Feb 02 09:00:39 localhost kernel: Linux version 5.14.0-665.el9.x86_64 (mockbuild@x86-05.stream.rdu2.redhat.com) (gcc (GCC) 11.5.0 20240719 (Red Hat 11.5.0-14), GNU ld version 2.35.2-69.el9) #1 SMP PREEMPT_DYNAMIC Thu Jan 22 12:30:22 UTC 2026
Feb 02 09:00:39 localhost kernel: The list of certified hardware and cloud instances for Red Hat Enterprise Linux 9 can be viewed at the Red Hat Ecosystem Catalog, https://catalog.redhat.com.
Feb 02 09:00:39 localhost kernel: Command line: BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-665.el9.x86_64 root=UUID=822f14ea-6e7e-41df-b0d8-fbe282d9ded8 ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Feb 02 09:00:39 localhost kernel: BIOS-provided physical RAM map:
Feb 02 09:00:39 localhost kernel: BIOS-e820: [mem 0x0000000000000000-0x000000000009fbff] usable
Feb 02 09:00:39 localhost kernel: BIOS-e820: [mem 0x000000000009fc00-0x000000000009ffff] reserved
Feb 02 09:00:39 localhost kernel: BIOS-e820: [mem 0x00000000000f0000-0x00000000000fffff] reserved
Feb 02 09:00:39 localhost kernel: BIOS-e820: [mem 0x0000000000100000-0x00000000bffdafff] usable
Feb 02 09:00:39 localhost kernel: BIOS-e820: [mem 0x00000000bffdb000-0x00000000bfffffff] reserved
Feb 02 09:00:39 localhost kernel: BIOS-e820: [mem 0x00000000feffc000-0x00000000feffffff] reserved
Feb 02 09:00:39 localhost kernel: BIOS-e820: [mem 0x00000000fffc0000-0x00000000ffffffff] reserved
Feb 02 09:00:39 localhost kernel: BIOS-e820: [mem 0x0000000100000000-0x000000023fffffff] usable
Feb 02 09:00:39 localhost kernel: NX (Execute Disable) protection: active
Feb 02 09:00:39 localhost kernel: APIC: Static calls initialized
Feb 02 09:00:39 localhost kernel: SMBIOS 2.8 present.
Feb 02 09:00:39 localhost kernel: DMI: OpenStack Foundation OpenStack Nova, BIOS 1.15.0-1 04/01/2014
Feb 02 09:00:39 localhost kernel: Hypervisor detected: KVM
Feb 02 09:00:39 localhost kernel: kvm-clock: Using msrs 4b564d01 and 4b564d00
Feb 02 09:00:39 localhost kernel: kvm-clock: using sched offset of 5598956451 cycles
Feb 02 09:00:39 localhost kernel: clocksource: kvm-clock: mask: 0xffffffffffffffff max_cycles: 0x1cd42e4dffb, max_idle_ns: 881590591483 ns
Feb 02 09:00:39 localhost kernel: tsc: Detected 2800.000 MHz processor
Feb 02 09:00:39 localhost kernel: e820: update [mem 0x00000000-0x00000fff] usable ==> reserved
Feb 02 09:00:39 localhost kernel: e820: remove [mem 0x000a0000-0x000fffff] usable
Feb 02 09:00:39 localhost kernel: last_pfn = 0x240000 max_arch_pfn = 0x400000000
Feb 02 09:00:39 localhost kernel: MTRR map: 4 entries (3 fixed + 1 variable; max 19), built from 8 variable MTRRs
Feb 02 09:00:39 localhost kernel: x86/PAT: Configuration [0-7]: WB  WC  UC- UC  WB  WP  UC- WT  
Feb 02 09:00:39 localhost kernel: last_pfn = 0xbffdb max_arch_pfn = 0x400000000
Feb 02 09:00:39 localhost kernel: found SMP MP-table at [mem 0x000f5ae0-0x000f5aef]
Feb 02 09:00:39 localhost kernel: Using GB pages for direct mapping
Feb 02 09:00:39 localhost kernel: RAMDISK: [mem 0x2d410000-0x329fffff]
Feb 02 09:00:39 localhost kernel: ACPI: Early table checksum verification disabled
Feb 02 09:00:39 localhost kernel: ACPI: RSDP 0x00000000000F5AA0 000014 (v00 BOCHS )
Feb 02 09:00:39 localhost kernel: ACPI: RSDT 0x00000000BFFE16BD 000030 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Feb 02 09:00:39 localhost kernel: ACPI: FACP 0x00000000BFFE1571 000074 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Feb 02 09:00:39 localhost kernel: ACPI: DSDT 0x00000000BFFDFC80 0018F1 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Feb 02 09:00:39 localhost kernel: ACPI: FACS 0x00000000BFFDFC40 000040
Feb 02 09:00:39 localhost kernel: ACPI: APIC 0x00000000BFFE15E5 0000B0 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Feb 02 09:00:39 localhost kernel: ACPI: WAET 0x00000000BFFE1695 000028 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Feb 02 09:00:39 localhost kernel: ACPI: Reserving FACP table memory at [mem 0xbffe1571-0xbffe15e4]
Feb 02 09:00:39 localhost kernel: ACPI: Reserving DSDT table memory at [mem 0xbffdfc80-0xbffe1570]
Feb 02 09:00:39 localhost kernel: ACPI: Reserving FACS table memory at [mem 0xbffdfc40-0xbffdfc7f]
Feb 02 09:00:39 localhost kernel: ACPI: Reserving APIC table memory at [mem 0xbffe15e5-0xbffe1694]
Feb 02 09:00:39 localhost kernel: ACPI: Reserving WAET table memory at [mem 0xbffe1695-0xbffe16bc]
Feb 02 09:00:39 localhost kernel: No NUMA configuration found
Feb 02 09:00:39 localhost kernel: Faking a node at [mem 0x0000000000000000-0x000000023fffffff]
Feb 02 09:00:39 localhost kernel: NODE_DATA(0) allocated [mem 0x23ffd3000-0x23fffdfff]
Feb 02 09:00:39 localhost kernel: crashkernel reserved: 0x00000000af000000 - 0x00000000bf000000 (256 MB)
Feb 02 09:00:39 localhost kernel: Zone ranges:
Feb 02 09:00:39 localhost kernel:   DMA      [mem 0x0000000000001000-0x0000000000ffffff]
Feb 02 09:00:39 localhost kernel:   DMA32    [mem 0x0000000001000000-0x00000000ffffffff]
Feb 02 09:00:39 localhost kernel:   Normal   [mem 0x0000000100000000-0x000000023fffffff]
Feb 02 09:00:39 localhost kernel:   Device   empty
Feb 02 09:00:39 localhost kernel: Movable zone start for each node
Feb 02 09:00:39 localhost kernel: Early memory node ranges
Feb 02 09:00:39 localhost kernel:   node   0: [mem 0x0000000000001000-0x000000000009efff]
Feb 02 09:00:39 localhost kernel:   node   0: [mem 0x0000000000100000-0x00000000bffdafff]
Feb 02 09:00:39 localhost kernel:   node   0: [mem 0x0000000100000000-0x000000023fffffff]
Feb 02 09:00:39 localhost kernel: Initmem setup node 0 [mem 0x0000000000001000-0x000000023fffffff]
Feb 02 09:00:39 localhost kernel: On node 0, zone DMA: 1 pages in unavailable ranges
Feb 02 09:00:39 localhost kernel: On node 0, zone DMA: 97 pages in unavailable ranges
Feb 02 09:00:39 localhost kernel: On node 0, zone Normal: 37 pages in unavailable ranges
Feb 02 09:00:39 localhost kernel: ACPI: PM-Timer IO Port: 0x608
Feb 02 09:00:39 localhost kernel: ACPI: LAPIC_NMI (acpi_id[0xff] dfl dfl lint[0x1])
Feb 02 09:00:39 localhost kernel: IOAPIC[0]: apic_id 0, version 17, address 0xfec00000, GSI 0-23
Feb 02 09:00:39 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 dfl dfl)
Feb 02 09:00:39 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 5 global_irq 5 high level)
Feb 02 09:00:39 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 9 global_irq 9 high level)
Feb 02 09:00:39 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 10 global_irq 10 high level)
Feb 02 09:00:39 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 11 global_irq 11 high level)
Feb 02 09:00:39 localhost kernel: ACPI: Using ACPI (MADT) for SMP configuration information
Feb 02 09:00:39 localhost kernel: TSC deadline timer available
Feb 02 09:00:39 localhost kernel: CPU topo: Max. logical packages:   8
Feb 02 09:00:39 localhost kernel: CPU topo: Max. logical dies:       8
Feb 02 09:00:39 localhost kernel: CPU topo: Max. dies per package:   1
Feb 02 09:00:39 localhost kernel: CPU topo: Max. threads per core:   1
Feb 02 09:00:39 localhost kernel: CPU topo: Num. cores per package:     1
Feb 02 09:00:39 localhost kernel: CPU topo: Num. threads per package:   1
Feb 02 09:00:39 localhost kernel: CPU topo: Allowing 8 present CPUs plus 0 hotplug CPUs
Feb 02 09:00:39 localhost kernel: kvm-guest: APIC: eoi() replaced with kvm_guest_apic_eoi_write()
Feb 02 09:00:39 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x00000000-0x00000fff]
Feb 02 09:00:39 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x0009f000-0x0009ffff]
Feb 02 09:00:39 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x000a0000-0x000effff]
Feb 02 09:00:39 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x000f0000-0x000fffff]
Feb 02 09:00:39 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xbffdb000-0xbfffffff]
Feb 02 09:00:39 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xc0000000-0xfeffbfff]
Feb 02 09:00:39 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xfeffc000-0xfeffffff]
Feb 02 09:00:39 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xff000000-0xfffbffff]
Feb 02 09:00:39 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xfffc0000-0xffffffff]
Feb 02 09:00:39 localhost kernel: [mem 0xc0000000-0xfeffbfff] available for PCI devices
Feb 02 09:00:39 localhost kernel: Booting paravirtualized kernel on KVM
Feb 02 09:00:39 localhost kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns
Feb 02 09:00:39 localhost kernel: setup_percpu: NR_CPUS:8192 nr_cpumask_bits:8 nr_cpu_ids:8 nr_node_ids:1
Feb 02 09:00:39 localhost kernel: percpu: Embedded 64 pages/cpu s225280 r8192 d28672 u262144
Feb 02 09:00:39 localhost kernel: pcpu-alloc: s225280 r8192 d28672 u262144 alloc=1*2097152
Feb 02 09:00:39 localhost kernel: pcpu-alloc: [0] 0 1 2 3 4 5 6 7 
Feb 02 09:00:39 localhost kernel: kvm-guest: PV spinlocks disabled, no host support
Feb 02 09:00:39 localhost kernel: Kernel command line: BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-665.el9.x86_64 root=UUID=822f14ea-6e7e-41df-b0d8-fbe282d9ded8 ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Feb 02 09:00:39 localhost kernel: Unknown kernel command line parameters "BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-665.el9.x86_64", will be passed to user space.
Feb 02 09:00:39 localhost kernel: random: crng init done
Feb 02 09:00:39 localhost kernel: Dentry cache hash table entries: 1048576 (order: 11, 8388608 bytes, linear)
Feb 02 09:00:39 localhost kernel: Inode-cache hash table entries: 524288 (order: 10, 4194304 bytes, linear)
Feb 02 09:00:39 localhost kernel: Fallback order for Node 0: 0 
Feb 02 09:00:39 localhost kernel: Built 1 zonelists, mobility grouping on.  Total pages: 2064091
Feb 02 09:00:39 localhost kernel: Policy zone: Normal
Feb 02 09:00:39 localhost kernel: mem auto-init: stack:off, heap alloc:off, heap free:off
Feb 02 09:00:39 localhost kernel: software IO TLB: area num 8.
Feb 02 09:00:39 localhost kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=8, Nodes=1
Feb 02 09:00:39 localhost kernel: ftrace: allocating 49438 entries in 194 pages
Feb 02 09:00:39 localhost kernel: ftrace: allocated 194 pages with 3 groups
Feb 02 09:00:39 localhost kernel: Dynamic Preempt: voluntary
Feb 02 09:00:39 localhost kernel: rcu: Preemptible hierarchical RCU implementation.
Feb 02 09:00:39 localhost kernel: rcu:         RCU event tracing is enabled.
Feb 02 09:00:39 localhost kernel: rcu:         RCU restricting CPUs from NR_CPUS=8192 to nr_cpu_ids=8.
Feb 02 09:00:39 localhost kernel:         Trampoline variant of Tasks RCU enabled.
Feb 02 09:00:39 localhost kernel:         Rude variant of Tasks RCU enabled.
Feb 02 09:00:39 localhost kernel:         Tracing variant of Tasks RCU enabled.
Feb 02 09:00:39 localhost kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies.
Feb 02 09:00:39 localhost kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=8
Feb 02 09:00:39 localhost kernel: RCU Tasks: Setting shift to 3 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=8.
Feb 02 09:00:39 localhost kernel: RCU Tasks Rude: Setting shift to 3 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=8.
Feb 02 09:00:39 localhost kernel: RCU Tasks Trace: Setting shift to 3 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=8.
Feb 02 09:00:39 localhost kernel: NR_IRQS: 524544, nr_irqs: 488, preallocated irqs: 16
Feb 02 09:00:39 localhost kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention.
Feb 02 09:00:39 localhost kernel: kfence: initialized - using 2097152 bytes for 255 objects at 0x(____ptrval____)-0x(____ptrval____)
Feb 02 09:00:39 localhost kernel: Console: colour VGA+ 80x25
Feb 02 09:00:39 localhost kernel: printk: console [ttyS0] enabled
Feb 02 09:00:39 localhost kernel: ACPI: Core revision 20230331
Feb 02 09:00:39 localhost kernel: APIC: Switch to symmetric I/O mode setup
Feb 02 09:00:39 localhost kernel: x2apic enabled
Feb 02 09:00:39 localhost kernel: APIC: Switched APIC routing to: physical x2apic
Feb 02 09:00:39 localhost kernel: tsc: Marking TSC unstable due to TSCs unsynchronized
Feb 02 09:00:39 localhost kernel: Calibrating delay loop (skipped) preset value.. 5600.00 BogoMIPS (lpj=2800000)
Feb 02 09:00:39 localhost kernel: x86/cpu: User Mode Instruction Prevention (UMIP) activated
Feb 02 09:00:39 localhost kernel: Last level iTLB entries: 4KB 512, 2MB 255, 4MB 127
Feb 02 09:00:39 localhost kernel: Last level dTLB entries: 4KB 512, 2MB 255, 4MB 127, 1GB 0
Feb 02 09:00:39 localhost kernel: mitigations: Enabled attack vectors: user_kernel, user_user, guest_host, guest_guest, SMT mitigations: auto
Feb 02 09:00:39 localhost kernel: Speculative Store Bypass: Mitigation: Speculative Store Bypass disabled via prctl
Feb 02 09:00:39 localhost kernel: Spectre V2 : Mitigation: Retpolines
Feb 02 09:00:39 localhost kernel: RETBleed: Mitigation: untrained return thunk
Feb 02 09:00:39 localhost kernel: Speculative Return Stack Overflow: Mitigation: SMT disabled
Feb 02 09:00:39 localhost kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization
Feb 02 09:00:39 localhost kernel: Spectre V2 : Spectre v2 / SpectreRSB: Filling RSB on context switch and VMEXIT
Feb 02 09:00:39 localhost kernel: Spectre V2 : Enabling Speculation Barrier for firmware calls
Feb 02 09:00:39 localhost kernel: active return thunk: retbleed_return_thunk
Feb 02 09:00:39 localhost kernel: Spectre V2 : mitigation: Enabling conditional Indirect Branch Prediction Barrier
Feb 02 09:00:39 localhost kernel: x86/fpu: Supporting XSAVE feature 0x001: 'x87 floating point registers'
Feb 02 09:00:39 localhost kernel: x86/fpu: Supporting XSAVE feature 0x002: 'SSE registers'
Feb 02 09:00:39 localhost kernel: x86/fpu: Supporting XSAVE feature 0x004: 'AVX registers'
Feb 02 09:00:39 localhost kernel: x86/fpu: xstate_offset[2]:  576, xstate_sizes[2]:  256
Feb 02 09:00:39 localhost kernel: x86/fpu: Enabled xstate features 0x7, context size is 832 bytes, using 'compacted' format.
Feb 02 09:00:39 localhost kernel: Freeing SMP alternatives memory: 40K
Feb 02 09:00:39 localhost kernel: pid_max: default: 32768 minimum: 301
Feb 02 09:00:39 localhost kernel: LSM: initializing lsm=lockdown,capability,landlock,yama,integrity,selinux,bpf
Feb 02 09:00:39 localhost kernel: landlock: Up and running.
Feb 02 09:00:39 localhost kernel: Yama: becoming mindful.
Feb 02 09:00:39 localhost kernel: SELinux:  Initializing.
Feb 02 09:00:39 localhost kernel: LSM support for eBPF active
Feb 02 09:00:39 localhost kernel: Mount-cache hash table entries: 16384 (order: 5, 131072 bytes, linear)
Feb 02 09:00:39 localhost kernel: Mountpoint-cache hash table entries: 16384 (order: 5, 131072 bytes, linear)
Feb 02 09:00:39 localhost kernel: smpboot: CPU0: AMD EPYC-Rome Processor (family: 0x17, model: 0x31, stepping: 0x0)
Feb 02 09:00:39 localhost kernel: Performance Events: Fam17h+ core perfctr, AMD PMU driver.
Feb 02 09:00:39 localhost kernel: ... version:                0
Feb 02 09:00:39 localhost kernel: ... bit width:              48
Feb 02 09:00:39 localhost kernel: ... generic registers:      6
Feb 02 09:00:39 localhost kernel: ... value mask:             0000ffffffffffff
Feb 02 09:00:39 localhost kernel: ... max period:             00007fffffffffff
Feb 02 09:00:39 localhost kernel: ... fixed-purpose events:   0
Feb 02 09:00:39 localhost kernel: ... event mask:             000000000000003f
Feb 02 09:00:39 localhost kernel: signal: max sigframe size: 1776
Feb 02 09:00:39 localhost kernel: rcu: Hierarchical SRCU implementation.
Feb 02 09:00:39 localhost kernel: rcu:         Max phase no-delay instances is 400.
Feb 02 09:00:39 localhost kernel: smp: Bringing up secondary CPUs ...
Feb 02 09:00:39 localhost kernel: smpboot: x86: Booting SMP configuration:
Feb 02 09:00:39 localhost kernel: .... node  #0, CPUs:      #1 #2 #3 #4 #5 #6 #7
Feb 02 09:00:39 localhost kernel: smp: Brought up 1 node, 8 CPUs
Feb 02 09:00:39 localhost kernel: smpboot: Total of 8 processors activated (44800.00 BogoMIPS)
Feb 02 09:00:39 localhost kernel: node 0 deferred pages initialised in 10ms
Feb 02 09:00:39 localhost kernel: Memory: 7763692K/8388068K available (16384K kernel code, 5801K rwdata, 13928K rodata, 4196K init, 7192K bss, 618408K reserved, 0K cma-reserved)
Feb 02 09:00:39 localhost kernel: devtmpfs: initialized
Feb 02 09:00:39 localhost kernel: x86/mm: Memory block size: 128MB
Feb 02 09:00:39 localhost kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns
Feb 02 09:00:39 localhost kernel: futex hash table entries: 2048 (131072 bytes on 1 NUMA nodes, total 128 KiB, linear).
Feb 02 09:00:39 localhost kernel: pinctrl core: initialized pinctrl subsystem
Feb 02 09:00:39 localhost kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family
Feb 02 09:00:39 localhost kernel: DMA: preallocated 1024 KiB GFP_KERNEL pool for atomic allocations
Feb 02 09:00:39 localhost kernel: DMA: preallocated 1024 KiB GFP_KERNEL|GFP_DMA pool for atomic allocations
Feb 02 09:00:39 localhost kernel: DMA: preallocated 1024 KiB GFP_KERNEL|GFP_DMA32 pool for atomic allocations
Feb 02 09:00:39 localhost kernel: audit: initializing netlink subsys (disabled)
Feb 02 09:00:39 localhost kernel: audit: type=2000 audit(1770022838.682:1): state=initialized audit_enabled=0 res=1
Feb 02 09:00:39 localhost kernel: thermal_sys: Registered thermal governor 'fair_share'
Feb 02 09:00:39 localhost kernel: thermal_sys: Registered thermal governor 'step_wise'
Feb 02 09:00:39 localhost kernel: thermal_sys: Registered thermal governor 'user_space'
Feb 02 09:00:39 localhost kernel: cpuidle: using governor menu
Feb 02 09:00:39 localhost kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5
Feb 02 09:00:39 localhost kernel: PCI: Using configuration type 1 for base access
Feb 02 09:00:39 localhost kernel: PCI: Using configuration type 1 for extended access
Feb 02 09:00:39 localhost kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible.
Feb 02 09:00:39 localhost kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages
Feb 02 09:00:39 localhost kernel: HugeTLB: 16380 KiB vmemmap can be freed for a 1.00 GiB page
Feb 02 09:00:39 localhost kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages
Feb 02 09:00:39 localhost kernel: HugeTLB: 28 KiB vmemmap can be freed for a 2.00 MiB page
Feb 02 09:00:39 localhost kernel: Demotion targets for Node 0: null
Feb 02 09:00:39 localhost kernel: cryptd: max_cpu_qlen set to 1000
Feb 02 09:00:39 localhost kernel: ACPI: Added _OSI(Module Device)
Feb 02 09:00:39 localhost kernel: ACPI: Added _OSI(Processor Device)
Feb 02 09:00:39 localhost kernel: ACPI: Added _OSI(Processor Aggregator Device)
Feb 02 09:00:39 localhost kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded
Feb 02 09:00:39 localhost kernel: ACPI: Interpreter enabled
Feb 02 09:00:39 localhost kernel: ACPI: PM: (supports S0 S3 S4 S5)
Feb 02 09:00:39 localhost kernel: ACPI: Using IOAPIC for interrupt routing
Feb 02 09:00:39 localhost kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug
Feb 02 09:00:39 localhost kernel: PCI: Using E820 reservations for host bridge windows
Feb 02 09:00:39 localhost kernel: ACPI: Enabled 2 GPEs in block 00 to 0F
Feb 02 09:00:39 localhost kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff])
Feb 02 09:00:39 localhost kernel: acpi PNP0A03:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI EDR HPX-Type3]
Feb 02 09:00:39 localhost kernel: acpiphp: Slot [3] registered
Feb 02 09:00:39 localhost kernel: acpiphp: Slot [4] registered
Feb 02 09:00:39 localhost kernel: acpiphp: Slot [5] registered
Feb 02 09:00:39 localhost kernel: acpiphp: Slot [6] registered
Feb 02 09:00:39 localhost kernel: acpiphp: Slot [7] registered
Feb 02 09:00:39 localhost kernel: acpiphp: Slot [8] registered
Feb 02 09:00:39 localhost kernel: acpiphp: Slot [9] registered
Feb 02 09:00:39 localhost kernel: acpiphp: Slot [10] registered
Feb 02 09:00:39 localhost kernel: acpiphp: Slot [11] registered
Feb 02 09:00:39 localhost kernel: acpiphp: Slot [12] registered
Feb 02 09:00:39 localhost kernel: acpiphp: Slot [13] registered
Feb 02 09:00:39 localhost kernel: acpiphp: Slot [14] registered
Feb 02 09:00:39 localhost kernel: acpiphp: Slot [15] registered
Feb 02 09:00:39 localhost kernel: acpiphp: Slot [16] registered
Feb 02 09:00:39 localhost kernel: acpiphp: Slot [17] registered
Feb 02 09:00:39 localhost kernel: acpiphp: Slot [18] registered
Feb 02 09:00:39 localhost kernel: acpiphp: Slot [19] registered
Feb 02 09:00:39 localhost kernel: acpiphp: Slot [20] registered
Feb 02 09:00:39 localhost kernel: acpiphp: Slot [21] registered
Feb 02 09:00:39 localhost kernel: acpiphp: Slot [22] registered
Feb 02 09:00:39 localhost kernel: acpiphp: Slot [23] registered
Feb 02 09:00:39 localhost kernel: acpiphp: Slot [24] registered
Feb 02 09:00:39 localhost kernel: acpiphp: Slot [25] registered
Feb 02 09:00:39 localhost kernel: acpiphp: Slot [26] registered
Feb 02 09:00:39 localhost kernel: acpiphp: Slot [27] registered
Feb 02 09:00:39 localhost kernel: acpiphp: Slot [28] registered
Feb 02 09:00:39 localhost kernel: acpiphp: Slot [29] registered
Feb 02 09:00:39 localhost kernel: acpiphp: Slot [30] registered
Feb 02 09:00:39 localhost kernel: acpiphp: Slot [31] registered
Feb 02 09:00:39 localhost kernel: PCI host bridge to bus 0000:00
Feb 02 09:00:39 localhost kernel: pci_bus 0000:00: root bus resource [io  0x0000-0x0cf7 window]
Feb 02 09:00:39 localhost kernel: pci_bus 0000:00: root bus resource [io  0x0d00-0xffff window]
Feb 02 09:00:39 localhost kernel: pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window]
Feb 02 09:00:39 localhost kernel: pci_bus 0000:00: root bus resource [mem 0xc0000000-0xfebfffff window]
Feb 02 09:00:39 localhost kernel: pci_bus 0000:00: root bus resource [mem 0x240000000-0x2bfffffff window]
Feb 02 09:00:39 localhost kernel: pci_bus 0000:00: root bus resource [bus 00-ff]
Feb 02 09:00:39 localhost kernel: pci 0000:00:00.0: [8086:1237] type 00 class 0x060000 conventional PCI endpoint
Feb 02 09:00:39 localhost kernel: pci 0000:00:01.0: [8086:7000] type 00 class 0x060100 conventional PCI endpoint
Feb 02 09:00:39 localhost kernel: pci 0000:00:01.1: [8086:7010] type 00 class 0x010180 conventional PCI endpoint
Feb 02 09:00:39 localhost kernel: pci 0000:00:01.1: BAR 4 [io  0xc140-0xc14f]
Feb 02 09:00:39 localhost kernel: pci 0000:00:01.1: BAR 0 [io  0x01f0-0x01f7]: legacy IDE quirk
Feb 02 09:00:39 localhost kernel: pci 0000:00:01.1: BAR 1 [io  0x03f6]: legacy IDE quirk
Feb 02 09:00:39 localhost kernel: pci 0000:00:01.1: BAR 2 [io  0x0170-0x0177]: legacy IDE quirk
Feb 02 09:00:39 localhost kernel: pci 0000:00:01.1: BAR 3 [io  0x0376]: legacy IDE quirk
Feb 02 09:00:39 localhost kernel: pci 0000:00:01.2: [8086:7020] type 00 class 0x0c0300 conventional PCI endpoint
Feb 02 09:00:39 localhost kernel: pci 0000:00:01.2: BAR 4 [io  0xc100-0xc11f]
Feb 02 09:00:39 localhost kernel: pci 0000:00:01.3: [8086:7113] type 00 class 0x068000 conventional PCI endpoint
Feb 02 09:00:39 localhost kernel: pci 0000:00:01.3: quirk: [io  0x0600-0x063f] claimed by PIIX4 ACPI
Feb 02 09:00:39 localhost kernel: pci 0000:00:01.3: quirk: [io  0x0700-0x070f] claimed by PIIX4 SMB
Feb 02 09:00:39 localhost kernel: pci 0000:00:02.0: [1af4:1050] type 00 class 0x030000 conventional PCI endpoint
Feb 02 09:00:39 localhost kernel: pci 0000:00:02.0: BAR 0 [mem 0xfe000000-0xfe7fffff pref]
Feb 02 09:00:39 localhost kernel: pci 0000:00:02.0: BAR 2 [mem 0xfe800000-0xfe803fff 64bit pref]
Feb 02 09:00:39 localhost kernel: pci 0000:00:02.0: BAR 4 [mem 0xfeb90000-0xfeb90fff]
Feb 02 09:00:39 localhost kernel: pci 0000:00:02.0: ROM [mem 0xfeb80000-0xfeb8ffff pref]
Feb 02 09:00:39 localhost kernel: pci 0000:00:02.0: Video device with shadowed ROM at [mem 0x000c0000-0x000dffff]
Feb 02 09:00:39 localhost kernel: pci 0000:00:03.0: [1af4:1000] type 00 class 0x020000 conventional PCI endpoint
Feb 02 09:00:39 localhost kernel: pci 0000:00:03.0: BAR 0 [io  0xc080-0xc0bf]
Feb 02 09:00:39 localhost kernel: pci 0000:00:03.0: BAR 1 [mem 0xfeb91000-0xfeb91fff]
Feb 02 09:00:39 localhost kernel: pci 0000:00:03.0: BAR 4 [mem 0xfe804000-0xfe807fff 64bit pref]
Feb 02 09:00:39 localhost kernel: pci 0000:00:03.0: ROM [mem 0xfeb00000-0xfeb7ffff pref]
Feb 02 09:00:39 localhost kernel: pci 0000:00:04.0: [1af4:1001] type 00 class 0x010000 conventional PCI endpoint
Feb 02 09:00:39 localhost kernel: pci 0000:00:04.0: BAR 0 [io  0xc000-0xc07f]
Feb 02 09:00:39 localhost kernel: pci 0000:00:04.0: BAR 1 [mem 0xfeb92000-0xfeb92fff]
Feb 02 09:00:39 localhost kernel: pci 0000:00:04.0: BAR 4 [mem 0xfe808000-0xfe80bfff 64bit pref]
Feb 02 09:00:39 localhost kernel: pci 0000:00:05.0: [1af4:1002] type 00 class 0x00ff00 conventional PCI endpoint
Feb 02 09:00:39 localhost kernel: pci 0000:00:05.0: BAR 0 [io  0xc0c0-0xc0ff]
Feb 02 09:00:39 localhost kernel: pci 0000:00:05.0: BAR 4 [mem 0xfe80c000-0xfe80ffff 64bit pref]
Feb 02 09:00:39 localhost kernel: pci 0000:00:06.0: [1af4:1005] type 00 class 0x00ff00 conventional PCI endpoint
Feb 02 09:00:39 localhost kernel: pci 0000:00:06.0: BAR 0 [io  0xc120-0xc13f]
Feb 02 09:00:39 localhost kernel: pci 0000:00:06.0: BAR 4 [mem 0xfe810000-0xfe813fff 64bit pref]
Feb 02 09:00:39 localhost kernel: ACPI: PCI: Interrupt link LNKA configured for IRQ 10
Feb 02 09:00:39 localhost kernel: ACPI: PCI: Interrupt link LNKB configured for IRQ 10
Feb 02 09:00:39 localhost kernel: ACPI: PCI: Interrupt link LNKC configured for IRQ 11
Feb 02 09:00:39 localhost kernel: ACPI: PCI: Interrupt link LNKD configured for IRQ 11
Feb 02 09:00:39 localhost kernel: ACPI: PCI: Interrupt link LNKS configured for IRQ 9
Feb 02 09:00:39 localhost kernel: iommu: Default domain type: Translated
Feb 02 09:00:39 localhost kernel: iommu: DMA domain TLB invalidation policy: lazy mode
Feb 02 09:00:39 localhost kernel: SCSI subsystem initialized
Feb 02 09:00:39 localhost kernel: ACPI: bus type USB registered
Feb 02 09:00:39 localhost kernel: usbcore: registered new interface driver usbfs
Feb 02 09:00:39 localhost kernel: usbcore: registered new interface driver hub
Feb 02 09:00:39 localhost kernel: usbcore: registered new device driver usb
Feb 02 09:00:39 localhost kernel: pps_core: LinuxPPS API ver. 1 registered
Feb 02 09:00:39 localhost kernel: pps_core: Software ver. 5.3.6 - Copyright 2005-2007 Rodolfo Giometti <giometti@linux.it>
Feb 02 09:00:39 localhost kernel: PTP clock support registered
Feb 02 09:00:39 localhost kernel: EDAC MC: Ver: 3.0.0
Feb 02 09:00:39 localhost kernel: NetLabel: Initializing
Feb 02 09:00:39 localhost kernel: NetLabel:  domain hash size = 128
Feb 02 09:00:39 localhost kernel: NetLabel:  protocols = UNLABELED CIPSOv4 CALIPSO
Feb 02 09:00:39 localhost kernel: NetLabel:  unlabeled traffic allowed by default
Feb 02 09:00:39 localhost kernel: PCI: Using ACPI for IRQ routing
Feb 02 09:00:39 localhost kernel: PCI: pci_cache_line_size set to 64 bytes
Feb 02 09:00:39 localhost kernel: e820: reserve RAM buffer [mem 0x0009fc00-0x0009ffff]
Feb 02 09:00:39 localhost kernel: e820: reserve RAM buffer [mem 0xbffdb000-0xbfffffff]
Feb 02 09:00:39 localhost kernel: pci 0000:00:02.0: vgaarb: setting as boot VGA device
Feb 02 09:00:39 localhost kernel: pci 0000:00:02.0: vgaarb: bridge control possible
Feb 02 09:00:39 localhost kernel: pci 0000:00:02.0: vgaarb: VGA device added: decodes=io+mem,owns=io+mem,locks=none
Feb 02 09:00:39 localhost kernel: vgaarb: loaded
Feb 02 09:00:39 localhost kernel: clocksource: Switched to clocksource kvm-clock
Feb 02 09:00:39 localhost kernel: VFS: Disk quotas dquot_6.6.0
Feb 02 09:00:39 localhost kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes)
Feb 02 09:00:39 localhost kernel: pnp: PnP ACPI init
Feb 02 09:00:39 localhost kernel: pnp 00:03: [dma 2]
Feb 02 09:00:39 localhost kernel: pnp: PnP ACPI: found 5 devices
Feb 02 09:00:39 localhost kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns
Feb 02 09:00:39 localhost kernel: NET: Registered PF_INET protocol family
Feb 02 09:00:39 localhost kernel: IP idents hash table entries: 131072 (order: 8, 1048576 bytes, linear)
Feb 02 09:00:39 localhost kernel: tcp_listen_portaddr_hash hash table entries: 4096 (order: 4, 65536 bytes, linear)
Feb 02 09:00:39 localhost kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear)
Feb 02 09:00:39 localhost kernel: TCP established hash table entries: 65536 (order: 7, 524288 bytes, linear)
Feb 02 09:00:39 localhost kernel: TCP bind hash table entries: 65536 (order: 8, 1048576 bytes, linear)
Feb 02 09:00:39 localhost kernel: TCP: Hash tables configured (established 65536 bind 65536)
Feb 02 09:00:39 localhost kernel: MPTCP token hash table entries: 8192 (order: 5, 196608 bytes, linear)
Feb 02 09:00:39 localhost kernel: UDP hash table entries: 4096 (order: 5, 131072 bytes, linear)
Feb 02 09:00:39 localhost kernel: UDP-Lite hash table entries: 4096 (order: 5, 131072 bytes, linear)
Feb 02 09:00:39 localhost kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family
Feb 02 09:00:39 localhost kernel: NET: Registered PF_XDP protocol family
Feb 02 09:00:39 localhost kernel: pci_bus 0000:00: resource 4 [io  0x0000-0x0cf7 window]
Feb 02 09:00:39 localhost kernel: pci_bus 0000:00: resource 5 [io  0x0d00-0xffff window]
Feb 02 09:00:39 localhost kernel: pci_bus 0000:00: resource 6 [mem 0x000a0000-0x000bffff window]
Feb 02 09:00:39 localhost kernel: pci_bus 0000:00: resource 7 [mem 0xc0000000-0xfebfffff window]
Feb 02 09:00:39 localhost kernel: pci_bus 0000:00: resource 8 [mem 0x240000000-0x2bfffffff window]
Feb 02 09:00:39 localhost kernel: pci 0000:00:01.0: PIIX3: Enabling Passive Release
Feb 02 09:00:39 localhost kernel: pci 0000:00:00.0: Limiting direct PCI/PCI transfers
Feb 02 09:00:39 localhost kernel: ACPI: \_SB_.LNKD: Enabled at IRQ 11
Feb 02 09:00:39 localhost kernel: pci 0000:00:01.2: quirk_usb_early_handoff+0x0/0x160 took 39780 usecs
Feb 02 09:00:39 localhost kernel: PCI: CLS 0 bytes, default 64
Feb 02 09:00:39 localhost kernel: PCI-DMA: Using software bounce buffering for IO (SWIOTLB)
Feb 02 09:00:39 localhost kernel: software IO TLB: mapped [mem 0x00000000ab000000-0x00000000af000000] (64MB)
Feb 02 09:00:39 localhost kernel: ACPI: bus type thunderbolt registered
Feb 02 09:00:39 localhost kernel: Trying to unpack rootfs image as initramfs...
Feb 02 09:00:39 localhost kernel: Initialise system trusted keyrings
Feb 02 09:00:39 localhost kernel: Key type blacklist registered
Feb 02 09:00:39 localhost kernel: workingset: timestamp_bits=36 max_order=21 bucket_order=0
Feb 02 09:00:39 localhost kernel: zbud: loaded
Feb 02 09:00:39 localhost kernel: integrity: Platform Keyring initialized
Feb 02 09:00:39 localhost kernel: integrity: Machine keyring initialized
Feb 02 09:00:39 localhost kernel: Freeing initrd memory: 88000K
Feb 02 09:00:39 localhost kernel: NET: Registered PF_ALG protocol family
Feb 02 09:00:39 localhost kernel: xor: automatically using best checksumming function   avx       
Feb 02 09:00:39 localhost kernel: Key type asymmetric registered
Feb 02 09:00:39 localhost kernel: Asymmetric key parser 'x509' registered
Feb 02 09:00:39 localhost kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 246)
Feb 02 09:00:39 localhost kernel: io scheduler mq-deadline registered
Feb 02 09:00:39 localhost kernel: io scheduler kyber registered
Feb 02 09:00:39 localhost kernel: io scheduler bfq registered
Feb 02 09:00:39 localhost kernel: atomic64_test: passed for x86-64 platform with CX8 and with SSE
Feb 02 09:00:39 localhost kernel: shpchp: Standard Hot Plug PCI Controller Driver version: 0.4
Feb 02 09:00:39 localhost kernel: input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input0
Feb 02 09:00:39 localhost kernel: ACPI: button: Power Button [PWRF]
Feb 02 09:00:39 localhost kernel: ACPI: \_SB_.LNKB: Enabled at IRQ 10
Feb 02 09:00:39 localhost kernel: ACPI: \_SB_.LNKC: Enabled at IRQ 11
Feb 02 09:00:39 localhost kernel: ACPI: \_SB_.LNKA: Enabled at IRQ 10
Feb 02 09:00:39 localhost kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled
Feb 02 09:00:39 localhost kernel: 00:00: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A
Feb 02 09:00:39 localhost kernel: Non-volatile memory driver v1.3
Feb 02 09:00:39 localhost kernel: rdac: device handler registered
Feb 02 09:00:39 localhost kernel: hp_sw: device handler registered
Feb 02 09:00:39 localhost kernel: emc: device handler registered
Feb 02 09:00:39 localhost kernel: alua: device handler registered
Feb 02 09:00:39 localhost kernel: uhci_hcd 0000:00:01.2: UHCI Host Controller
Feb 02 09:00:39 localhost kernel: uhci_hcd 0000:00:01.2: new USB bus registered, assigned bus number 1
Feb 02 09:00:39 localhost kernel: uhci_hcd 0000:00:01.2: detected 2 ports
Feb 02 09:00:39 localhost kernel: uhci_hcd 0000:00:01.2: irq 11, io port 0x0000c100
Feb 02 09:00:39 localhost kernel: usb usb1: New USB device found, idVendor=1d6b, idProduct=0001, bcdDevice= 5.14
Feb 02 09:00:39 localhost kernel: usb usb1: New USB device strings: Mfr=3, Product=2, SerialNumber=1
Feb 02 09:00:39 localhost kernel: usb usb1: Product: UHCI Host Controller
Feb 02 09:00:39 localhost kernel: usb usb1: Manufacturer: Linux 5.14.0-665.el9.x86_64 uhci_hcd
Feb 02 09:00:39 localhost kernel: usb usb1: SerialNumber: 0000:00:01.2
Feb 02 09:00:39 localhost kernel: hub 1-0:1.0: USB hub found
Feb 02 09:00:39 localhost kernel: hub 1-0:1.0: 2 ports detected
Feb 02 09:00:39 localhost kernel: usbcore: registered new interface driver usbserial_generic
Feb 02 09:00:39 localhost kernel: usbserial: USB Serial support registered for generic
Feb 02 09:00:39 localhost kernel: i8042: PNP: PS/2 Controller [PNP0303:KBD,PNP0f13:MOU] at 0x60,0x64 irq 1,12
Feb 02 09:00:39 localhost kernel: serio: i8042 KBD port at 0x60,0x64 irq 1
Feb 02 09:00:39 localhost kernel: serio: i8042 AUX port at 0x60,0x64 irq 12
Feb 02 09:00:39 localhost kernel: mousedev: PS/2 mouse device common for all mice
Feb 02 09:00:39 localhost kernel: rtc_cmos 00:04: RTC can wake from S4
Feb 02 09:00:39 localhost kernel: rtc_cmos 00:04: registered as rtc0
Feb 02 09:00:39 localhost kernel: input: AT Translated Set 2 keyboard as /devices/platform/i8042/serio0/input/input1
Feb 02 09:00:39 localhost kernel: rtc_cmos 00:04: setting system clock to 2026-02-02T09:00:38 UTC (1770022838)
Feb 02 09:00:39 localhost kernel: rtc_cmos 00:04: alarms up to one day, y3k, 242 bytes nvram
Feb 02 09:00:39 localhost kernel: amd_pstate: the _CPC object is not present in SBIOS or ACPI disabled
Feb 02 09:00:39 localhost kernel: input: VirtualPS/2 VMware VMMouse as /devices/platform/i8042/serio1/input/input4
Feb 02 09:00:39 localhost kernel: hid: raw HID events driver (C) Jiri Kosina
Feb 02 09:00:39 localhost kernel: usbcore: registered new interface driver usbhid
Feb 02 09:00:39 localhost kernel: usbhid: USB HID core driver
Feb 02 09:00:39 localhost kernel: drop_monitor: Initializing network drop monitor service
Feb 02 09:00:39 localhost kernel: input: VirtualPS/2 VMware VMMouse as /devices/platform/i8042/serio1/input/input3
Feb 02 09:00:39 localhost kernel: Initializing XFRM netlink socket
Feb 02 09:00:39 localhost kernel: NET: Registered PF_INET6 protocol family
Feb 02 09:00:39 localhost kernel: Segment Routing with IPv6
Feb 02 09:00:39 localhost kernel: NET: Registered PF_PACKET protocol family
Feb 02 09:00:39 localhost kernel: mpls_gso: MPLS GSO support
Feb 02 09:00:39 localhost kernel: IPI shorthand broadcast: enabled
Feb 02 09:00:39 localhost kernel: AVX2 version of gcm_enc/dec engaged.
Feb 02 09:00:39 localhost kernel: AES CTR mode by8 optimization enabled
Feb 02 09:00:39 localhost kernel: sched_clock: Marking stable (899001730, 144895020)->(1138477630, -94580880)
Feb 02 09:00:39 localhost kernel: registered taskstats version 1
Feb 02 09:00:39 localhost kernel: Loading compiled-in X.509 certificates
Feb 02 09:00:39 localhost kernel: Loaded X.509 cert 'The CentOS Project: CentOS Stream kernel signing key: 8d408fd8f954b245ea1a4231fd25ac56c328a9b5'
Feb 02 09:00:39 localhost kernel: Loaded X.509 cert 'Red Hat Enterprise Linux Driver Update Program (key 3): bf57f3e87362bc7229d9f465321773dfd1f77a80'
Feb 02 09:00:39 localhost kernel: Loaded X.509 cert 'Red Hat Enterprise Linux kpatch signing key: 4d38fd864ebe18c5f0b72e3852e2014c3a676fc8'
Feb 02 09:00:39 localhost kernel: Loaded X.509 cert 'RH-IMA-CA: Red Hat IMA CA: fb31825dd0e073685b264e3038963673f753959a'
Feb 02 09:00:39 localhost kernel: Loaded X.509 cert 'Nvidia GPU OOT signing 001: 55e1cef88193e60419f0b0ec379c49f77545acf0'
Feb 02 09:00:39 localhost kernel: Demotion targets for Node 0: null
Feb 02 09:00:39 localhost kernel: page_owner is disabled
Feb 02 09:00:39 localhost kernel: Key type .fscrypt registered
Feb 02 09:00:39 localhost kernel: Key type fscrypt-provisioning registered
Feb 02 09:00:39 localhost kernel: Key type big_key registered
Feb 02 09:00:39 localhost kernel: Key type encrypted registered
Feb 02 09:00:39 localhost kernel: ima: No TPM chip found, activating TPM-bypass!
Feb 02 09:00:39 localhost kernel: Loading compiled-in module X.509 certificates
Feb 02 09:00:39 localhost kernel: Loaded X.509 cert 'The CentOS Project: CentOS Stream kernel signing key: 8d408fd8f954b245ea1a4231fd25ac56c328a9b5'
Feb 02 09:00:39 localhost kernel: ima: Allocated hash algorithm: sha256
Feb 02 09:00:39 localhost kernel: ima: No architecture policies found
Feb 02 09:00:39 localhost kernel: evm: Initialising EVM extended attributes:
Feb 02 09:00:39 localhost kernel: evm: security.selinux
Feb 02 09:00:39 localhost kernel: evm: security.SMACK64 (disabled)
Feb 02 09:00:39 localhost kernel: evm: security.SMACK64EXEC (disabled)
Feb 02 09:00:39 localhost kernel: evm: security.SMACK64TRANSMUTE (disabled)
Feb 02 09:00:39 localhost kernel: evm: security.SMACK64MMAP (disabled)
Feb 02 09:00:39 localhost kernel: evm: security.apparmor (disabled)
Feb 02 09:00:39 localhost kernel: evm: security.ima
Feb 02 09:00:39 localhost kernel: evm: security.capability
Feb 02 09:00:39 localhost kernel: evm: HMAC attrs: 0x1
Feb 02 09:00:39 localhost kernel: usb 1-1: new full-speed USB device number 2 using uhci_hcd
Feb 02 09:00:39 localhost kernel: Running certificate verification RSA selftest
Feb 02 09:00:39 localhost kernel: Loaded X.509 cert 'Certificate verification self-testing key: f58703bb33ce1b73ee02eccdee5b8817518fe3db'
Feb 02 09:00:39 localhost kernel: Running certificate verification ECDSA selftest
Feb 02 09:00:39 localhost kernel: Loaded X.509 cert 'Certificate verification ECDSA self-testing key: 2900bcea1deb7bc8479a84a23d758efdfdd2b2d3'
Feb 02 09:00:39 localhost kernel: clk: Disabling unused clocks
Feb 02 09:00:39 localhost kernel: Freeing unused decrypted memory: 2028K
Feb 02 09:00:39 localhost kernel: Freeing unused kernel image (initmem) memory: 4196K
Feb 02 09:00:39 localhost kernel: Write protecting the kernel read-only data: 30720k
Feb 02 09:00:39 localhost kernel: Freeing unused kernel image (rodata/data gap) memory: 408K
Feb 02 09:00:39 localhost kernel: x86/mm: Checked W+X mappings: passed, no W+X pages found.
Feb 02 09:00:39 localhost kernel: Run /init as init process
Feb 02 09:00:39 localhost kernel:   with arguments:
Feb 02 09:00:39 localhost kernel:     /init
Feb 02 09:00:39 localhost kernel:   with environment:
Feb 02 09:00:39 localhost kernel:     HOME=/
Feb 02 09:00:39 localhost kernel:     TERM=linux
Feb 02 09:00:39 localhost kernel:     BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-665.el9.x86_64
Feb 02 09:00:39 localhost systemd[1]: systemd 252-64.el9 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT +GNUTLS +OPENSSL +ACL +BLKID +CURL +ELFUTILS +FIDO2 +IDN2 -IDN -IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY +P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK +XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified)
Feb 02 09:00:39 localhost systemd[1]: Detected virtualization kvm.
Feb 02 09:00:39 localhost systemd[1]: Detected architecture x86-64.
Feb 02 09:00:39 localhost systemd[1]: Running in initrd.
Feb 02 09:00:39 localhost systemd[1]: No hostname configured, using default hostname.
Feb 02 09:00:39 localhost systemd[1]: Hostname set to <localhost>.
Feb 02 09:00:39 localhost systemd[1]: Initializing machine ID from VM UUID.
Feb 02 09:00:39 localhost kernel: usb 1-1: New USB device found, idVendor=0627, idProduct=0001, bcdDevice= 0.00
Feb 02 09:00:39 localhost kernel: usb 1-1: New USB device strings: Mfr=1, Product=3, SerialNumber=10
Feb 02 09:00:39 localhost kernel: usb 1-1: Product: QEMU USB Tablet
Feb 02 09:00:39 localhost kernel: usb 1-1: Manufacturer: QEMU
Feb 02 09:00:39 localhost kernel: usb 1-1: SerialNumber: 28754-0000:00:01.2-1
Feb 02 09:00:39 localhost kernel: input: QEMU QEMU USB Tablet as /devices/pci0000:00/0000:00:01.2/usb1/1-1/1-1:1.0/0003:0627:0001.0001/input/input5
Feb 02 09:00:39 localhost kernel: hid-generic 0003:0627:0001.0001: input,hidraw0: USB HID v0.01 Mouse [QEMU QEMU USB Tablet] on usb-0000:00:01.2-1/input0
Feb 02 09:00:39 localhost systemd[1]: Queued start job for default target Initrd Default Target.
Feb 02 09:00:39 localhost systemd[1]: Started Dispatch Password Requests to Console Directory Watch.
Feb 02 09:00:39 localhost systemd[1]: Reached target Local Encrypted Volumes.
Feb 02 09:00:39 localhost systemd[1]: Reached target Initrd /usr File System.
Feb 02 09:00:39 localhost systemd[1]: Reached target Local File Systems.
Feb 02 09:00:39 localhost systemd[1]: Reached target Path Units.
Feb 02 09:00:39 localhost systemd[1]: Reached target Slice Units.
Feb 02 09:00:39 localhost systemd[1]: Reached target Swaps.
Feb 02 09:00:39 localhost systemd[1]: Reached target Timer Units.
Feb 02 09:00:39 localhost systemd[1]: Listening on D-Bus System Message Bus Socket.
Feb 02 09:00:39 localhost systemd[1]: Listening on Journal Socket (/dev/log).
Feb 02 09:00:39 localhost systemd[1]: Listening on Journal Socket.
Feb 02 09:00:39 localhost systemd[1]: Listening on udev Control Socket.
Feb 02 09:00:39 localhost systemd[1]: Listening on udev Kernel Socket.
Feb 02 09:00:39 localhost systemd[1]: Reached target Socket Units.
Feb 02 09:00:39 localhost systemd[1]: Starting Create List of Static Device Nodes...
Feb 02 09:00:39 localhost systemd[1]: Starting Journal Service...
Feb 02 09:00:39 localhost systemd[1]: Load Kernel Modules was skipped because no trigger condition checks were met.
Feb 02 09:00:39 localhost systemd[1]: Starting Apply Kernel Variables...
Feb 02 09:00:39 localhost systemd[1]: Starting Create System Users...
Feb 02 09:00:39 localhost systemd[1]: Starting Setup Virtual Console...
Feb 02 09:00:39 localhost systemd[1]: Finished Create List of Static Device Nodes.
Feb 02 09:00:39 localhost systemd[1]: Finished Apply Kernel Variables.
Feb 02 09:00:39 localhost systemd[1]: Finished Create System Users.
Feb 02 09:00:39 localhost systemd-journald[306]: Journal started
Feb 02 09:00:39 localhost systemd-journald[306]: Runtime Journal (/run/log/journal/7f778d97f318438087762e4d99e5fd86) is 8.0M, max 153.6M, 145.6M free.
Feb 02 09:00:39 localhost systemd-sysusers[310]: Creating group 'users' with GID 100.
Feb 02 09:00:39 localhost systemd-sysusers[310]: Creating group 'dbus' with GID 81.
Feb 02 09:00:39 localhost systemd-sysusers[310]: Creating user 'dbus' (System Message Bus) with UID 81 and GID 81.
Feb 02 09:00:39 localhost systemd[1]: Started Journal Service.
Feb 02 09:00:39 localhost systemd[1]: Starting Create Static Device Nodes in /dev...
Feb 02 09:00:39 localhost systemd[1]: Starting Create Volatile Files and Directories...
Feb 02 09:00:39 localhost systemd[1]: Finished Create Static Device Nodes in /dev.
Feb 02 09:00:39 localhost systemd[1]: Finished Create Volatile Files and Directories.
Feb 02 09:00:39 localhost systemd[1]: Finished Setup Virtual Console.
Feb 02 09:00:39 localhost systemd[1]: dracut ask for additional cmdline parameters was skipped because no trigger condition checks were met.
Feb 02 09:00:39 localhost systemd[1]: Starting dracut cmdline hook...
Feb 02 09:00:39 localhost dracut-cmdline[325]: dracut-9 dracut-057-102.git20250818.el9
Feb 02 09:00:39 localhost dracut-cmdline[325]: Using kernel command line parameters:    BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-665.el9.x86_64 root=UUID=822f14ea-6e7e-41df-b0d8-fbe282d9ded8 ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Feb 02 09:00:39 localhost systemd[1]: Finished dracut cmdline hook.
Feb 02 09:00:39 localhost systemd[1]: Starting dracut pre-udev hook...
Feb 02 09:00:39 localhost kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log.
Feb 02 09:00:39 localhost kernel: device-mapper: uevent: version 1.0.3
Feb 02 09:00:39 localhost kernel: device-mapper: ioctl: 4.50.0-ioctl (2025-04-28) initialised: dm-devel@lists.linux.dev
Feb 02 09:00:39 localhost kernel: RPC: Registered named UNIX socket transport module.
Feb 02 09:00:39 localhost kernel: RPC: Registered udp transport module.
Feb 02 09:00:39 localhost kernel: RPC: Registered tcp transport module.
Feb 02 09:00:39 localhost kernel: RPC: Registered tcp-with-tls transport module.
Feb 02 09:00:39 localhost kernel: RPC: Registered tcp NFSv4.1 backchannel transport module.
Feb 02 09:00:39 localhost rpc.statd[442]: Version 2.5.4 starting
Feb 02 09:00:39 localhost rpc.statd[442]: Initializing NSM state
Feb 02 09:00:39 localhost rpc.idmapd[447]: Setting log level to 0
Feb 02 09:00:39 localhost systemd[1]: Finished dracut pre-udev hook.
Feb 02 09:00:39 localhost systemd[1]: Starting Rule-based Manager for Device Events and Files...
Feb 02 09:00:39 localhost systemd-udevd[460]: Using default interface naming scheme 'rhel-9.0'.
Feb 02 09:00:39 localhost systemd[1]: Started Rule-based Manager for Device Events and Files.
Feb 02 09:00:39 localhost systemd[1]: Starting dracut pre-trigger hook...
Feb 02 09:00:39 localhost systemd[1]: Finished dracut pre-trigger hook.
Feb 02 09:00:39 localhost systemd[1]: Starting Coldplug All udev Devices...
Feb 02 09:00:39 localhost systemd[1]: Created slice Slice /system/modprobe.
Feb 02 09:00:39 localhost systemd[1]: Starting Load Kernel Module configfs...
Feb 02 09:00:39 localhost systemd[1]: Finished Coldplug All udev Devices.
Feb 02 09:00:39 localhost systemd[1]: modprobe@configfs.service: Deactivated successfully.
Feb 02 09:00:39 localhost systemd[1]: Finished Load Kernel Module configfs.
Feb 02 09:00:39 localhost systemd[1]: nm-initrd.service was skipped because of an unmet condition check (ConditionPathExists=/run/NetworkManager/initrd/neednet).
Feb 02 09:00:39 localhost systemd[1]: Reached target Network.
Feb 02 09:00:39 localhost systemd[1]: nm-wait-online-initrd.service was skipped because of an unmet condition check (ConditionPathExists=/run/NetworkManager/initrd/neednet).
Feb 02 09:00:39 localhost systemd[1]: Starting dracut initqueue hook...
Feb 02 09:00:39 localhost kernel: virtio_blk virtio2: 8/0/0 default/read/poll queues
Feb 02 09:00:39 localhost kernel: libata version 3.00 loaded.
Feb 02 09:00:39 localhost kernel: ata_piix 0000:00:01.1: version 2.13
Feb 02 09:00:39 localhost kernel: scsi host0: ata_piix
Feb 02 09:00:39 localhost kernel: virtio_blk virtio2: [vda] 167772160 512-byte logical blocks (85.9 GB/80.0 GiB)
Feb 02 09:00:39 localhost kernel: scsi host1: ata_piix
Feb 02 09:00:39 localhost kernel: ata1: PATA max MWDMA2 cmd 0x1f0 ctl 0x3f6 bmdma 0xc140 irq 14 lpm-pol 0
Feb 02 09:00:39 localhost kernel: ata2: PATA max MWDMA2 cmd 0x170 ctl 0x376 bmdma 0xc148 irq 15 lpm-pol 0
Feb 02 09:00:39 localhost systemd-udevd[496]: Network interface NamePolicy= disabled on kernel command line.
Feb 02 09:00:39 localhost kernel:  vda: vda1
Feb 02 09:00:40 localhost systemd[1]: Mounting Kernel Configuration File System...
Feb 02 09:00:40 localhost kernel: ata1: found unknown device (class 0)
Feb 02 09:00:40 localhost kernel: ata1.00: ATAPI: QEMU DVD-ROM, 2.5+, max UDMA/100
Feb 02 09:00:40 localhost kernel: scsi 0:0:0:0: CD-ROM            QEMU     QEMU DVD-ROM     2.5+ PQ: 0 ANSI: 5
Feb 02 09:00:40 localhost kernel: scsi 0:0:0:0: Attached scsi generic sg0 type 5
Feb 02 09:00:40 localhost kernel: sr 0:0:0:0: [sr0] scsi3-mmc drive: 4x/4x cd/rw xa/form2 tray
Feb 02 09:00:40 localhost kernel: cdrom: Uniform CD-ROM driver Revision: 3.20
Feb 02 09:00:40 localhost kernel: sr 0:0:0:0: Attached scsi CD-ROM sr0
Feb 02 09:00:40 localhost systemd[1]: Mounted Kernel Configuration File System.
Feb 02 09:00:40 localhost systemd[1]: Found device /dev/disk/by-uuid/822f14ea-6e7e-41df-b0d8-fbe282d9ded8.
Feb 02 09:00:40 localhost systemd[1]: Reached target Initrd Root Device.
Feb 02 09:00:40 localhost systemd[1]: Reached target System Initialization.
Feb 02 09:00:40 localhost systemd[1]: Reached target Basic System.
Feb 02 09:00:40 localhost systemd[1]: Finished dracut initqueue hook.
Feb 02 09:00:40 localhost systemd[1]: Reached target Preparation for Remote File Systems.
Feb 02 09:00:40 localhost systemd[1]: Reached target Remote Encrypted Volumes.
Feb 02 09:00:40 localhost systemd[1]: Reached target Remote File Systems.
Feb 02 09:00:40 localhost systemd[1]: Starting dracut pre-mount hook...
Feb 02 09:00:40 localhost systemd[1]: Finished dracut pre-mount hook.
Feb 02 09:00:40 localhost systemd[1]: Starting File System Check on /dev/disk/by-uuid/822f14ea-6e7e-41df-b0d8-fbe282d9ded8...
Feb 02 09:00:40 localhost systemd-fsck[556]: /usr/sbin/fsck.xfs: XFS file system.
Feb 02 09:00:40 localhost systemd[1]: Finished File System Check on /dev/disk/by-uuid/822f14ea-6e7e-41df-b0d8-fbe282d9ded8.
Feb 02 09:00:40 localhost systemd[1]: Mounting /sysroot...
Feb 02 09:00:40 localhost kernel: SGI XFS with ACLs, security attributes, scrub, quota, no debug enabled
Feb 02 09:00:40 localhost kernel: XFS (vda1): Mounting V5 Filesystem 822f14ea-6e7e-41df-b0d8-fbe282d9ded8
Feb 02 09:00:40 localhost kernel: XFS (vda1): Ending clean mount
Feb 02 09:00:40 localhost systemd[1]: Mounted /sysroot.
Feb 02 09:00:40 localhost systemd[1]: Reached target Initrd Root File System.
Feb 02 09:00:40 localhost systemd[1]: Starting Mountpoints Configured in the Real Root...
Feb 02 09:00:40 localhost systemd[1]: initrd-parse-etc.service: Deactivated successfully.
Feb 02 09:00:40 localhost systemd[1]: Finished Mountpoints Configured in the Real Root.
Feb 02 09:00:40 localhost systemd[1]: Reached target Initrd File Systems.
Feb 02 09:00:40 localhost systemd[1]: Reached target Initrd Default Target.
Feb 02 09:00:40 localhost systemd[1]: Starting dracut mount hook...
Feb 02 09:00:41 localhost systemd[1]: Finished dracut mount hook.
Feb 02 09:00:41 localhost systemd[1]: Starting dracut pre-pivot and cleanup hook...
Feb 02 09:00:41 localhost rpc.idmapd[447]: exiting on signal 15
Feb 02 09:00:41 localhost systemd[1]: var-lib-nfs-rpc_pipefs.mount: Deactivated successfully.
Feb 02 09:00:41 localhost systemd[1]: Finished dracut pre-pivot and cleanup hook.
Feb 02 09:00:41 localhost systemd[1]: Starting Cleaning Up and Shutting Down Daemons...
Feb 02 09:00:41 localhost systemd[1]: Stopped target Network.
Feb 02 09:00:41 localhost systemd[1]: Stopped target Remote Encrypted Volumes.
Feb 02 09:00:41 localhost systemd[1]: Stopped target Timer Units.
Feb 02 09:00:41 localhost systemd[1]: dbus.socket: Deactivated successfully.
Feb 02 09:00:41 localhost systemd[1]: Closed D-Bus System Message Bus Socket.
Feb 02 09:00:41 localhost systemd[1]: dracut-pre-pivot.service: Deactivated successfully.
Feb 02 09:00:41 localhost systemd[1]: Stopped dracut pre-pivot and cleanup hook.
Feb 02 09:00:41 localhost systemd[1]: Stopped target Initrd Default Target.
Feb 02 09:00:41 localhost systemd[1]: Stopped target Basic System.
Feb 02 09:00:41 localhost systemd[1]: Stopped target Initrd Root Device.
Feb 02 09:00:41 localhost systemd[1]: Stopped target Initrd /usr File System.
Feb 02 09:00:41 localhost systemd[1]: Stopped target Path Units.
Feb 02 09:00:41 localhost systemd[1]: Stopped target Remote File Systems.
Feb 02 09:00:41 localhost systemd[1]: Stopped target Preparation for Remote File Systems.
Feb 02 09:00:41 localhost systemd[1]: Stopped target Slice Units.
Feb 02 09:00:41 localhost systemd[1]: Stopped target Socket Units.
Feb 02 09:00:41 localhost systemd[1]: Stopped target System Initialization.
Feb 02 09:00:41 localhost systemd[1]: Stopped target Local File Systems.
Feb 02 09:00:41 localhost systemd[1]: Stopped target Swaps.
Feb 02 09:00:41 localhost systemd[1]: dracut-mount.service: Deactivated successfully.
Feb 02 09:00:41 localhost systemd[1]: Stopped dracut mount hook.
Feb 02 09:00:41 localhost systemd[1]: dracut-pre-mount.service: Deactivated successfully.
Feb 02 09:00:41 localhost systemd[1]: Stopped dracut pre-mount hook.
Feb 02 09:00:41 localhost systemd[1]: Stopped target Local Encrypted Volumes.
Feb 02 09:00:41 localhost systemd[1]: systemd-ask-password-console.path: Deactivated successfully.
Feb 02 09:00:41 localhost systemd[1]: Stopped Dispatch Password Requests to Console Directory Watch.
Feb 02 09:00:41 localhost systemd[1]: dracut-initqueue.service: Deactivated successfully.
Feb 02 09:00:41 localhost systemd[1]: Stopped dracut initqueue hook.
Feb 02 09:00:41 localhost systemd[1]: systemd-sysctl.service: Deactivated successfully.
Feb 02 09:00:41 localhost systemd[1]: Stopped Apply Kernel Variables.
Feb 02 09:00:41 localhost systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully.
Feb 02 09:00:41 localhost systemd[1]: Stopped Create Volatile Files and Directories.
Feb 02 09:00:41 localhost systemd[1]: systemd-udev-trigger.service: Deactivated successfully.
Feb 02 09:00:41 localhost systemd[1]: Stopped Coldplug All udev Devices.
Feb 02 09:00:41 localhost systemd[1]: dracut-pre-trigger.service: Deactivated successfully.
Feb 02 09:00:41 localhost systemd[1]: Stopped dracut pre-trigger hook.
Feb 02 09:00:41 localhost systemd[1]: Stopping Rule-based Manager for Device Events and Files...
Feb 02 09:00:41 localhost systemd[1]: systemd-vconsole-setup.service: Deactivated successfully.
Feb 02 09:00:41 localhost systemd[1]: Stopped Setup Virtual Console.
Feb 02 09:00:41 localhost systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully.
Feb 02 09:00:41 localhost systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully.
Feb 02 09:00:41 localhost systemd[1]: systemd-udevd.service: Deactivated successfully.
Feb 02 09:00:41 localhost systemd[1]: Stopped Rule-based Manager for Device Events and Files.
Feb 02 09:00:41 localhost systemd[1]: systemd-udevd-control.socket: Deactivated successfully.
Feb 02 09:00:41 localhost systemd[1]: Closed udev Control Socket.
Feb 02 09:00:41 localhost systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully.
Feb 02 09:00:41 localhost systemd[1]: Closed udev Kernel Socket.
Feb 02 09:00:41 localhost systemd[1]: dracut-pre-udev.service: Deactivated successfully.
Feb 02 09:00:41 localhost systemd[1]: Stopped dracut pre-udev hook.
Feb 02 09:00:41 localhost systemd[1]: dracut-cmdline.service: Deactivated successfully.
Feb 02 09:00:41 localhost systemd[1]: Stopped dracut cmdline hook.
Feb 02 09:00:41 localhost systemd[1]: Starting Cleanup udev Database...
Feb 02 09:00:41 localhost systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully.
Feb 02 09:00:41 localhost systemd[1]: Stopped Create Static Device Nodes in /dev.
Feb 02 09:00:41 localhost systemd[1]: kmod-static-nodes.service: Deactivated successfully.
Feb 02 09:00:41 localhost systemd[1]: Stopped Create List of Static Device Nodes.
Feb 02 09:00:41 localhost systemd[1]: systemd-sysusers.service: Deactivated successfully.
Feb 02 09:00:41 localhost systemd[1]: Stopped Create System Users.
Feb 02 09:00:41 localhost systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup\x2ddev.service.mount: Deactivated successfully.
Feb 02 09:00:41 localhost systemd[1]: run-credentials-systemd\x2dsysusers.service.mount: Deactivated successfully.
Feb 02 09:00:41 localhost systemd[1]: initrd-cleanup.service: Deactivated successfully.
Feb 02 09:00:41 localhost systemd[1]: Finished Cleaning Up and Shutting Down Daemons.
Feb 02 09:00:41 localhost systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully.
Feb 02 09:00:41 localhost systemd[1]: Finished Cleanup udev Database.
Feb 02 09:00:41 localhost systemd[1]: Reached target Switch Root.
Feb 02 09:00:41 localhost systemd[1]: Starting Switch Root...
Feb 02 09:00:41 localhost systemd[1]: Switching root.
Feb 02 09:00:41 localhost systemd-journald[306]: Journal stopped
Feb 02 09:00:42 localhost systemd-journald[306]: Received SIGTERM from PID 1 (systemd).
Feb 02 09:00:42 localhost kernel: audit: type=1404 audit(1770022841.496:2): enforcing=1 old_enforcing=0 auid=4294967295 ses=4294967295 enabled=1 old-enabled=1 lsm=selinux res=1
Feb 02 09:00:42 localhost kernel: SELinux:  policy capability network_peer_controls=1
Feb 02 09:00:42 localhost kernel: SELinux:  policy capability open_perms=1
Feb 02 09:00:42 localhost kernel: SELinux:  policy capability extended_socket_class=1
Feb 02 09:00:42 localhost kernel: SELinux:  policy capability always_check_network=0
Feb 02 09:00:42 localhost kernel: SELinux:  policy capability cgroup_seclabel=1
Feb 02 09:00:42 localhost kernel: SELinux:  policy capability nnp_nosuid_transition=1
Feb 02 09:00:42 localhost kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Feb 02 09:00:42 localhost kernel: audit: type=1403 audit(1770022841.616:3): auid=4294967295 ses=4294967295 lsm=selinux res=1
Feb 02 09:00:42 localhost systemd[1]: Successfully loaded SELinux policy in 123.354ms.
Feb 02 09:00:42 localhost systemd[1]: Relabelled /dev, /dev/shm, /run, /sys/fs/cgroup in 25.796ms.
Feb 02 09:00:42 localhost systemd[1]: systemd 252-64.el9 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT +GNUTLS +OPENSSL +ACL +BLKID +CURL +ELFUTILS +FIDO2 +IDN2 -IDN -IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY +P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK +XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified)
Feb 02 09:00:42 localhost systemd[1]: Detected virtualization kvm.
Feb 02 09:00:42 localhost systemd[1]: Detected architecture x86-64.
Feb 02 09:00:42 localhost systemd-rc-local-generator[639]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 02 09:00:42 localhost systemd[1]: initrd-switch-root.service: Deactivated successfully.
Feb 02 09:00:42 localhost systemd[1]: Stopped Switch Root.
Feb 02 09:00:42 localhost systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1.
Feb 02 09:00:42 localhost systemd[1]: Created slice Slice /system/getty.
Feb 02 09:00:42 localhost systemd[1]: Created slice Slice /system/serial-getty.
Feb 02 09:00:42 localhost systemd[1]: Created slice Slice /system/sshd-keygen.
Feb 02 09:00:42 localhost systemd[1]: Created slice User and Session Slice.
Feb 02 09:00:42 localhost systemd[1]: Started Dispatch Password Requests to Console Directory Watch.
Feb 02 09:00:42 localhost systemd[1]: Started Forward Password Requests to Wall Directory Watch.
Feb 02 09:00:42 localhost systemd[1]: Set up automount Arbitrary Executable File Formats File System Automount Point.
Feb 02 09:00:42 localhost systemd[1]: Reached target Local Encrypted Volumes.
Feb 02 09:00:42 localhost systemd[1]: Stopped target Switch Root.
Feb 02 09:00:42 localhost systemd[1]: Stopped target Initrd File Systems.
Feb 02 09:00:42 localhost systemd[1]: Stopped target Initrd Root File System.
Feb 02 09:00:42 localhost systemd[1]: Reached target Local Integrity Protected Volumes.
Feb 02 09:00:42 localhost systemd[1]: Reached target Path Units.
Feb 02 09:00:42 localhost systemd[1]: Reached target rpc_pipefs.target.
Feb 02 09:00:42 localhost systemd[1]: Reached target Slice Units.
Feb 02 09:00:42 localhost systemd[1]: Reached target Swaps.
Feb 02 09:00:42 localhost systemd[1]: Reached target Local Verity Protected Volumes.
Feb 02 09:00:42 localhost systemd[1]: Listening on RPCbind Server Activation Socket.
Feb 02 09:00:42 localhost systemd[1]: Reached target RPC Port Mapper.
Feb 02 09:00:42 localhost systemd[1]: Listening on Process Core Dump Socket.
Feb 02 09:00:42 localhost systemd[1]: Listening on initctl Compatibility Named Pipe.
Feb 02 09:00:42 localhost systemd[1]: Listening on udev Control Socket.
Feb 02 09:00:42 localhost systemd[1]: Listening on udev Kernel Socket.
Feb 02 09:00:42 localhost systemd[1]: Mounting Huge Pages File System...
Feb 02 09:00:42 localhost systemd[1]: Mounting POSIX Message Queue File System...
Feb 02 09:00:42 localhost systemd[1]: Mounting Kernel Debug File System...
Feb 02 09:00:42 localhost systemd[1]: Mounting Kernel Trace File System...
Feb 02 09:00:42 localhost systemd[1]: Kernel Module supporting RPCSEC_GSS was skipped because of an unmet condition check (ConditionPathExists=/etc/krb5.keytab).
Feb 02 09:00:42 localhost systemd[1]: Starting Create List of Static Device Nodes...
Feb 02 09:00:42 localhost systemd[1]: Starting Load Kernel Module configfs...
Feb 02 09:00:42 localhost systemd[1]: Starting Load Kernel Module drm...
Feb 02 09:00:42 localhost systemd[1]: Starting Load Kernel Module efi_pstore...
Feb 02 09:00:42 localhost systemd[1]: Starting Load Kernel Module fuse...
Feb 02 09:00:42 localhost systemd[1]: Starting Read and set NIS domainname from /etc/sysconfig/network...
Feb 02 09:00:42 localhost systemd[1]: systemd-fsck-root.service: Deactivated successfully.
Feb 02 09:00:42 localhost systemd[1]: Stopped File System Check on Root Device.
Feb 02 09:00:42 localhost systemd[1]: Stopped Journal Service.
Feb 02 09:00:42 localhost systemd[1]: Starting Journal Service...
Feb 02 09:00:42 localhost systemd[1]: Load Kernel Modules was skipped because no trigger condition checks were met.
Feb 02 09:00:42 localhost systemd[1]: Starting Generate network units from Kernel command line...
Feb 02 09:00:42 localhost systemd[1]: TPM2 PCR Machine ID Measurement was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Feb 02 09:00:42 localhost systemd[1]: Starting Remount Root and Kernel File Systems...
Feb 02 09:00:42 localhost systemd[1]: Repartition Root Disk was skipped because no trigger condition checks were met.
Feb 02 09:00:42 localhost systemd[1]: Starting Apply Kernel Variables...
Feb 02 09:00:42 localhost kernel: xfs filesystem being remounted at / supports timestamps until 2038 (0x7fffffff)
Feb 02 09:00:42 localhost systemd[1]: Starting Coldplug All udev Devices...
Feb 02 09:00:42 localhost kernel: fuse: init (API version 7.37)
Feb 02 09:00:42 localhost systemd[1]: Mounted Huge Pages File System.
Feb 02 09:00:42 localhost systemd[1]: Mounted POSIX Message Queue File System.
Feb 02 09:00:42 localhost systemd[1]: Mounted Kernel Debug File System.
Feb 02 09:00:42 localhost systemd[1]: Mounted Kernel Trace File System.
Feb 02 09:00:42 localhost systemd[1]: Finished Create List of Static Device Nodes.
Feb 02 09:00:42 localhost systemd-journald[680]: Journal started
Feb 02 09:00:42 localhost systemd-journald[680]: Runtime Journal (/run/log/journal/bf0bc0bb03de29b24cba1cc9599cf5d0) is 8.0M, max 153.6M, 145.6M free.
Feb 02 09:00:42 localhost systemd[1]: modprobe@configfs.service: Deactivated successfully.
Feb 02 09:00:42 localhost systemd[1]: Queued start job for default target Multi-User System.
Feb 02 09:00:42 localhost systemd[1]: systemd-journald.service: Deactivated successfully.
Feb 02 09:00:42 localhost systemd[1]: Finished Load Kernel Module configfs.
Feb 02 09:00:42 localhost systemd[1]: Started Journal Service.
Feb 02 09:00:42 localhost kernel: ACPI: bus type drm_connector registered
Feb 02 09:00:42 localhost systemd[1]: modprobe@drm.service: Deactivated successfully.
Feb 02 09:00:42 localhost systemd[1]: Finished Load Kernel Module drm.
Feb 02 09:00:42 localhost systemd[1]: modprobe@efi_pstore.service: Deactivated successfully.
Feb 02 09:00:42 localhost systemd[1]: Finished Load Kernel Module efi_pstore.
Feb 02 09:00:42 localhost systemd[1]: modprobe@fuse.service: Deactivated successfully.
Feb 02 09:00:42 localhost systemd[1]: Finished Load Kernel Module fuse.
Feb 02 09:00:42 localhost systemd[1]: Finished Read and set NIS domainname from /etc/sysconfig/network.
Feb 02 09:00:42 localhost systemd[1]: Finished Generate network units from Kernel command line.
Feb 02 09:00:42 localhost systemd[1]: Finished Remount Root and Kernel File Systems.
Feb 02 09:00:42 localhost systemd[1]: Finished Apply Kernel Variables.
Feb 02 09:00:42 localhost systemd[1]: Mounting FUSE Control File System...
Feb 02 09:00:42 localhost systemd[1]: First Boot Wizard was skipped because of an unmet condition check (ConditionFirstBoot=yes).
Feb 02 09:00:42 localhost systemd[1]: Starting Rebuild Hardware Database...
Feb 02 09:00:42 localhost systemd[1]: Starting Flush Journal to Persistent Storage...
Feb 02 09:00:42 localhost systemd[1]: Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore).
Feb 02 09:00:42 localhost systemd[1]: Starting Load/Save OS Random Seed...
Feb 02 09:00:42 localhost systemd[1]: Starting Create System Users...
Feb 02 09:00:42 localhost systemd[1]: Finished Coldplug All udev Devices.
Feb 02 09:00:42 localhost systemd[1]: Mounted FUSE Control File System.
Feb 02 09:00:42 localhost systemd-journald[680]: Runtime Journal (/run/log/journal/bf0bc0bb03de29b24cba1cc9599cf5d0) is 8.0M, max 153.6M, 145.6M free.
Feb 02 09:00:42 localhost systemd-journald[680]: Received client request to flush runtime journal.
Feb 02 09:00:42 localhost systemd[1]: Finished Flush Journal to Persistent Storage.
Feb 02 09:00:42 localhost systemd[1]: Finished Load/Save OS Random Seed.
Feb 02 09:00:42 localhost systemd[1]: First Boot Complete was skipped because of an unmet condition check (ConditionFirstBoot=yes).
Feb 02 09:00:42 localhost systemd[1]: Finished Create System Users.
Feb 02 09:00:42 localhost systemd[1]: Starting Create Static Device Nodes in /dev...
Feb 02 09:00:42 localhost systemd[1]: Finished Create Static Device Nodes in /dev.
Feb 02 09:00:42 localhost systemd[1]: Reached target Preparation for Local File Systems.
Feb 02 09:00:42 localhost systemd[1]: Reached target Local File Systems.
Feb 02 09:00:42 localhost systemd[1]: Starting Rebuild Dynamic Linker Cache...
Feb 02 09:00:42 localhost systemd[1]: Mark the need to relabel after reboot was skipped because of an unmet condition check (ConditionSecurity=!selinux).
Feb 02 09:00:42 localhost systemd[1]: Set Up Additional Binary Formats was skipped because no trigger condition checks were met.
Feb 02 09:00:42 localhost systemd[1]: Update Boot Loader Random Seed was skipped because no trigger condition checks were met.
Feb 02 09:00:42 localhost systemd[1]: Starting Automatic Boot Loader Update...
Feb 02 09:00:42 localhost systemd[1]: Commit a transient machine-id on disk was skipped because of an unmet condition check (ConditionPathIsMountPoint=/etc/machine-id).
Feb 02 09:00:42 localhost systemd[1]: Starting Create Volatile Files and Directories...
Feb 02 09:00:42 localhost bootctl[700]: Couldn't find EFI system partition, skipping.
Feb 02 09:00:42 localhost systemd[1]: Finished Automatic Boot Loader Update.
Feb 02 09:00:42 localhost systemd[1]: Finished Create Volatile Files and Directories.
Feb 02 09:00:42 localhost systemd[1]: Starting Security Auditing Service...
Feb 02 09:00:42 localhost systemd[1]: Starting RPC Bind...
Feb 02 09:00:42 localhost systemd[1]: Starting Rebuild Journal Catalog...
Feb 02 09:00:42 localhost auditd[706]: audit dispatcher initialized with q_depth=2000 and 1 active plugins
Feb 02 09:00:42 localhost auditd[706]: Init complete, auditd 3.1.5 listening for events (startup state enable)
Feb 02 09:00:42 localhost systemd[1]: Finished Rebuild Journal Catalog.
Feb 02 09:00:42 localhost systemd[1]: Started RPC Bind.
Feb 02 09:00:42 localhost augenrules[711]: /sbin/augenrules: No change
Feb 02 09:00:42 localhost augenrules[726]: No rules
Feb 02 09:00:42 localhost augenrules[726]: enabled 1
Feb 02 09:00:42 localhost augenrules[726]: failure 1
Feb 02 09:00:42 localhost augenrules[726]: pid 706
Feb 02 09:00:42 localhost augenrules[726]: rate_limit 0
Feb 02 09:00:42 localhost augenrules[726]: backlog_limit 8192
Feb 02 09:00:42 localhost augenrules[726]: lost 0
Feb 02 09:00:42 localhost augenrules[726]: backlog 3
Feb 02 09:00:42 localhost augenrules[726]: backlog_wait_time 60000
Feb 02 09:00:42 localhost augenrules[726]: backlog_wait_time_actual 0
Feb 02 09:00:42 localhost augenrules[726]: enabled 1
Feb 02 09:00:42 localhost augenrules[726]: failure 1
Feb 02 09:00:42 localhost augenrules[726]: pid 706
Feb 02 09:00:42 localhost augenrules[726]: rate_limit 0
Feb 02 09:00:42 localhost augenrules[726]: backlog_limit 8192
Feb 02 09:00:42 localhost augenrules[726]: lost 0
Feb 02 09:00:42 localhost augenrules[726]: backlog 2
Feb 02 09:00:42 localhost augenrules[726]: backlog_wait_time 60000
Feb 02 09:00:42 localhost augenrules[726]: backlog_wait_time_actual 0
Feb 02 09:00:42 localhost augenrules[726]: enabled 1
Feb 02 09:00:42 localhost augenrules[726]: failure 1
Feb 02 09:00:42 localhost augenrules[726]: pid 706
Feb 02 09:00:42 localhost augenrules[726]: rate_limit 0
Feb 02 09:00:42 localhost augenrules[726]: backlog_limit 8192
Feb 02 09:00:42 localhost augenrules[726]: lost 0
Feb 02 09:00:42 localhost augenrules[726]: backlog 0
Feb 02 09:00:42 localhost augenrules[726]: backlog_wait_time 60000
Feb 02 09:00:42 localhost augenrules[726]: backlog_wait_time_actual 0
Feb 02 09:00:42 localhost systemd[1]: Started Security Auditing Service.
Feb 02 09:00:42 localhost systemd[1]: Starting Record System Boot/Shutdown in UTMP...
Feb 02 09:00:42 localhost systemd[1]: Finished Record System Boot/Shutdown in UTMP.
Feb 02 09:00:43 localhost systemd[1]: Finished Rebuild Hardware Database.
Feb 02 09:00:43 localhost systemd[1]: Starting Rule-based Manager for Device Events and Files...
Feb 02 09:00:43 localhost systemd-udevd[734]: Using default interface naming scheme 'rhel-9.0'.
Feb 02 09:00:43 localhost systemd[1]: Finished Rebuild Dynamic Linker Cache.
Feb 02 09:00:43 localhost systemd[1]: Starting Update is Completed...
Feb 02 09:00:43 localhost systemd[1]: Started Rule-based Manager for Device Events and Files.
Feb 02 09:00:43 localhost systemd[1]: Starting Load Kernel Module configfs...
Feb 02 09:00:43 localhost systemd[1]: modprobe@configfs.service: Deactivated successfully.
Feb 02 09:00:43 localhost systemd[1]: Finished Load Kernel Module configfs.
Feb 02 09:00:43 localhost systemd[1]: Finished Update is Completed.
Feb 02 09:00:43 localhost systemd[1]: Reached target System Initialization.
Feb 02 09:00:43 localhost systemd[1]: Started dnf makecache --timer.
Feb 02 09:00:43 localhost systemd[1]: Started Daily rotation of log files.
Feb 02 09:00:43 localhost systemd[1]: Started Daily Cleanup of Temporary Directories.
Feb 02 09:00:43 localhost systemd[1]: Reached target Timer Units.
Feb 02 09:00:43 localhost systemd[1]: Listening on D-Bus System Message Bus Socket.
Feb 02 09:00:43 localhost systemd[1]: Listening on SSSD Kerberos Cache Manager responder socket.
Feb 02 09:00:43 localhost systemd[1]: Reached target Socket Units.
Feb 02 09:00:43 localhost systemd[1]: Starting D-Bus System Message Bus...
Feb 02 09:00:43 localhost systemd[1]: TPM2 PCR Barrier (Initialization) was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Feb 02 09:00:43 localhost systemd[1]: Condition check resulted in /dev/ttyS0 being skipped.
Feb 02 09:00:43 localhost systemd-udevd[741]: Network interface NamePolicy= disabled on kernel command line.
Feb 02 09:00:43 localhost kernel: piix4_smbus 0000:00:01.3: SMBus Host Controller at 0x700, revision 0
Feb 02 09:00:43 localhost kernel: i2c i2c-0: 1/1 memory slots populated (from DMI)
Feb 02 09:00:43 localhost kernel: i2c i2c-0: Memory type 0x07 not supported yet, not instantiating SPD
Feb 02 09:00:43 localhost systemd[1]: Started D-Bus System Message Bus.
Feb 02 09:00:43 localhost systemd[1]: Reached target Basic System.
Feb 02 09:00:43 localhost dbus-broker-lau[775]: Ready
Feb 02 09:00:43 localhost kernel: input: PC Speaker as /devices/platform/pcspkr/input/input6
Feb 02 09:00:43 localhost systemd[1]: Starting NTP client/server...
Feb 02 09:00:43 localhost kernel: kvm_amd: TSC scaling supported
Feb 02 09:00:43 localhost kernel: kvm_amd: Nested Virtualization enabled
Feb 02 09:00:43 localhost kernel: kvm_amd: Nested Paging enabled
Feb 02 09:00:43 localhost kernel: kvm_amd: LBR virtualization supported
Feb 02 09:00:43 localhost systemd[1]: Starting Cloud-init: Local Stage (pre-network)...
Feb 02 09:00:43 localhost kernel: [drm] pci: virtio-vga detected at 0000:00:02.0
Feb 02 09:00:43 localhost kernel: virtio-pci 0000:00:02.0: vgaarb: deactivate vga console
Feb 02 09:00:43 localhost kernel: Console: switching to colour dummy device 80x25
Feb 02 09:00:43 localhost kernel: [drm] features: -virgl +edid -resource_blob -host_visible
Feb 02 09:00:43 localhost kernel: [drm] features: -context_init
Feb 02 09:00:43 localhost kernel: [drm] number of scanouts: 1
Feb 02 09:00:43 localhost kernel: [drm] number of cap sets: 0
Feb 02 09:00:43 localhost kernel: [drm] Initialized virtio_gpu 0.1.0 for 0000:00:02.0 on minor 0
Feb 02 09:00:43 localhost systemd[1]: Starting Restore /run/initramfs on shutdown...
Feb 02 09:00:43 localhost kernel: fbcon: virtio_gpudrmfb (fb0) is primary device
Feb 02 09:00:43 localhost kernel: Console: switching to colour frame buffer device 128x48
Feb 02 09:00:43 localhost kernel: virtio-pci 0000:00:02.0: [drm] fb0: virtio_gpudrmfb frame buffer device
Feb 02 09:00:43 localhost systemd[1]: Starting IPv4 firewall with iptables...
Feb 02 09:00:43 localhost systemd[1]: Started irqbalance daemon.
Feb 02 09:00:43 localhost systemd[1]: Load CPU microcode update was skipped because of an unmet condition check (ConditionPathExists=/sys/devices/system/cpu/microcode/reload).
Feb 02 09:00:43 localhost systemd[1]: OpenSSH ecdsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Feb 02 09:00:43 localhost systemd[1]: OpenSSH ed25519 Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Feb 02 09:00:43 localhost systemd[1]: OpenSSH rsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Feb 02 09:00:43 localhost systemd[1]: Reached target sshd-keygen.target.
Feb 02 09:00:43 localhost systemd[1]: System Security Services Daemon was skipped because no trigger condition checks were met.
Feb 02 09:00:43 localhost systemd[1]: Reached target User and Group Name Lookups.
Feb 02 09:00:43 localhost systemd[1]: Starting User Login Management...
Feb 02 09:00:43 localhost systemd[1]: Finished Restore /run/initramfs on shutdown.
Feb 02 09:00:43 localhost chronyd[811]: chronyd version 4.8 starting (+CMDMON +REFCLOCK +RTC +PRIVDROP +SCFILTER +SIGND +NTS +SECHASH +IPV6 +DEBUG)
Feb 02 09:00:43 localhost chronyd[811]: Loaded 0 symmetric keys
Feb 02 09:00:43 localhost chronyd[811]: Using right/UTC timezone to obtain leap second data
Feb 02 09:00:43 localhost chronyd[811]: Loaded seccomp filter (level 2)
Feb 02 09:00:43 localhost systemd[1]: Started NTP client/server.
Feb 02 09:00:43 localhost kernel: Warning: Deprecated Driver is detected: nft_compat will not be maintained in a future major release and may be disabled
Feb 02 09:00:43 localhost systemd-logind[805]: New seat seat0.
Feb 02 09:00:43 localhost systemd-logind[805]: Watching system buttons on /dev/input/event0 (Power Button)
Feb 02 09:00:43 localhost systemd-logind[805]: Watching system buttons on /dev/input/event1 (AT Translated Set 2 keyboard)
Feb 02 09:00:43 localhost kernel: Warning: Deprecated Driver is detected: nft_compat_module_init will not be maintained in a future major release and may be disabled
Feb 02 09:00:43 localhost systemd[1]: Started User Login Management.
Feb 02 09:00:43 localhost iptables.init[799]: iptables: Applying firewall rules: [  OK  ]
Feb 02 09:00:43 localhost systemd[1]: Finished IPv4 firewall with iptables.
Feb 02 09:00:44 localhost cloud-init[844]: Cloud-init v. 24.4-8.el9 running 'init-local' at Mon, 02 Feb 2026 09:00:44 +0000. Up 6.51 seconds.
Feb 02 09:00:44 localhost kernel: ISO 9660 Extensions: Microsoft Joliet Level 3
Feb 02 09:00:44 localhost kernel: ISO 9660 Extensions: RRIP_1991A
Feb 02 09:00:44 localhost systemd[1]: run-cloud\x2dinit-tmp-tmpimibsutq.mount: Deactivated successfully.
Feb 02 09:00:44 localhost systemd[1]: Starting Hostname Service...
Feb 02 09:00:44 localhost systemd[1]: Started Hostname Service.
Feb 02 09:00:44 np0005604791.novalocal systemd-hostnamed[858]: Hostname set to <np0005604791.novalocal> (static)
Feb 02 09:00:44 np0005604791.novalocal systemd[1]: Finished Cloud-init: Local Stage (pre-network).
Feb 02 09:00:44 np0005604791.novalocal systemd[1]: Reached target Preparation for Network.
Feb 02 09:00:44 np0005604791.novalocal systemd[1]: Starting Network Manager...
Feb 02 09:00:44 np0005604791.novalocal NetworkManager[862]: <info>  [1770022844.7656] NetworkManager (version 1.54.3-2.el9) is starting... (boot:73bfa7d4-cc72-468c-831e-edc1e8589b87)
Feb 02 09:00:44 np0005604791.novalocal NetworkManager[862]: <info>  [1770022844.7662] Read config: /etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf
Feb 02 09:00:44 np0005604791.novalocal NetworkManager[862]: <info>  [1770022844.7792] manager[0x55c11e6e0000]: monitoring kernel firmware directory '/lib/firmware'.
Feb 02 09:00:44 np0005604791.novalocal NetworkManager[862]: <info>  [1770022844.7827] hostname: hostname: using hostnamed
Feb 02 09:00:44 np0005604791.novalocal NetworkManager[862]: <info>  [1770022844.7828] hostname: static hostname changed from (none) to "np0005604791.novalocal"
Feb 02 09:00:44 np0005604791.novalocal NetworkManager[862]: <info>  [1770022844.7830] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Feb 02 09:00:44 np0005604791.novalocal NetworkManager[862]: <info>  [1770022844.7952] manager[0x55c11e6e0000]: rfkill: Wi-Fi hardware radio set enabled
Feb 02 09:00:44 np0005604791.novalocal NetworkManager[862]: <info>  [1770022844.7952] manager[0x55c11e6e0000]: rfkill: WWAN hardware radio set enabled
Feb 02 09:00:44 np0005604791.novalocal systemd[1]: Listening on Load/Save RF Kill Switch Status /dev/rfkill Watch.
Feb 02 09:00:44 np0005604791.novalocal NetworkManager[862]: <info>  [1770022844.8045] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.54.3-2.el9/libnm-device-plugin-team.so)
Feb 02 09:00:44 np0005604791.novalocal NetworkManager[862]: <info>  [1770022844.8046] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Feb 02 09:00:44 np0005604791.novalocal NetworkManager[862]: <info>  [1770022844.8047] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Feb 02 09:00:44 np0005604791.novalocal NetworkManager[862]: <info>  [1770022844.8048] manager: Networking is enabled by state file
Feb 02 09:00:44 np0005604791.novalocal NetworkManager[862]: <info>  [1770022844.8050] settings: Loaded settings plugin: keyfile (internal)
Feb 02 09:00:44 np0005604791.novalocal NetworkManager[862]: <info>  [1770022844.8100] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.54.3-2.el9/libnm-settings-plugin-ifcfg-rh.so")
Feb 02 09:00:44 np0005604791.novalocal systemd[1]: Starting Network Manager Script Dispatcher Service...
Feb 02 09:00:44 np0005604791.novalocal NetworkManager[862]: <info>  [1770022844.8209] Warning: the ifcfg-rh plugin is deprecated, please migrate connections to the keyfile format using "nmcli connection migrate"
Feb 02 09:00:44 np0005604791.novalocal NetworkManager[862]: <info>  [1770022844.8441] dhcp: init: Using DHCP client 'internal'
Feb 02 09:00:44 np0005604791.novalocal NetworkManager[862]: <info>  [1770022844.8449] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Feb 02 09:00:44 np0005604791.novalocal systemd[1]: Started Network Manager Script Dispatcher Service.
Feb 02 09:00:44 np0005604791.novalocal NetworkManager[862]: <info>  [1770022844.8484] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Feb 02 09:00:44 np0005604791.novalocal NetworkManager[862]: <info>  [1770022844.8531] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Feb 02 09:00:44 np0005604791.novalocal NetworkManager[862]: <info>  [1770022844.8570] device (lo): Activation: starting connection 'lo' (5d713ff7-af86-4df5-9d5a-ad7ed5dcc84d)
Feb 02 09:00:44 np0005604791.novalocal NetworkManager[862]: <info>  [1770022844.8583] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Feb 02 09:00:44 np0005604791.novalocal NetworkManager[862]: <info>  [1770022844.8589] device (eth0): state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Feb 02 09:00:44 np0005604791.novalocal systemd[1]: Started Network Manager.
Feb 02 09:00:44 np0005604791.novalocal NetworkManager[862]: <info>  [1770022844.8626] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Feb 02 09:00:44 np0005604791.novalocal NetworkManager[862]: <info>  [1770022844.8638] device (lo): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Feb 02 09:00:44 np0005604791.novalocal NetworkManager[862]: <info>  [1770022844.8645] device (lo): state change: prepare -> config (reason 'none', managed-type: 'external')
Feb 02 09:00:44 np0005604791.novalocal systemd[1]: Reached target Network.
Feb 02 09:00:44 np0005604791.novalocal NetworkManager[862]: <info>  [1770022844.8652] device (lo): state change: config -> ip-config (reason 'none', managed-type: 'external')
Feb 02 09:00:44 np0005604791.novalocal NetworkManager[862]: <info>  [1770022844.8659] device (eth0): carrier: link connected
Feb 02 09:00:44 np0005604791.novalocal NetworkManager[862]: <info>  [1770022844.8664] device (lo): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Feb 02 09:00:44 np0005604791.novalocal NetworkManager[862]: <info>  [1770022844.8675] device (eth0): state change: unavailable -> disconnected (reason 'carrier-changed', managed-type: 'full')
Feb 02 09:00:44 np0005604791.novalocal NetworkManager[862]: <info>  [1770022844.8685] policy: auto-activating connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Feb 02 09:00:44 np0005604791.novalocal NetworkManager[862]: <info>  [1770022844.8695] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Feb 02 09:00:44 np0005604791.novalocal NetworkManager[862]: <info>  [1770022844.8696] device (lo): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Feb 02 09:00:44 np0005604791.novalocal NetworkManager[862]: <info>  [1770022844.8701] device (eth0): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Feb 02 09:00:44 np0005604791.novalocal NetworkManager[862]: <info>  [1770022844.8709] manager: NetworkManager state is now CONNECTING
Feb 02 09:00:44 np0005604791.novalocal NetworkManager[862]: <info>  [1770022844.8713] device (eth0): state change: prepare -> config (reason 'none', managed-type: 'full')
Feb 02 09:00:44 np0005604791.novalocal NetworkManager[862]: <info>  [1770022844.8722] device (lo): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Feb 02 09:00:44 np0005604791.novalocal NetworkManager[862]: <info>  [1770022844.8733] device (lo): Activation: successful, device activated.
Feb 02 09:00:44 np0005604791.novalocal NetworkManager[862]: <info>  [1770022844.8750] device (eth0): state change: config -> ip-config (reason 'none', managed-type: 'full')
Feb 02 09:00:44 np0005604791.novalocal NetworkManager[862]: <info>  [1770022844.8758] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Feb 02 09:00:44 np0005604791.novalocal systemd[1]: Starting Network Manager Wait Online...
Feb 02 09:00:44 np0005604791.novalocal systemd[1]: Starting GSSAPI Proxy Daemon...
Feb 02 09:00:44 np0005604791.novalocal systemd[1]: Started GSSAPI Proxy Daemon.
Feb 02 09:00:44 np0005604791.novalocal systemd[1]: RPC security service for NFS client and server was skipped because of an unmet condition check (ConditionPathExists=/etc/krb5.keytab).
Feb 02 09:00:44 np0005604791.novalocal systemd[1]: Reached target NFS client services.
Feb 02 09:00:44 np0005604791.novalocal systemd[1]: Reached target Preparation for Remote File Systems.
Feb 02 09:00:44 np0005604791.novalocal systemd[1]: Reached target Remote File Systems.
Feb 02 09:00:44 np0005604791.novalocal systemd[1]: TPM2 PCR Barrier (User) was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Feb 02 09:00:45 np0005604791.novalocal NetworkManager[862]: <info>  [1770022845.6162] dhcp4 (eth0): state changed new lease, address=38.102.83.189
Feb 02 09:00:45 np0005604791.novalocal NetworkManager[862]: <info>  [1770022845.6180] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Feb 02 09:00:45 np0005604791.novalocal NetworkManager[862]: <info>  [1770022845.6206] device (eth0): state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Feb 02 09:00:45 np0005604791.novalocal NetworkManager[862]: <info>  [1770022845.6232] device (eth0): state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Feb 02 09:00:45 np0005604791.novalocal NetworkManager[862]: <info>  [1770022845.6235] device (eth0): state change: secondaries -> activated (reason 'none', managed-type: 'full')
Feb 02 09:00:45 np0005604791.novalocal NetworkManager[862]: <info>  [1770022845.6239] manager: NetworkManager state is now CONNECTED_SITE
Feb 02 09:00:45 np0005604791.novalocal NetworkManager[862]: <info>  [1770022845.6244] device (eth0): Activation: successful, device activated.
Feb 02 09:00:45 np0005604791.novalocal NetworkManager[862]: <info>  [1770022845.6250] manager: NetworkManager state is now CONNECTED_GLOBAL
Feb 02 09:00:45 np0005604791.novalocal NetworkManager[862]: <info>  [1770022845.6255] manager: startup complete
Feb 02 09:00:45 np0005604791.novalocal systemd[1]: Finished Network Manager Wait Online.
Feb 02 09:00:45 np0005604791.novalocal systemd[1]: Starting Cloud-init: Network Stage...
Feb 02 09:00:45 np0005604791.novalocal cloud-init[925]: Cloud-init v. 24.4-8.el9 running 'init' at Mon, 02 Feb 2026 09:00:45 +0000. Up 8.25 seconds.
Feb 02 09:00:45 np0005604791.novalocal cloud-init[925]: ci-info: +++++++++++++++++++++++++++++++++++++++Net device info+++++++++++++++++++++++++++++++++++++++
Feb 02 09:00:45 np0005604791.novalocal cloud-init[925]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Feb 02 09:00:45 np0005604791.novalocal cloud-init[925]: ci-info: | Device |  Up  |           Address            |      Mask     | Scope  |     Hw-Address    |
Feb 02 09:00:45 np0005604791.novalocal cloud-init[925]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Feb 02 09:00:45 np0005604791.novalocal cloud-init[925]: ci-info: |  eth0  | True |        38.102.83.189         | 255.255.255.0 | global | fa:16:3e:ac:ea:a6 |
Feb 02 09:00:45 np0005604791.novalocal cloud-init[925]: ci-info: |  eth0  | True | fe80::f816:3eff:feac:eaa6/64 |       .       |  link  | fa:16:3e:ac:ea:a6 |
Feb 02 09:00:45 np0005604791.novalocal cloud-init[925]: ci-info: |   lo   | True |          127.0.0.1           |   255.0.0.0   |  host  |         .         |
Feb 02 09:00:45 np0005604791.novalocal cloud-init[925]: ci-info: |   lo   | True |           ::1/128            |       .       |  host  |         .         |
Feb 02 09:00:45 np0005604791.novalocal cloud-init[925]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Feb 02 09:00:45 np0005604791.novalocal cloud-init[925]: ci-info: +++++++++++++++++++++++++++++++++Route IPv4 info+++++++++++++++++++++++++++++++++
Feb 02 09:00:45 np0005604791.novalocal cloud-init[925]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+
Feb 02 09:00:45 np0005604791.novalocal cloud-init[925]: ci-info: | Route |   Destination   |    Gateway    |     Genmask     | Interface | Flags |
Feb 02 09:00:45 np0005604791.novalocal cloud-init[925]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+
Feb 02 09:00:45 np0005604791.novalocal cloud-init[925]: ci-info: |   0   |     0.0.0.0     |  38.102.83.1  |     0.0.0.0     |    eth0   |   UG  |
Feb 02 09:00:45 np0005604791.novalocal cloud-init[925]: ci-info: |   1   |   38.102.83.0   |    0.0.0.0    |  255.255.255.0  |    eth0   |   U   |
Feb 02 09:00:45 np0005604791.novalocal cloud-init[925]: ci-info: |   2   | 169.254.169.254 | 38.102.83.126 | 255.255.255.255 |    eth0   |  UGH  |
Feb 02 09:00:45 np0005604791.novalocal cloud-init[925]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+
Feb 02 09:00:45 np0005604791.novalocal cloud-init[925]: ci-info: +++++++++++++++++++Route IPv6 info+++++++++++++++++++
Feb 02 09:00:45 np0005604791.novalocal cloud-init[925]: ci-info: +-------+-------------+---------+-----------+-------+
Feb 02 09:00:45 np0005604791.novalocal cloud-init[925]: ci-info: | Route | Destination | Gateway | Interface | Flags |
Feb 02 09:00:45 np0005604791.novalocal cloud-init[925]: ci-info: +-------+-------------+---------+-----------+-------+
Feb 02 09:00:45 np0005604791.novalocal cloud-init[925]: ci-info: |   1   |  fe80::/64  |    ::   |    eth0   |   U   |
Feb 02 09:00:45 np0005604791.novalocal cloud-init[925]: ci-info: |   3   |  multicast  |    ::   |    eth0   |   U   |
Feb 02 09:00:46 np0005604791.novalocal cloud-init[925]: ci-info: +-------+-------------+---------+-----------+-------+
Feb 02 09:00:47 np0005604791.novalocal useradd[992]: new group: name=cloud-user, GID=1001
Feb 02 09:00:47 np0005604791.novalocal useradd[992]: new user: name=cloud-user, UID=1001, GID=1001, home=/home/cloud-user, shell=/bin/bash, from=none
Feb 02 09:00:47 np0005604791.novalocal useradd[992]: add 'cloud-user' to group 'adm'
Feb 02 09:00:47 np0005604791.novalocal useradd[992]: add 'cloud-user' to group 'systemd-journal'
Feb 02 09:00:47 np0005604791.novalocal useradd[992]: add 'cloud-user' to shadow group 'adm'
Feb 02 09:00:47 np0005604791.novalocal useradd[992]: add 'cloud-user' to shadow group 'systemd-journal'
Feb 02 09:00:48 np0005604791.novalocal cloud-init[925]: Generating public/private rsa key pair.
Feb 02 09:00:48 np0005604791.novalocal cloud-init[925]: Your identification has been saved in /etc/ssh/ssh_host_rsa_key
Feb 02 09:00:48 np0005604791.novalocal cloud-init[925]: Your public key has been saved in /etc/ssh/ssh_host_rsa_key.pub
Feb 02 09:00:48 np0005604791.novalocal cloud-init[925]: The key fingerprint is:
Feb 02 09:00:48 np0005604791.novalocal cloud-init[925]: SHA256:SATBxxODRQB/dRXEaifL7F2mZu61vPlj0VS7a3P8BAc root@np0005604791.novalocal
Feb 02 09:00:48 np0005604791.novalocal cloud-init[925]: The key's randomart image is:
Feb 02 09:00:48 np0005604791.novalocal cloud-init[925]: +---[RSA 3072]----+
Feb 02 09:00:48 np0005604791.novalocal cloud-init[925]: |  .o+O*.. .++.   |
Feb 02 09:00:48 np0005604791.novalocal cloud-init[925]: |   .o.+o .  .   .|
Feb 02 09:00:48 np0005604791.novalocal cloud-init[925]: |    ..o.   .  E o|
Feb 02 09:00:48 np0005604791.novalocal cloud-init[925]: |     o .  + .  o.|
Feb 02 09:00:48 np0005604791.novalocal cloud-init[925]: |      . S+ +  ..+|
Feb 02 09:00:48 np0005604791.novalocal cloud-init[925]: |          +   o=.|
Feb 02 09:00:48 np0005604791.novalocal cloud-init[925]: |         . . +..+|
Feb 02 09:00:48 np0005604791.novalocal cloud-init[925]: |          . =o Xo|
Feb 02 09:00:48 np0005604791.novalocal cloud-init[925]: |           =o BoB|
Feb 02 09:00:48 np0005604791.novalocal cloud-init[925]: +----[SHA256]-----+
Feb 02 09:00:48 np0005604791.novalocal cloud-init[925]: Generating public/private ecdsa key pair.
Feb 02 09:00:48 np0005604791.novalocal cloud-init[925]: Your identification has been saved in /etc/ssh/ssh_host_ecdsa_key
Feb 02 09:00:48 np0005604791.novalocal cloud-init[925]: Your public key has been saved in /etc/ssh/ssh_host_ecdsa_key.pub
Feb 02 09:00:48 np0005604791.novalocal cloud-init[925]: The key fingerprint is:
Feb 02 09:00:48 np0005604791.novalocal cloud-init[925]: SHA256:3cWqaKHv8gpYx1XjtiGfkeI9XXQDI6cmrbyggix79RE root@np0005604791.novalocal
Feb 02 09:00:48 np0005604791.novalocal cloud-init[925]: The key's randomart image is:
Feb 02 09:00:48 np0005604791.novalocal cloud-init[925]: +---[ECDSA 256]---+
Feb 02 09:00:48 np0005604791.novalocal cloud-init[925]: |          o. +o..|
Feb 02 09:00:48 np0005604791.novalocal cloud-init[925]: |         o.o+o...|
Feb 02 09:00:48 np0005604791.novalocal cloud-init[925]: |        +.*+  +  |
Feb 02 09:00:48 np0005604791.novalocal cloud-init[925]: |     .Eo.B+B +   |
Feb 02 09:00:48 np0005604791.novalocal cloud-init[925]: |    . ooSoB +    |
Feb 02 09:00:48 np0005604791.novalocal cloud-init[925]: |. .o..o..o.o     |
Feb 02 09:00:48 np0005604791.novalocal cloud-init[925]: |.o.o.o..o..      |
Feb 02 09:00:48 np0005604791.novalocal cloud-init[925]: |... ..oo         |
Feb 02 09:00:48 np0005604791.novalocal cloud-init[925]: |..    .=+        |
Feb 02 09:00:48 np0005604791.novalocal cloud-init[925]: +----[SHA256]-----+
Feb 02 09:00:48 np0005604791.novalocal cloud-init[925]: Generating public/private ed25519 key pair.
Feb 02 09:00:48 np0005604791.novalocal cloud-init[925]: Your identification has been saved in /etc/ssh/ssh_host_ed25519_key
Feb 02 09:00:48 np0005604791.novalocal cloud-init[925]: Your public key has been saved in /etc/ssh/ssh_host_ed25519_key.pub
Feb 02 09:00:48 np0005604791.novalocal cloud-init[925]: The key fingerprint is:
Feb 02 09:00:48 np0005604791.novalocal cloud-init[925]: SHA256:aYizx4EWEpqD0HGvB5mqoP2aGGblYGGFbTG9xBqOYkw root@np0005604791.novalocal
Feb 02 09:00:48 np0005604791.novalocal cloud-init[925]: The key's randomart image is:
Feb 02 09:00:48 np0005604791.novalocal cloud-init[925]: +--[ED25519 256]--+
Feb 02 09:00:48 np0005604791.novalocal cloud-init[925]: |..===            |
Feb 02 09:00:48 np0005604791.novalocal cloud-init[925]: |oEo*.B           |
Feb 02 09:00:48 np0005604791.novalocal cloud-init[925]: |Bo= O o          |
Feb 02 09:00:48 np0005604791.novalocal cloud-init[925]: |o=.= B . .       |
Feb 02 09:00:48 np0005604791.novalocal cloud-init[925]: |+o..* + S        |
Feb 02 09:00:48 np0005604791.novalocal cloud-init[925]: |+o+. = o         |
Feb 02 09:00:48 np0005604791.novalocal cloud-init[925]: |+o... o          |
Feb 02 09:00:48 np0005604791.novalocal cloud-init[925]: |oo o .           |
Feb 02 09:00:48 np0005604791.novalocal cloud-init[925]: |. o..            |
Feb 02 09:00:48 np0005604791.novalocal cloud-init[925]: +----[SHA256]-----+
Feb 02 09:00:48 np0005604791.novalocal systemd[1]: Finished Cloud-init: Network Stage.
Feb 02 09:00:48 np0005604791.novalocal systemd[1]: Reached target Cloud-config availability.
Feb 02 09:00:48 np0005604791.novalocal systemd[1]: Reached target Network is Online.
Feb 02 09:00:48 np0005604791.novalocal systemd[1]: Starting Cloud-init: Config Stage...
Feb 02 09:00:48 np0005604791.novalocal systemd[1]: Starting Crash recovery kernel arming...
Feb 02 09:00:48 np0005604791.novalocal systemd[1]: Starting Notify NFS peers of a restart...
Feb 02 09:00:48 np0005604791.novalocal systemd[1]: Starting System Logging Service...
Feb 02 09:00:48 np0005604791.novalocal systemd[1]: Starting OpenSSH server daemon...
Feb 02 09:00:48 np0005604791.novalocal sm-notify[1008]: Version 2.5.4 starting
Feb 02 09:00:48 np0005604791.novalocal systemd[1]: Starting Permit User Sessions...
Feb 02 09:00:48 np0005604791.novalocal systemd[1]: Started Notify NFS peers of a restart.
Feb 02 09:00:48 np0005604791.novalocal systemd[1]: Finished Permit User Sessions.
Feb 02 09:00:48 np0005604791.novalocal sshd[1010]: Server listening on 0.0.0.0 port 22.
Feb 02 09:00:48 np0005604791.novalocal sshd[1010]: Server listening on :: port 22.
Feb 02 09:00:48 np0005604791.novalocal systemd[1]: Started Command Scheduler.
Feb 02 09:00:48 np0005604791.novalocal systemd[1]: Started Getty on tty1.
Feb 02 09:00:48 np0005604791.novalocal systemd[1]: Started Serial Getty on ttyS0.
Feb 02 09:00:48 np0005604791.novalocal systemd[1]: Reached target Login Prompts.
Feb 02 09:00:48 np0005604791.novalocal systemd[1]: Started OpenSSH server daemon.
Feb 02 09:00:48 np0005604791.novalocal crond[1013]: (CRON) STARTUP (1.5.7)
Feb 02 09:00:48 np0005604791.novalocal crond[1013]: (CRON) INFO (Syslog will be used instead of sendmail.)
Feb 02 09:00:48 np0005604791.novalocal crond[1013]: (CRON) INFO (RANDOM_DELAY will be scaled with factor 66% if used.)
Feb 02 09:00:48 np0005604791.novalocal crond[1013]: (CRON) INFO (running with inotify support)
Feb 02 09:00:48 np0005604791.novalocal rsyslogd[1009]: [origin software="rsyslogd" swVersion="8.2510.0-2.el9" x-pid="1009" x-info="https://www.rsyslog.com"] start
Feb 02 09:00:48 np0005604791.novalocal rsyslogd[1009]: imjournal: No statefile exists, /var/lib/rsyslog/imjournal.state will be created (ignore if this is first run): No such file or directory [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2040 ]
Feb 02 09:00:48 np0005604791.novalocal systemd[1]: Started System Logging Service.
Feb 02 09:00:48 np0005604791.novalocal systemd[1]: Reached target Multi-User System.
Feb 02 09:00:48 np0005604791.novalocal systemd[1]: Starting Record Runlevel Change in UTMP...
Feb 02 09:00:48 np0005604791.novalocal systemd[1]: systemd-update-utmp-runlevel.service: Deactivated successfully.
Feb 02 09:00:48 np0005604791.novalocal systemd[1]: Finished Record Runlevel Change in UTMP.
Feb 02 09:00:48 np0005604791.novalocal rsyslogd[1009]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Feb 02 09:00:48 np0005604791.novalocal kdumpctl[1018]: kdump: No kdump initial ramdisk found.
Feb 02 09:00:48 np0005604791.novalocal kdumpctl[1018]: kdump: Rebuilding /boot/initramfs-5.14.0-665.el9.x86_64kdump.img
Feb 02 09:00:48 np0005604791.novalocal cloud-init[1154]: Cloud-init v. 24.4-8.el9 running 'modules:config' at Mon, 02 Feb 2026 09:00:48 +0000. Up 10.88 seconds.
Feb 02 09:00:48 np0005604791.novalocal systemd[1]: Finished Cloud-init: Config Stage.
Feb 02 09:00:48 np0005604791.novalocal systemd[1]: Starting Cloud-init: Final Stage...
Feb 02 09:00:48 np0005604791.novalocal dracut[1269]: dracut-057-102.git20250818.el9
Feb 02 09:00:48 np0005604791.novalocal dracut[1271]: Executing: /usr/bin/dracut --quiet --hostonly --hostonly-cmdline --hostonly-i18n --hostonly-mode strict --hostonly-nics  --mount "/dev/disk/by-uuid/822f14ea-6e7e-41df-b0d8-fbe282d9ded8 /sysroot xfs rw,relatime,seclabel,attr2,inode64,logbufs=8,logbsize=32k,noquota" --squash-compressor zstd --no-hostonly-default-device --add-confdir /lib/kdump/dracut.conf.d -f /boot/initramfs-5.14.0-665.el9.x86_64kdump.img 5.14.0-665.el9.x86_64
Feb 02 09:00:48 np0005604791.novalocal cloud-init[1305]: Cloud-init v. 24.4-8.el9 running 'modules:final' at Mon, 02 Feb 2026 09:00:48 +0000. Up 11.23 seconds.
Feb 02 09:00:48 np0005604791.novalocal cloud-init[1341]: #############################################################
Feb 02 09:00:48 np0005604791.novalocal cloud-init[1342]: -----BEGIN SSH HOST KEY FINGERPRINTS-----
Feb 02 09:00:48 np0005604791.novalocal cloud-init[1344]: 256 SHA256:3cWqaKHv8gpYx1XjtiGfkeI9XXQDI6cmrbyggix79RE root@np0005604791.novalocal (ECDSA)
Feb 02 09:00:48 np0005604791.novalocal cloud-init[1346]: 256 SHA256:aYizx4EWEpqD0HGvB5mqoP2aGGblYGGFbTG9xBqOYkw root@np0005604791.novalocal (ED25519)
Feb 02 09:00:49 np0005604791.novalocal cloud-init[1348]: 3072 SHA256:SATBxxODRQB/dRXEaifL7F2mZu61vPlj0VS7a3P8BAc root@np0005604791.novalocal (RSA)
Feb 02 09:00:49 np0005604791.novalocal cloud-init[1349]: -----END SSH HOST KEY FINGERPRINTS-----
Feb 02 09:00:49 np0005604791.novalocal cloud-init[1350]: #############################################################
Feb 02 09:00:49 np0005604791.novalocal cloud-init[1305]: Cloud-init v. 24.4-8.el9 finished at Mon, 02 Feb 2026 09:00:49 +0000. Datasource DataSourceConfigDrive [net,ver=2][source=/dev/sr0].  Up 11.41 seconds
Feb 02 09:00:49 np0005604791.novalocal sshd-session[1362]: Unable to negotiate with 38.102.83.114 port 54106: no matching host key type found. Their offer: ssh-ed25519,ssh-ed25519-cert-v01@openssh.com [preauth]
Feb 02 09:00:49 np0005604791.novalocal sshd-session[1372]: Unable to negotiate with 38.102.83.114 port 54124: no matching host key type found. Their offer: ecdsa-sha2-nistp384,ecdsa-sha2-nistp384-cert-v01@openssh.com [preauth]
Feb 02 09:00:49 np0005604791.novalocal systemd[1]: Finished Cloud-init: Final Stage.
Feb 02 09:00:49 np0005604791.novalocal systemd[1]: Reached target Cloud-init target.
Feb 02 09:00:49 np0005604791.novalocal sshd-session[1377]: Unable to negotiate with 38.102.83.114 port 54128: no matching host key type found. Their offer: ecdsa-sha2-nistp521,ecdsa-sha2-nistp521-cert-v01@openssh.com [preauth]
Feb 02 09:00:49 np0005604791.novalocal sshd-session[1390]: Connection reset by 38.102.83.114 port 54156 [preauth]
Feb 02 09:00:49 np0005604791.novalocal sshd-session[1356]: Connection closed by 38.102.83.114 port 54098 [preauth]
Feb 02 09:00:49 np0005604791.novalocal sshd-session[1398]: Unable to negotiate with 38.102.83.114 port 54160: no matching host key type found. Their offer: ssh-rsa,ssh-rsa-cert-v01@openssh.com [preauth]
Feb 02 09:00:49 np0005604791.novalocal sshd-session[1403]: Unable to negotiate with 38.102.83.114 port 54170: no matching host key type found. Their offer: ssh-dss,ssh-dss-cert-v01@openssh.com [preauth]
Feb 02 09:00:49 np0005604791.novalocal sshd-session[1367]: Connection closed by 38.102.83.114 port 54122 [preauth]
Feb 02 09:00:49 np0005604791.novalocal sshd-session[1384]: Connection closed by 38.102.83.114 port 54140 [preauth]
Feb 02 09:00:49 np0005604791.novalocal dracut[1271]: dracut module 'systemd-networkd' will not be installed, because command 'networkctl' could not be found!
Feb 02 09:00:49 np0005604791.novalocal dracut[1271]: dracut module 'systemd-networkd' will not be installed, because command '/usr/lib/systemd/systemd-networkd' could not be found!
Feb 02 09:00:49 np0005604791.novalocal dracut[1271]: dracut module 'systemd-networkd' will not be installed, because command '/usr/lib/systemd/systemd-networkd-wait-online' could not be found!
Feb 02 09:00:49 np0005604791.novalocal dracut[1271]: dracut module 'systemd-resolved' will not be installed, because command 'resolvectl' could not be found!
Feb 02 09:00:49 np0005604791.novalocal dracut[1271]: dracut module 'systemd-resolved' will not be installed, because command '/usr/lib/systemd/systemd-resolved' could not be found!
Feb 02 09:00:49 np0005604791.novalocal dracut[1271]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-timesyncd' could not be found!
Feb 02 09:00:49 np0005604791.novalocal dracut[1271]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-time-wait-sync' could not be found!
Feb 02 09:00:49 np0005604791.novalocal dracut[1271]: dracut module 'busybox' will not be installed, because command 'busybox' could not be found!
Feb 02 09:00:49 np0005604791.novalocal dracut[1271]: dracut module 'dbus-daemon' will not be installed, because command 'dbus-daemon' could not be found!
Feb 02 09:00:49 np0005604791.novalocal dracut[1271]: dracut module 'rngd' will not be installed, because command 'rngd' could not be found!
Feb 02 09:00:49 np0005604791.novalocal dracut[1271]: dracut module 'connman' will not be installed, because command 'connmand' could not be found!
Feb 02 09:00:49 np0005604791.novalocal dracut[1271]: dracut module 'connman' will not be installed, because command 'connmanctl' could not be found!
Feb 02 09:00:49 np0005604791.novalocal dracut[1271]: dracut module 'connman' will not be installed, because command 'connmand-wait-online' could not be found!
Feb 02 09:00:49 np0005604791.novalocal dracut[1271]: dracut module 'network-wicked' will not be installed, because command 'wicked' could not be found!
Feb 02 09:00:49 np0005604791.novalocal dracut[1271]: Module 'ifcfg' will not be installed, because it's in the list to be omitted!
Feb 02 09:00:49 np0005604791.novalocal dracut[1271]: Module 'plymouth' will not be installed, because it's in the list to be omitted!
Feb 02 09:00:49 np0005604791.novalocal dracut[1271]: 62bluetooth: Could not find any command of '/usr/lib/bluetooth/bluetoothd /usr/libexec/bluetooth/bluetoothd'!
Feb 02 09:00:49 np0005604791.novalocal dracut[1271]: dracut module 'lvmmerge' will not be installed, because command 'lvm' could not be found!
Feb 02 09:00:49 np0005604791.novalocal dracut[1271]: dracut module 'lvmthinpool-monitor' will not be installed, because command 'lvm' could not be found!
Feb 02 09:00:49 np0005604791.novalocal dracut[1271]: dracut module 'btrfs' will not be installed, because command 'btrfs' could not be found!
Feb 02 09:00:49 np0005604791.novalocal dracut[1271]: dracut module 'dmraid' will not be installed, because command 'dmraid' could not be found!
Feb 02 09:00:49 np0005604791.novalocal dracut[1271]: dracut module 'lvm' will not be installed, because command 'lvm' could not be found!
Feb 02 09:00:49 np0005604791.novalocal dracut[1271]: dracut module 'mdraid' will not be installed, because command 'mdadm' could not be found!
Feb 02 09:00:49 np0005604791.novalocal dracut[1271]: dracut module 'pcsc' will not be installed, because command 'pcscd' could not be found!
Feb 02 09:00:49 np0005604791.novalocal dracut[1271]: dracut module 'tpm2-tss' will not be installed, because command 'tpm2' could not be found!
Feb 02 09:00:49 np0005604791.novalocal dracut[1271]: dracut module 'cifs' will not be installed, because command 'mount.cifs' could not be found!
Feb 02 09:00:49 np0005604791.novalocal dracut[1271]: dracut module 'iscsi' will not be installed, because command 'iscsi-iname' could not be found!
Feb 02 09:00:49 np0005604791.novalocal dracut[1271]: dracut module 'iscsi' will not be installed, because command 'iscsiadm' could not be found!
Feb 02 09:00:49 np0005604791.novalocal dracut[1271]: dracut module 'iscsi' will not be installed, because command 'iscsid' could not be found!
Feb 02 09:00:49 np0005604791.novalocal dracut[1271]: dracut module 'nvmf' will not be installed, because command 'nvme' could not be found!
Feb 02 09:00:49 np0005604791.novalocal dracut[1271]: Module 'resume' will not be installed, because it's in the list to be omitted!
Feb 02 09:00:49 np0005604791.novalocal dracut[1271]: dracut module 'biosdevname' will not be installed, because command 'biosdevname' could not be found!
Feb 02 09:00:49 np0005604791.novalocal dracut[1271]: Module 'earlykdump' will not be installed, because it's in the list to be omitted!
Feb 02 09:00:49 np0005604791.novalocal dracut[1271]: dracut module 'memstrack' will not be installed, because command 'memstrack' could not be found!
Feb 02 09:00:49 np0005604791.novalocal dracut[1271]: memstrack is not available
Feb 02 09:00:49 np0005604791.novalocal dracut[1271]: If you need to use rd.memdebug>=4, please install memstrack and procps-ng
Feb 02 09:00:49 np0005604791.novalocal dracut[1271]: dracut module 'systemd-resolved' will not be installed, because command 'resolvectl' could not be found!
Feb 02 09:00:49 np0005604791.novalocal dracut[1271]: dracut module 'systemd-resolved' will not be installed, because command '/usr/lib/systemd/systemd-resolved' could not be found!
Feb 02 09:00:49 np0005604791.novalocal dracut[1271]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-timesyncd' could not be found!
Feb 02 09:00:49 np0005604791.novalocal dracut[1271]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-time-wait-sync' could not be found!
Feb 02 09:00:49 np0005604791.novalocal dracut[1271]: dracut module 'busybox' will not be installed, because command 'busybox' could not be found!
Feb 02 09:00:49 np0005604791.novalocal dracut[1271]: dracut module 'dbus-daemon' will not be installed, because command 'dbus-daemon' could not be found!
Feb 02 09:00:49 np0005604791.novalocal dracut[1271]: dracut module 'rngd' will not be installed, because command 'rngd' could not be found!
Feb 02 09:00:49 np0005604791.novalocal dracut[1271]: dracut module 'connman' will not be installed, because command 'connmand' could not be found!
Feb 02 09:00:49 np0005604791.novalocal dracut[1271]: dracut module 'connman' will not be installed, because command 'connmanctl' could not be found!
Feb 02 09:00:49 np0005604791.novalocal dracut[1271]: dracut module 'connman' will not be installed, because command 'connmand-wait-online' could not be found!
Feb 02 09:00:49 np0005604791.novalocal dracut[1271]: dracut module 'network-wicked' will not be installed, because command 'wicked' could not be found!
Feb 02 09:00:49 np0005604791.novalocal dracut[1271]: 62bluetooth: Could not find any command of '/usr/lib/bluetooth/bluetoothd /usr/libexec/bluetooth/bluetoothd'!
Feb 02 09:00:49 np0005604791.novalocal dracut[1271]: dracut module 'lvmmerge' will not be installed, because command 'lvm' could not be found!
Feb 02 09:00:49 np0005604791.novalocal dracut[1271]: dracut module 'lvmthinpool-monitor' will not be installed, because command 'lvm' could not be found!
Feb 02 09:00:49 np0005604791.novalocal dracut[1271]: dracut module 'btrfs' will not be installed, because command 'btrfs' could not be found!
Feb 02 09:00:49 np0005604791.novalocal dracut[1271]: dracut module 'dmraid' will not be installed, because command 'dmraid' could not be found!
Feb 02 09:00:49 np0005604791.novalocal dracut[1271]: dracut module 'lvm' will not be installed, because command 'lvm' could not be found!
Feb 02 09:00:49 np0005604791.novalocal dracut[1271]: dracut module 'mdraid' will not be installed, because command 'mdadm' could not be found!
Feb 02 09:00:49 np0005604791.novalocal dracut[1271]: dracut module 'pcsc' will not be installed, because command 'pcscd' could not be found!
Feb 02 09:00:49 np0005604791.novalocal dracut[1271]: dracut module 'tpm2-tss' will not be installed, because command 'tpm2' could not be found!
Feb 02 09:00:49 np0005604791.novalocal dracut[1271]: dracut module 'cifs' will not be installed, because command 'mount.cifs' could not be found!
Feb 02 09:00:49 np0005604791.novalocal dracut[1271]: dracut module 'iscsi' will not be installed, because command 'iscsi-iname' could not be found!
Feb 02 09:00:49 np0005604791.novalocal dracut[1271]: dracut module 'iscsi' will not be installed, because command 'iscsiadm' could not be found!
Feb 02 09:00:49 np0005604791.novalocal dracut[1271]: dracut module 'iscsi' will not be installed, because command 'iscsid' could not be found!
Feb 02 09:00:49 np0005604791.novalocal dracut[1271]: dracut module 'nvmf' will not be installed, because command 'nvme' could not be found!
Feb 02 09:00:49 np0005604791.novalocal dracut[1271]: dracut module 'memstrack' will not be installed, because command 'memstrack' could not be found!
Feb 02 09:00:49 np0005604791.novalocal dracut[1271]: memstrack is not available
Feb 02 09:00:49 np0005604791.novalocal dracut[1271]: If you need to use rd.memdebug>=4, please install memstrack and procps-ng
Feb 02 09:00:50 np0005604791.novalocal dracut[1271]: *** Including module: systemd ***
Feb 02 09:00:50 np0005604791.novalocal dracut[1271]: *** Including module: fips ***
Feb 02 09:00:50 np0005604791.novalocal dracut[1271]: *** Including module: systemd-initrd ***
Feb 02 09:00:50 np0005604791.novalocal dracut[1271]: *** Including module: i18n ***
Feb 02 09:00:50 np0005604791.novalocal dracut[1271]: *** Including module: drm ***
Feb 02 09:00:50 np0005604791.novalocal chronyd[811]: Selected source 149.56.19.163 (2.centos.pool.ntp.org)
Feb 02 09:00:50 np0005604791.novalocal chronyd[811]: System clock TAI offset set to 37 seconds
Feb 02 09:00:50 np0005604791.novalocal dracut[1271]: *** Including module: prefixdevname ***
Feb 02 09:00:50 np0005604791.novalocal dracut[1271]: *** Including module: kernel-modules ***
Feb 02 09:00:51 np0005604791.novalocal kernel: block vda: the capability attribute has been deprecated.
Feb 02 09:00:51 np0005604791.novalocal dracut[1271]: *** Including module: kernel-modules-extra ***
Feb 02 09:00:51 np0005604791.novalocal dracut[1271]:   kernel-modules-extra: configuration source "/run/depmod.d" does not exist
Feb 02 09:00:51 np0005604791.novalocal dracut[1271]:   kernel-modules-extra: configuration source "/lib/depmod.d" does not exist
Feb 02 09:00:51 np0005604791.novalocal dracut[1271]:   kernel-modules-extra: parsing configuration file "/etc/depmod.d/dist.conf"
Feb 02 09:00:51 np0005604791.novalocal dracut[1271]:   kernel-modules-extra: /etc/depmod.d/dist.conf: added "updates extra built-in weak-updates" to the list of search directories
Feb 02 09:00:51 np0005604791.novalocal dracut[1271]: *** Including module: qemu ***
Feb 02 09:00:51 np0005604791.novalocal dracut[1271]: *** Including module: fstab-sys ***
Feb 02 09:00:51 np0005604791.novalocal dracut[1271]: *** Including module: rootfs-block ***
Feb 02 09:00:51 np0005604791.novalocal dracut[1271]: *** Including module: terminfo ***
Feb 02 09:00:51 np0005604791.novalocal dracut[1271]: *** Including module: udev-rules ***
Feb 02 09:00:52 np0005604791.novalocal dracut[1271]: Skipping udev rule: 91-permissions.rules
Feb 02 09:00:52 np0005604791.novalocal dracut[1271]: Skipping udev rule: 80-drivers-modprobe.rules
Feb 02 09:00:52 np0005604791.novalocal dracut[1271]: *** Including module: virtiofs ***
Feb 02 09:00:52 np0005604791.novalocal dracut[1271]: *** Including module: dracut-systemd ***
Feb 02 09:00:52 np0005604791.novalocal dracut[1271]: *** Including module: usrmount ***
Feb 02 09:00:52 np0005604791.novalocal dracut[1271]: *** Including module: base ***
Feb 02 09:00:52 np0005604791.novalocal dracut[1271]: *** Including module: fs-lib ***
Feb 02 09:00:52 np0005604791.novalocal dracut[1271]: *** Including module: kdumpbase ***
Feb 02 09:00:52 np0005604791.novalocal dracut[1271]: *** Including module: microcode_ctl-fw_dir_override ***
Feb 02 09:00:52 np0005604791.novalocal dracut[1271]:   microcode_ctl module: mangling fw_dir
Feb 02 09:00:52 np0005604791.novalocal dracut[1271]:     microcode_ctl: reset fw_dir to "/lib/firmware/updates /lib/firmware"
Feb 02 09:00:52 np0005604791.novalocal dracut[1271]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel"...
Feb 02 09:00:52 np0005604791.novalocal dracut[1271]:     microcode_ctl: configuration "intel" is ignored
Feb 02 09:00:52 np0005604791.novalocal dracut[1271]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-2d-07"...
Feb 02 09:00:52 np0005604791.novalocal dracut[1271]:     microcode_ctl: configuration "intel-06-2d-07" is ignored
Feb 02 09:00:52 np0005604791.novalocal dracut[1271]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-4e-03"...
Feb 02 09:00:52 np0005604791.novalocal dracut[1271]:     microcode_ctl: configuration "intel-06-4e-03" is ignored
Feb 02 09:00:52 np0005604791.novalocal dracut[1271]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-4f-01"...
Feb 02 09:00:52 np0005604791.novalocal dracut[1271]:     microcode_ctl: configuration "intel-06-4f-01" is ignored
Feb 02 09:00:52 np0005604791.novalocal dracut[1271]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-55-04"...
Feb 02 09:00:52 np0005604791.novalocal dracut[1271]:     microcode_ctl: configuration "intel-06-55-04" is ignored
Feb 02 09:00:52 np0005604791.novalocal dracut[1271]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-5e-03"...
Feb 02 09:00:53 np0005604791.novalocal dracut[1271]:     microcode_ctl: configuration "intel-06-5e-03" is ignored
Feb 02 09:00:53 np0005604791.novalocal dracut[1271]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8c-01"...
Feb 02 09:00:53 np0005604791.novalocal dracut[1271]:     microcode_ctl: configuration "intel-06-8c-01" is ignored
Feb 02 09:00:53 np0005604791.novalocal dracut[1271]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8e-9e-0x-0xca"...
Feb 02 09:00:53 np0005604791.novalocal dracut[1271]:     microcode_ctl: configuration "intel-06-8e-9e-0x-0xca" is ignored
Feb 02 09:00:53 np0005604791.novalocal dracut[1271]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8e-9e-0x-dell"...
Feb 02 09:00:53 np0005604791.novalocal dracut[1271]:     microcode_ctl: configuration "intel-06-8e-9e-0x-dell" is ignored
Feb 02 09:00:53 np0005604791.novalocal dracut[1271]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8f-08"...
Feb 02 09:00:53 np0005604791.novalocal dracut[1271]:     microcode_ctl: configuration "intel-06-8f-08" is ignored
Feb 02 09:00:53 np0005604791.novalocal dracut[1271]:     microcode_ctl: final fw_dir: "/lib/firmware/updates /lib/firmware"
Feb 02 09:00:53 np0005604791.novalocal dracut[1271]: *** Including module: openssl ***
Feb 02 09:00:53 np0005604791.novalocal dracut[1271]: *** Including module: shutdown ***
Feb 02 09:00:53 np0005604791.novalocal dracut[1271]: *** Including module: squash ***
Feb 02 09:00:53 np0005604791.novalocal dracut[1271]: *** Including modules done ***
Feb 02 09:00:53 np0005604791.novalocal dracut[1271]: *** Installing kernel module dependencies ***
Feb 02 09:00:53 np0005604791.novalocal irqbalance[804]: Cannot change IRQ 25 affinity: Operation not permitted
Feb 02 09:00:53 np0005604791.novalocal irqbalance[804]: IRQ 25 affinity is now unmanaged
Feb 02 09:00:53 np0005604791.novalocal irqbalance[804]: Cannot change IRQ 31 affinity: Operation not permitted
Feb 02 09:00:53 np0005604791.novalocal irqbalance[804]: IRQ 31 affinity is now unmanaged
Feb 02 09:00:53 np0005604791.novalocal irqbalance[804]: Cannot change IRQ 28 affinity: Operation not permitted
Feb 02 09:00:53 np0005604791.novalocal irqbalance[804]: IRQ 28 affinity is now unmanaged
Feb 02 09:00:53 np0005604791.novalocal irqbalance[804]: Cannot change IRQ 32 affinity: Operation not permitted
Feb 02 09:00:53 np0005604791.novalocal irqbalance[804]: IRQ 32 affinity is now unmanaged
Feb 02 09:00:53 np0005604791.novalocal irqbalance[804]: Cannot change IRQ 30 affinity: Operation not permitted
Feb 02 09:00:53 np0005604791.novalocal irqbalance[804]: IRQ 30 affinity is now unmanaged
Feb 02 09:00:53 np0005604791.novalocal irqbalance[804]: Cannot change IRQ 29 affinity: Operation not permitted
Feb 02 09:00:53 np0005604791.novalocal irqbalance[804]: IRQ 29 affinity is now unmanaged
Feb 02 09:00:54 np0005604791.novalocal dracut[1271]: *** Installing kernel module dependencies done ***
Feb 02 09:00:54 np0005604791.novalocal dracut[1271]: *** Resolving executable dependencies ***
Feb 02 09:00:55 np0005604791.novalocal dracut[1271]: *** Resolving executable dependencies done ***
Feb 02 09:00:55 np0005604791.novalocal dracut[1271]: *** Generating early-microcode cpio image ***
Feb 02 09:00:55 np0005604791.novalocal dracut[1271]: *** Store current command line parameters ***
Feb 02 09:00:55 np0005604791.novalocal dracut[1271]: Stored kernel commandline:
Feb 02 09:00:55 np0005604791.novalocal dracut[1271]: No dracut internal kernel commandline stored in the initramfs
Feb 02 09:00:55 np0005604791.novalocal systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Feb 02 09:00:55 np0005604791.novalocal dracut[1271]: *** Install squash loader ***
Feb 02 09:00:56 np0005604791.novalocal dracut[1271]: *** Squashing the files inside the initramfs ***
Feb 02 09:00:57 np0005604791.novalocal dracut[1271]: *** Squashing the files inside the initramfs done ***
Feb 02 09:00:57 np0005604791.novalocal dracut[1271]: *** Creating image file '/boot/initramfs-5.14.0-665.el9.x86_64kdump.img' ***
Feb 02 09:00:57 np0005604791.novalocal dracut[1271]: *** Hardlinking files ***
Feb 02 09:00:57 np0005604791.novalocal dracut[1271]: Mode:           real
Feb 02 09:00:57 np0005604791.novalocal dracut[1271]: Files:          50
Feb 02 09:00:57 np0005604791.novalocal dracut[1271]: Linked:         0 files
Feb 02 09:00:57 np0005604791.novalocal dracut[1271]: Compared:       0 xattrs
Feb 02 09:00:57 np0005604791.novalocal dracut[1271]: Compared:       0 files
Feb 02 09:00:57 np0005604791.novalocal dracut[1271]: Saved:          0 B
Feb 02 09:00:57 np0005604791.novalocal dracut[1271]: Duration:       0.000937 seconds
Feb 02 09:00:57 np0005604791.novalocal dracut[1271]: *** Hardlinking files done ***
Feb 02 09:00:58 np0005604791.novalocal dracut[1271]: *** Creating initramfs image file '/boot/initramfs-5.14.0-665.el9.x86_64kdump.img' done ***
Feb 02 09:00:58 np0005604791.novalocal kdumpctl[1018]: kdump: kexec: loaded kdump kernel
Feb 02 09:00:58 np0005604791.novalocal kdumpctl[1018]: kdump: Starting kdump: [OK]
Feb 02 09:00:58 np0005604791.novalocal systemd[1]: Finished Crash recovery kernel arming.
Feb 02 09:00:58 np0005604791.novalocal systemd[1]: Startup finished in 1.235s (kernel) + 2.610s (initrd) + 17.100s (userspace) = 20.946s.
Feb 02 09:01:01 np0005604791.novalocal CROND[4307]: (root) CMD (run-parts /etc/cron.hourly)
Feb 02 09:01:01 np0005604791.novalocal run-parts[4310]: (/etc/cron.hourly) starting 0anacron
Feb 02 09:01:01 np0005604791.novalocal anacron[4318]: Anacron started on 2026-02-02
Feb 02 09:01:01 np0005604791.novalocal anacron[4318]: Will run job `cron.daily' in 21 min.
Feb 02 09:01:01 np0005604791.novalocal anacron[4318]: Will run job `cron.weekly' in 41 min.
Feb 02 09:01:01 np0005604791.novalocal anacron[4318]: Will run job `cron.monthly' in 61 min.
Feb 02 09:01:01 np0005604791.novalocal anacron[4318]: Jobs will be executed sequentially
Feb 02 09:01:01 np0005604791.novalocal run-parts[4320]: (/etc/cron.hourly) finished 0anacron
Feb 02 09:01:01 np0005604791.novalocal CROND[4306]: (root) CMDEND (run-parts /etc/cron.hourly)
Feb 02 09:01:14 np0005604791.novalocal sshd-session[4321]: Accepted publickey for zuul from 38.102.83.114 port 34446 ssh2: RSA SHA256:zhs3MiW0JhxzckYcMHQES8SMYHj1iGcomnyzmbiwor8
Feb 02 09:01:14 np0005604791.novalocal systemd[1]: Created slice User Slice of UID 1000.
Feb 02 09:01:14 np0005604791.novalocal systemd[1]: Starting User Runtime Directory /run/user/1000...
Feb 02 09:01:14 np0005604791.novalocal systemd-logind[805]: New session 1 of user zuul.
Feb 02 09:01:14 np0005604791.novalocal systemd[1]: Finished User Runtime Directory /run/user/1000.
Feb 02 09:01:14 np0005604791.novalocal systemd[1]: Starting User Manager for UID 1000...
Feb 02 09:01:14 np0005604791.novalocal systemd[4325]: pam_unix(systemd-user:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Feb 02 09:01:14 np0005604791.novalocal systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Feb 02 09:01:14 np0005604791.novalocal systemd[4325]: Queued start job for default target Main User Target.
Feb 02 09:01:14 np0005604791.novalocal systemd[4325]: Created slice User Application Slice.
Feb 02 09:01:14 np0005604791.novalocal systemd[4325]: Started Mark boot as successful after the user session has run 2 minutes.
Feb 02 09:01:14 np0005604791.novalocal systemd[4325]: Started Daily Cleanup of User's Temporary Directories.
Feb 02 09:01:14 np0005604791.novalocal systemd[4325]: Reached target Paths.
Feb 02 09:01:14 np0005604791.novalocal systemd[4325]: Reached target Timers.
Feb 02 09:01:14 np0005604791.novalocal systemd[4325]: Starting D-Bus User Message Bus Socket...
Feb 02 09:01:14 np0005604791.novalocal systemd[4325]: Starting Create User's Volatile Files and Directories...
Feb 02 09:01:14 np0005604791.novalocal systemd[4325]: Finished Create User's Volatile Files and Directories.
Feb 02 09:01:14 np0005604791.novalocal systemd[4325]: Listening on D-Bus User Message Bus Socket.
Feb 02 09:01:14 np0005604791.novalocal systemd[4325]: Reached target Sockets.
Feb 02 09:01:14 np0005604791.novalocal systemd[4325]: Reached target Basic System.
Feb 02 09:01:14 np0005604791.novalocal systemd[4325]: Reached target Main User Target.
Feb 02 09:01:14 np0005604791.novalocal systemd[4325]: Startup finished in 179ms.
Feb 02 09:01:14 np0005604791.novalocal systemd[1]: Started User Manager for UID 1000.
Feb 02 09:01:14 np0005604791.novalocal systemd[1]: Started Session 1 of User zuul.
Feb 02 09:01:14 np0005604791.novalocal sshd-session[4321]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Feb 02 09:01:15 np0005604791.novalocal python3[4409]: ansible-setup Invoked with gather_subset=['!all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 02 09:01:18 np0005604791.novalocal python3[4437]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 02 09:01:26 np0005604791.novalocal python3[4495]: ansible-setup Invoked with gather_subset=['network'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 02 09:01:27 np0005604791.novalocal python3[4535]: ansible-zuul_console Invoked with path=/tmp/console-{log_uuid}.log port=19885 state=present
Feb 02 09:01:29 np0005604791.novalocal python3[4561]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDDres8I0e2lx2XlkDi/o8mbn7A8kJLvscauEMeSccA/Q28EgVAaHKAaMzB7MTuExuZhV2hKdHCjChvbo+ZEItJb42XILxS2oD7nNZFvVgzBQniv52jPQzNymZKv6xSxlAe2fhEntL1UKK7rrlHSbTvpCdGBhDUQsTkZLTXEabEEU2AUKrMcF1w86Dag94m2LcmlUNBhMgEGG2gCAwR3LArhvliT36AiA+uCD9ZLWOYPkktaBOoVTE2SXaHLM/QcLtQ9fjx6HlaVH0Yhtj7rqVbzUqi90TmhLPQuW8eD8LtDzn9vdNraZXTqHagLV5n5OxOivwbk4MGal3/4FVMfbvwmkxfPWWHnq9CpCjdr2/8NZkLs7rZjZtRj+oszTemHh2fSvs0qv1+QN2N9Fo3lRt/o3COnsw0ktNu6Xln+nqj4Bt/yqB5VmDCXaqp2DHhGlCM3XpR2F7xlpNITVJVPl9bGLc9YHytFHIM9fCjt1aMlyP028PhHIHlcB7LcSSd5QM= zuul-build-sshkey manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 02 09:01:29 np0005604791.novalocal python3[4585]: ansible-file Invoked with state=directory path=/home/zuul/.ssh mode=448 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 02 09:01:30 np0005604791.novalocal python3[4684]: ansible-ansible.legacy.stat Invoked with path=/home/zuul/.ssh/id_rsa follow=False get_checksum=False checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 02 09:01:30 np0005604791.novalocal python3[4755]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1770022889.827167-252-9767779315071/source dest=/home/zuul/.ssh/id_rsa mode=384 force=False _original_basename=b133c3a79151467e8c6849ab0367df01_id_rsa follow=False checksum=97092328ac9cd34b53b1d81cf7562eb94a095d6b backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 02 09:01:31 np0005604791.novalocal python3[4878]: ansible-ansible.legacy.stat Invoked with path=/home/zuul/.ssh/id_rsa.pub follow=False get_checksum=False checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 02 09:01:31 np0005604791.novalocal python3[4949]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1770022890.7703822-307-164783705280026/source dest=/home/zuul/.ssh/id_rsa.pub mode=420 force=False _original_basename=b133c3a79151467e8c6849ab0367df01_id_rsa.pub follow=False checksum=79261b1251eaaf0ed818421d3062a6de11fbecf0 backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 02 09:01:32 np0005604791.novalocal python3[4997]: ansible-ping Invoked with data=pong
Feb 02 09:01:33 np0005604791.novalocal python3[5021]: ansible-setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 02 09:01:36 np0005604791.novalocal python3[5079]: ansible-zuul_debug_info Invoked with ipv4_route_required=False ipv6_route_required=False image_manifest_files=['/etc/dib-builddate.txt', '/etc/image-hostname.txt'] image_manifest=None traceroute_host=None
Feb 02 09:01:37 np0005604791.novalocal python3[5111]: ansible-file Invoked with path=/home/zuul/zuul-output/logs state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 02 09:01:37 np0005604791.novalocal python3[5135]: ansible-file Invoked with path=/home/zuul/zuul-output/artifacts state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 02 09:01:37 np0005604791.novalocal python3[5159]: ansible-file Invoked with path=/home/zuul/zuul-output/docs state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 02 09:01:38 np0005604791.novalocal python3[5183]: ansible-file Invoked with path=/home/zuul/zuul-output/logs state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 02 09:01:38 np0005604791.novalocal python3[5207]: ansible-file Invoked with path=/home/zuul/zuul-output/artifacts state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 02 09:01:38 np0005604791.novalocal python3[5231]: ansible-file Invoked with path=/home/zuul/zuul-output/docs state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 02 09:01:40 np0005604791.novalocal sudo[5255]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qlvotdhimmrrdvfdwttsjbpklrwivkaj ; /usr/bin/python3'
Feb 02 09:01:40 np0005604791.novalocal sudo[5255]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:01:40 np0005604791.novalocal python3[5257]: ansible-file Invoked with path=/etc/ci state=directory owner=root group=root mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 02 09:01:40 np0005604791.novalocal sudo[5255]: pam_unix(sudo:session): session closed for user root
Feb 02 09:01:40 np0005604791.novalocal sudo[5333]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jywjbkwkzyagzmqgpbdlcjusvkctcncg ; /usr/bin/python3'
Feb 02 09:01:40 np0005604791.novalocal sudo[5333]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:01:40 np0005604791.novalocal python3[5335]: ansible-ansible.legacy.stat Invoked with path=/etc/ci/mirror_info.sh follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 02 09:01:40 np0005604791.novalocal sudo[5333]: pam_unix(sudo:session): session closed for user root
Feb 02 09:01:41 np0005604791.novalocal sudo[5406]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oiurqumjrkdvlfizefdbasacanodbzle ; /usr/bin/python3'
Feb 02 09:01:41 np0005604791.novalocal sudo[5406]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:01:41 np0005604791.novalocal python3[5408]: ansible-ansible.legacy.copy Invoked with dest=/etc/ci/mirror_info.sh owner=root group=root mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1770022900.5012546-32-160872255291395/source follow=False _original_basename=mirror_info.sh.j2 checksum=92d92a03afdddee82732741071f662c729080c35 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 02 09:01:41 np0005604791.novalocal sudo[5406]: pam_unix(sudo:session): session closed for user root
Feb 02 09:01:42 np0005604791.novalocal python3[5456]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAABIwAAAQEA4Z/c9osaGGtU6X8fgELwfj/yayRurfcKA0HMFfdpPxev2dbwljysMuzoVp4OZmW1gvGtyYPSNRvnzgsaabPNKNo2ym5NToCP6UM+KSe93aln4BcM/24mXChYAbXJQ5Bqq/pIzsGs/pKetQN+vwvMxLOwTvpcsCJBXaa981RKML6xj9l/UZ7IIq1HSEKMvPLxZMWdu0Ut8DkCd5F4nOw9Wgml2uYpDCj5LLCrQQ9ChdOMz8hz6SighhNlRpPkvPaet3OXxr/ytFMu7j7vv06CaEnuMMiY2aTWN1Imin9eHAylIqFHta/3gFfQSWt9jXM7owkBLKL7ATzhaAn+fjNupw== arxcruz@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 02 09:01:42 np0005604791.novalocal python3[5480]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQDS4Fn6k4deCnIlOtLWqZJyksbepjQt04j8Ed8CGx9EKkj0fKiAxiI4TadXQYPuNHMixZy4Nevjb6aDhL5Z906TfvNHKUrjrG7G26a0k8vdc61NEQ7FmcGMWRLwwc6ReDO7lFpzYKBMk4YqfWgBuGU/K6WLKiVW2cVvwIuGIaYrE1OiiX0iVUUk7KApXlDJMXn7qjSYynfO4mF629NIp8FJal38+Kv+HA+0QkE5Y2xXnzD4Lar5+keymiCHRntPppXHeLIRzbt0gxC7v3L72hpQ3BTBEzwHpeS8KY+SX1y5lRMN45thCHfJqGmARJREDjBvWG8JXOPmVIKQtZmVcD5b mandreou@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 02 09:01:42 np0005604791.novalocal python3[5504]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQC9MiLfy30deHA7xPOAlew5qUq3UP2gmRMYJi8PtkjFB20/DKeWwWNnkZPqP9AayruRoo51SIiVg870gbZE2jYl+Ncx/FYDe56JeC3ySZsXoAVkC9bP7gkOGqOmJjirvAgPMI7bogVz8i+66Q4Ar7OKTp3762G4IuWPPEg4ce4Y7lx9qWocZapHYq4cYKMxrOZ7SEbFSATBbe2bPZAPKTw8do/Eny+Hq/LkHFhIeyra6cqTFQYShr+zPln0Cr+ro/pDX3bB+1ubFgTpjpkkkQsLhDfR6cCdCWM2lgnS3BTtYj5Ct9/JRPR5YOphqZz+uB+OEu2IL68hmU9vNTth1KeX rlandy@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 02 09:01:42 np0005604791.novalocal python3[5528]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIFCbgz8gdERiJlk2IKOtkjQxEXejrio6ZYMJAVJYpOIp raukadah@gmail.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 02 09:01:43 np0005604791.novalocal python3[5552]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIBqb3Q/9uDf4LmihQ7xeJ9gA/STIQUFPSfyyV0m8AoQi bshewale@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 02 09:01:43 np0005604791.novalocal python3[5576]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQC0I8QqQx0Az2ysJt2JuffucLijhBqnsXKEIx5GyHwxVULROa8VtNFXUDH6ZKZavhiMcmfHB2+TBTda+lDP4FldYj06dGmzCY+IYGa+uDRdxHNGYjvCfLFcmLlzRK6fNbTcui+KlUFUdKe0fb9CRoGKyhlJD5GRkM1Dv+Yb6Bj+RNnmm1fVGYxzmrD2utvffYEb0SZGWxq2R9gefx1q/3wCGjeqvufEV+AskPhVGc5T7t9eyZ4qmslkLh1/nMuaIBFcr9AUACRajsvk6mXrAN1g3HlBf2gQlhi1UEyfbqIQvzzFtsbLDlSum/KmKjy818GzvWjERfQ0VkGzCd9bSLVL dviroel@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 02 09:01:43 np0005604791.novalocal python3[5600]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDLOQd4ZLtkZXQGY6UwAr/06ppWQK4fDO3HaqxPk98csyOCBXsliSKK39Bso828+5srIXiW7aI6aC9P5mwi4mUZlGPfJlQbfrcGvY+b/SocuvaGK+1RrHLoJCT52LBhwgrzlXio2jeksZeein8iaTrhsPrOAs7KggIL/rB9hEiB3NaOPWhhoCP4vlW6MEMExGcqB/1FVxXFBPnLkEyW0Lk7ycVflZl2ocRxbfjZi0+tI1Wlinp8PvSQSc/WVrAcDgKjc/mB4ODPOyYy3G8FHgfMsrXSDEyjBKgLKMsdCrAUcqJQWjkqXleXSYOV4q3pzL+9umK+q/e3P/bIoSFQzmJKTU1eDfuvPXmow9F5H54fii/Da7ezlMJ+wPGHJrRAkmzvMbALy7xwswLhZMkOGNtRcPqaKYRmIBKpw3o6bCTtcNUHOtOQnzwY8JzrM2eBWJBXAANYw+9/ho80JIiwhg29CFNpVBuHbql2YxJQNrnl90guN65rYNpDxdIluweyUf8= anbanerj@kaermorhen manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 02 09:01:43 np0005604791.novalocal python3[5624]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQC3VwV8Im9kRm49lt3tM36hj4Zv27FxGo4C1Q/0jqhzFmHY7RHbmeRr8ObhwWoHjXSozKWg8FL5ER0z3hTwL0W6lez3sL7hUaCmSuZmG5Hnl3x4vTSxDI9JZ/Y65rtYiiWQo2fC5xJhU/4+0e5e/pseCm8cKRSu+SaxhO+sd6FDojA2x1BzOzKiQRDy/1zWGp/cZkxcEuB1wHI5LMzN03c67vmbu+fhZRAUO4dQkvcnj2LrhQtpa+ytvnSjr8icMDosf1OsbSffwZFyHB/hfWGAfe0eIeSA2XPraxiPknXxiPKx2MJsaUTYbsZcm3EjFdHBBMumw5rBI74zLrMRvCO9GwBEmGT4rFng1nP+yw5DB8sn2zqpOsPg1LYRwCPOUveC13P6pgsZZPh812e8v5EKnETct+5XI3dVpdw6CnNiLwAyVAF15DJvBGT/u1k0Myg/bQn+Gv9k2MSj6LvQmf6WbZu2Wgjm30z3FyCneBqTL7mLF19YXzeC0ufHz5pnO1E= dasm@fedora manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 02 09:01:44 np0005604791.novalocal python3[5648]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIHUnwjB20UKmsSed9X73eGNV5AOEFccQ3NYrRW776pEk cjeanner manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 02 09:01:44 np0005604791.novalocal python3[5672]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDercCMGn8rW1C4P67tHgtflPdTeXlpyUJYH+6XDd2lR jgilaber@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 02 09:01:44 np0005604791.novalocal python3[5696]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIAMI6kkg9Wg0sG7jIJmyZemEBwUn1yzNpQQd3gnulOmZ adrianfuscoarnejo@gmail.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 02 09:01:44 np0005604791.novalocal python3[5720]: ansible-authorized_key Invoked with user=zuul state=present key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBPijwpQu/3jhhhBZInXNOLEH57DrknPc3PLbsRvYyJIFzwYjX+WD4a7+nGnMYS42MuZk6TJcVqgnqofVx4isoD4= ramishra@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 02 09:01:45 np0005604791.novalocal python3[5744]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIGpU/BepK3qX0NRf5Np+dOBDqzQEefhNrw2DCZaH3uWW rebtoor@monolith manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 02 09:01:45 np0005604791.novalocal python3[5768]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDK0iKdi8jQTpQrDdLVH/AAgLVYyTXF7AQ1gjc/5uT3t ykarel@yatinkarel manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 02 09:01:45 np0005604791.novalocal python3[5792]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIF/V/cLotA6LZeO32VL45Hd78skuA2lJA425Sm2LlQeZ fmount@horcrux manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 02 09:01:46 np0005604791.novalocal python3[5816]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDa7QCjuDMVmRPo1rREbGwzYeBCYVN+Ou/3WKXZEC6Sr manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 02 09:01:46 np0005604791.novalocal python3[5840]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAACAQCfNtF7NvKl915TGsGGoseUb06Hj8L/S4toWf0hExeY+F00woL6NvBlJD0nDct+P5a22I4EhvoQCRQ8reaPCm1lybR3uiRIJsj+8zkVvLwby9LXzfZorlNG9ofjd00FEmB09uW/YvTl6Q9XwwwX6tInzIOv3TMqTHHGOL74ibbj8J/FJR0cFEyj0z4WQRvtkh32xAHl83gbuINryMt0sqRI+clj2381NKL55DRLQrVw0gsfqqxiHAnXg21qWmc4J+b9e9kiuAFQjcjwTVkwJCcg3xbPwC/qokYRby/Y5S40UUd7/jEARGXT7RZgpzTuDd1oZiCVrnrqJNPaMNdVv5MLeFdf1B7iIe5aa/fGouX7AO4SdKhZUdnJmCFAGvjC6S3JMZ2wAcUl+OHnssfmdj7XL50cLo27vjuzMtLAgSqi6N99m92WCF2s8J9aVzszX7Xz9OKZCeGsiVJp3/NdABKzSEAyM9xBD/5Vho894Sav+otpySHe3p6RUTgbB5Zu8VyZRZ/UtB3ueXxyo764yrc6qWIDqrehm84Xm9g+/jpIBzGPl07NUNJpdt/6Sgf9RIKXw/7XypO5yZfUcuFNGTxLfqjTNrtgLZNcjfav6sSdVXVcMPL//XNuRdKmVFaO76eV/oGMQGr1fGcCD+N+CpI7+Q+fCNB6VFWG4nZFuI/Iuw== averdagu@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 02 09:01:46 np0005604791.novalocal python3[5864]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDq8l27xI+QlQVdS4djp9ogSoyrNE2+Ox6vKPdhSNL1J3PE5w+WCSvMz9A5gnNuH810zwbekEApbxTze/gLQJwBHA52CChfURpXrFaxY7ePXRElwKAL3mJfzBWY/c5jnNL9TCVmFJTGZkFZP3Nh+BMgZvL6xBkt3WKm6Uq18qzd9XeKcZusrA+O+uLv1fVeQnadY9RIqOCyeFYCzLWrUfTyE8x/XG0hAWIM7qpnF2cALQS2h9n4hW5ybiUN790H08wf9hFwEf5nxY9Z9dVkPFQiTSGKNBzmnCXU9skxS/xhpFjJ5duGSZdtAHe9O+nGZm9c67hxgtf8e5PDuqAdXEv2cf6e3VBAt+Bz8EKI3yosTj0oZHfwr42Yzb1l/SKy14Rggsrc9KAQlrGXan6+u2jcQqqx7l+SWmnpFiWTV9u5cWj2IgOhApOitmRBPYqk9rE2usfO0hLn/Pj/R/Nau4803e1/EikdLE7Ps95s9mX5jRDjAoUa2JwFF5RsVFyL910= ashigupt@ashigupt.remote.csb manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 02 09:01:46 np0005604791.novalocal python3[5888]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIOKLl0NYKwoZ/JY5KeZU8VwRAggeOxqQJeoqp3dsAaY9 manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 02 09:01:47 np0005604791.novalocal python3[5912]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIASASQOH2BcOyLKuuDOdWZlPi2orcjcA8q4400T73DLH evallesp@fedora manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 02 09:01:47 np0005604791.novalocal python3[5936]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAILeBWlamUph+jRKV2qrx1PGU7vWuGIt5+z9k96I8WehW amsinha@amsinha-mac manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 02 09:01:47 np0005604791.novalocal python3[5960]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIANvVgvJBlK3gb1yz5uef/JqIGq4HLEmY2dYA8e37swb morenod@redhat-laptop manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 02 09:01:47 np0005604791.novalocal python3[5984]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAACAQDZdI7t1cxYx65heVI24HTV4F7oQLW1zyfxHreL2TIJKxjyrUUKIFEUmTutcBlJRLNT2Eoix6x1sOw9YrchloCLcn//SGfTElr9mSc5jbjb7QXEU+zJMhtxyEJ1Po3CUGnj7ckiIXw7wcawZtrEOAQ9pH3ExYCJcEMiyNjRQZCxT3tPK+S4B95EWh5Fsrz9CkwpjNRPPH7LigCeQTM3Wc7r97utAslBUUvYceDSLA7rMgkitJE38b7rZBeYzsGQ8YYUBjTCtehqQXxCRjizbHWaaZkBU+N3zkKB6n/iCNGIO690NK7A/qb6msTijiz1PeuM8ThOsi9qXnbX5v0PoTpcFSojV7NHAQ71f0XXuS43FhZctT+Dcx44dT8Fb5vJu2cJGrk+qF8ZgJYNpRS7gPg0EG2EqjK7JMf9ULdjSu0r+KlqIAyLvtzT4eOnQipoKlb/WG5D/0ohKv7OMQ352ggfkBFIQsRXyyTCT98Ft9juqPuahi3CAQmP4H9dyE+7+Kz437PEtsxLmfm6naNmWi7Ee1DqWPwS8rEajsm4sNM4wW9gdBboJQtc0uZw0DfLj1I9r3Mc8Ol0jYtz0yNQDSzVLrGCaJlC311trU70tZ+ZkAVV6Mn8lOhSbj1cK0lvSr6ZK4dgqGl3I1eTZJJhbLNdg7UOVaiRx9543+C/p/As7w== brjackma@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 02 09:01:48 np0005604791.novalocal python3[6008]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIKwedoZ0TWPJX/z/4TAbO/kKcDZOQVgRH0hAqrL5UCI1 vcastell@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 02 09:01:48 np0005604791.novalocal python3[6032]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIEmv8sE8GCk6ZTPIqF0FQrttBdL3mq7rCm/IJy0xDFh7 michburk@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 02 09:01:48 np0005604791.novalocal python3[6056]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAICy6GpGEtwevXEEn4mmLR5lmSLe23dGgAvzkB9DMNbkf rsafrono@rsafrono manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 02 09:01:51 np0005604791.novalocal sudo[6080]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-afvmwljrspkaidtkwvoxtvykyjupywla ; /usr/bin/python3'
Feb 02 09:01:51 np0005604791.novalocal sudo[6080]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:01:51 np0005604791.novalocal python3[6082]: ansible-community.general.timezone Invoked with name=UTC hwclock=None
Feb 02 09:01:51 np0005604791.novalocal systemd[1]: Starting Time & Date Service...
Feb 02 09:01:51 np0005604791.novalocal systemd[1]: Started Time & Date Service.
Feb 02 09:01:51 np0005604791.novalocal systemd-timedated[6084]: Changed time zone to 'UTC' (UTC).
Feb 02 09:01:51 np0005604791.novalocal sudo[6080]: pam_unix(sudo:session): session closed for user root
Feb 02 09:01:51 np0005604791.novalocal sudo[6111]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rvaarvigqovipiyzztvnokpglarzzbxl ; /usr/bin/python3'
Feb 02 09:01:51 np0005604791.novalocal sudo[6111]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:01:51 np0005604791.novalocal python3[6113]: ansible-file Invoked with path=/etc/nodepool state=directory mode=511 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 02 09:01:51 np0005604791.novalocal sudo[6111]: pam_unix(sudo:session): session closed for user root
Feb 02 09:01:52 np0005604791.novalocal python3[6189]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/sub_nodes follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 02 09:01:52 np0005604791.novalocal python3[6260]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/sub_nodes src=/home/zuul/.ansible/tmp/ansible-tmp-1770022912.0679753-252-275154762755933/source _original_basename=tmp7vuu01de follow=False checksum=da39a3ee5e6b4b0d3255bfef95601890afd80709 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 02 09:01:53 np0005604791.novalocal python3[6360]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/sub_nodes_private follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 02 09:01:53 np0005604791.novalocal python3[6431]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/sub_nodes_private src=/home/zuul/.ansible/tmp/ansible-tmp-1770022912.8973043-303-57341955622931/source _original_basename=tmpc0ih3ok9 follow=False checksum=da39a3ee5e6b4b0d3255bfef95601890afd80709 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 02 09:01:54 np0005604791.novalocal sudo[6531]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rhemhlydjpkueoukwfqzzvjlnwjlfmsi ; /usr/bin/python3'
Feb 02 09:01:54 np0005604791.novalocal sudo[6531]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:01:54 np0005604791.novalocal python3[6533]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/node_private follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 02 09:01:54 np0005604791.novalocal sudo[6531]: pam_unix(sudo:session): session closed for user root
Feb 02 09:01:54 np0005604791.novalocal sudo[6604]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qsoukbwpbeqtujwgahmgbtwfpsbjwsrq ; /usr/bin/python3'
Feb 02 09:01:54 np0005604791.novalocal sudo[6604]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:01:54 np0005604791.novalocal python3[6606]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/node_private src=/home/zuul/.ansible/tmp/ansible-tmp-1770022914.0356104-382-146128306350918/source _original_basename=tmpzimi10cz follow=False checksum=b9ea63fb38f50d3257ec076159ca59d9b4b7fe2c backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 02 09:01:54 np0005604791.novalocal sudo[6604]: pam_unix(sudo:session): session closed for user root
Feb 02 09:01:55 np0005604791.novalocal python3[6654]: ansible-ansible.legacy.command Invoked with _raw_params=cp .ssh/id_rsa /etc/nodepool/id_rsa zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 02 09:01:55 np0005604791.novalocal python3[6680]: ansible-ansible.legacy.command Invoked with _raw_params=cp .ssh/id_rsa.pub /etc/nodepool/id_rsa.pub zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 02 09:01:55 np0005604791.novalocal sudo[6758]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-upqjxmtnaqdllaphzbhfnsqyvczpgmbb ; /usr/bin/python3'
Feb 02 09:01:55 np0005604791.novalocal sudo[6758]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:01:55 np0005604791.novalocal python3[6760]: ansible-ansible.legacy.stat Invoked with path=/etc/sudoers.d/zuul-sudo-grep follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 02 09:01:55 np0005604791.novalocal sudo[6758]: pam_unix(sudo:session): session closed for user root
Feb 02 09:01:56 np0005604791.novalocal sudo[6831]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nuqzfpsrsktfqgyzpkpqcwuvtaeryfvb ; /usr/bin/python3'
Feb 02 09:01:56 np0005604791.novalocal sudo[6831]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:01:56 np0005604791.novalocal python3[6833]: ansible-ansible.legacy.copy Invoked with dest=/etc/sudoers.d/zuul-sudo-grep mode=288 src=/home/zuul/.ansible/tmp/ansible-tmp-1770022915.592871-452-226389565636795/source _original_basename=tmppg98_f17 follow=False checksum=bdca1a77493d00fb51567671791f4aa30f66c2f0 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 02 09:01:56 np0005604791.novalocal sudo[6831]: pam_unix(sudo:session): session closed for user root
Feb 02 09:01:56 np0005604791.novalocal sudo[6882]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nrmtbxddbnhgwrffzauyqynroiyqvxyr ; /usr/bin/python3'
Feb 02 09:01:56 np0005604791.novalocal sudo[6882]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:01:56 np0005604791.novalocal python3[6884]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/visudo -c zuul_log_id=fa163ec2-ffbe-385c-c5be-00000000001f-1-compute1 zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 02 09:01:56 np0005604791.novalocal sudo[6882]: pam_unix(sudo:session): session closed for user root
Feb 02 09:01:57 np0005604791.novalocal python3[6912]: ansible-ansible.legacy.command Invoked with executable=/bin/bash _raw_params=env
                                                       _uses_shell=True zuul_log_id=fa163ec2-ffbe-385c-c5be-000000000020-1-compute1 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None creates=None removes=None stdin=None
Feb 02 09:01:58 np0005604791.novalocal python3[6940]: ansible-file Invoked with path=/home/zuul/workspace state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 02 09:02:03 np0005604791.novalocal irqbalance[804]: Cannot change IRQ 27 affinity: Operation not permitted
Feb 02 09:02:03 np0005604791.novalocal irqbalance[804]: IRQ 27 affinity is now unmanaged
Feb 02 09:02:17 np0005604791.novalocal sudo[6964]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lzrvnrospshppptysxyzpddpbcgakvgc ; /usr/bin/python3'
Feb 02 09:02:17 np0005604791.novalocal sudo[6964]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:02:17 np0005604791.novalocal python3[6966]: ansible-ansible.builtin.file Invoked with path=/etc/ci/env state=directory mode=0755 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 02 09:02:17 np0005604791.novalocal sudo[6964]: pam_unix(sudo:session): session closed for user root
Feb 02 09:02:21 np0005604791.novalocal systemd[1]: systemd-timedated.service: Deactivated successfully.
Feb 02 09:03:17 np0005604791.novalocal sshd-session[4336]: Received disconnect from 38.102.83.114 port 34446:11: disconnected by user
Feb 02 09:03:17 np0005604791.novalocal sshd-session[4336]: Disconnected from user zuul 38.102.83.114 port 34446
Feb 02 09:03:17 np0005604791.novalocal sshd-session[4321]: pam_unix(sshd:session): session closed for user zuul
Feb 02 09:03:17 np0005604791.novalocal systemd-logind[805]: Session 1 logged out. Waiting for processes to exit.
Feb 02 09:03:29 np0005604791.novalocal kernel: pci 0000:00:07.0: [1af4:1000] type 00 class 0x020000 conventional PCI endpoint
Feb 02 09:03:29 np0005604791.novalocal kernel: pci 0000:00:07.0: BAR 0 [io  0x0000-0x003f]
Feb 02 09:03:29 np0005604791.novalocal kernel: pci 0000:00:07.0: BAR 1 [mem 0x00000000-0x00000fff]
Feb 02 09:03:29 np0005604791.novalocal kernel: pci 0000:00:07.0: BAR 4 [mem 0x00000000-0x00003fff 64bit pref]
Feb 02 09:03:29 np0005604791.novalocal kernel: pci 0000:00:07.0: ROM [mem 0x00000000-0x0007ffff pref]
Feb 02 09:03:29 np0005604791.novalocal kernel: pci 0000:00:07.0: ROM [mem 0xc0000000-0xc007ffff pref]: assigned
Feb 02 09:03:29 np0005604791.novalocal kernel: pci 0000:00:07.0: BAR 4 [mem 0x240000000-0x240003fff 64bit pref]: assigned
Feb 02 09:03:29 np0005604791.novalocal kernel: pci 0000:00:07.0: BAR 1 [mem 0xc0080000-0xc0080fff]: assigned
Feb 02 09:03:29 np0005604791.novalocal kernel: pci 0000:00:07.0: BAR 0 [io  0x1000-0x103f]: assigned
Feb 02 09:03:29 np0005604791.novalocal kernel: virtio-pci 0000:00:07.0: enabling device (0000 -> 0003)
Feb 02 09:03:30 np0005604791.novalocal NetworkManager[862]: <info>  [1770023010.0160] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Feb 02 09:03:30 np0005604791.novalocal systemd-udevd[6970]: Network interface NamePolicy= disabled on kernel command line.
Feb 02 09:03:30 np0005604791.novalocal NetworkManager[862]: <info>  [1770023010.0299] device (eth1): state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Feb 02 09:03:30 np0005604791.novalocal NetworkManager[862]: <info>  [1770023010.0330] settings: (eth1): created default wired connection 'Wired connection 1'
Feb 02 09:03:30 np0005604791.novalocal NetworkManager[862]: <info>  [1770023010.0336] device (eth1): carrier: link connected
Feb 02 09:03:30 np0005604791.novalocal NetworkManager[862]: <info>  [1770023010.0339] device (eth1): state change: unavailable -> disconnected (reason 'carrier-changed', managed-type: 'full')
Feb 02 09:03:30 np0005604791.novalocal NetworkManager[862]: <info>  [1770023010.0348] policy: auto-activating connection 'Wired connection 1' (4b750100-60f6-352c-8e2d-c3f0f09dae3a)
Feb 02 09:03:30 np0005604791.novalocal NetworkManager[862]: <info>  [1770023010.0355] device (eth1): Activation: starting connection 'Wired connection 1' (4b750100-60f6-352c-8e2d-c3f0f09dae3a)
Feb 02 09:03:30 np0005604791.novalocal NetworkManager[862]: <info>  [1770023010.0356] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Feb 02 09:03:30 np0005604791.novalocal NetworkManager[862]: <info>  [1770023010.0362] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Feb 02 09:03:30 np0005604791.novalocal NetworkManager[862]: <info>  [1770023010.0368] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'full')
Feb 02 09:03:30 np0005604791.novalocal NetworkManager[862]: <info>  [1770023010.0375] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Feb 02 09:03:30 np0005604791.novalocal systemd[4325]: Starting Mark boot as successful...
Feb 02 09:03:30 np0005604791.novalocal systemd[4325]: Finished Mark boot as successful.
Feb 02 09:03:31 np0005604791.novalocal sshd-session[6974]: Accepted publickey for zuul from 38.102.83.114 port 37084 ssh2: RSA SHA256:oiZKnX5kwvqrsUV0ZZjSac+GUqsMvprIFYZPo6yyNjU
Feb 02 09:03:31 np0005604791.novalocal systemd-logind[805]: New session 3 of user zuul.
Feb 02 09:03:31 np0005604791.novalocal systemd[1]: Started Session 3 of User zuul.
Feb 02 09:03:31 np0005604791.novalocal sshd-session[6974]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Feb 02 09:03:31 np0005604791.novalocal python3[7001]: ansible-ansible.legacy.command Invoked with _raw_params=ip -j link zuul_log_id=fa163ec2-ffbe-26e4-4fa2-00000000018f-0-controller zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 02 09:03:41 np0005604791.novalocal sudo[7079]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lcgkpxgdhhmzpjxorlroplcqlgkhthxl ; OS_CLOUD=vexxhost /usr/bin/python3'
Feb 02 09:03:41 np0005604791.novalocal sudo[7079]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:03:41 np0005604791.novalocal python3[7081]: ansible-ansible.legacy.stat Invoked with path=/etc/NetworkManager/system-connections/ci-private-network.nmconnection follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 02 09:03:41 np0005604791.novalocal sudo[7079]: pam_unix(sudo:session): session closed for user root
Feb 02 09:03:42 np0005604791.novalocal sudo[7152]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-itxpshtlcxxjbcqffqrtqrvlsjkoxhgs ; OS_CLOUD=vexxhost /usr/bin/python3'
Feb 02 09:03:42 np0005604791.novalocal sudo[7152]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:03:42 np0005604791.novalocal python3[7154]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1770023021.5348797-155-64804940174475/source dest=/etc/NetworkManager/system-connections/ci-private-network.nmconnection mode=0600 owner=root group=root follow=False _original_basename=bootstrap-ci-network-nm-connection.nmconnection.j2 checksum=41a2bd022780f84a2a9b026b65aafb4433cf3332 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 02 09:03:42 np0005604791.novalocal sudo[7152]: pam_unix(sudo:session): session closed for user root
Feb 02 09:03:42 np0005604791.novalocal sudo[7202]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ltqllceirnkvjmbolyrinczaswelrugl ; OS_CLOUD=vexxhost /usr/bin/python3'
Feb 02 09:03:42 np0005604791.novalocal sudo[7202]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:03:42 np0005604791.novalocal python3[7204]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Feb 02 09:03:42 np0005604791.novalocal systemd[1]: NetworkManager-wait-online.service: Deactivated successfully.
Feb 02 09:03:42 np0005604791.novalocal systemd[1]: Stopped Network Manager Wait Online.
Feb 02 09:03:42 np0005604791.novalocal systemd[1]: Stopping Network Manager Wait Online...
Feb 02 09:03:42 np0005604791.novalocal systemd[1]: Stopping Network Manager...
Feb 02 09:03:42 np0005604791.novalocal NetworkManager[862]: <info>  [1770023022.8048] caught SIGTERM, shutting down normally.
Feb 02 09:03:42 np0005604791.novalocal NetworkManager[862]: <info>  [1770023022.8062] dhcp4 (eth0): canceled DHCP transaction
Feb 02 09:03:42 np0005604791.novalocal NetworkManager[862]: <info>  [1770023022.8063] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Feb 02 09:03:42 np0005604791.novalocal NetworkManager[862]: <info>  [1770023022.8063] dhcp4 (eth0): state changed no lease
Feb 02 09:03:42 np0005604791.novalocal NetworkManager[862]: <info>  [1770023022.8066] manager: NetworkManager state is now CONNECTING
Feb 02 09:03:42 np0005604791.novalocal NetworkManager[862]: <info>  [1770023022.8192] dhcp4 (eth1): canceled DHCP transaction
Feb 02 09:03:42 np0005604791.novalocal NetworkManager[862]: <info>  [1770023022.8192] dhcp4 (eth1): state changed no lease
Feb 02 09:03:42 np0005604791.novalocal NetworkManager[862]: <info>  [1770023022.8263] exiting (success)
Feb 02 09:03:42 np0005604791.novalocal systemd[1]: Starting Network Manager Script Dispatcher Service...
Feb 02 09:03:42 np0005604791.novalocal systemd[1]: Started Network Manager Script Dispatcher Service.
Feb 02 09:03:42 np0005604791.novalocal systemd[1]: NetworkManager.service: Deactivated successfully.
Feb 02 09:03:42 np0005604791.novalocal systemd[1]: Stopped Network Manager.
Feb 02 09:03:42 np0005604791.novalocal systemd[1]: NetworkManager.service: Consumed 1.524s CPU time, 9.9M memory peak.
Feb 02 09:03:42 np0005604791.novalocal systemd[1]: Starting Network Manager...
Feb 02 09:03:42 np0005604791.novalocal NetworkManager[7213]: <info>  [1770023022.8881] NetworkManager (version 1.54.3-2.el9) is starting... (after a restart, boot:73bfa7d4-cc72-468c-831e-edc1e8589b87)
Feb 02 09:03:42 np0005604791.novalocal NetworkManager[7213]: <info>  [1770023022.8882] Read config: /etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf
Feb 02 09:03:42 np0005604791.novalocal NetworkManager[7213]: <info>  [1770023022.8951] manager[0x55e12412b000]: monitoring kernel firmware directory '/lib/firmware'.
Feb 02 09:03:42 np0005604791.novalocal systemd[1]: Starting Hostname Service...
Feb 02 09:03:42 np0005604791.novalocal systemd[1]: Started Hostname Service.
Feb 02 09:03:42 np0005604791.novalocal NetworkManager[7213]: <info>  [1770023022.9628] hostname: hostname: using hostnamed
Feb 02 09:03:42 np0005604791.novalocal NetworkManager[7213]: <info>  [1770023022.9629] hostname: static hostname changed from (none) to "np0005604791.novalocal"
Feb 02 09:03:42 np0005604791.novalocal NetworkManager[7213]: <info>  [1770023022.9637] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Feb 02 09:03:42 np0005604791.novalocal NetworkManager[7213]: <info>  [1770023022.9642] manager[0x55e12412b000]: rfkill: Wi-Fi hardware radio set enabled
Feb 02 09:03:42 np0005604791.novalocal NetworkManager[7213]: <info>  [1770023022.9643] manager[0x55e12412b000]: rfkill: WWAN hardware radio set enabled
Feb 02 09:03:42 np0005604791.novalocal NetworkManager[7213]: <info>  [1770023022.9683] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.54.3-2.el9/libnm-device-plugin-team.so)
Feb 02 09:03:42 np0005604791.novalocal NetworkManager[7213]: <info>  [1770023022.9683] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Feb 02 09:03:42 np0005604791.novalocal NetworkManager[7213]: <info>  [1770023022.9684] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Feb 02 09:03:42 np0005604791.novalocal NetworkManager[7213]: <info>  [1770023022.9685] manager: Networking is enabled by state file
Feb 02 09:03:42 np0005604791.novalocal NetworkManager[7213]: <info>  [1770023022.9688] settings: Loaded settings plugin: keyfile (internal)
Feb 02 09:03:42 np0005604791.novalocal NetworkManager[7213]: <info>  [1770023022.9693] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.54.3-2.el9/libnm-settings-plugin-ifcfg-rh.so")
Feb 02 09:03:42 np0005604791.novalocal NetworkManager[7213]: <info>  [1770023022.9730] Warning: the ifcfg-rh plugin is deprecated, please migrate connections to the keyfile format using "nmcli connection migrate"
Feb 02 09:03:42 np0005604791.novalocal NetworkManager[7213]: <info>  [1770023022.9743] dhcp: init: Using DHCP client 'internal'
Feb 02 09:03:42 np0005604791.novalocal NetworkManager[7213]: <info>  [1770023022.9747] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Feb 02 09:03:42 np0005604791.novalocal NetworkManager[7213]: <info>  [1770023022.9755] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Feb 02 09:03:42 np0005604791.novalocal NetworkManager[7213]: <info>  [1770023022.9762] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Feb 02 09:03:42 np0005604791.novalocal NetworkManager[7213]: <info>  [1770023022.9775] device (lo): Activation: starting connection 'lo' (5d713ff7-af86-4df5-9d5a-ad7ed5dcc84d)
Feb 02 09:03:42 np0005604791.novalocal NetworkManager[7213]: <info>  [1770023022.9783] device (eth0): carrier: link connected
Feb 02 09:03:42 np0005604791.novalocal NetworkManager[7213]: <info>  [1770023022.9789] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Feb 02 09:03:42 np0005604791.novalocal NetworkManager[7213]: <info>  [1770023022.9798] manager: (eth0): assume: will attempt to assume matching connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03) (indicated)
Feb 02 09:03:42 np0005604791.novalocal NetworkManager[7213]: <info>  [1770023022.9799] device (eth0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Feb 02 09:03:42 np0005604791.novalocal NetworkManager[7213]: <info>  [1770023022.9808] device (eth0): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Feb 02 09:03:42 np0005604791.novalocal NetworkManager[7213]: <info>  [1770023022.9818] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Feb 02 09:03:42 np0005604791.novalocal NetworkManager[7213]: <info>  [1770023022.9825] device (eth1): carrier: link connected
Feb 02 09:03:42 np0005604791.novalocal NetworkManager[7213]: <info>  [1770023022.9831] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Feb 02 09:03:42 np0005604791.novalocal NetworkManager[7213]: <info>  [1770023022.9838] manager: (eth1): assume: will attempt to assume matching connection 'Wired connection 1' (4b750100-60f6-352c-8e2d-c3f0f09dae3a) (indicated)
Feb 02 09:03:42 np0005604791.novalocal NetworkManager[7213]: <info>  [1770023022.9839] device (eth1): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Feb 02 09:03:42 np0005604791.novalocal NetworkManager[7213]: <info>  [1770023022.9846] device (eth1): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Feb 02 09:03:42 np0005604791.novalocal NetworkManager[7213]: <info>  [1770023022.9856] device (eth1): Activation: starting connection 'Wired connection 1' (4b750100-60f6-352c-8e2d-c3f0f09dae3a)
Feb 02 09:03:42 np0005604791.novalocal NetworkManager[7213]: <info>  [1770023022.9864] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Feb 02 09:03:42 np0005604791.novalocal systemd[1]: Started Network Manager.
Feb 02 09:03:42 np0005604791.novalocal NetworkManager[7213]: <info>  [1770023022.9869] device (lo): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Feb 02 09:03:42 np0005604791.novalocal NetworkManager[7213]: <info>  [1770023022.9874] device (lo): state change: prepare -> config (reason 'none', managed-type: 'external')
Feb 02 09:03:42 np0005604791.novalocal NetworkManager[7213]: <info>  [1770023022.9876] device (lo): state change: config -> ip-config (reason 'none', managed-type: 'external')
Feb 02 09:03:42 np0005604791.novalocal NetworkManager[7213]: <info>  [1770023022.9879] device (eth0): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Feb 02 09:03:42 np0005604791.novalocal NetworkManager[7213]: <info>  [1770023022.9883] device (eth0): state change: prepare -> config (reason 'none', managed-type: 'assume')
Feb 02 09:03:42 np0005604791.novalocal NetworkManager[7213]: <info>  [1770023022.9886] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Feb 02 09:03:42 np0005604791.novalocal NetworkManager[7213]: <info>  [1770023022.9889] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'assume')
Feb 02 09:03:42 np0005604791.novalocal NetworkManager[7213]: <info>  [1770023022.9892] device (lo): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Feb 02 09:03:42 np0005604791.novalocal NetworkManager[7213]: <info>  [1770023022.9902] device (eth0): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Feb 02 09:03:42 np0005604791.novalocal NetworkManager[7213]: <info>  [1770023022.9906] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Feb 02 09:03:42 np0005604791.novalocal NetworkManager[7213]: <info>  [1770023022.9918] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Feb 02 09:03:42 np0005604791.novalocal NetworkManager[7213]: <info>  [1770023022.9922] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Feb 02 09:03:42 np0005604791.novalocal NetworkManager[7213]: <info>  [1770023022.9945] device (lo): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Feb 02 09:03:42 np0005604791.novalocal NetworkManager[7213]: <info>  [1770023022.9951] device (lo): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Feb 02 09:03:42 np0005604791.novalocal NetworkManager[7213]: <info>  [1770023022.9959] device (lo): Activation: successful, device activated.
Feb 02 09:03:42 np0005604791.novalocal NetworkManager[7213]: <info>  [1770023022.9970] dhcp4 (eth0): state changed new lease, address=38.102.83.189
Feb 02 09:03:42 np0005604791.novalocal NetworkManager[7213]: <info>  [1770023022.9980] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Feb 02 09:03:42 np0005604791.novalocal systemd[1]: Starting Network Manager Wait Online...
Feb 02 09:03:43 np0005604791.novalocal NetworkManager[7213]: <info>  [1770023023.0083] device (eth0): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Feb 02 09:03:43 np0005604791.novalocal NetworkManager[7213]: <info>  [1770023023.0105] device (eth0): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Feb 02 09:03:43 np0005604791.novalocal NetworkManager[7213]: <info>  [1770023023.0108] device (eth0): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Feb 02 09:03:43 np0005604791.novalocal NetworkManager[7213]: <info>  [1770023023.0112] manager: NetworkManager state is now CONNECTED_SITE
Feb 02 09:03:43 np0005604791.novalocal NetworkManager[7213]: <info>  [1770023023.0117] device (eth0): Activation: successful, device activated.
Feb 02 09:03:43 np0005604791.novalocal NetworkManager[7213]: <info>  [1770023023.0124] manager: NetworkManager state is now CONNECTED_GLOBAL
Feb 02 09:03:43 np0005604791.novalocal sudo[7202]: pam_unix(sudo:session): session closed for user root
Feb 02 09:03:43 np0005604791.novalocal python3[7288]: ansible-ansible.legacy.command Invoked with _raw_params=ip route zuul_log_id=fa163ec2-ffbe-26e4-4fa2-0000000000c8-0-controller zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 02 09:03:53 np0005604791.novalocal systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Feb 02 09:04:12 np0005604791.novalocal systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Feb 02 09:04:28 np0005604791.novalocal NetworkManager[7213]: <info>  [1770023068.6370] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Feb 02 09:04:28 np0005604791.novalocal systemd[1]: Starting Network Manager Script Dispatcher Service...
Feb 02 09:04:28 np0005604791.novalocal systemd[1]: Started Network Manager Script Dispatcher Service.
Feb 02 09:04:28 np0005604791.novalocal NetworkManager[7213]: <info>  [1770023068.6635] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Feb 02 09:04:28 np0005604791.novalocal NetworkManager[7213]: <info>  [1770023068.6642] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Feb 02 09:04:28 np0005604791.novalocal NetworkManager[7213]: <info>  [1770023068.6659] device (eth1): Activation: successful, device activated.
Feb 02 09:04:28 np0005604791.novalocal NetworkManager[7213]: <info>  [1770023068.6673] manager: startup complete
Feb 02 09:04:28 np0005604791.novalocal NetworkManager[7213]: <info>  [1770023068.6681] device (eth1): state change: activated -> failed (reason 'ip-config-unavailable', managed-type: 'full')
Feb 02 09:04:28 np0005604791.novalocal NetworkManager[7213]: <warn>  [1770023068.6698] device (eth1): Activation: failed for connection 'Wired connection 1'
Feb 02 09:04:28 np0005604791.novalocal NetworkManager[7213]: <info>  [1770023068.6710] device (eth1): state change: failed -> disconnected (reason 'none', managed-type: 'full')
Feb 02 09:04:28 np0005604791.novalocal systemd[1]: Finished Network Manager Wait Online.
Feb 02 09:04:28 np0005604791.novalocal NetworkManager[7213]: <info>  [1770023068.6812] dhcp4 (eth1): canceled DHCP transaction
Feb 02 09:04:28 np0005604791.novalocal NetworkManager[7213]: <info>  [1770023068.6812] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Feb 02 09:04:28 np0005604791.novalocal NetworkManager[7213]: <info>  [1770023068.6813] dhcp4 (eth1): state changed no lease
Feb 02 09:04:28 np0005604791.novalocal NetworkManager[7213]: <info>  [1770023068.6830] policy: auto-activating connection 'ci-private-network' (ee27c9b5-5e51-5927-b192-b2b3e6929a50)
Feb 02 09:04:28 np0005604791.novalocal NetworkManager[7213]: <info>  [1770023068.6837] device (eth1): Activation: starting connection 'ci-private-network' (ee27c9b5-5e51-5927-b192-b2b3e6929a50)
Feb 02 09:04:28 np0005604791.novalocal NetworkManager[7213]: <info>  [1770023068.6838] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Feb 02 09:04:28 np0005604791.novalocal NetworkManager[7213]: <info>  [1770023068.6842] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Feb 02 09:04:28 np0005604791.novalocal NetworkManager[7213]: <info>  [1770023068.6850] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'full')
Feb 02 09:04:28 np0005604791.novalocal NetworkManager[7213]: <info>  [1770023068.6861] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Feb 02 09:04:28 np0005604791.novalocal NetworkManager[7213]: <info>  [1770023068.7217] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Feb 02 09:04:28 np0005604791.novalocal NetworkManager[7213]: <info>  [1770023068.7221] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'full')
Feb 02 09:04:28 np0005604791.novalocal NetworkManager[7213]: <info>  [1770023068.7231] device (eth1): Activation: successful, device activated.
Feb 02 09:04:38 np0005604791.novalocal systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Feb 02 09:04:43 np0005604791.novalocal sshd-session[6977]: Received disconnect from 38.102.83.114 port 37084:11: disconnected by user
Feb 02 09:04:43 np0005604791.novalocal sshd-session[6977]: Disconnected from user zuul 38.102.83.114 port 37084
Feb 02 09:04:43 np0005604791.novalocal sshd-session[6974]: pam_unix(sshd:session): session closed for user zuul
Feb 02 09:04:43 np0005604791.novalocal systemd[1]: session-3.scope: Deactivated successfully.
Feb 02 09:04:43 np0005604791.novalocal systemd[1]: session-3.scope: Consumed 1.513s CPU time.
Feb 02 09:04:43 np0005604791.novalocal systemd-logind[805]: Session 3 logged out. Waiting for processes to exit.
Feb 02 09:04:43 np0005604791.novalocal systemd-logind[805]: Removed session 3.
Feb 02 09:05:22 np0005604791.novalocal sshd-session[7316]: Accepted publickey for zuul from 38.102.83.114 port 59718 ssh2: RSA SHA256:oiZKnX5kwvqrsUV0ZZjSac+GUqsMvprIFYZPo6yyNjU
Feb 02 09:05:22 np0005604791.novalocal systemd-logind[805]: New session 4 of user zuul.
Feb 02 09:05:22 np0005604791.novalocal systemd[1]: Started Session 4 of User zuul.
Feb 02 09:05:22 np0005604791.novalocal sshd-session[7316]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Feb 02 09:05:22 np0005604791.novalocal sudo[7395]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dxkbpqxuxmbyhuzdwjmllbowaaokuacv ; OS_CLOUD=vexxhost /usr/bin/python3'
Feb 02 09:05:22 np0005604791.novalocal sudo[7395]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:05:22 np0005604791.novalocal python3[7397]: ansible-ansible.legacy.stat Invoked with path=/etc/ci/env/networking-info.yml follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 02 09:05:22 np0005604791.novalocal sudo[7395]: pam_unix(sudo:session): session closed for user root
Feb 02 09:05:22 np0005604791.novalocal sudo[7468]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wcrbtwybkrywromjekrgftucderlurpl ; OS_CLOUD=vexxhost /usr/bin/python3'
Feb 02 09:05:22 np0005604791.novalocal sudo[7468]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:05:22 np0005604791.novalocal python3[7470]: ansible-ansible.legacy.copy Invoked with dest=/etc/ci/env/networking-info.yml owner=root group=root mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1770023122.3360052-373-268191696952508/source _original_basename=tmp3foveb5q follow=False checksum=f14c371f1ecf34b9a35f6f9273fe37702180eaed backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 02 09:05:22 np0005604791.novalocal sudo[7468]: pam_unix(sudo:session): session closed for user root
Feb 02 09:05:23 np0005604791.novalocal irqbalance[804]: Cannot change IRQ 26 affinity: Operation not permitted
Feb 02 09:05:23 np0005604791.novalocal irqbalance[804]: IRQ 26 affinity is now unmanaged
Feb 02 09:05:25 np0005604791.novalocal sshd-session[7319]: Connection closed by 38.102.83.114 port 59718
Feb 02 09:05:25 np0005604791.novalocal sshd-session[7316]: pam_unix(sshd:session): session closed for user zuul
Feb 02 09:05:25 np0005604791.novalocal systemd[1]: session-4.scope: Deactivated successfully.
Feb 02 09:05:25 np0005604791.novalocal systemd-logind[805]: Session 4 logged out. Waiting for processes to exit.
Feb 02 09:05:25 np0005604791.novalocal systemd-logind[805]: Removed session 4.
Feb 02 09:06:52 np0005604791.novalocal systemd[4325]: Created slice User Background Tasks Slice.
Feb 02 09:06:52 np0005604791.novalocal systemd[4325]: Starting Cleanup of User's Temporary Files and Directories...
Feb 02 09:06:52 np0005604791.novalocal systemd[4325]: Finished Cleanup of User's Temporary Files and Directories.
Feb 02 09:13:15 np0005604791.novalocal sshd-session[7501]: Accepted publickey for zuul from 38.102.83.114 port 56174 ssh2: RSA SHA256:oiZKnX5kwvqrsUV0ZZjSac+GUqsMvprIFYZPo6yyNjU
Feb 02 09:13:15 np0005604791.novalocal systemd-logind[805]: New session 5 of user zuul.
Feb 02 09:13:15 np0005604791.novalocal systemd[1]: Started Session 5 of User zuul.
Feb 02 09:13:15 np0005604791.novalocal sshd-session[7501]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Feb 02 09:13:15 np0005604791.novalocal sudo[7528]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sjuyvhxpbuoiqpfzlnkxehjhuzedsmhk ; /usr/bin/python3'
Feb 02 09:13:15 np0005604791.novalocal sudo[7528]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:13:16 np0005604791.novalocal python3[7530]: ansible-ansible.legacy.command Invoked with _raw_params=lsblk -nd -o MAJ:MIN /dev/vda
                                                       _uses_shell=True zuul_log_id=fa163ec2-ffbe-f989-50d9-00000000217d-1-compute1 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 02 09:13:16 np0005604791.novalocal sudo[7528]: pam_unix(sudo:session): session closed for user root
Feb 02 09:13:16 np0005604791.novalocal sudo[7556]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-crnbafvfxphtcdxptdfyavmkqfzgaovx ; /usr/bin/python3'
Feb 02 09:13:16 np0005604791.novalocal sudo[7556]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:13:16 np0005604791.novalocal python3[7558]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/init.scope state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 02 09:13:16 np0005604791.novalocal sudo[7556]: pam_unix(sudo:session): session closed for user root
Feb 02 09:13:16 np0005604791.novalocal sudo[7582]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-roadyfbodouxvjaltdkkjqizdgdvhlvu ; /usr/bin/python3'
Feb 02 09:13:16 np0005604791.novalocal sudo[7582]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:13:16 np0005604791.novalocal python3[7585]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/machine.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 02 09:13:16 np0005604791.novalocal sudo[7582]: pam_unix(sudo:session): session closed for user root
Feb 02 09:13:16 np0005604791.novalocal sudo[7609]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zvtfovnaumfvwdagocahsgblddkobloi ; /usr/bin/python3'
Feb 02 09:13:16 np0005604791.novalocal sudo[7609]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:13:16 np0005604791.novalocal python3[7611]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/system.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 02 09:13:16 np0005604791.novalocal sudo[7609]: pam_unix(sudo:session): session closed for user root
Feb 02 09:13:17 np0005604791.novalocal sudo[7635]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kebzaumhqfgdonjhlenvqyonccrkdsgz ; /usr/bin/python3'
Feb 02 09:13:17 np0005604791.novalocal sudo[7635]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:13:17 np0005604791.novalocal python3[7637]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/user.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 02 09:13:17 np0005604791.novalocal sudo[7635]: pam_unix(sudo:session): session closed for user root
Feb 02 09:13:17 np0005604791.novalocal sudo[7661]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zgtcvyjtzoirmnjnoteikgqifwbgwfxo ; /usr/bin/python3'
Feb 02 09:13:17 np0005604791.novalocal sudo[7661]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:13:17 np0005604791.novalocal python3[7663]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system.conf.d state=directory mode=0755 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 02 09:13:17 np0005604791.novalocal sudo[7661]: pam_unix(sudo:session): session closed for user root
Feb 02 09:13:18 np0005604791.novalocal sudo[7739]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gnvqybzwwsipjnjzscpiyvfmqdvdyabu ; /usr/bin/python3'
Feb 02 09:13:18 np0005604791.novalocal sudo[7739]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:13:18 np0005604791.novalocal python3[7741]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system.conf.d/override.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 02 09:13:18 np0005604791.novalocal sudo[7739]: pam_unix(sudo:session): session closed for user root
Feb 02 09:13:18 np0005604791.novalocal sudo[7812]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kptfjcwigmyaviwarlaczskhhruqxfun ; /usr/bin/python3'
Feb 02 09:13:18 np0005604791.novalocal sudo[7812]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:13:18 np0005604791.novalocal python3[7814]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system.conf.d/override.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1770023598.109643-547-108383498769253/source _original_basename=tmplgda88eh follow=False checksum=a05098bd3d2321238ea1169d0e6f135b35b392d4 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 02 09:13:18 np0005604791.novalocal sudo[7812]: pam_unix(sudo:session): session closed for user root
Feb 02 09:13:19 np0005604791.novalocal sudo[7862]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-biclbijspuefxrzcnhaythionfokdpbu ; /usr/bin/python3'
Feb 02 09:13:19 np0005604791.novalocal sudo[7862]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:13:19 np0005604791.novalocal python3[7864]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Feb 02 09:13:19 np0005604791.novalocal systemd[1]: Reloading.
Feb 02 09:13:19 np0005604791.novalocal systemd-rc-local-generator[7883]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 02 09:13:19 np0005604791.novalocal sudo[7862]: pam_unix(sudo:session): session closed for user root
Feb 02 09:13:21 np0005604791.novalocal sudo[7918]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vddhuxnvmlnhpvsksaejwabyzzbnhqez ; /usr/bin/python3'
Feb 02 09:13:21 np0005604791.novalocal sudo[7918]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:13:21 np0005604791.novalocal python3[7920]: ansible-ansible.builtin.wait_for Invoked with path=/sys/fs/cgroup/system.slice/io.max state=present timeout=30 host=127.0.0.1 connect_timeout=5 delay=0 active_connection_states=['ESTABLISHED', 'FIN_WAIT1', 'FIN_WAIT2', 'SYN_RECV', 'SYN_SENT', 'TIME_WAIT'] sleep=1 port=None search_regex=None exclude_hosts=None msg=None
Feb 02 09:13:21 np0005604791.novalocal sudo[7918]: pam_unix(sudo:session): session closed for user root
Feb 02 09:13:21 np0005604791.novalocal sudo[7944]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oqjbzmyrdejpxmvzmfpuyiloutloahyy ; /usr/bin/python3'
Feb 02 09:13:21 np0005604791.novalocal sudo[7944]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:13:21 np0005604791.novalocal python3[7946]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/init.scope/io.max
                                                       _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 02 09:13:21 np0005604791.novalocal sudo[7944]: pam_unix(sudo:session): session closed for user root
Feb 02 09:13:21 np0005604791.novalocal sudo[7972]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bdgnliywezdnjilpzjxwwhglwjqjxvac ; /usr/bin/python3'
Feb 02 09:13:21 np0005604791.novalocal sudo[7972]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:13:22 np0005604791.novalocal python3[7974]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/machine.slice/io.max
                                                       _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 02 09:13:22 np0005604791.novalocal sudo[7972]: pam_unix(sudo:session): session closed for user root
Feb 02 09:13:22 np0005604791.novalocal sudo[8000]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vjhbttpufvevrjqilyjydiuvqcnqmqkh ; /usr/bin/python3'
Feb 02 09:13:22 np0005604791.novalocal sudo[8000]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:13:22 np0005604791.novalocal python3[8002]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/system.slice/io.max
                                                       _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 02 09:13:22 np0005604791.novalocal sudo[8000]: pam_unix(sudo:session): session closed for user root
Feb 02 09:13:22 np0005604791.novalocal sudo[8028]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pbgmltltcumzhgpdwzjlrkahphohewmz ; /usr/bin/python3'
Feb 02 09:13:22 np0005604791.novalocal sudo[8028]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:13:22 np0005604791.novalocal python3[8030]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/user.slice/io.max
                                                       _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 02 09:13:22 np0005604791.novalocal sudo[8028]: pam_unix(sudo:session): session closed for user root
Feb 02 09:13:23 np0005604791.novalocal python3[8057]: ansible-ansible.legacy.command Invoked with _raw_params=echo "init";    cat /sys/fs/cgroup/init.scope/io.max; echo "machine"; cat /sys/fs/cgroup/machine.slice/io.max; echo "system";  cat /sys/fs/cgroup/system.slice/io.max; echo "user";    cat /sys/fs/cgroup/user.slice/io.max;
                                                       _uses_shell=True zuul_log_id=fa163ec2-ffbe-f989-50d9-000000002184-1-compute1 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 02 09:13:23 np0005604791.novalocal python3[8087]: ansible-ansible.builtin.stat Invoked with path=/sys/fs/cgroup/kubepods.slice/io.max follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Feb 02 09:13:26 np0005604791.novalocal sshd-session[7504]: Connection closed by 38.102.83.114 port 56174
Feb 02 09:13:26 np0005604791.novalocal sshd-session[7501]: pam_unix(sshd:session): session closed for user zuul
Feb 02 09:13:26 np0005604791.novalocal systemd[1]: session-5.scope: Deactivated successfully.
Feb 02 09:13:26 np0005604791.novalocal systemd[1]: session-5.scope: Consumed 3.969s CPU time.
Feb 02 09:13:26 np0005604791.novalocal systemd-logind[805]: Session 5 logged out. Waiting for processes to exit.
Feb 02 09:13:26 np0005604791.novalocal systemd-logind[805]: Removed session 5.
Feb 02 09:13:27 np0005604791.novalocal sshd-session[8090]: Connection closed by 203.83.238.251 port 41252
Feb 02 09:13:28 np0005604791.novalocal sshd-session[8092]: Accepted publickey for zuul from 38.102.83.114 port 40600 ssh2: RSA SHA256:oiZKnX5kwvqrsUV0ZZjSac+GUqsMvprIFYZPo6yyNjU
Feb 02 09:13:28 np0005604791.novalocal systemd-logind[805]: New session 6 of user zuul.
Feb 02 09:13:28 np0005604791.novalocal systemd[1]: Started Session 6 of User zuul.
Feb 02 09:13:28 np0005604791.novalocal sshd-session[8092]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Feb 02 09:13:28 np0005604791.novalocal sudo[8119]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-todbygjgoxaesyvcfdbusoelcepvdwre ; /usr/bin/python3'
Feb 02 09:13:28 np0005604791.novalocal sudo[8119]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:13:28 np0005604791.novalocal python3[8121]: ansible-ansible.legacy.dnf Invoked with name=['podman', 'buildah'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False use_backend=auto conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Feb 02 09:13:35 np0005604791.novalocal setsebool[8164]: The virt_use_nfs policy boolean was changed to 1 by root
Feb 02 09:13:35 np0005604791.novalocal setsebool[8164]: The virt_sandbox_use_all_caps policy boolean was changed to 1 by root
Feb 02 09:13:45 np0005604791.novalocal kernel: SELinux:  Converting 386 SID table entries...
Feb 02 09:13:45 np0005604791.novalocal kernel: SELinux:  policy capability network_peer_controls=1
Feb 02 09:13:45 np0005604791.novalocal kernel: SELinux:  policy capability open_perms=1
Feb 02 09:13:45 np0005604791.novalocal kernel: SELinux:  policy capability extended_socket_class=1
Feb 02 09:13:45 np0005604791.novalocal kernel: SELinux:  policy capability always_check_network=0
Feb 02 09:13:45 np0005604791.novalocal kernel: SELinux:  policy capability cgroup_seclabel=1
Feb 02 09:13:45 np0005604791.novalocal kernel: SELinux:  policy capability nnp_nosuid_transition=1
Feb 02 09:13:45 np0005604791.novalocal kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Feb 02 09:13:54 np0005604791.novalocal kernel: SELinux:  Converting 389 SID table entries...
Feb 02 09:13:54 np0005604791.novalocal kernel: SELinux:  policy capability network_peer_controls=1
Feb 02 09:13:54 np0005604791.novalocal kernel: SELinux:  policy capability open_perms=1
Feb 02 09:13:54 np0005604791.novalocal kernel: SELinux:  policy capability extended_socket_class=1
Feb 02 09:13:54 np0005604791.novalocal kernel: SELinux:  policy capability always_check_network=0
Feb 02 09:13:54 np0005604791.novalocal kernel: SELinux:  policy capability cgroup_seclabel=1
Feb 02 09:13:54 np0005604791.novalocal kernel: SELinux:  policy capability nnp_nosuid_transition=1
Feb 02 09:13:54 np0005604791.novalocal kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Feb 02 09:14:11 np0005604791.novalocal dbus-broker-launch[780]: avc:  op=load_policy lsm=selinux seqno=4 res=1
Feb 02 09:14:11 np0005604791.novalocal systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Feb 02 09:14:12 np0005604791.novalocal systemd[1]: Starting man-db-cache-update.service...
Feb 02 09:14:12 np0005604791.novalocal systemd[1]: Reloading.
Feb 02 09:14:12 np0005604791.novalocal systemd-rc-local-generator[8931]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 02 09:14:12 np0005604791.novalocal systemd[1]: Queuing reload/restart jobs for marked units…
Feb 02 09:14:13 np0005604791.novalocal sudo[8119]: pam_unix(sudo:session): session closed for user root
Feb 02 09:14:16 np0005604791.novalocal python3[12810]: ansible-ansible.legacy.command Invoked with _raw_params=echo "openstack-k8s-operators+cirobot"
                                                        _uses_shell=True zuul_log_id=fa163ec2-ffbe-cc31-c4d5-00000000000c-1-compute1 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 02 09:14:17 np0005604791.novalocal kernel: evm: overlay not supported
Feb 02 09:14:17 np0005604791.novalocal systemd[4325]: Starting D-Bus User Message Bus...
Feb 02 09:14:17 np0005604791.novalocal systemd[4325]: Started D-Bus User Message Bus.
Feb 02 09:14:17 np0005604791.novalocal dbus-broker-launch[13941]: Policy to allow eavesdropping in /usr/share/dbus-1/session.conf +31: Eavesdropping is deprecated and ignored
Feb 02 09:14:17 np0005604791.novalocal dbus-broker-launch[13941]: Policy to allow eavesdropping in /usr/share/dbus-1/session.conf +33: Eavesdropping is deprecated and ignored
Feb 02 09:14:17 np0005604791.novalocal dbus-broker-lau[13941]: Ready
Feb 02 09:14:17 np0005604791.novalocal systemd[4325]: selinux: avc:  op=load_policy lsm=selinux seqno=4 res=1
Feb 02 09:14:17 np0005604791.novalocal systemd[4325]: Created slice Slice /user.
Feb 02 09:14:17 np0005604791.novalocal systemd[4325]: podman-13840.scope: unit configures an IP firewall, but not running as root.
Feb 02 09:14:17 np0005604791.novalocal systemd[4325]: (This warning is only shown for the first unit using IP firewalling.)
Feb 02 09:14:17 np0005604791.novalocal systemd[4325]: Started podman-13840.scope.
Feb 02 09:14:17 np0005604791.novalocal systemd[4325]: Started podman-pause-189668ee.scope.
Feb 02 09:14:18 np0005604791.novalocal sshd-session[8095]: Connection closed by 38.102.83.114 port 40600
Feb 02 09:14:18 np0005604791.novalocal sshd-session[8092]: pam_unix(sshd:session): session closed for user zuul
Feb 02 09:14:18 np0005604791.novalocal systemd[1]: session-6.scope: Deactivated successfully.
Feb 02 09:14:18 np0005604791.novalocal systemd[1]: session-6.scope: Consumed 40.101s CPU time.
Feb 02 09:14:18 np0005604791.novalocal systemd-logind[805]: Session 6 logged out. Waiting for processes to exit.
Feb 02 09:14:18 np0005604791.novalocal systemd-logind[805]: Removed session 6.
Feb 02 09:14:33 np0005604791.novalocal sshd-session[21352]: Unable to negotiate with 38.102.83.241 port 56736: no matching host key type found. Their offer: sk-ecdsa-sha2-nistp256@openssh.com [preauth]
Feb 02 09:14:33 np0005604791.novalocal sshd-session[21360]: Connection closed by 38.102.83.241 port 56722 [preauth]
Feb 02 09:14:33 np0005604791.novalocal sshd-session[21354]: Unable to negotiate with 38.102.83.241 port 56738: no matching host key type found. Their offer: sk-ssh-ed25519@openssh.com [preauth]
Feb 02 09:14:33 np0005604791.novalocal sshd-session[21358]: Connection closed by 38.102.83.241 port 56708 [preauth]
Feb 02 09:14:33 np0005604791.novalocal sshd-session[21361]: Unable to negotiate with 38.102.83.241 port 56730: no matching host key type found. Their offer: ssh-ed25519 [preauth]
Feb 02 09:14:38 np0005604791.novalocal sshd-session[23430]: Accepted publickey for zuul from 38.102.83.114 port 33674 ssh2: RSA SHA256:oiZKnX5kwvqrsUV0ZZjSac+GUqsMvprIFYZPo6yyNjU
Feb 02 09:14:38 np0005604791.novalocal systemd-logind[805]: New session 7 of user zuul.
Feb 02 09:14:38 np0005604791.novalocal systemd[1]: Started Session 7 of User zuul.
Feb 02 09:14:38 np0005604791.novalocal sshd-session[23430]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Feb 02 09:14:38 np0005604791.novalocal python3[23541]: ansible-ansible.posix.authorized_key Invoked with user=zuul key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBBAK6tX32HcxwxspxXPo2b5qp7NanSpxzQsxoSXNQ1fyRzMKWHr/dNDElPeQbQ0mmJ7TyKZaqVEp5TJcSLpUuKw= zuul@np0005604789.novalocal
                                                        manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 02 09:14:38 np0005604791.novalocal sudo[23757]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kzjyqqpqjwlsjzonnobhylrvtwzivmpk ; /usr/bin/python3'
Feb 02 09:14:38 np0005604791.novalocal sudo[23757]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:14:39 np0005604791.novalocal python3[23770]: ansible-ansible.posix.authorized_key Invoked with user=root key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBBAK6tX32HcxwxspxXPo2b5qp7NanSpxzQsxoSXNQ1fyRzMKWHr/dNDElPeQbQ0mmJ7TyKZaqVEp5TJcSLpUuKw= zuul@np0005604789.novalocal
                                                        manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 02 09:14:39 np0005604791.novalocal sudo[23757]: pam_unix(sudo:session): session closed for user root
Feb 02 09:14:39 np0005604791.novalocal sudo[24115]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ahblvxzjaqkbdxqdzjbieunyvgqeclzc ; /usr/bin/python3'
Feb 02 09:14:39 np0005604791.novalocal sudo[24115]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:14:39 np0005604791.novalocal python3[24122]: ansible-ansible.builtin.user Invoked with name=cloud-admin shell=/bin/bash state=present non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on np0005604791.novalocal update_password=always uid=None group=None groups=None comment=None home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None
Feb 02 09:14:39 np0005604791.novalocal useradd[24195]: new group: name=cloud-admin, GID=1002
Feb 02 09:14:39 np0005604791.novalocal useradd[24195]: new user: name=cloud-admin, UID=1002, GID=1002, home=/home/cloud-admin, shell=/bin/bash, from=none
Feb 02 09:14:39 np0005604791.novalocal sudo[24115]: pam_unix(sudo:session): session closed for user root
Feb 02 09:14:40 np0005604791.novalocal sudo[24324]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iifgnpwfgfpbuiydlcjenncgnpurmwzb ; /usr/bin/python3'
Feb 02 09:14:40 np0005604791.novalocal sudo[24324]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:14:40 np0005604791.novalocal python3[24336]: ansible-ansible.posix.authorized_key Invoked with user=cloud-admin key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBBAK6tX32HcxwxspxXPo2b5qp7NanSpxzQsxoSXNQ1fyRzMKWHr/dNDElPeQbQ0mmJ7TyKZaqVEp5TJcSLpUuKw= zuul@np0005604789.novalocal
                                                        manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 02 09:14:40 np0005604791.novalocal sudo[24324]: pam_unix(sudo:session): session closed for user root
Feb 02 09:14:40 np0005604791.novalocal sudo[24621]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iapojtwlmbjnxqruyvuujyweygqauihq ; /usr/bin/python3'
Feb 02 09:14:40 np0005604791.novalocal sudo[24621]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:14:40 np0005604791.novalocal python3[24632]: ansible-ansible.legacy.stat Invoked with path=/etc/sudoers.d/cloud-admin follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 02 09:14:40 np0005604791.novalocal sudo[24621]: pam_unix(sudo:session): session closed for user root
Feb 02 09:14:41 np0005604791.novalocal sudo[24861]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zmsmetmcvdmwcrhmqnhyolishmqakfkl ; /usr/bin/python3'
Feb 02 09:14:41 np0005604791.novalocal sudo[24861]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:14:41 np0005604791.novalocal python3[24874]: ansible-ansible.legacy.copy Invoked with dest=/etc/sudoers.d/cloud-admin mode=0640 src=/home/zuul/.ansible/tmp/ansible-tmp-1770023680.4837782-151-178742569184994/source _original_basename=tmpklhajr8w follow=False checksum=e7614e5ad3ab06eaae55b8efaa2ed81b63ea5634 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 02 09:14:41 np0005604791.novalocal sudo[24861]: pam_unix(sudo:session): session closed for user root
Feb 02 09:14:41 np0005604791.novalocal sudo[25246]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ttmrvvfenoskuyjgxkdjvguvvllvjkis ; /usr/bin/python3'
Feb 02 09:14:41 np0005604791.novalocal sudo[25246]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:14:42 np0005604791.novalocal python3[25256]: ansible-ansible.builtin.hostname Invoked with name=compute-1 use=systemd
Feb 02 09:14:42 np0005604791.novalocal systemd[1]: Starting Hostname Service...
Feb 02 09:14:42 np0005604791.novalocal systemd[1]: Started Hostname Service.
Feb 02 09:14:42 np0005604791.novalocal systemd-hostnamed[25372]: Changed pretty hostname to 'compute-1'
Feb 02 09:14:42 compute-1 systemd-hostnamed[25372]: Hostname set to <compute-1> (static)
Feb 02 09:14:42 compute-1 NetworkManager[7213]: <info>  [1770023682.1595] hostname: static hostname changed from "np0005604791.novalocal" to "compute-1"
Feb 02 09:14:42 compute-1 systemd[1]: Starting Network Manager Script Dispatcher Service...
Feb 02 09:14:42 compute-1 systemd[1]: Started Network Manager Script Dispatcher Service.
Feb 02 09:14:42 compute-1 sudo[25246]: pam_unix(sudo:session): session closed for user root
Feb 02 09:14:42 compute-1 sshd-session[23485]: Connection closed by 38.102.83.114 port 33674
Feb 02 09:14:42 compute-1 sshd-session[23430]: pam_unix(sshd:session): session closed for user zuul
Feb 02 09:14:42 compute-1 systemd[1]: session-7.scope: Deactivated successfully.
Feb 02 09:14:42 compute-1 systemd[1]: session-7.scope: Consumed 2.271s CPU time.
Feb 02 09:14:42 compute-1 systemd-logind[805]: Session 7 logged out. Waiting for processes to exit.
Feb 02 09:14:42 compute-1 systemd-logind[805]: Removed session 7.
Feb 02 09:14:52 compute-1 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Feb 02 09:14:52 compute-1 systemd[1]: Finished man-db-cache-update.service.
Feb 02 09:14:52 compute-1 systemd[1]: man-db-cache-update.service: Consumed 46.634s CPU time.
Feb 02 09:14:52 compute-1 systemd[1]: run-r03f04290d3444c01b388663d1379ff93.service: Deactivated successfully.
Feb 02 09:14:52 compute-1 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Feb 02 09:15:12 compute-1 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Feb 02 09:15:52 compute-1 systemd[1]: Starting Cleanup of Temporary Directories...
Feb 02 09:15:52 compute-1 systemd[1]: systemd-tmpfiles-clean.service: Deactivated successfully.
Feb 02 09:15:52 compute-1 systemd[1]: Finished Cleanup of Temporary Directories.
Feb 02 09:15:52 compute-1 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dclean.service.mount: Deactivated successfully.
Feb 02 09:18:07 compute-1 sshd-session[29989]: Accepted publickey for zuul from 38.102.83.241 port 35890 ssh2: RSA SHA256:oiZKnX5kwvqrsUV0ZZjSac+GUqsMvprIFYZPo6yyNjU
Feb 02 09:18:07 compute-1 systemd-logind[805]: New session 8 of user zuul.
Feb 02 09:18:07 compute-1 systemd[1]: Started Session 8 of User zuul.
Feb 02 09:18:07 compute-1 sshd-session[29989]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Feb 02 09:18:07 compute-1 python3[30065]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 02 09:18:09 compute-1 sudo[30179]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yhvyjidehkdylxvoegrkhmcxyyqwikmi ; /usr/bin/python3'
Feb 02 09:18:09 compute-1 sudo[30179]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:18:09 compute-1 python3[30181]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/delorean.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 02 09:18:09 compute-1 sudo[30179]: pam_unix(sudo:session): session closed for user root
Feb 02 09:18:09 compute-1 sudo[30252]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xwgujswedmygsduwnqnwihlelthqwzmy ; /usr/bin/python3'
Feb 02 09:18:09 compute-1 sudo[30252]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:18:09 compute-1 python3[30254]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1770023889.0992186-33995-203990774314588/source mode=0755 _original_basename=delorean.repo follow=False checksum=cc4ab4695da8ec58c451521a3dd2f41014af145d backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 02 09:18:09 compute-1 sudo[30252]: pam_unix(sudo:session): session closed for user root
Feb 02 09:18:09 compute-1 sudo[30278]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-heuiacdnpihkhuivjryasyiqbihnkylj ; /usr/bin/python3'
Feb 02 09:18:09 compute-1 sudo[30278]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:18:10 compute-1 python3[30280]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/delorean-antelope-testing.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 02 09:18:10 compute-1 sudo[30278]: pam_unix(sudo:session): session closed for user root
Feb 02 09:18:10 compute-1 sudo[30351]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-twfkdtomgdiemkavchwocptwmjnndvwq ; /usr/bin/python3'
Feb 02 09:18:10 compute-1 sudo[30351]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:18:10 compute-1 python3[30353]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1770023889.0992186-33995-203990774314588/source mode=0755 _original_basename=delorean-antelope-testing.repo follow=False checksum=4ebc56dead962b5d40b8d420dad43b948b84d3fc backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 02 09:18:10 compute-1 sudo[30351]: pam_unix(sudo:session): session closed for user root
Feb 02 09:18:10 compute-1 sudo[30377]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kywfbumvxxyrkhvewudxqwwhjhsufmww ; /usr/bin/python3'
Feb 02 09:18:10 compute-1 sudo[30377]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:18:10 compute-1 python3[30379]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-highavailability.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 02 09:18:10 compute-1 sudo[30377]: pam_unix(sudo:session): session closed for user root
Feb 02 09:18:10 compute-1 sudo[30450]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-djgaeqhyyxhbeaemmgorkpubtlemuqjm ; /usr/bin/python3'
Feb 02 09:18:10 compute-1 sudo[30450]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:18:11 compute-1 python3[30452]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1770023889.0992186-33995-203990774314588/source mode=0755 _original_basename=repo-setup-centos-highavailability.repo follow=False checksum=55d0f695fd0d8f47cbc3044ce0dcf5f88862490f backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 02 09:18:11 compute-1 sudo[30450]: pam_unix(sudo:session): session closed for user root
Feb 02 09:18:11 compute-1 sudo[30476]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lbnvdkwcxtedmhceehrkqywazoolcdyb ; /usr/bin/python3'
Feb 02 09:18:11 compute-1 sudo[30476]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:18:11 compute-1 python3[30478]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-powertools.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 02 09:18:11 compute-1 sudo[30476]: pam_unix(sudo:session): session closed for user root
Feb 02 09:18:11 compute-1 sudo[30549]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jzqlojdtlbayrkaxcbuyfsnuhxxxoueu ; /usr/bin/python3'
Feb 02 09:18:11 compute-1 sudo[30549]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:18:11 compute-1 python3[30551]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1770023889.0992186-33995-203990774314588/source mode=0755 _original_basename=repo-setup-centos-powertools.repo follow=False checksum=4b0cf99aa89c5c5be0151545863a7a7568f67568 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 02 09:18:11 compute-1 sudo[30549]: pam_unix(sudo:session): session closed for user root
Feb 02 09:18:11 compute-1 sudo[30575]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tztaymqlyizptwpopgayqnsekfxhtwrf ; /usr/bin/python3'
Feb 02 09:18:11 compute-1 sudo[30575]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:18:11 compute-1 python3[30577]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-appstream.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 02 09:18:11 compute-1 sudo[30575]: pam_unix(sudo:session): session closed for user root
Feb 02 09:18:12 compute-1 sudo[30648]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nfclwfhvjuaybamsydlscuguygspmozi ; /usr/bin/python3'
Feb 02 09:18:12 compute-1 sudo[30648]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:18:12 compute-1 python3[30650]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1770023889.0992186-33995-203990774314588/source mode=0755 _original_basename=repo-setup-centos-appstream.repo follow=False checksum=e89244d2503b2996429dda1857290c1e91e393a1 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 02 09:18:12 compute-1 sudo[30648]: pam_unix(sudo:session): session closed for user root
Feb 02 09:18:12 compute-1 sudo[30674]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-imazkbxkrpsmvvntxtgqabqctmoyhims ; /usr/bin/python3'
Feb 02 09:18:12 compute-1 sudo[30674]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:18:12 compute-1 python3[30676]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-baseos.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 02 09:18:12 compute-1 sudo[30674]: pam_unix(sudo:session): session closed for user root
Feb 02 09:18:12 compute-1 sudo[30747]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fkznpvkhjcpkqcqplqnmnifysunzzoug ; /usr/bin/python3'
Feb 02 09:18:12 compute-1 sudo[30747]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:18:12 compute-1 python3[30749]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1770023889.0992186-33995-203990774314588/source mode=0755 _original_basename=repo-setup-centos-baseos.repo follow=False checksum=36d926db23a40dbfa5c84b5e4d43eac6fa2301d6 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 02 09:18:12 compute-1 sudo[30747]: pam_unix(sudo:session): session closed for user root
Feb 02 09:18:13 compute-1 sudo[30773]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wmjlarfozspzdnscahqaousatrcwikqu ; /usr/bin/python3'
Feb 02 09:18:13 compute-1 sudo[30773]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:18:13 compute-1 python3[30775]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/delorean.repo.md5 follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 02 09:18:13 compute-1 sudo[30773]: pam_unix(sudo:session): session closed for user root
Feb 02 09:18:13 compute-1 sudo[30846]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iwpmsvqccyrpcjuojgzgrmqqlmfjyxdo ; /usr/bin/python3'
Feb 02 09:18:13 compute-1 sudo[30846]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:18:13 compute-1 python3[30848]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1770023889.0992186-33995-203990774314588/source mode=0755 _original_basename=delorean.repo.md5 follow=False checksum=362a603578148d54e8cd25942b88d7f471cc677a backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 02 09:18:13 compute-1 sudo[30846]: pam_unix(sudo:session): session closed for user root
Feb 02 09:18:25 compute-1 python3[30896]: ansible-ansible.legacy.command Invoked with _raw_params=hostname _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 02 09:22:01 compute-1 anacron[4318]: Job `cron.daily' started
Feb 02 09:22:01 compute-1 anacron[4318]: Job `cron.daily' terminated
Feb 02 09:23:24 compute-1 sshd-session[29992]: Received disconnect from 38.102.83.241 port 35890:11: disconnected by user
Feb 02 09:23:24 compute-1 sshd-session[29992]: Disconnected from user zuul 38.102.83.241 port 35890
Feb 02 09:23:24 compute-1 sshd-session[29989]: pam_unix(sshd:session): session closed for user zuul
Feb 02 09:23:24 compute-1 systemd[1]: session-8.scope: Deactivated successfully.
Feb 02 09:23:24 compute-1 systemd[1]: session-8.scope: Consumed 5.195s CPU time.
Feb 02 09:23:24 compute-1 systemd-logind[805]: Session 8 logged out. Waiting for processes to exit.
Feb 02 09:23:24 compute-1 systemd-logind[805]: Removed session 8.
Feb 02 09:26:28 compute-1 sshd-session[30902]: Connection closed by 45.148.10.240 port 39580
Feb 02 09:28:52 compute-1 systemd[1]: Starting dnf makecache...
Feb 02 09:28:52 compute-1 dnf[30904]: Failed determining last makecache time.
Feb 02 09:28:52 compute-1 dnf[30904]: delorean-openstack-barbican-42b4c41831408a8e323 344 kB/s |  13 kB     00:00
Feb 02 09:28:52 compute-1 dnf[30904]: delorean-python-glean-642fffe0203a8ffcc2443db52 2.4 MB/s |  65 kB     00:00
Feb 02 09:28:52 compute-1 dnf[30904]: delorean-openstack-cinder-1c00d6490d88e436f26ef 1.4 MB/s |  32 kB     00:00
Feb 02 09:28:52 compute-1 dnf[30904]: delorean-python-stevedore-c4acc5639fd2329372142 5.0 MB/s | 131 kB     00:00
Feb 02 09:28:52 compute-1 dnf[30904]: delorean-python-cloudkitty-tests-tempest-783703 1.4 MB/s |  32 kB     00:00
Feb 02 09:28:52 compute-1 dnf[30904]: delorean-diskimage-builder-61b717cc45660834fe9a  11 MB/s | 349 kB     00:00
Feb 02 09:28:53 compute-1 dnf[30904]: delorean-openstack-nova-eaa65f0b85123a4ee343246 1.7 MB/s |  42 kB     00:00
Feb 02 09:28:53 compute-1 dnf[30904]: delorean-python-designate-tests-tempest-347fdbc 645 kB/s |  18 kB     00:00
Feb 02 09:28:53 compute-1 dnf[30904]: delorean-openstack-glance-1fd12c29b339f30fe823e 655 kB/s |  18 kB     00:00
Feb 02 09:28:53 compute-1 dnf[30904]: delorean-openstack-keystone-e4b40af0ae3698fbbbb 1.0 MB/s |  29 kB     00:00
Feb 02 09:28:53 compute-1 dnf[30904]: delorean-openstack-manila-d783d10e75495b73866db 897 kB/s |  25 kB     00:00
Feb 02 09:28:53 compute-1 dnf[30904]: delorean-openstack-neutron-95cadbd379667c8520c8 5.3 MB/s | 154 kB     00:00
Feb 02 09:28:53 compute-1 dnf[30904]: delorean-openstack-octavia-5975097dd4b021385178 946 kB/s |  26 kB     00:00
Feb 02 09:28:53 compute-1 dnf[30904]: delorean-openstack-watcher-c014f81a8647287f6dcc 593 kB/s |  16 kB     00:00
Feb 02 09:28:53 compute-1 dnf[30904]: delorean-python-tcib-78032d201b02cee27e8e644c61 307 kB/s | 7.4 kB     00:00
Feb 02 09:28:53 compute-1 dnf[30904]: delorean-puppet-ceph-7352068d7b8c84ded636ab3158 5.0 MB/s | 144 kB     00:00
Feb 02 09:28:53 compute-1 dnf[30904]: delorean-openstack-swift-dc98a8463506ac520c469a 489 kB/s |  14 kB     00:00
Feb 02 09:28:53 compute-1 dnf[30904]: delorean-python-tempestconf-8515371b7cceebd4282 2.1 MB/s |  53 kB     00:00
Feb 02 09:28:53 compute-1 dnf[30904]: delorean-openstack-heat-ui-013accbfd179753bc3f0 3.3 MB/s |  96 kB     00:00
Feb 02 09:28:53 compute-1 dnf[30904]: CentOS Stream 9 - BaseOS                         51 kB/s | 6.7 kB     00:00
Feb 02 09:28:53 compute-1 dnf[30904]: CentOS Stream 9 - AppStream                      54 kB/s | 6.8 kB     00:00
Feb 02 09:28:54 compute-1 dnf[30904]: CentOS Stream 9 - CRB                            28 kB/s | 6.6 kB     00:00
Feb 02 09:28:54 compute-1 dnf[30904]: CentOS Stream 9 - Extras packages                55 kB/s | 7.3 kB     00:00
Feb 02 09:28:54 compute-1 dnf[30904]: dlrn-antelope-testing                            29 MB/s | 1.1 MB     00:00
Feb 02 09:28:54 compute-1 dnf[30904]: dlrn-antelope-build-deps                         14 MB/s | 461 kB     00:00
Feb 02 09:28:55 compute-1 dnf[30904]: centos9-rabbitmq                                9.2 MB/s | 123 kB     00:00
Feb 02 09:28:55 compute-1 dnf[30904]: centos9-storage                                  20 MB/s | 415 kB     00:00
Feb 02 09:28:55 compute-1 dnf[30904]: centos9-opstools                                4.0 MB/s |  51 kB     00:00
Feb 02 09:28:55 compute-1 dnf[30904]: NFV SIG OpenvSwitch                              26 MB/s | 461 kB     00:00
Feb 02 09:28:55 compute-1 dnf[30904]: repo-setup-centos-appstream                     121 MB/s |  26 MB     00:00
Feb 02 09:29:01 compute-1 dnf[30904]: repo-setup-centos-baseos                        101 MB/s | 8.9 MB     00:00
Feb 02 09:29:02 compute-1 dnf[30904]: repo-setup-centos-highavailability               32 MB/s | 744 kB     00:00
Feb 02 09:29:02 compute-1 dnf[30904]: repo-setup-centos-powertools                     90 MB/s | 7.6 MB     00:00
Feb 02 09:29:05 compute-1 dnf[30904]: Extra Packages for Enterprise Linux 9 - x86_64   17 MB/s |  20 MB     00:01
Feb 02 09:29:17 compute-1 dnf[30904]: Metadata cache created.
Feb 02 09:29:17 compute-1 systemd[1]: dnf-makecache.service: Deactivated successfully.
Feb 02 09:29:17 compute-1 systemd[1]: Finished dnf makecache.
Feb 02 09:29:17 compute-1 systemd[1]: dnf-makecache.service: Consumed 23.152s CPU time.
Feb 02 09:29:51 compute-1 sshd-session[31006]: Accepted publickey for zuul from 192.168.122.30 port 43682 ssh2: ECDSA SHA256:RIWOugHsRom13QN8+H2eekzMj6VNcm6gUxie+zDStiQ
Feb 02 09:29:51 compute-1 systemd-logind[805]: New session 9 of user zuul.
Feb 02 09:29:51 compute-1 systemd[1]: Started Session 9 of User zuul.
Feb 02 09:29:51 compute-1 sshd-session[31006]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Feb 02 09:29:52 compute-1 python3.9[31159]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 02 09:29:53 compute-1 sudo[31338]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ywjkilkjqcrndvcfragtkawhkpkgfqnx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770024592.9833467-52-227909878675132/AnsiballZ_command.py'
Feb 02 09:29:53 compute-1 sudo[31338]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:29:53 compute-1 python3.9[31340]: ansible-ansible.legacy.command Invoked with _raw_params=set -euxo pipefail
                                            pushd /var/tmp
                                            curl -sL https://github.com/openstack-k8s-operators/repo-setup/archive/refs/heads/main.tar.gz | tar -xz
                                            pushd repo-setup-main
                                            python3 -m venv ./venv
                                            PBR_VERSION=0.0.0 ./venv/bin/pip install ./
                                            ./venv/bin/repo-setup current-podified -b antelope
                                            popd
                                            rm -rf repo-setup-main
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 02 09:29:59 compute-1 sudo[31338]: pam_unix(sudo:session): session closed for user root
Feb 02 09:30:00 compute-1 sshd-session[31009]: Connection closed by 192.168.122.30 port 43682
Feb 02 09:30:00 compute-1 sshd-session[31006]: pam_unix(sshd:session): session closed for user zuul
Feb 02 09:30:00 compute-1 systemd[1]: session-9.scope: Deactivated successfully.
Feb 02 09:30:00 compute-1 systemd[1]: session-9.scope: Consumed 7.080s CPU time.
Feb 02 09:30:00 compute-1 systemd-logind[805]: Session 9 logged out. Waiting for processes to exit.
Feb 02 09:30:00 compute-1 systemd-logind[805]: Removed session 9.
Feb 02 09:30:16 compute-1 sshd-session[31398]: Accepted publickey for zuul from 192.168.122.30 port 43612 ssh2: ECDSA SHA256:RIWOugHsRom13QN8+H2eekzMj6VNcm6gUxie+zDStiQ
Feb 02 09:30:16 compute-1 systemd-logind[805]: New session 10 of user zuul.
Feb 02 09:30:16 compute-1 systemd[1]: Started Session 10 of User zuul.
Feb 02 09:30:16 compute-1 sshd-session[31398]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Feb 02 09:30:17 compute-1 python3.9[31551]: ansible-ansible.legacy.ping Invoked with data=pong
Feb 02 09:30:18 compute-1 python3.9[31725]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 02 09:30:18 compute-1 sudo[31875]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kxaretknxnvpvvdaryfrlbdqkmezhfvs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770024618.571829-89-256190340113007/AnsiballZ_command.py'
Feb 02 09:30:18 compute-1 sudo[31875]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:30:19 compute-1 python3.9[31877]: ansible-ansible.legacy.command Invoked with _raw_params=PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin which growvols
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 02 09:30:19 compute-1 sudo[31875]: pam_unix(sudo:session): session closed for user root
Feb 02 09:30:19 compute-1 sudo[32028]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-btzonffmpaqlnestyrqecgyxjbcgxvjt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770024619.5721145-125-198810723573471/AnsiballZ_stat.py'
Feb 02 09:30:19 compute-1 sudo[32028]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:30:20 compute-1 python3.9[32030]: ansible-ansible.builtin.stat Invoked with path=/etc/ansible/facts.d/bootc.fact follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 02 09:30:20 compute-1 sudo[32028]: pam_unix(sudo:session): session closed for user root
Feb 02 09:30:20 compute-1 sudo[32180]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rjsumihvnkztyqkyfgufrqpurfcexgjb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770024620.3311296-149-244592582166388/AnsiballZ_file.py'
Feb 02 09:30:20 compute-1 sudo[32180]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:30:20 compute-1 python3.9[32182]: ansible-ansible.builtin.file Invoked with mode=755 path=/etc/ansible/facts.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 02 09:30:20 compute-1 sudo[32180]: pam_unix(sudo:session): session closed for user root
Feb 02 09:30:21 compute-1 sudo[32332]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mivtskvdsshtnluwxyjyhomjuoswymer ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770024621.0426342-173-256343527697352/AnsiballZ_stat.py'
Feb 02 09:30:21 compute-1 sudo[32332]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:30:21 compute-1 python3.9[32334]: ansible-ansible.legacy.stat Invoked with path=/etc/ansible/facts.d/bootc.fact follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 02 09:30:21 compute-1 sudo[32332]: pam_unix(sudo:session): session closed for user root
Feb 02 09:30:21 compute-1 sudo[32455]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-guubhapljuvbewrlanbcrcvjlrkrdxtj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770024621.0426342-173-256343527697352/AnsiballZ_copy.py'
Feb 02 09:30:21 compute-1 sudo[32455]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:30:22 compute-1 python3.9[32457]: ansible-ansible.legacy.copy Invoked with dest=/etc/ansible/facts.d/bootc.fact mode=755 src=/home/zuul/.ansible/tmp/ansible-tmp-1770024621.0426342-173-256343527697352/.source.fact _original_basename=bootc.fact follow=False checksum=eb4122ce7fc50a38407beb511c4ff8c178005b12 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 02 09:30:22 compute-1 sudo[32455]: pam_unix(sudo:session): session closed for user root
Feb 02 09:30:22 compute-1 sudo[32607]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vurqevgqnrdqrsnxnsizfdrcsqmppiqe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770024622.2675476-218-34942823507536/AnsiballZ_setup.py'
Feb 02 09:30:22 compute-1 sudo[32607]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:30:22 compute-1 python3.9[32609]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 02 09:30:22 compute-1 sudo[32607]: pam_unix(sudo:session): session closed for user root
Feb 02 09:30:23 compute-1 sudo[32763]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vyyrntckynctrpqhkbbelyzmngdfeifg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770024623.206533-242-148155497954766/AnsiballZ_file.py'
Feb 02 09:30:23 compute-1 sudo[32763]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:30:23 compute-1 python3.9[32765]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/log/journal setype=var_log_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 02 09:30:23 compute-1 sudo[32763]: pam_unix(sudo:session): session closed for user root
Feb 02 09:30:24 compute-1 sudo[32915]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jwnxzxsyvejwsojxiloprcfdrxnzqcci ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770024623.8914032-269-144876528325947/AnsiballZ_file.py'
Feb 02 09:30:24 compute-1 sudo[32915]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:30:24 compute-1 python3.9[32917]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/config-data/ansible-generated recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 02 09:30:24 compute-1 sudo[32915]: pam_unix(sudo:session): session closed for user root
Feb 02 09:30:25 compute-1 python3.9[33067]: ansible-ansible.builtin.service_facts Invoked
Feb 02 09:30:29 compute-1 python3.9[33320]: ansible-ansible.builtin.lineinfile Invoked with line=cloud-init=disabled path=/proc/cmdline state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 02 09:30:30 compute-1 python3.9[33470]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 02 09:30:31 compute-1 python3.9[33624]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 02 09:30:32 compute-1 sudo[33780]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bvgnxcddtkzgzcijrhpanvxyhpqjdwsm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770024632.4396365-413-200230015634936/AnsiballZ_setup.py'
Feb 02 09:30:32 compute-1 sudo[33780]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:30:32 compute-1 python3.9[33782]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Feb 02 09:30:33 compute-1 sudo[33780]: pam_unix(sudo:session): session closed for user root
Feb 02 09:30:33 compute-1 sudo[33864]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-frixlemtoqcmyodypexegnrykdtxrowt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770024632.4396365-413-200230015634936/AnsiballZ_dnf.py'
Feb 02 09:30:33 compute-1 sudo[33864]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:30:33 compute-1 python3.9[33866]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Feb 02 09:31:13 compute-1 systemd[1]: Reloading.
Feb 02 09:31:13 compute-1 systemd-rc-local-generator[34063]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 02 09:31:13 compute-1 systemd[1]: Listening on Device-mapper event daemon FIFOs.
Feb 02 09:31:14 compute-1 systemd[1]: Reloading.
Feb 02 09:31:14 compute-1 systemd-rc-local-generator[34104]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 02 09:31:14 compute-1 systemd[1]: Starting Monitoring of LVM2 mirrors, snapshots etc. using dmeventd or progress polling...
Feb 02 09:31:14 compute-1 systemd[1]: Finished Monitoring of LVM2 mirrors, snapshots etc. using dmeventd or progress polling.
Feb 02 09:31:14 compute-1 systemd[1]: Reloading.
Feb 02 09:31:14 compute-1 systemd-rc-local-generator[34143]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 02 09:31:14 compute-1 systemd[1]: Listening on LVM2 poll daemon socket.
Feb 02 09:31:14 compute-1 dbus-broker-launch[775]: Noticed file-system modification, trigger reload.
Feb 02 09:31:14 compute-1 dbus-broker-launch[775]: Noticed file-system modification, trigger reload.
Feb 02 09:31:14 compute-1 dbus-broker-launch[775]: Noticed file-system modification, trigger reload.
Feb 02 09:32:08 compute-1 kernel: SELinux:  Converting 2728 SID table entries...
Feb 02 09:32:08 compute-1 kernel: SELinux:  policy capability network_peer_controls=1
Feb 02 09:32:08 compute-1 kernel: SELinux:  policy capability open_perms=1
Feb 02 09:32:08 compute-1 kernel: SELinux:  policy capability extended_socket_class=1
Feb 02 09:32:08 compute-1 kernel: SELinux:  policy capability always_check_network=0
Feb 02 09:32:08 compute-1 kernel: SELinux:  policy capability cgroup_seclabel=1
Feb 02 09:32:08 compute-1 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Feb 02 09:32:08 compute-1 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Feb 02 09:32:08 compute-1 dbus-broker-launch[780]: avc:  op=load_policy lsm=selinux seqno=6 res=1
Feb 02 09:32:08 compute-1 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Feb 02 09:32:08 compute-1 systemd[1]: Starting man-db-cache-update.service...
Feb 02 09:32:08 compute-1 systemd[1]: Reloading.
Feb 02 09:32:08 compute-1 systemd-rc-local-generator[34475]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 02 09:32:09 compute-1 systemd[1]: Queuing reload/restart jobs for marked units…
Feb 02 09:32:09 compute-1 sudo[33864]: pam_unix(sudo:session): session closed for user root
Feb 02 09:32:09 compute-1 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Feb 02 09:32:09 compute-1 systemd[1]: Finished man-db-cache-update.service.
Feb 02 09:32:09 compute-1 systemd[1]: run-r8e3c2b150b6f4bd895136e821349cf4e.service: Deactivated successfully.
Feb 02 09:32:11 compute-1 sudo[35392]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bandkesbivodvezgryvztosgjxmjpuoj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770024731.4486105-450-155271380215874/AnsiballZ_command.py'
Feb 02 09:32:11 compute-1 sudo[35392]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:32:11 compute-1 python3.9[35394]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 02 09:32:12 compute-1 sshd-session[35471]: Connection closed by 80.94.92.184 port 50938
Feb 02 09:32:12 compute-1 sudo[35392]: pam_unix(sudo:session): session closed for user root
Feb 02 09:32:13 compute-1 sudo[35674]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gcicmhkanqcrkpklrbcsejmhvahnnrup ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770024733.364128-473-208234613540120/AnsiballZ_selinux.py'
Feb 02 09:32:13 compute-1 sudo[35674]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:32:14 compute-1 python3.9[35676]: ansible-ansible.posix.selinux Invoked with policy=targeted state=enforcing configfile=/etc/selinux/config update_kernel_param=False
Feb 02 09:32:14 compute-1 sudo[35674]: pam_unix(sudo:session): session closed for user root
Feb 02 09:32:14 compute-1 sudo[35826]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uzqzrqzavbvxosbojszkgafvprqkzlyh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770024734.6829805-506-281077885807293/AnsiballZ_command.py'
Feb 02 09:32:14 compute-1 sudo[35826]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:32:15 compute-1 python3.9[35828]: ansible-ansible.legacy.command Invoked with cmd=dd if=/dev/zero of=/swap count=1024 bs=1M creates=/swap _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None removes=None stdin=None
Feb 02 09:32:15 compute-1 sudo[35826]: pam_unix(sudo:session): session closed for user root
Feb 02 09:32:16 compute-1 sudo[35980]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qtywbdgccfkluabugvqmmsgwyhqcgpfd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770024735.9096084-530-256452975793919/AnsiballZ_file.py'
Feb 02 09:32:16 compute-1 sudo[35980]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:32:17 compute-1 sshd-session[35983]: Connection closed by 45.148.10.121 port 40100 [preauth]
Feb 02 09:32:17 compute-1 python3.9[35982]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/swap recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 02 09:32:17 compute-1 sudo[35980]: pam_unix(sudo:session): session closed for user root
Feb 02 09:32:18 compute-1 sudo[36134]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aeopgkzjyzxhcvqfpraggucbbprfkfkr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770024737.9285595-554-65963493234450/AnsiballZ_mount.py'
Feb 02 09:32:18 compute-1 sudo[36134]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:32:18 compute-1 python3.9[36136]: ansible-ansible.posix.mount Invoked with dump=0 fstype=swap name=none opts=sw passno=0 src=/swap state=present path=none boot=True opts_no_log=False backup=False fstab=None
Feb 02 09:32:18 compute-1 sudo[36134]: pam_unix(sudo:session): session closed for user root
Feb 02 09:32:19 compute-1 sudo[36286]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rrcitefievwihdtpktipsuamohkbcfsg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770024739.4764907-638-269725839642329/AnsiballZ_file.py'
Feb 02 09:32:19 compute-1 sudo[36286]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:32:19 compute-1 python3.9[36288]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/ca-trust/source/anchors setype=cert_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 02 09:32:19 compute-1 sudo[36286]: pam_unix(sudo:session): session closed for user root
Feb 02 09:32:20 compute-1 sudo[36438]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dzzkqmrvwkspqjqvhaoksjfbatwinhyk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770024740.1902425-662-60105461617986/AnsiballZ_stat.py'
Feb 02 09:32:20 compute-1 sudo[36438]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:32:20 compute-1 python3.9[36440]: ansible-ansible.legacy.stat Invoked with path=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 02 09:32:20 compute-1 sudo[36438]: pam_unix(sudo:session): session closed for user root
Feb 02 09:32:21 compute-1 sudo[36561]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-odjmiusiyotggaplndsbyreljzdebctt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770024740.1902425-662-60105461617986/AnsiballZ_copy.py'
Feb 02 09:32:21 compute-1 sudo[36561]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:32:25 compute-1 python3.9[36563]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1770024740.1902425-662-60105461617986/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=01ba6f1c4701862bb94c27ffc13223400c80de38 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 02 09:32:25 compute-1 sudo[36561]: pam_unix(sudo:session): session closed for user root
Feb 02 09:32:26 compute-1 sudo[36713]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-igieamlsunhmgwcismiqhdqdbcrdyudt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770024746.250258-734-55479328057664/AnsiballZ_stat.py'
Feb 02 09:32:26 compute-1 sudo[36713]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:32:26 compute-1 python3.9[36715]: ansible-ansible.builtin.stat Invoked with path=/etc/lvm/devices/system.devices follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 02 09:32:26 compute-1 sudo[36713]: pam_unix(sudo:session): session closed for user root
Feb 02 09:32:27 compute-1 sudo[36865]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rjwcdiauthedhfnwdvzmophoaclzhhlp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770024746.907189-758-240064108553587/AnsiballZ_command.py'
Feb 02 09:32:27 compute-1 sudo[36865]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:32:27 compute-1 python3.9[36867]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/vgimportdevices --all _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 02 09:32:27 compute-1 sudo[36865]: pam_unix(sudo:session): session closed for user root
Feb 02 09:32:27 compute-1 sudo[37018]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-khanradgnebaettvqeeleorkbgklcpaz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770024747.5757685-782-264657015245871/AnsiballZ_file.py'
Feb 02 09:32:27 compute-1 sudo[37018]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:32:28 compute-1 python3.9[37020]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/lvm/devices/system.devices state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 02 09:32:28 compute-1 sudo[37018]: pam_unix(sudo:session): session closed for user root
Feb 02 09:32:28 compute-1 sudo[37170]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rwlmdcauolqatiyskdulorjopicjlpmz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770024748.5014236-815-72559365241560/AnsiballZ_getent.py'
Feb 02 09:32:28 compute-1 sudo[37170]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:32:29 compute-1 python3.9[37172]: ansible-ansible.builtin.getent Invoked with database=passwd key=qemu fail_key=True service=None split=None
Feb 02 09:32:29 compute-1 sudo[37170]: pam_unix(sudo:session): session closed for user root
Feb 02 09:32:29 compute-1 sudo[37323]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xezgaukcseqodfbbkugnxocsueffnwdp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770024749.275705-839-250950629685298/AnsiballZ_group.py'
Feb 02 09:32:29 compute-1 sudo[37323]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:32:29 compute-1 python3.9[37325]: ansible-ansible.builtin.group Invoked with gid=107 name=qemu state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Feb 02 09:32:29 compute-1 groupadd[37326]: group added to /etc/group: name=qemu, GID=107
Feb 02 09:32:29 compute-1 rsyslogd[1009]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Feb 02 09:32:29 compute-1 groupadd[37326]: group added to /etc/gshadow: name=qemu
Feb 02 09:32:29 compute-1 groupadd[37326]: new group: name=qemu, GID=107
Feb 02 09:32:29 compute-1 sudo[37323]: pam_unix(sudo:session): session closed for user root
Feb 02 09:32:30 compute-1 sudo[37482]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nzaebvcmglqgcuwhtfazpibukqwdcpzd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770024750.0942774-863-162785282690544/AnsiballZ_user.py'
Feb 02 09:32:30 compute-1 sudo[37482]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:32:30 compute-1 python3.9[37484]: ansible-ansible.builtin.user Invoked with comment=qemu user group=qemu groups=[''] name=qemu shell=/sbin/nologin state=present uid=107 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-1 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Feb 02 09:32:30 compute-1 useradd[37486]: new user: name=qemu, UID=107, GID=107, home=/home/qemu, shell=/sbin/nologin, from=/dev/pts/0
Feb 02 09:32:30 compute-1 sudo[37482]: pam_unix(sudo:session): session closed for user root
Feb 02 09:32:31 compute-1 sudo[37642]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qggpuxykgothzekydmxnjeegrnpuwpuv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770024751.0402172-887-255315740040576/AnsiballZ_getent.py'
Feb 02 09:32:31 compute-1 sudo[37642]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:32:31 compute-1 python3.9[37644]: ansible-ansible.builtin.getent Invoked with database=passwd key=hugetlbfs fail_key=True service=None split=None
Feb 02 09:32:31 compute-1 sudo[37642]: pam_unix(sudo:session): session closed for user root
Feb 02 09:32:31 compute-1 sudo[37795]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wtpsyfeapkniymiefhihscggpguyrfyo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770024751.6698945-911-67255890961412/AnsiballZ_group.py'
Feb 02 09:32:31 compute-1 sudo[37795]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:32:32 compute-1 python3.9[37797]: ansible-ansible.builtin.group Invoked with gid=42477 name=hugetlbfs state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Feb 02 09:32:32 compute-1 groupadd[37798]: group added to /etc/group: name=hugetlbfs, GID=42477
Feb 02 09:32:32 compute-1 groupadd[37798]: group added to /etc/gshadow: name=hugetlbfs
Feb 02 09:32:32 compute-1 groupadd[37798]: new group: name=hugetlbfs, GID=42477
Feb 02 09:32:32 compute-1 sudo[37795]: pam_unix(sudo:session): session closed for user root
Feb 02 09:32:32 compute-1 sudo[37953]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tdnfptwknzjeuegckncbpnrvdwqigqyz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770024752.3801327-938-15151971214153/AnsiballZ_file.py'
Feb 02 09:32:32 compute-1 sudo[37953]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:32:32 compute-1 python3.9[37955]: ansible-ansible.builtin.file Invoked with group=qemu mode=0755 owner=qemu path=/var/lib/vhost_sockets setype=virt_cache_t seuser=system_u state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None serole=None selevel=None attributes=None
Feb 02 09:32:32 compute-1 sudo[37953]: pam_unix(sudo:session): session closed for user root
Feb 02 09:32:33 compute-1 sudo[38105]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vvynsleaucijlmdukjeexzimzqjfsdat ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770024753.3844838-971-23880400034814/AnsiballZ_dnf.py'
Feb 02 09:32:33 compute-1 sudo[38105]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:32:33 compute-1 python3.9[38107]: ansible-ansible.legacy.dnf Invoked with name=['dracut-config-generic'] state=absent allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Feb 02 09:32:35 compute-1 sudo[38105]: pam_unix(sudo:session): session closed for user root
Feb 02 09:32:35 compute-1 sudo[38258]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tgokuhmimqzxjhgopndjmdnxgntqybgl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770024755.5747018-995-69200429661630/AnsiballZ_file.py'
Feb 02 09:32:35 compute-1 sudo[38258]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:32:36 compute-1 python3.9[38260]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/modules-load.d setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 02 09:32:36 compute-1 sudo[38258]: pam_unix(sudo:session): session closed for user root
Feb 02 09:32:36 compute-1 sudo[38410]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-econccjloadnajuobmuidnsylsqdhrnu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770024756.255102-1020-264444790041307/AnsiballZ_stat.py'
Feb 02 09:32:36 compute-1 sudo[38410]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:32:36 compute-1 python3.9[38412]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 02 09:32:36 compute-1 sudo[38410]: pam_unix(sudo:session): session closed for user root
Feb 02 09:32:37 compute-1 sudo[38533]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vqkskpiboouzxxzinrdexuemainbzgfm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770024756.255102-1020-264444790041307/AnsiballZ_copy.py'
Feb 02 09:32:37 compute-1 sudo[38533]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:32:37 compute-1 python3.9[38535]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/99-edpm.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1770024756.255102-1020-264444790041307/.source.conf follow=False _original_basename=edpm-modprobe.conf.j2 checksum=8021efe01721d8fa8cab46b95c00ec1be6dbb9d0 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Feb 02 09:32:37 compute-1 sudo[38533]: pam_unix(sudo:session): session closed for user root
Feb 02 09:32:38 compute-1 sudo[38685]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xikxcmogmodbrzyageyaociiwfopyjke ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770024757.4662416-1064-252016023451144/AnsiballZ_systemd.py'
Feb 02 09:32:38 compute-1 sudo[38685]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:32:38 compute-1 python3.9[38687]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Feb 02 09:32:38 compute-1 systemd[1]: Starting Load Kernel Modules...
Feb 02 09:32:38 compute-1 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this.
Feb 02 09:32:38 compute-1 kernel: Bridge firewalling registered
Feb 02 09:32:38 compute-1 systemd-modules-load[38691]: Inserted module 'br_netfilter'
Feb 02 09:32:38 compute-1 systemd[1]: Finished Load Kernel Modules.
Feb 02 09:32:38 compute-1 sudo[38685]: pam_unix(sudo:session): session closed for user root
Feb 02 09:32:38 compute-1 sudo[38844]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fhqecljcttmotuqhwzxdfjtananzjjmf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770024758.661151-1088-37317981670626/AnsiballZ_stat.py'
Feb 02 09:32:38 compute-1 sudo[38844]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:32:39 compute-1 python3.9[38846]: ansible-ansible.legacy.stat Invoked with path=/etc/sysctl.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 02 09:32:39 compute-1 sudo[38844]: pam_unix(sudo:session): session closed for user root
Feb 02 09:32:39 compute-1 sudo[38967]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rgqqneseoxdourspgcftrkqutmoxwjxo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770024758.661151-1088-37317981670626/AnsiballZ_copy.py'
Feb 02 09:32:39 compute-1 sudo[38967]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:32:39 compute-1 python3.9[38969]: ansible-ansible.legacy.copy Invoked with dest=/etc/sysctl.d/99-edpm.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1770024758.661151-1088-37317981670626/.source.conf follow=False _original_basename=edpm-sysctl.conf.j2 checksum=2a366439721b855adcfe4d7f152babb68596a007 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Feb 02 09:32:39 compute-1 sudo[38967]: pam_unix(sudo:session): session closed for user root
Feb 02 09:32:40 compute-1 sudo[39119]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jpmwtrfllwijiwyzoiisfdfkgsforwnq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770024760.197125-1142-150157227088285/AnsiballZ_dnf.py'
Feb 02 09:32:40 compute-1 sudo[39119]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:32:40 compute-1 python3.9[39121]: ansible-ansible.legacy.dnf Invoked with name=['tuned', 'tuned-profiles-cpu-partitioning'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Feb 02 09:32:43 compute-1 dbus-broker-launch[775]: Noticed file-system modification, trigger reload.
Feb 02 09:32:43 compute-1 dbus-broker-launch[775]: Noticed file-system modification, trigger reload.
Feb 02 09:32:44 compute-1 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Feb 02 09:32:44 compute-1 systemd[1]: Starting man-db-cache-update.service...
Feb 02 09:32:44 compute-1 systemd[1]: Reloading.
Feb 02 09:32:44 compute-1 systemd-rc-local-generator[39179]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 02 09:32:44 compute-1 systemd[1]: Queuing reload/restart jobs for marked units…
Feb 02 09:32:44 compute-1 sudo[39119]: pam_unix(sudo:session): session closed for user root
Feb 02 09:32:45 compute-1 python3.9[41039]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/active_profile follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 02 09:32:46 compute-1 python3.9[42243]: ansible-ansible.builtin.slurp Invoked with src=/etc/tuned/active_profile
Feb 02 09:32:47 compute-1 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Feb 02 09:32:47 compute-1 python3.9[43134]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/throughput-performance-variables.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 02 09:32:47 compute-1 systemd[1]: Finished man-db-cache-update.service.
Feb 02 09:32:47 compute-1 systemd[1]: man-db-cache-update.service: Consumed 3.586s CPU time.
Feb 02 09:32:47 compute-1 systemd[1]: run-r4d41e351b2a34d2086783bc988b436fd.service: Deactivated successfully.
Feb 02 09:32:47 compute-1 sudo[43323]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zheuroprqjszwxeejmwzgidugrfgbpxh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770024767.44258-1259-86398151451578/AnsiballZ_command.py'
Feb 02 09:32:47 compute-1 sudo[43323]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:32:47 compute-1 python3.9[43325]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/tuned-adm profile throughput-performance _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 02 09:32:48 compute-1 systemd[1]: Starting Dynamic System Tuning Daemon...
Feb 02 09:32:48 compute-1 systemd[1]: Starting Authorization Manager...
Feb 02 09:32:48 compute-1 systemd[1]: Started Dynamic System Tuning Daemon.
Feb 02 09:32:48 compute-1 polkitd[43542]: Started polkitd version 0.117
Feb 02 09:32:48 compute-1 polkitd[43542]: Loading rules from directory /etc/polkit-1/rules.d
Feb 02 09:32:48 compute-1 polkitd[43542]: Loading rules from directory /usr/share/polkit-1/rules.d
Feb 02 09:32:48 compute-1 polkitd[43542]: Finished loading, compiling and executing 2 rules
Feb 02 09:32:48 compute-1 systemd[1]: Started Authorization Manager.
Feb 02 09:32:48 compute-1 polkitd[43542]: Acquired the name org.freedesktop.PolicyKit1 on the system bus
Feb 02 09:32:48 compute-1 sudo[43323]: pam_unix(sudo:session): session closed for user root
Feb 02 09:32:49 compute-1 sudo[43710]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sorluhimtetdjoiogkyocpyyksnxcpvm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770024768.973284-1286-149691979409086/AnsiballZ_systemd.py'
Feb 02 09:32:49 compute-1 sudo[43710]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:32:49 compute-1 python3.9[43712]: ansible-ansible.builtin.systemd Invoked with enabled=True name=tuned state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 02 09:32:50 compute-1 systemd[1]: Stopping Dynamic System Tuning Daemon...
Feb 02 09:32:50 compute-1 systemd[1]: tuned.service: Deactivated successfully.
Feb 02 09:32:50 compute-1 systemd[1]: Stopped Dynamic System Tuning Daemon.
Feb 02 09:32:50 compute-1 systemd[1]: Starting Dynamic System Tuning Daemon...
Feb 02 09:32:50 compute-1 systemd[1]: Started Dynamic System Tuning Daemon.
Feb 02 09:32:50 compute-1 sudo[43710]: pam_unix(sudo:session): session closed for user root
Feb 02 09:32:51 compute-1 python3.9[43873]: ansible-ansible.builtin.slurp Invoked with src=/proc/cmdline
Feb 02 09:32:55 compute-1 sudo[44023]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-trndrmyuvccdzalefotcxhqdqifghqnu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770024774.7546184-1457-130065643790734/AnsiballZ_systemd.py'
Feb 02 09:32:55 compute-1 sudo[44023]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:32:55 compute-1 python3.9[44025]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksm.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 02 09:32:55 compute-1 systemd[1]: Reloading.
Feb 02 09:32:55 compute-1 systemd-rc-local-generator[44051]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 02 09:32:55 compute-1 sudo[44023]: pam_unix(sudo:session): session closed for user root
Feb 02 09:32:55 compute-1 sudo[44212]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-upysipnzjgggovjicknkxgesyoiyunyl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770024775.6382718-1457-19580717935621/AnsiballZ_systemd.py'
Feb 02 09:32:55 compute-1 sudo[44212]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:32:56 compute-1 python3.9[44214]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksmtuned.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 02 09:32:56 compute-1 systemd[1]: Reloading.
Feb 02 09:32:56 compute-1 systemd-rc-local-generator[44244]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 02 09:32:56 compute-1 sudo[44212]: pam_unix(sudo:session): session closed for user root
Feb 02 09:32:57 compute-1 sudo[44401]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xqfkxmbfdnbghhgqwqjbzvrwtmntssum ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770024776.755067-1505-34485746911620/AnsiballZ_command.py'
Feb 02 09:32:57 compute-1 sudo[44401]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:32:57 compute-1 python3.9[44403]: ansible-ansible.legacy.command Invoked with _raw_params=mkswap "/swap" _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 02 09:32:57 compute-1 sudo[44401]: pam_unix(sudo:session): session closed for user root
Feb 02 09:32:57 compute-1 sudo[44554]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zudsgeyxgrkqzoiotkscjstqsdfzhpkb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770024777.4033616-1529-208802094163946/AnsiballZ_command.py'
Feb 02 09:32:57 compute-1 sudo[44554]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:32:57 compute-1 python3.9[44556]: ansible-ansible.legacy.command Invoked with _raw_params=swapon "/swap" _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 02 09:32:57 compute-1 kernel: Adding 1048572k swap on /swap.  Priority:-2 extents:1 across:1048572k 
Feb 02 09:32:57 compute-1 sudo[44554]: pam_unix(sudo:session): session closed for user root
Feb 02 09:32:58 compute-1 sudo[44707]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-exwalqixohhdeqirkznscfftocqvusqy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770024778.1316826-1553-91443517947114/AnsiballZ_command.py'
Feb 02 09:32:58 compute-1 sudo[44707]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:32:58 compute-1 python3.9[44709]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/bin/update-ca-trust _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 02 09:32:59 compute-1 sudo[44707]: pam_unix(sudo:session): session closed for user root
Feb 02 09:33:00 compute-1 sudo[44869]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-worqmshlfabgfrnvwbicttbfxqymswmj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770024780.3205717-1577-258821966553120/AnsiballZ_command.py'
Feb 02 09:33:00 compute-1 sudo[44869]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:33:00 compute-1 python3.9[44871]: ansible-ansible.legacy.command Invoked with _raw_params=echo 2 >/sys/kernel/mm/ksm/run _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 02 09:33:00 compute-1 sudo[44869]: pam_unix(sudo:session): session closed for user root
Feb 02 09:33:01 compute-1 sudo[45022]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xjityopvfezofotwaerdzxpypjjccnec ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770024780.9967012-1601-201858062840845/AnsiballZ_systemd.py'
Feb 02 09:33:01 compute-1 sudo[45022]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:33:01 compute-1 python3.9[45024]: ansible-ansible.builtin.systemd Invoked with name=systemd-sysctl.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Feb 02 09:33:01 compute-1 systemd[1]: systemd-sysctl.service: Deactivated successfully.
Feb 02 09:33:01 compute-1 systemd[1]: Stopped Apply Kernel Variables.
Feb 02 09:33:01 compute-1 systemd[1]: Stopping Apply Kernel Variables...
Feb 02 09:33:01 compute-1 systemd[1]: Starting Apply Kernel Variables...
Feb 02 09:33:01 compute-1 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully.
Feb 02 09:33:01 compute-1 systemd[1]: Finished Apply Kernel Variables.
Feb 02 09:33:01 compute-1 sudo[45022]: pam_unix(sudo:session): session closed for user root
Feb 02 09:33:02 compute-1 sshd-session[31401]: Connection closed by 192.168.122.30 port 43612
Feb 02 09:33:02 compute-1 sshd-session[31398]: pam_unix(sshd:session): session closed for user zuul
Feb 02 09:33:02 compute-1 systemd[1]: session-10.scope: Deactivated successfully.
Feb 02 09:33:02 compute-1 systemd[1]: session-10.scope: Consumed 1min 59.714s CPU time.
Feb 02 09:33:02 compute-1 systemd-logind[805]: Session 10 logged out. Waiting for processes to exit.
Feb 02 09:33:02 compute-1 systemd-logind[805]: Removed session 10.
Feb 02 09:33:07 compute-1 sshd-session[45054]: Accepted publickey for zuul from 192.168.122.30 port 60874 ssh2: ECDSA SHA256:RIWOugHsRom13QN8+H2eekzMj6VNcm6gUxie+zDStiQ
Feb 02 09:33:07 compute-1 systemd-logind[805]: New session 11 of user zuul.
Feb 02 09:33:07 compute-1 systemd[1]: Started Session 11 of User zuul.
Feb 02 09:33:07 compute-1 sshd-session[45054]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Feb 02 09:33:08 compute-1 python3.9[45207]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 02 09:33:09 compute-1 sudo[45361]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-isnrsiknfspevudkdhstpbxuzreisnak ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770024789.5216937-64-92065018923294/AnsiballZ_getent.py'
Feb 02 09:33:09 compute-1 sudo[45361]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:33:10 compute-1 python3.9[45363]: ansible-ansible.builtin.getent Invoked with database=passwd key=openvswitch fail_key=True service=None split=None
Feb 02 09:33:10 compute-1 sudo[45361]: pam_unix(sudo:session): session closed for user root
Feb 02 09:33:10 compute-1 sudo[45514]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hlxqctnsumwhycczvjmafwboixhgprub ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770024790.5773556-88-82422769935003/AnsiballZ_group.py'
Feb 02 09:33:10 compute-1 sudo[45514]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:33:11 compute-1 python3.9[45516]: ansible-ansible.builtin.group Invoked with gid=42476 name=openvswitch state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Feb 02 09:33:11 compute-1 groupadd[45517]: group added to /etc/group: name=openvswitch, GID=42476
Feb 02 09:33:11 compute-1 groupadd[45517]: group added to /etc/gshadow: name=openvswitch
Feb 02 09:33:11 compute-1 groupadd[45517]: new group: name=openvswitch, GID=42476
Feb 02 09:33:11 compute-1 sudo[45514]: pam_unix(sudo:session): session closed for user root
Feb 02 09:33:11 compute-1 sudo[45672]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ypwdfmjvjpdnghtjjwumgltltnpdaohi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770024791.372339-112-227720686119880/AnsiballZ_user.py'
Feb 02 09:33:11 compute-1 sudo[45672]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:33:11 compute-1 python3.9[45674]: ansible-ansible.builtin.user Invoked with comment=openvswitch user group=openvswitch groups=['hugetlbfs'] name=openvswitch shell=/sbin/nologin state=present uid=42476 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-1 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Feb 02 09:33:12 compute-1 useradd[45676]: new user: name=openvswitch, UID=42476, GID=42476, home=/home/openvswitch, shell=/sbin/nologin, from=/dev/pts/0
Feb 02 09:33:12 compute-1 useradd[45676]: add 'openvswitch' to group 'hugetlbfs'
Feb 02 09:33:12 compute-1 useradd[45676]: add 'openvswitch' to shadow group 'hugetlbfs'
Feb 02 09:33:12 compute-1 sudo[45672]: pam_unix(sudo:session): session closed for user root
Feb 02 09:33:13 compute-1 sudo[45832]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ltcfvplothnpqqssrsijeawnfxoyqtbz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770024792.7888474-142-164601939756245/AnsiballZ_setup.py'
Feb 02 09:33:13 compute-1 sudo[45832]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:33:13 compute-1 python3.9[45834]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Feb 02 09:33:13 compute-1 sudo[45832]: pam_unix(sudo:session): session closed for user root
Feb 02 09:33:13 compute-1 sudo[45916]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pflppimyavfowrsnlkmwochhyuwlldwc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770024792.7888474-142-164601939756245/AnsiballZ_dnf.py'
Feb 02 09:33:13 compute-1 sudo[45916]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:33:14 compute-1 python3.9[45918]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openvswitch'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Feb 02 09:33:16 compute-1 sudo[45916]: pam_unix(sudo:session): session closed for user root
Feb 02 09:33:16 compute-1 sudo[46079]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-roafwhkzdephyfszzbhpblqaexxcbbfl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770024796.4408975-184-246014993041575/AnsiballZ_dnf.py'
Feb 02 09:33:16 compute-1 sudo[46079]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:33:16 compute-1 python3.9[46081]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Feb 02 09:33:26 compute-1 kernel: SELinux:  Converting 2740 SID table entries...
Feb 02 09:33:26 compute-1 kernel: SELinux:  policy capability network_peer_controls=1
Feb 02 09:33:26 compute-1 kernel: SELinux:  policy capability open_perms=1
Feb 02 09:33:26 compute-1 kernel: SELinux:  policy capability extended_socket_class=1
Feb 02 09:33:26 compute-1 kernel: SELinux:  policy capability always_check_network=0
Feb 02 09:33:26 compute-1 kernel: SELinux:  policy capability cgroup_seclabel=1
Feb 02 09:33:26 compute-1 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Feb 02 09:33:26 compute-1 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Feb 02 09:33:26 compute-1 groupadd[46104]: group added to /etc/group: name=unbound, GID=994
Feb 02 09:33:26 compute-1 groupadd[46104]: group added to /etc/gshadow: name=unbound
Feb 02 09:33:26 compute-1 groupadd[46104]: new group: name=unbound, GID=994
Feb 02 09:33:26 compute-1 useradd[46111]: new user: name=unbound, UID=993, GID=994, home=/var/lib/unbound, shell=/sbin/nologin, from=none
Feb 02 09:33:27 compute-1 dbus-broker-launch[780]: avc:  op=load_policy lsm=selinux seqno=7 res=1
Feb 02 09:33:27 compute-1 systemd[1]: Started daily update of the root trust anchor for DNSSEC.
Feb 02 09:33:28 compute-1 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Feb 02 09:33:28 compute-1 systemd[1]: Starting man-db-cache-update.service...
Feb 02 09:33:28 compute-1 systemd[1]: Reloading.
Feb 02 09:33:28 compute-1 systemd-rc-local-generator[46609]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 02 09:33:28 compute-1 systemd-sysv-generator[46612]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 02 09:33:28 compute-1 systemd[1]: Queuing reload/restart jobs for marked units…
Feb 02 09:33:28 compute-1 sudo[46079]: pam_unix(sudo:session): session closed for user root
Feb 02 09:33:28 compute-1 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Feb 02 09:33:28 compute-1 systemd[1]: Finished man-db-cache-update.service.
Feb 02 09:33:28 compute-1 systemd[1]: run-r5dc8cebddb074e7ba314dac0c9caa4ca.service: Deactivated successfully.
Feb 02 09:33:30 compute-1 sudo[47179]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tvdgmqcaryddapfylumuukccwvttjibj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770024809.633507-208-55489341030612/AnsiballZ_systemd.py'
Feb 02 09:33:30 compute-1 sudo[47179]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:33:30 compute-1 python3.9[47181]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Feb 02 09:33:30 compute-1 systemd[1]: Reloading.
Feb 02 09:33:30 compute-1 systemd-sysv-generator[47210]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 02 09:33:30 compute-1 systemd-rc-local-generator[47206]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 02 09:33:30 compute-1 systemd[1]: Starting Open vSwitch Database Unit...
Feb 02 09:33:30 compute-1 chown[47223]: /usr/bin/chown: cannot access '/run/openvswitch': No such file or directory
Feb 02 09:33:30 compute-1 ovs-ctl[47228]: /etc/openvswitch/conf.db does not exist ... (warning).
Feb 02 09:33:30 compute-1 ovs-ctl[47228]: Creating empty database /etc/openvswitch/conf.db [  OK  ]
Feb 02 09:33:30 compute-1 ovs-ctl[47228]: Starting ovsdb-server [  OK  ]
Feb 02 09:33:30 compute-1 ovs-vsctl[47278]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait -- init -- set Open_vSwitch . db-version=8.5.1
Feb 02 09:33:31 compute-1 ovs-vsctl[47298]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait set Open_vSwitch . ovs-version=3.3.5-115.el9s "external-ids:system-id=\"2f54a3b0-231a-4b96-9e3a-0a36e3e73216\"" "external-ids:rundir=\"/var/run/openvswitch\"" "system-type=\"centos\"" "system-version=\"9\""
Feb 02 09:33:31 compute-1 ovs-ctl[47228]: Configuring Open vSwitch system IDs [  OK  ]
Feb 02 09:33:31 compute-1 ovs-vsctl[47304]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait add Open_vSwitch . external-ids hostname=compute-1
Feb 02 09:33:31 compute-1 ovs-ctl[47228]: Enabling remote OVSDB managers [  OK  ]
Feb 02 09:33:31 compute-1 systemd[1]: Started Open vSwitch Database Unit.
Feb 02 09:33:31 compute-1 systemd[1]: Starting Open vSwitch Delete Transient Ports...
Feb 02 09:33:31 compute-1 systemd[1]: Finished Open vSwitch Delete Transient Ports.
Feb 02 09:33:31 compute-1 systemd[1]: Starting Open vSwitch Forwarding Unit...
Feb 02 09:33:31 compute-1 kernel: openvswitch: Open vSwitch switching datapath
Feb 02 09:33:31 compute-1 ovs-ctl[47349]: Inserting openvswitch module [  OK  ]
Feb 02 09:33:31 compute-1 ovs-ctl[47318]: Starting ovs-vswitchd [  OK  ]
Feb 02 09:33:31 compute-1 ovs-vsctl[47369]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait add Open_vSwitch . external-ids hostname=compute-1
Feb 02 09:33:31 compute-1 ovs-ctl[47318]: Enabling remote OVSDB managers [  OK  ]
Feb 02 09:33:31 compute-1 systemd[1]: Started Open vSwitch Forwarding Unit.
Feb 02 09:33:31 compute-1 systemd[1]: Starting Open vSwitch...
Feb 02 09:33:31 compute-1 systemd[1]: Finished Open vSwitch.
Feb 02 09:33:31 compute-1 sudo[47179]: pam_unix(sudo:session): session closed for user root
Feb 02 09:33:32 compute-1 python3.9[47521]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 02 09:33:33 compute-1 sudo[47671]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xncgejcyikrehssfleenkdrzhphxhphk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770024812.6112123-262-179471054317081/AnsiballZ_sefcontext.py'
Feb 02 09:33:33 compute-1 sudo[47671]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:33:33 compute-1 python3.9[47673]: ansible-community.general.sefcontext Invoked with selevel=s0 setype=container_file_t state=present target=/var/lib/edpm-config(/.*)? ignore_selinux_state=False ftype=a reload=True substitute=None seuser=None
Feb 02 09:33:34 compute-1 kernel: SELinux:  Converting 2754 SID table entries...
Feb 02 09:33:34 compute-1 kernel: SELinux:  policy capability network_peer_controls=1
Feb 02 09:33:34 compute-1 kernel: SELinux:  policy capability open_perms=1
Feb 02 09:33:34 compute-1 kernel: SELinux:  policy capability extended_socket_class=1
Feb 02 09:33:34 compute-1 kernel: SELinux:  policy capability always_check_network=0
Feb 02 09:33:34 compute-1 kernel: SELinux:  policy capability cgroup_seclabel=1
Feb 02 09:33:34 compute-1 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Feb 02 09:33:34 compute-1 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Feb 02 09:33:34 compute-1 sudo[47671]: pam_unix(sudo:session): session closed for user root
Feb 02 09:33:35 compute-1 python3.9[47828]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 02 09:33:36 compute-1 sudo[47984]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hgkrrqiipycvxyxhwdjhhpdhigrrjjqu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770024815.7637918-316-37133884291034/AnsiballZ_dnf.py'
Feb 02 09:33:36 compute-1 dbus-broker-launch[780]: avc:  op=load_policy lsm=selinux seqno=8 res=1
Feb 02 09:33:36 compute-1 sudo[47984]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:33:36 compute-1 python3.9[47986]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Feb 02 09:33:37 compute-1 sudo[47984]: pam_unix(sudo:session): session closed for user root
Feb 02 09:33:38 compute-1 sudo[48137]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mgbaadkunbaqmoqnkeffxqgeolxbdvgb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770024817.6511545-340-24770178223405/AnsiballZ_command.py'
Feb 02 09:33:38 compute-1 sudo[48137]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:33:38 compute-1 python3.9[48139]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 02 09:33:38 compute-1 sudo[48137]: pam_unix(sudo:session): session closed for user root
Feb 02 09:33:39 compute-1 sudo[48424]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-axzosccotcapkpfhfrxlqcqxcwlkfqel ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770024819.1523268-364-164874149883943/AnsiballZ_file.py'
Feb 02 09:33:39 compute-1 sudo[48424]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:33:39 compute-1 python3.9[48426]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config selevel=s0 setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None attributes=None
Feb 02 09:33:39 compute-1 sudo[48424]: pam_unix(sudo:session): session closed for user root
Feb 02 09:33:40 compute-1 python3.9[48576]: ansible-ansible.builtin.stat Invoked with path=/etc/cloud/cloud.cfg.d follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 02 09:33:41 compute-1 sudo[48728]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wsfamlaznkxbnixpitzkrpdttrylrsat ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770024820.9327688-412-23239885711380/AnsiballZ_dnf.py'
Feb 02 09:33:41 compute-1 sudo[48728]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:33:41 compute-1 python3.9[48730]: ansible-ansible.legacy.dnf Invoked with name=['NetworkManager-ovs'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Feb 02 09:33:43 compute-1 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Feb 02 09:33:43 compute-1 systemd[1]: Starting man-db-cache-update.service...
Feb 02 09:33:43 compute-1 systemd[1]: Reloading.
Feb 02 09:33:43 compute-1 systemd-rc-local-generator[48765]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 02 09:33:43 compute-1 systemd-sysv-generator[48770]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 02 09:33:43 compute-1 systemd[1]: Queuing reload/restart jobs for marked units…
Feb 02 09:33:43 compute-1 sudo[48728]: pam_unix(sudo:session): session closed for user root
Feb 02 09:33:43 compute-1 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Feb 02 09:33:43 compute-1 systemd[1]: Finished man-db-cache-update.service.
Feb 02 09:33:43 compute-1 systemd[1]: run-raf11a8c34f8349dba4fb96444987709c.service: Deactivated successfully.
Feb 02 09:33:44 compute-1 sudo[49045]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yczskggxibcrywplxvwgmtepubaatnac ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770024823.9588435-436-167409755567711/AnsiballZ_systemd.py'
Feb 02 09:33:44 compute-1 sudo[49045]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:33:44 compute-1 python3.9[49047]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Feb 02 09:33:44 compute-1 systemd[1]: NetworkManager-wait-online.service: Deactivated successfully.
Feb 02 09:33:44 compute-1 systemd[1]: Stopped Network Manager Wait Online.
Feb 02 09:33:44 compute-1 systemd[1]: Stopping Network Manager Wait Online...
Feb 02 09:33:44 compute-1 systemd[1]: Stopping Network Manager...
Feb 02 09:33:44 compute-1 NetworkManager[7213]: <info>  [1770024824.6651] caught SIGTERM, shutting down normally.
Feb 02 09:33:44 compute-1 NetworkManager[7213]: <info>  [1770024824.6667] dhcp4 (eth0): canceled DHCP transaction
Feb 02 09:33:44 compute-1 NetworkManager[7213]: <info>  [1770024824.6667] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Feb 02 09:33:44 compute-1 NetworkManager[7213]: <info>  [1770024824.6667] dhcp4 (eth0): state changed no lease
Feb 02 09:33:44 compute-1 NetworkManager[7213]: <info>  [1770024824.6671] manager: NetworkManager state is now CONNECTED_SITE
Feb 02 09:33:44 compute-1 NetworkManager[7213]: <info>  [1770024824.6739] exiting (success)
Feb 02 09:33:44 compute-1 systemd[1]: Starting Network Manager Script Dispatcher Service...
Feb 02 09:33:44 compute-1 systemd[1]: NetworkManager.service: Deactivated successfully.
Feb 02 09:33:44 compute-1 systemd[1]: Stopped Network Manager.
Feb 02 09:33:44 compute-1 systemd[1]: NetworkManager.service: Consumed 14.564s CPU time, 4.1M memory peak, read 0B from disk, written 34.5K to disk.
Feb 02 09:33:44 compute-1 systemd[1]: Starting Network Manager...
Feb 02 09:33:44 compute-1 systemd[1]: Started Network Manager Script Dispatcher Service.
Feb 02 09:33:44 compute-1 NetworkManager[49055]: <info>  [1770024824.7235] NetworkManager (version 1.54.3-2.el9) is starting... (after a restart, boot:73bfa7d4-cc72-468c-831e-edc1e8589b87)
Feb 02 09:33:44 compute-1 NetworkManager[49055]: <info>  [1770024824.7236] Read config: /etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf
Feb 02 09:33:44 compute-1 NetworkManager[49055]: <info>  [1770024824.7286] manager[0x55ab2a03f000]: monitoring kernel firmware directory '/lib/firmware'.
Feb 02 09:33:44 compute-1 systemd[1]: Starting Hostname Service...
Feb 02 09:33:44 compute-1 systemd[1]: Started Hostname Service.
Feb 02 09:33:44 compute-1 NetworkManager[49055]: <info>  [1770024824.8143] hostname: hostname: using hostnamed
Feb 02 09:33:44 compute-1 NetworkManager[49055]: <info>  [1770024824.8144] hostname: static hostname changed from (none) to "compute-1"
Feb 02 09:33:44 compute-1 NetworkManager[49055]: <info>  [1770024824.8151] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Feb 02 09:33:44 compute-1 NetworkManager[49055]: <info>  [1770024824.8157] manager[0x55ab2a03f000]: rfkill: Wi-Fi hardware radio set enabled
Feb 02 09:33:44 compute-1 NetworkManager[49055]: <info>  [1770024824.8157] manager[0x55ab2a03f000]: rfkill: WWAN hardware radio set enabled
Feb 02 09:33:44 compute-1 NetworkManager[49055]: <info>  [1770024824.8187] Loaded device plugin: NMOvsFactory (/usr/lib64/NetworkManager/1.54.3-2.el9/libnm-device-plugin-ovs.so)
Feb 02 09:33:44 compute-1 NetworkManager[49055]: <info>  [1770024824.8201] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.54.3-2.el9/libnm-device-plugin-team.so)
Feb 02 09:33:44 compute-1 NetworkManager[49055]: <info>  [1770024824.8202] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Feb 02 09:33:44 compute-1 NetworkManager[49055]: <info>  [1770024824.8203] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Feb 02 09:33:44 compute-1 NetworkManager[49055]: <info>  [1770024824.8203] manager: Networking is enabled by state file
Feb 02 09:33:44 compute-1 NetworkManager[49055]: <info>  [1770024824.8206] settings: Loaded settings plugin: keyfile (internal)
Feb 02 09:33:44 compute-1 NetworkManager[49055]: <info>  [1770024824.8211] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.54.3-2.el9/libnm-settings-plugin-ifcfg-rh.so")
Feb 02 09:33:44 compute-1 NetworkManager[49055]: <info>  [1770024824.8248] Warning: the ifcfg-rh plugin is deprecated, please migrate connections to the keyfile format using "nmcli connection migrate"
Feb 02 09:33:44 compute-1 NetworkManager[49055]: <info>  [1770024824.8261] dhcp: init: Using DHCP client 'internal'
Feb 02 09:33:44 compute-1 NetworkManager[49055]: <info>  [1770024824.8265] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Feb 02 09:33:44 compute-1 NetworkManager[49055]: <info>  [1770024824.8272] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Feb 02 09:33:44 compute-1 NetworkManager[49055]: <info>  [1770024824.8280] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Feb 02 09:33:44 compute-1 NetworkManager[49055]: <info>  [1770024824.8291] device (lo): Activation: starting connection 'lo' (5d713ff7-af86-4df5-9d5a-ad7ed5dcc84d)
Feb 02 09:33:44 compute-1 NetworkManager[49055]: <info>  [1770024824.8299] device (eth0): carrier: link connected
Feb 02 09:33:44 compute-1 NetworkManager[49055]: <info>  [1770024824.8305] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Feb 02 09:33:44 compute-1 NetworkManager[49055]: <info>  [1770024824.8312] manager: (eth0): assume: will attempt to assume matching connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03) (indicated)
Feb 02 09:33:44 compute-1 NetworkManager[49055]: <info>  [1770024824.8313] device (eth0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Feb 02 09:33:44 compute-1 NetworkManager[49055]: <info>  [1770024824.8321] device (eth0): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Feb 02 09:33:44 compute-1 NetworkManager[49055]: <info>  [1770024824.8330] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Feb 02 09:33:44 compute-1 NetworkManager[49055]: <info>  [1770024824.8337] device (eth1): carrier: link connected
Feb 02 09:33:44 compute-1 NetworkManager[49055]: <info>  [1770024824.8343] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Feb 02 09:33:44 compute-1 NetworkManager[49055]: <info>  [1770024824.8349] manager: (eth1): assume: will attempt to assume matching connection 'ci-private-network' (ee27c9b5-5e51-5927-b192-b2b3e6929a50) (indicated)
Feb 02 09:33:44 compute-1 NetworkManager[49055]: <info>  [1770024824.8350] device (eth1): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Feb 02 09:33:44 compute-1 NetworkManager[49055]: <info>  [1770024824.8357] device (eth1): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Feb 02 09:33:44 compute-1 NetworkManager[49055]: <info>  [1770024824.8367] device (eth1): Activation: starting connection 'ci-private-network' (ee27c9b5-5e51-5927-b192-b2b3e6929a50)
Feb 02 09:33:44 compute-1 systemd[1]: Started Network Manager.
Feb 02 09:33:44 compute-1 NetworkManager[49055]: <info>  [1770024824.8373] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Feb 02 09:33:44 compute-1 NetworkManager[49055]: <info>  [1770024824.8390] device (lo): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Feb 02 09:33:44 compute-1 NetworkManager[49055]: <info>  [1770024824.8394] device (lo): state change: prepare -> config (reason 'none', managed-type: 'external')
Feb 02 09:33:44 compute-1 NetworkManager[49055]: <info>  [1770024824.8398] device (lo): state change: config -> ip-config (reason 'none', managed-type: 'external')
Feb 02 09:33:44 compute-1 NetworkManager[49055]: <info>  [1770024824.8408] device (eth0): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Feb 02 09:33:44 compute-1 NetworkManager[49055]: <info>  [1770024824.8413] device (eth0): state change: prepare -> config (reason 'none', managed-type: 'assume')
Feb 02 09:33:44 compute-1 NetworkManager[49055]: <info>  [1770024824.8417] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Feb 02 09:33:44 compute-1 NetworkManager[49055]: <info>  [1770024824.8420] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'assume')
Feb 02 09:33:44 compute-1 NetworkManager[49055]: <info>  [1770024824.8425] device (lo): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Feb 02 09:33:44 compute-1 NetworkManager[49055]: <info>  [1770024824.8435] device (eth0): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Feb 02 09:33:44 compute-1 NetworkManager[49055]: <info>  [1770024824.8439] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Feb 02 09:33:44 compute-1 NetworkManager[49055]: <info>  [1770024824.8450] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Feb 02 09:33:44 compute-1 NetworkManager[49055]: <info>  [1770024824.8472] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Feb 02 09:33:44 compute-1 NetworkManager[49055]: <info>  [1770024824.8487] device (lo): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Feb 02 09:33:44 compute-1 NetworkManager[49055]: <info>  [1770024824.8492] dhcp4 (eth0): state changed new lease, address=38.102.83.189
Feb 02 09:33:44 compute-1 NetworkManager[49055]: <info>  [1770024824.8498] device (lo): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Feb 02 09:33:44 compute-1 NetworkManager[49055]: <info>  [1770024824.8507] device (lo): Activation: successful, device activated.
Feb 02 09:33:44 compute-1 NetworkManager[49055]: <info>  [1770024824.8522] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Feb 02 09:33:44 compute-1 systemd[1]: Starting Network Manager Wait Online...
Feb 02 09:33:44 compute-1 NetworkManager[49055]: <info>  [1770024824.8596] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Feb 02 09:33:44 compute-1 NetworkManager[49055]: <info>  [1770024824.8602] device (eth0): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Feb 02 09:33:44 compute-1 NetworkManager[49055]: <info>  [1770024824.8609] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Feb 02 09:33:44 compute-1 NetworkManager[49055]: <info>  [1770024824.8613] manager: NetworkManager state is now CONNECTED_LOCAL
Feb 02 09:33:44 compute-1 NetworkManager[49055]: <info>  [1770024824.8617] device (eth1): Activation: successful, device activated.
Feb 02 09:33:44 compute-1 NetworkManager[49055]: <info>  [1770024824.8628] device (eth0): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Feb 02 09:33:44 compute-1 NetworkManager[49055]: <info>  [1770024824.8629] device (eth0): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Feb 02 09:33:44 compute-1 NetworkManager[49055]: <info>  [1770024824.8634] manager: NetworkManager state is now CONNECTED_SITE
Feb 02 09:33:44 compute-1 NetworkManager[49055]: <info>  [1770024824.8636] device (eth0): Activation: successful, device activated.
Feb 02 09:33:44 compute-1 NetworkManager[49055]: <info>  [1770024824.8644] manager: NetworkManager state is now CONNECTED_GLOBAL
Feb 02 09:33:44 compute-1 NetworkManager[49055]: <info>  [1770024824.8648] manager: startup complete
Feb 02 09:33:44 compute-1 sudo[49045]: pam_unix(sudo:session): session closed for user root
Feb 02 09:33:44 compute-1 systemd[1]: Finished Network Manager Wait Online.
Feb 02 09:33:45 compute-1 sudo[49271]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pjjhkqqsrcclgbftkuzwsdiqiioqtmzl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770024825.056614-460-189710479453678/AnsiballZ_dnf.py'
Feb 02 09:33:45 compute-1 sudo[49271]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:33:45 compute-1 python3.9[49273]: ansible-ansible.legacy.dnf Invoked with name=['os-net-config'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Feb 02 09:33:49 compute-1 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Feb 02 09:33:49 compute-1 systemd[1]: Starting man-db-cache-update.service...
Feb 02 09:33:49 compute-1 systemd[1]: Reloading.
Feb 02 09:33:49 compute-1 systemd-rc-local-generator[49324]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 02 09:33:49 compute-1 systemd-sysv-generator[49330]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 02 09:33:49 compute-1 systemd[1]: Queuing reload/restart jobs for marked units…
Feb 02 09:33:50 compute-1 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Feb 02 09:33:50 compute-1 systemd[1]: Finished man-db-cache-update.service.
Feb 02 09:33:50 compute-1 systemd[1]: run-rddc579ad61724063b6b170ff9fb0ec23.service: Deactivated successfully.
Feb 02 09:33:50 compute-1 sudo[49271]: pam_unix(sudo:session): session closed for user root
Feb 02 09:33:51 compute-1 sudo[49730]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wceqltfsrboxkjwdhxzccmzwadzbphjl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770024831.0290492-496-130599290510727/AnsiballZ_stat.py'
Feb 02 09:33:51 compute-1 sudo[49730]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:33:51 compute-1 python3.9[49732]: ansible-ansible.builtin.stat Invoked with path=/var/lib/edpm-config/os-net-config.returncode follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 02 09:33:51 compute-1 sudo[49730]: pam_unix(sudo:session): session closed for user root
Feb 02 09:33:52 compute-1 sudo[49882]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cygldiusadfqtowkhxpmpotnbmakpfqk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770024831.6947322-523-51314736332210/AnsiballZ_ini_file.py'
Feb 02 09:33:52 compute-1 sudo[49882]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:33:52 compute-1 python3.9[49884]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=no-auto-default path=/etc/NetworkManager/NetworkManager.conf section=main state=present value=* exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 02 09:33:52 compute-1 sudo[49882]: pam_unix(sudo:session): session closed for user root
Feb 02 09:33:53 compute-1 sudo[50036]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fbfxtylaellihogpifshvkofzxvotjhg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770024832.837216-553-227106266109315/AnsiballZ_ini_file.py'
Feb 02 09:33:53 compute-1 sudo[50036]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:33:53 compute-1 python3.9[50038]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=dns path=/etc/NetworkManager/NetworkManager.conf section=main state=absent value=none exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 02 09:33:53 compute-1 sudo[50036]: pam_unix(sudo:session): session closed for user root
Feb 02 09:33:53 compute-1 sudo[50188]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fgjlnyplnlrypsexlgalstvjueraievx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770024833.453526-553-38263738395039/AnsiballZ_ini_file.py'
Feb 02 09:33:53 compute-1 sudo[50188]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:33:53 compute-1 python3.9[50190]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=dns path=/etc/NetworkManager/conf.d/99-cloud-init.conf section=main state=absent value=none exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 02 09:33:53 compute-1 sudo[50188]: pam_unix(sudo:session): session closed for user root
Feb 02 09:33:54 compute-1 sudo[50340]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nydhdtgwsgoemruceiooiciidguaatut ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770024834.048094-598-169437363085499/AnsiballZ_ini_file.py'
Feb 02 09:33:54 compute-1 sudo[50340]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:33:54 compute-1 python3.9[50342]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=rc-manager path=/etc/NetworkManager/NetworkManager.conf section=main state=absent value=unmanaged exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 02 09:33:54 compute-1 sudo[50340]: pam_unix(sudo:session): session closed for user root
Feb 02 09:33:54 compute-1 sudo[50492]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jlfuzvmisgxucnyzrjecfwkronyolzeo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770024834.6280236-598-263821878651256/AnsiballZ_ini_file.py'
Feb 02 09:33:54 compute-1 sudo[50492]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:33:54 compute-1 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Feb 02 09:33:55 compute-1 python3.9[50494]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=rc-manager path=/etc/NetworkManager/conf.d/99-cloud-init.conf section=main state=absent value=unmanaged exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 02 09:33:55 compute-1 sudo[50492]: pam_unix(sudo:session): session closed for user root
Feb 02 09:33:55 compute-1 sudo[50644]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-itgfuvgflbbgcvpzozdlbegnkekirmnd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770024835.1926363-643-195722698107248/AnsiballZ_stat.py'
Feb 02 09:33:55 compute-1 sudo[50644]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:33:55 compute-1 python3.9[50646]: ansible-ansible.legacy.stat Invoked with path=/etc/dhcp/dhclient-enter-hooks follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 02 09:33:55 compute-1 sudo[50644]: pam_unix(sudo:session): session closed for user root
Feb 02 09:33:56 compute-1 sudo[50767]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kthcuubyymbvuicaavzbdbwslpcmomma ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770024835.1926363-643-195722698107248/AnsiballZ_copy.py'
Feb 02 09:33:56 compute-1 sudo[50767]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:33:56 compute-1 python3.9[50769]: ansible-ansible.legacy.copy Invoked with dest=/etc/dhcp/dhclient-enter-hooks mode=0755 src=/home/zuul/.ansible/tmp/ansible-tmp-1770024835.1926363-643-195722698107248/.source _original_basename=.n50w5qmt follow=False checksum=f6278a40de79a9841f6ed1fc584538225566990c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 02 09:33:56 compute-1 sudo[50767]: pam_unix(sudo:session): session closed for user root
Feb 02 09:33:56 compute-1 sudo[50919]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xonpsbnvphlufkrsadglberkfakutozy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770024836.6453674-688-68486702811670/AnsiballZ_file.py'
Feb 02 09:33:56 compute-1 sudo[50919]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:33:57 compute-1 python3.9[50921]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/os-net-config state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 02 09:33:57 compute-1 sudo[50919]: pam_unix(sudo:session): session closed for user root
Feb 02 09:33:57 compute-1 sudo[51071]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ahnlwadkcsowzicnnfmfpoupllzpqkzx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770024837.3644657-712-73565578776238/AnsiballZ_edpm_os_net_config_mappings.py'
Feb 02 09:33:57 compute-1 sudo[51071]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:33:58 compute-1 python3.9[51073]: ansible-edpm_os_net_config_mappings Invoked with net_config_data_lookup={}
Feb 02 09:33:58 compute-1 sudo[51071]: pam_unix(sudo:session): session closed for user root
Feb 02 09:33:58 compute-1 sudo[51223]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iinwnxukaeqgjtolayzycmolwxssxted ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770024838.4326108-739-194358986868849/AnsiballZ_file.py'
Feb 02 09:33:58 compute-1 sudo[51223]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:33:58 compute-1 python3.9[51225]: ansible-ansible.builtin.file Invoked with path=/var/lib/edpm-config/scripts state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 02 09:33:58 compute-1 sudo[51223]: pam_unix(sudo:session): session closed for user root
Feb 02 09:33:59 compute-1 sudo[51375]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aemustnnsqorrqqmumwfutnzpmikyoqu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770024839.2235785-769-41551801481268/AnsiballZ_stat.py'
Feb 02 09:33:59 compute-1 sudo[51375]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:33:59 compute-1 sudo[51375]: pam_unix(sudo:session): session closed for user root
Feb 02 09:33:59 compute-1 sudo[51498]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-usgttxsoftdwyyvuimpcctxxglqswcat ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770024839.2235785-769-41551801481268/AnsiballZ_copy.py'
Feb 02 09:33:59 compute-1 sudo[51498]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:34:00 compute-1 sudo[51498]: pam_unix(sudo:session): session closed for user root
Feb 02 09:34:00 compute-1 sudo[51650]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hapxtncmkpxvtkkfhslrpucsfbjphilv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770024840.358862-814-5958809830149/AnsiballZ_slurp.py'
Feb 02 09:34:00 compute-1 sudo[51650]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:34:01 compute-1 python3.9[51652]: ansible-ansible.builtin.slurp Invoked with path=/etc/os-net-config/config.yaml src=/etc/os-net-config/config.yaml
Feb 02 09:34:01 compute-1 sudo[51650]: pam_unix(sudo:session): session closed for user root
Feb 02 09:34:02 compute-1 sudo[51825]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nghzcagajsispccivvzqxawicpuqzmdi ; ANSIBLE_ASYNC_DIR=\'~/.ansible_async\' /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770024841.3594537-841-228079003665104/async_wrapper.py j446185006143 300 /home/zuul/.ansible/tmp/ansible-tmp-1770024841.3594537-841-228079003665104/AnsiballZ_edpm_os_net_config.py _'
Feb 02 09:34:02 compute-1 sudo[51825]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:34:02 compute-1 ansible-async_wrapper.py[51827]: Invoked with j446185006143 300 /home/zuul/.ansible/tmp/ansible-tmp-1770024841.3594537-841-228079003665104/AnsiballZ_edpm_os_net_config.py _
Feb 02 09:34:02 compute-1 ansible-async_wrapper.py[51830]: Starting module and watcher
Feb 02 09:34:02 compute-1 ansible-async_wrapper.py[51830]: Start watching 51831 (300)
Feb 02 09:34:02 compute-1 ansible-async_wrapper.py[51831]: Start module (51831)
Feb 02 09:34:02 compute-1 ansible-async_wrapper.py[51827]: Return async_wrapper task started.
Feb 02 09:34:02 compute-1 sudo[51825]: pam_unix(sudo:session): session closed for user root
Feb 02 09:34:02 compute-1 python3.9[51832]: ansible-edpm_os_net_config Invoked with cleanup=True config_file=/etc/os-net-config/config.yaml debug=True detailed_exit_codes=True safe_defaults=False use_nmstate=True
Feb 02 09:34:03 compute-1 kernel: cfg80211: Loading compiled-in X.509 certificates for regulatory database
Feb 02 09:34:03 compute-1 kernel: Loaded X.509 cert 'sforshee: 00b28ddf47aef9cea7'
Feb 02 09:34:03 compute-1 kernel: Loaded X.509 cert 'wens: 61c038651aabdcf94bd0ac7ff06c7248db18c600'
Feb 02 09:34:03 compute-1 kernel: platform regulatory.0: Direct firmware load for regulatory.db failed with error -2
Feb 02 09:34:03 compute-1 kernel: cfg80211: failed to load regulatory.db
Feb 02 09:34:04 compute-1 NetworkManager[49055]: <info>  [1770024844.3270] audit: op="checkpoint-create" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=51833 uid=0 result="success"
Feb 02 09:34:04 compute-1 NetworkManager[49055]: <info>  [1770024844.3297] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=51833 uid=0 result="success"
Feb 02 09:34:04 compute-1 NetworkManager[49055]: <info>  [1770024844.4017] manager: (br-ex): new Open vSwitch Bridge device (/org/freedesktop/NetworkManager/Devices/4)
Feb 02 09:34:04 compute-1 NetworkManager[49055]: <info>  [1770024844.4019] audit: op="connection-add" uuid="12147afe-840c-4d71-9d90-87848bae2322" name="br-ex-br" pid=51833 uid=0 result="success"
Feb 02 09:34:04 compute-1 NetworkManager[49055]: <info>  [1770024844.4036] manager: (br-ex): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/5)
Feb 02 09:34:04 compute-1 NetworkManager[49055]: <info>  [1770024844.4037] audit: op="connection-add" uuid="64011fed-cfe7-4a87-81c3-73a14c673061" name="br-ex-port" pid=51833 uid=0 result="success"
Feb 02 09:34:04 compute-1 NetworkManager[49055]: <info>  [1770024844.4051] manager: (eth1): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/6)
Feb 02 09:34:04 compute-1 NetworkManager[49055]: <info>  [1770024844.4053] audit: op="connection-add" uuid="64571508-7a4f-4b5e-89c5-3e1f65b65a54" name="eth1-port" pid=51833 uid=0 result="success"
Feb 02 09:34:04 compute-1 NetworkManager[49055]: <info>  [1770024844.4066] manager: (vlan20): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/7)
Feb 02 09:34:04 compute-1 NetworkManager[49055]: <info>  [1770024844.4068] audit: op="connection-add" uuid="1bb4e91c-0132-453d-a16f-0b09c1c65047" name="vlan20-port" pid=51833 uid=0 result="success"
Feb 02 09:34:04 compute-1 NetworkManager[49055]: <info>  [1770024844.4080] manager: (vlan21): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/8)
Feb 02 09:34:04 compute-1 NetworkManager[49055]: <info>  [1770024844.4082] audit: op="connection-add" uuid="ebf6e626-46d2-4e7b-ac53-74cb51313eba" name="vlan21-port" pid=51833 uid=0 result="success"
Feb 02 09:34:04 compute-1 NetworkManager[49055]: <info>  [1770024844.4097] manager: (vlan22): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/9)
Feb 02 09:34:04 compute-1 NetworkManager[49055]: <info>  [1770024844.4098] audit: op="connection-add" uuid="0b57298e-2470-4dd2-a204-40cf2c152f2d" name="vlan22-port" pid=51833 uid=0 result="success"
Feb 02 09:34:04 compute-1 NetworkManager[49055]: <info>  [1770024844.4111] manager: (vlan23): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/10)
Feb 02 09:34:04 compute-1 NetworkManager[49055]: <info>  [1770024844.4113] audit: op="connection-add" uuid="a493b690-bd4a-4885-a502-03c3741c796c" name="vlan23-port" pid=51833 uid=0 result="success"
Feb 02 09:34:04 compute-1 NetworkManager[49055]: <info>  [1770024844.4137] audit: op="connection-update" uuid="5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03" name="System eth0" args="ipv6.dhcp-timeout,ipv6.addr-gen-mode,ipv6.method,802-3-ethernet.mtu,connection.timestamp,connection.autoconnect-priority,ipv4.dhcp-timeout,ipv4.dhcp-client-id" pid=51833 uid=0 result="success"
Feb 02 09:34:04 compute-1 NetworkManager[49055]: <info>  [1770024844.4158] manager: (br-ex): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/11)
Feb 02 09:34:04 compute-1 NetworkManager[49055]: <info>  [1770024844.4160] audit: op="connection-add" uuid="5073fcc2-d4a1-4649-b330-3ba11bcafee3" name="br-ex-if" pid=51833 uid=0 result="success"
Feb 02 09:34:04 compute-1 NetworkManager[49055]: <info>  [1770024844.4206] audit: op="connection-update" uuid="ee27c9b5-5e51-5927-b192-b2b3e6929a50" name="ci-private-network" args="ipv6.routing-rules,ipv6.dns,ipv6.addr-gen-mode,ipv6.routes,ipv6.method,ipv6.addresses,ovs-external-ids.data,ovs-interface.type,connection.controller,connection.port-type,connection.master,connection.slave-type,connection.timestamp,ipv4.never-default,ipv4.dns,ipv4.routing-rules,ipv4.routes,ipv4.method,ipv4.addresses" pid=51833 uid=0 result="success"
Feb 02 09:34:04 compute-1 NetworkManager[49055]: <info>  [1770024844.4233] manager: (vlan20): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/12)
Feb 02 09:34:04 compute-1 NetworkManager[49055]: <info>  [1770024844.4234] audit: op="connection-add" uuid="7057c86f-124b-448c-8b89-082381965557" name="vlan20-if" pid=51833 uid=0 result="success"
Feb 02 09:34:04 compute-1 NetworkManager[49055]: <info>  [1770024844.4253] manager: (vlan21): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/13)
Feb 02 09:34:04 compute-1 NetworkManager[49055]: <info>  [1770024844.4255] audit: op="connection-add" uuid="8bf5f96c-762d-42bf-9c6e-7b928bb9f5de" name="vlan21-if" pid=51833 uid=0 result="success"
Feb 02 09:34:04 compute-1 NetworkManager[49055]: <info>  [1770024844.4282] manager: (vlan22): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/14)
Feb 02 09:34:04 compute-1 NetworkManager[49055]: <info>  [1770024844.4284] audit: op="connection-add" uuid="c8540e5b-66cb-4359-99d1-b8486c8c24f2" name="vlan22-if" pid=51833 uid=0 result="success"
Feb 02 09:34:04 compute-1 NetworkManager[49055]: <info>  [1770024844.4313] manager: (vlan23): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/15)
Feb 02 09:34:04 compute-1 NetworkManager[49055]: <info>  [1770024844.4316] audit: op="connection-add" uuid="e90cf2e2-a0e0-4fc9-a1d7-b162c2f7e589" name="vlan23-if" pid=51833 uid=0 result="success"
Feb 02 09:34:04 compute-1 NetworkManager[49055]: <info>  [1770024844.4334] audit: op="connection-delete" uuid="4b750100-60f6-352c-8e2d-c3f0f09dae3a" name="Wired connection 1" pid=51833 uid=0 result="success"
Feb 02 09:34:04 compute-1 NetworkManager[49055]: <info>  [1770024844.4354] device (br-ex)[Open vSwitch Bridge]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Feb 02 09:34:04 compute-1 NetworkManager[49055]: <warn>  [1770024844.4358] device (br-ex)[Open vSwitch Bridge]: error setting IPv4 forwarding to '1': Success
Feb 02 09:34:04 compute-1 NetworkManager[49055]: <info>  [1770024844.4369] device (br-ex)[Open vSwitch Bridge]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Feb 02 09:34:04 compute-1 NetworkManager[49055]: <info>  [1770024844.4376] device (br-ex)[Open vSwitch Bridge]: Activation: starting connection 'br-ex-br' (12147afe-840c-4d71-9d90-87848bae2322)
Feb 02 09:34:04 compute-1 NetworkManager[49055]: <info>  [1770024844.4377] audit: op="connection-activate" uuid="12147afe-840c-4d71-9d90-87848bae2322" name="br-ex-br" pid=51833 uid=0 result="success"
Feb 02 09:34:04 compute-1 NetworkManager[49055]: <info>  [1770024844.4380] device (br-ex)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Feb 02 09:34:04 compute-1 NetworkManager[49055]: <warn>  [1770024844.4381] device (br-ex)[Open vSwitch Port]: error setting IPv4 forwarding to '1': Resource temporarily unavailable
Feb 02 09:34:04 compute-1 NetworkManager[49055]: <info>  [1770024844.4388] device (br-ex)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Feb 02 09:34:04 compute-1 NetworkManager[49055]: <info>  [1770024844.4394] device (br-ex)[Open vSwitch Port]: Activation: starting connection 'br-ex-port' (64011fed-cfe7-4a87-81c3-73a14c673061)
Feb 02 09:34:04 compute-1 NetworkManager[49055]: <info>  [1770024844.4396] device (eth1)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Feb 02 09:34:04 compute-1 NetworkManager[49055]: <warn>  [1770024844.4398] device (eth1)[Open vSwitch Port]: error setting IPv4 forwarding to '1': Resource temporarily unavailable
Feb 02 09:34:04 compute-1 NetworkManager[49055]: <info>  [1770024844.4404] device (eth1)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Feb 02 09:34:04 compute-1 NetworkManager[49055]: <info>  [1770024844.4409] device (eth1)[Open vSwitch Port]: Activation: starting connection 'eth1-port' (64571508-7a4f-4b5e-89c5-3e1f65b65a54)
Feb 02 09:34:04 compute-1 NetworkManager[49055]: <info>  [1770024844.4411] device (vlan20)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Feb 02 09:34:04 compute-1 NetworkManager[49055]: <warn>  [1770024844.4413] device (vlan20)[Open vSwitch Port]: error setting IPv4 forwarding to '1': Resource temporarily unavailable
Feb 02 09:34:04 compute-1 NetworkManager[49055]: <info>  [1770024844.4420] device (vlan20)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Feb 02 09:34:04 compute-1 NetworkManager[49055]: <info>  [1770024844.4430] device (vlan20)[Open vSwitch Port]: Activation: starting connection 'vlan20-port' (1bb4e91c-0132-453d-a16f-0b09c1c65047)
Feb 02 09:34:04 compute-1 NetworkManager[49055]: <info>  [1770024844.4435] device (vlan21)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Feb 02 09:34:04 compute-1 NetworkManager[49055]: <warn>  [1770024844.4436] device (vlan21)[Open vSwitch Port]: error setting IPv4 forwarding to '1': Resource temporarily unavailable
Feb 02 09:34:04 compute-1 NetworkManager[49055]: <info>  [1770024844.4447] device (vlan21)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Feb 02 09:34:04 compute-1 NetworkManager[49055]: <info>  [1770024844.4457] device (vlan21)[Open vSwitch Port]: Activation: starting connection 'vlan21-port' (ebf6e626-46d2-4e7b-ac53-74cb51313eba)
Feb 02 09:34:04 compute-1 NetworkManager[49055]: <info>  [1770024844.4460] device (vlan22)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Feb 02 09:34:04 compute-1 NetworkManager[49055]: <warn>  [1770024844.4462] device (vlan22)[Open vSwitch Port]: error setting IPv4 forwarding to '1': Resource temporarily unavailable
Feb 02 09:34:04 compute-1 NetworkManager[49055]: <info>  [1770024844.4470] device (vlan22)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Feb 02 09:34:04 compute-1 NetworkManager[49055]: <info>  [1770024844.4475] device (vlan22)[Open vSwitch Port]: Activation: starting connection 'vlan22-port' (0b57298e-2470-4dd2-a204-40cf2c152f2d)
Feb 02 09:34:04 compute-1 NetworkManager[49055]: <info>  [1770024844.4478] device (vlan23)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Feb 02 09:34:04 compute-1 NetworkManager[49055]: <warn>  [1770024844.4479] device (vlan23)[Open vSwitch Port]: error setting IPv4 forwarding to '1': Resource temporarily unavailable
Feb 02 09:34:04 compute-1 NetworkManager[49055]: <info>  [1770024844.4487] device (vlan23)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Feb 02 09:34:04 compute-1 NetworkManager[49055]: <info>  [1770024844.4493] device (vlan23)[Open vSwitch Port]: Activation: starting connection 'vlan23-port' (a493b690-bd4a-4885-a502-03c3741c796c)
Feb 02 09:34:04 compute-1 NetworkManager[49055]: <info>  [1770024844.4494] device (br-ex)[Open vSwitch Bridge]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Feb 02 09:34:04 compute-1 NetworkManager[49055]: <info>  [1770024844.4498] device (br-ex)[Open vSwitch Bridge]: state change: prepare -> config (reason 'none', managed-type: 'full')
Feb 02 09:34:04 compute-1 NetworkManager[49055]: <info>  [1770024844.4501] device (br-ex)[Open vSwitch Bridge]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Feb 02 09:34:04 compute-1 NetworkManager[49055]: <info>  [1770024844.4509] device (br-ex)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Feb 02 09:34:04 compute-1 NetworkManager[49055]: <warn>  [1770024844.4510] device (br-ex)[Open vSwitch Interface]: error setting IPv4 forwarding to '1': No such file or directory
Feb 02 09:34:04 compute-1 NetworkManager[49055]: <info>  [1770024844.4524] device (br-ex)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Feb 02 09:34:04 compute-1 NetworkManager[49055]: <info>  [1770024844.4529] device (br-ex)[Open vSwitch Interface]: Activation: starting connection 'br-ex-if' (5073fcc2-d4a1-4649-b330-3ba11bcafee3)
Feb 02 09:34:04 compute-1 NetworkManager[49055]: <info>  [1770024844.4529] device (br-ex)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Feb 02 09:34:04 compute-1 NetworkManager[49055]: <info>  [1770024844.4533] device (br-ex)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Feb 02 09:34:04 compute-1 NetworkManager[49055]: <info>  [1770024844.4535] device (br-ex)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Feb 02 09:34:04 compute-1 NetworkManager[49055]: <info>  [1770024844.4536] device (br-ex)[Open vSwitch Port]: Activation: connection 'br-ex-port' attached as port, continuing activation
Feb 02 09:34:04 compute-1 NetworkManager[49055]: <info>  [1770024844.4537] device (eth1): state change: activated -> deactivating (reason 'new-activation', managed-type: 'full')
Feb 02 09:34:04 compute-1 NetworkManager[49055]: <info>  [1770024844.4550] device (eth1): disconnecting for new activation request.
Feb 02 09:34:04 compute-1 NetworkManager[49055]: <info>  [1770024844.4551] device (eth1)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Feb 02 09:34:04 compute-1 NetworkManager[49055]: <info>  [1770024844.4554] device (eth1)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Feb 02 09:34:04 compute-1 NetworkManager[49055]: <info>  [1770024844.4555] device (eth1)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Feb 02 09:34:04 compute-1 NetworkManager[49055]: <info>  [1770024844.4557] device (eth1)[Open vSwitch Port]: Activation: connection 'eth1-port' attached as port, continuing activation
Feb 02 09:34:04 compute-1 NetworkManager[49055]: <info>  [1770024844.4560] device (vlan20)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Feb 02 09:34:04 compute-1 NetworkManager[49055]: <warn>  [1770024844.4561] device (vlan20)[Open vSwitch Interface]: error setting IPv4 forwarding to '1': No such file or directory
Feb 02 09:34:04 compute-1 NetworkManager[49055]: <info>  [1770024844.4564] device (vlan20)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Feb 02 09:34:04 compute-1 NetworkManager[49055]: <info>  [1770024844.4568] device (vlan20)[Open vSwitch Interface]: Activation: starting connection 'vlan20-if' (7057c86f-124b-448c-8b89-082381965557)
Feb 02 09:34:04 compute-1 NetworkManager[49055]: <info>  [1770024844.4569] device (vlan20)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Feb 02 09:34:04 compute-1 NetworkManager[49055]: <info>  [1770024844.4572] device (vlan20)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Feb 02 09:34:04 compute-1 NetworkManager[49055]: <info>  [1770024844.4574] device (vlan20)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Feb 02 09:34:04 compute-1 NetworkManager[49055]: <info>  [1770024844.4575] device (vlan20)[Open vSwitch Port]: Activation: connection 'vlan20-port' attached as port, continuing activation
Feb 02 09:34:04 compute-1 NetworkManager[49055]: <info>  [1770024844.4578] device (vlan21)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Feb 02 09:34:04 compute-1 NetworkManager[49055]: <warn>  [1770024844.4579] device (vlan21)[Open vSwitch Interface]: error setting IPv4 forwarding to '1': No such file or directory
Feb 02 09:34:04 compute-1 NetworkManager[49055]: <info>  [1770024844.4582] device (vlan21)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Feb 02 09:34:04 compute-1 NetworkManager[49055]: <info>  [1770024844.4586] device (vlan21)[Open vSwitch Interface]: Activation: starting connection 'vlan21-if' (8bf5f96c-762d-42bf-9c6e-7b928bb9f5de)
Feb 02 09:34:04 compute-1 NetworkManager[49055]: <info>  [1770024844.4587] device (vlan21)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Feb 02 09:34:04 compute-1 NetworkManager[49055]: <info>  [1770024844.4590] device (vlan21)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Feb 02 09:34:04 compute-1 NetworkManager[49055]: <info>  [1770024844.4592] device (vlan21)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Feb 02 09:34:04 compute-1 NetworkManager[49055]: <info>  [1770024844.4594] device (vlan21)[Open vSwitch Port]: Activation: connection 'vlan21-port' attached as port, continuing activation
Feb 02 09:34:04 compute-1 NetworkManager[49055]: <info>  [1770024844.4596] device (vlan22)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Feb 02 09:34:04 compute-1 NetworkManager[49055]: <warn>  [1770024844.4597] device (vlan22)[Open vSwitch Interface]: error setting IPv4 forwarding to '1': No such file or directory
Feb 02 09:34:04 compute-1 NetworkManager[49055]: <info>  [1770024844.4600] device (vlan22)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Feb 02 09:34:04 compute-1 NetworkManager[49055]: <info>  [1770024844.4605] device (vlan22)[Open vSwitch Interface]: Activation: starting connection 'vlan22-if' (c8540e5b-66cb-4359-99d1-b8486c8c24f2)
Feb 02 09:34:04 compute-1 NetworkManager[49055]: <info>  [1770024844.4605] device (vlan22)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Feb 02 09:34:04 compute-1 NetworkManager[49055]: <info>  [1770024844.4608] device (vlan22)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Feb 02 09:34:04 compute-1 NetworkManager[49055]: <info>  [1770024844.4610] device (vlan22)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Feb 02 09:34:04 compute-1 NetworkManager[49055]: <info>  [1770024844.4611] device (vlan22)[Open vSwitch Port]: Activation: connection 'vlan22-port' attached as port, continuing activation
Feb 02 09:34:04 compute-1 NetworkManager[49055]: <info>  [1770024844.4614] device (vlan23)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Feb 02 09:34:04 compute-1 NetworkManager[49055]: <warn>  [1770024844.4615] device (vlan23)[Open vSwitch Interface]: error setting IPv4 forwarding to '1': No such file or directory
Feb 02 09:34:04 compute-1 NetworkManager[49055]: <info>  [1770024844.4618] device (vlan23)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Feb 02 09:34:04 compute-1 NetworkManager[49055]: <info>  [1770024844.4623] device (vlan23)[Open vSwitch Interface]: Activation: starting connection 'vlan23-if' (e90cf2e2-a0e0-4fc9-a1d7-b162c2f7e589)
Feb 02 09:34:04 compute-1 NetworkManager[49055]: <info>  [1770024844.4623] device (vlan23)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Feb 02 09:34:04 compute-1 NetworkManager[49055]: <info>  [1770024844.4626] device (vlan23)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Feb 02 09:34:04 compute-1 NetworkManager[49055]: <info>  [1770024844.4628] device (vlan23)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Feb 02 09:34:04 compute-1 NetworkManager[49055]: <info>  [1770024844.4629] device (vlan23)[Open vSwitch Port]: Activation: connection 'vlan23-port' attached as port, continuing activation
Feb 02 09:34:04 compute-1 NetworkManager[49055]: <info>  [1770024844.4631] device (br-ex)[Open vSwitch Bridge]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Feb 02 09:34:04 compute-1 NetworkManager[49055]: <info>  [1770024844.4647] audit: op="device-reapply" interface="eth0" ifindex=2 args="ipv6.addr-gen-mode,ipv6.method,802-3-ethernet.mtu,connection.autoconnect-priority,ipv4.dhcp-timeout,ipv4.dhcp-client-id" pid=51833 uid=0 result="success"
Feb 02 09:34:04 compute-1 NetworkManager[49055]: <info>  [1770024844.4649] device (br-ex)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Feb 02 09:34:04 compute-1 NetworkManager[49055]: <info>  [1770024844.4654] device (br-ex)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Feb 02 09:34:04 compute-1 NetworkManager[49055]: <info>  [1770024844.4656] device (br-ex)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Feb 02 09:34:04 compute-1 NetworkManager[49055]: <info>  [1770024844.4662] device (br-ex)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Feb 02 09:34:04 compute-1 NetworkManager[49055]: <info>  [1770024844.4666] device (eth1)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Feb 02 09:34:04 compute-1 NetworkManager[49055]: <info>  [1770024844.4671] device (vlan20)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Feb 02 09:34:04 compute-1 NetworkManager[49055]: <info>  [1770024844.4674] device (vlan20)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Feb 02 09:34:04 compute-1 NetworkManager[49055]: <info>  [1770024844.4676] device (vlan20)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Feb 02 09:34:04 compute-1 kernel: ovs-system: entered promiscuous mode
Feb 02 09:34:04 compute-1 NetworkManager[49055]: <info>  [1770024844.4680] device (vlan20)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Feb 02 09:34:04 compute-1 NetworkManager[49055]: <info>  [1770024844.4684] device (vlan21)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Feb 02 09:34:04 compute-1 NetworkManager[49055]: <info>  [1770024844.4686] device (vlan21)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Feb 02 09:34:04 compute-1 NetworkManager[49055]: <info>  [1770024844.4688] device (vlan21)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Feb 02 09:34:04 compute-1 NetworkManager[49055]: <info>  [1770024844.4692] device (vlan21)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Feb 02 09:34:04 compute-1 NetworkManager[49055]: <info>  [1770024844.4695] device (vlan22)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Feb 02 09:34:04 compute-1 NetworkManager[49055]: <info>  [1770024844.4698] device (vlan22)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Feb 02 09:34:04 compute-1 NetworkManager[49055]: <info>  [1770024844.4701] device (vlan22)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Feb 02 09:34:04 compute-1 systemd-udevd[51839]: Network interface NamePolicy= disabled on kernel command line.
Feb 02 09:34:04 compute-1 kernel: Timeout policy base is empty
Feb 02 09:34:04 compute-1 NetworkManager[49055]: <info>  [1770024844.4707] device (vlan22)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Feb 02 09:34:04 compute-1 NetworkManager[49055]: <info>  [1770024844.4710] device (vlan23)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Feb 02 09:34:04 compute-1 NetworkManager[49055]: <info>  [1770024844.4713] device (vlan23)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Feb 02 09:34:04 compute-1 NetworkManager[49055]: <info>  [1770024844.4714] device (vlan23)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Feb 02 09:34:04 compute-1 NetworkManager[49055]: <info>  [1770024844.4718] device (vlan23)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Feb 02 09:34:04 compute-1 NetworkManager[49055]: <info>  [1770024844.4721] dhcp4 (eth0): canceled DHCP transaction
Feb 02 09:34:04 compute-1 NetworkManager[49055]: <info>  [1770024844.4722] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Feb 02 09:34:04 compute-1 NetworkManager[49055]: <info>  [1770024844.4722] dhcp4 (eth0): state changed no lease
Feb 02 09:34:04 compute-1 NetworkManager[49055]: <info>  [1770024844.4723] dhcp4 (eth0): activation: beginning transaction (no timeout)
Feb 02 09:34:04 compute-1 NetworkManager[49055]: <info>  [1770024844.4733] device (br-ex)[Open vSwitch Interface]: Activation: connection 'br-ex-if' attached as port, continuing activation
Feb 02 09:34:04 compute-1 NetworkManager[49055]: <info>  [1770024844.4736] audit: op="device-reapply" interface="eth1" ifindex=3 pid=51833 uid=0 result="fail" reason="Device is not activated"
Feb 02 09:34:04 compute-1 systemd[1]: Starting Network Manager Script Dispatcher Service...
Feb 02 09:34:04 compute-1 NetworkManager[49055]: <info>  [1770024844.4770] device (vlan20)[Open vSwitch Interface]: Activation: connection 'vlan20-if' attached as port, continuing activation
Feb 02 09:34:04 compute-1 NetworkManager[49055]: <info>  [1770024844.4775] dhcp4 (eth0): state changed new lease, address=38.102.83.189
Feb 02 09:34:04 compute-1 NetworkManager[49055]: <info>  [1770024844.4814] device (vlan21)[Open vSwitch Interface]: Activation: connection 'vlan21-if' attached as port, continuing activation
Feb 02 09:34:04 compute-1 NetworkManager[49055]: <info>  [1770024844.4833] device (vlan22)[Open vSwitch Interface]: Activation: connection 'vlan22-if' attached as port, continuing activation
Feb 02 09:34:04 compute-1 NetworkManager[49055]: <info>  [1770024844.4843] device (vlan23)[Open vSwitch Interface]: Activation: connection 'vlan23-if' attached as port, continuing activation
Feb 02 09:34:04 compute-1 NetworkManager[49055]: <info>  [1770024844.4863] device (eth1): disconnecting for new activation request.
Feb 02 09:34:04 compute-1 NetworkManager[49055]: <info>  [1770024844.4864] audit: op="connection-activate" uuid="ee27c9b5-5e51-5927-b192-b2b3e6929a50" name="ci-private-network" pid=51833 uid=0 result="success"
Feb 02 09:34:04 compute-1 NetworkManager[49055]: <info>  [1770024844.4899] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=51833 uid=0 result="success"
Feb 02 09:34:04 compute-1 systemd[1]: Started Network Manager Script Dispatcher Service.
Feb 02 09:34:04 compute-1 NetworkManager[49055]: <info>  [1770024844.4940] device (eth1): state change: deactivating -> disconnected (reason 'new-activation', managed-type: 'full')
Feb 02 09:34:04 compute-1 kernel: br-ex: entered promiscuous mode
Feb 02 09:34:04 compute-1 NetworkManager[49055]: <info>  [1770024844.5090] device (eth1): Activation: starting connection 'ci-private-network' (ee27c9b5-5e51-5927-b192-b2b3e6929a50)
Feb 02 09:34:04 compute-1 NetworkManager[49055]: <info>  [1770024844.5097] device (br-ex)[Open vSwitch Bridge]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Feb 02 09:34:04 compute-1 NetworkManager[49055]: <info>  [1770024844.5106] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Feb 02 09:34:04 compute-1 NetworkManager[49055]: <info>  [1770024844.5111] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Feb 02 09:34:04 compute-1 NetworkManager[49055]: <info>  [1770024844.5118] device (br-ex)[Open vSwitch Bridge]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Feb 02 09:34:04 compute-1 NetworkManager[49055]: <info>  [1770024844.5123] device (br-ex)[Open vSwitch Bridge]: Activation: successful, device activated.
Feb 02 09:34:04 compute-1 NetworkManager[49055]: <info>  [1770024844.5133] device (br-ex)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Feb 02 09:34:04 compute-1 NetworkManager[49055]: <info>  [1770024844.5136] device (eth1)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Feb 02 09:34:04 compute-1 NetworkManager[49055]: <info>  [1770024844.5138] device (vlan20)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Feb 02 09:34:04 compute-1 NetworkManager[49055]: <info>  [1770024844.5140] device (vlan21)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Feb 02 09:34:04 compute-1 NetworkManager[49055]: <info>  [1770024844.5142] device (vlan22)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Feb 02 09:34:04 compute-1 NetworkManager[49055]: <info>  [1770024844.5144] device (vlan23)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Feb 02 09:34:04 compute-1 kernel: vlan22: entered promiscuous mode
Feb 02 09:34:04 compute-1 systemd-udevd[51838]: Network interface NamePolicy= disabled on kernel command line.
Feb 02 09:34:04 compute-1 NetworkManager[49055]: <info>  [1770024844.5158] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'full')
Feb 02 09:34:04 compute-1 NetworkManager[49055]: <info>  [1770024844.5174] device (br-ex)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Feb 02 09:34:04 compute-1 NetworkManager[49055]: <info>  [1770024844.5179] device (br-ex)[Open vSwitch Port]: Activation: successful, device activated.
Feb 02 09:34:04 compute-1 kernel: vlan23: entered promiscuous mode
Feb 02 09:34:04 compute-1 NetworkManager[49055]: <info>  [1770024844.5184] device (eth1)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Feb 02 09:34:04 compute-1 NetworkManager[49055]: <info>  [1770024844.5188] device (eth1)[Open vSwitch Port]: Activation: successful, device activated.
Feb 02 09:34:04 compute-1 NetworkManager[49055]: <info>  [1770024844.5193] device (vlan20)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Feb 02 09:34:04 compute-1 NetworkManager[49055]: <info>  [1770024844.5197] device (vlan20)[Open vSwitch Port]: Activation: successful, device activated.
Feb 02 09:34:04 compute-1 NetworkManager[49055]: <info>  [1770024844.5201] device (vlan21)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Feb 02 09:34:04 compute-1 NetworkManager[49055]: <info>  [1770024844.5206] device (vlan21)[Open vSwitch Port]: Activation: successful, device activated.
Feb 02 09:34:04 compute-1 NetworkManager[49055]: <info>  [1770024844.5210] device (vlan22)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Feb 02 09:34:04 compute-1 NetworkManager[49055]: <info>  [1770024844.5215] device (vlan22)[Open vSwitch Port]: Activation: successful, device activated.
Feb 02 09:34:04 compute-1 NetworkManager[49055]: <info>  [1770024844.5219] device (vlan23)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Feb 02 09:34:04 compute-1 NetworkManager[49055]: <info>  [1770024844.5224] device (vlan23)[Open vSwitch Port]: Activation: successful, device activated.
Feb 02 09:34:04 compute-1 NetworkManager[49055]: <info>  [1770024844.5236] device (eth1): Activation: connection 'ci-private-network' attached as port, continuing activation
Feb 02 09:34:04 compute-1 NetworkManager[49055]: <info>  [1770024844.5242] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Feb 02 09:34:04 compute-1 NetworkManager[49055]: <info>  [1770024844.5257] device (br-ex)[Open vSwitch Interface]: carrier: link connected
Feb 02 09:34:04 compute-1 kernel: vlan21: entered promiscuous mode
Feb 02 09:34:04 compute-1 NetworkManager[49055]: <info>  [1770024844.5274] device (vlan22)[Open vSwitch Interface]: carrier: link connected
Feb 02 09:34:04 compute-1 NetworkManager[49055]: <info>  [1770024844.5279] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Feb 02 09:34:04 compute-1 NetworkManager[49055]: <info>  [1770024844.5296] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'full')
Feb 02 09:34:04 compute-1 NetworkManager[49055]: <info>  [1770024844.5302] device (eth1): Activation: successful, device activated.
Feb 02 09:34:04 compute-1 NetworkManager[49055]: <info>  [1770024844.5310] device (vlan23)[Open vSwitch Interface]: carrier: link connected
Feb 02 09:34:04 compute-1 kernel: vlan20: entered promiscuous mode
Feb 02 09:34:04 compute-1 NetworkManager[49055]: <info>  [1770024844.5325] device (br-ex)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Feb 02 09:34:04 compute-1 kernel: virtio_net virtio5 eth1: entered promiscuous mode
Feb 02 09:34:04 compute-1 NetworkManager[49055]: <info>  [1770024844.5354] device (vlan22)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Feb 02 09:34:04 compute-1 NetworkManager[49055]: <info>  [1770024844.5369] device (vlan21)[Open vSwitch Interface]: carrier: link connected
Feb 02 09:34:04 compute-1 NetworkManager[49055]: <info>  [1770024844.5375] device (vlan23)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Feb 02 09:34:04 compute-1 NetworkManager[49055]: <info>  [1770024844.5385] device (br-ex)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Feb 02 09:34:04 compute-1 NetworkManager[49055]: <info>  [1770024844.5393] device (br-ex)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Feb 02 09:34:04 compute-1 NetworkManager[49055]: <info>  [1770024844.5399] device (br-ex)[Open vSwitch Interface]: Activation: successful, device activated.
Feb 02 09:34:04 compute-1 NetworkManager[49055]: <info>  [1770024844.5406] device (vlan22)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Feb 02 09:34:04 compute-1 NetworkManager[49055]: <info>  [1770024844.5408] device (vlan23)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Feb 02 09:34:04 compute-1 NetworkManager[49055]: <info>  [1770024844.5413] device (vlan20)[Open vSwitch Interface]: carrier: link connected
Feb 02 09:34:04 compute-1 NetworkManager[49055]: <info>  [1770024844.5420] device (vlan22)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Feb 02 09:34:04 compute-1 NetworkManager[49055]: <info>  [1770024844.5425] device (vlan22)[Open vSwitch Interface]: Activation: successful, device activated.
Feb 02 09:34:04 compute-1 NetworkManager[49055]: <info>  [1770024844.5432] device (vlan23)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Feb 02 09:34:04 compute-1 NetworkManager[49055]: <info>  [1770024844.5437] device (vlan23)[Open vSwitch Interface]: Activation: successful, device activated.
Feb 02 09:34:04 compute-1 NetworkManager[49055]: <info>  [1770024844.5451] device (vlan21)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Feb 02 09:34:04 compute-1 NetworkManager[49055]: <info>  [1770024844.5468] device (vlan20)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Feb 02 09:34:04 compute-1 NetworkManager[49055]: <info>  [1770024844.5474] device (vlan21)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Feb 02 09:34:04 compute-1 NetworkManager[49055]: <info>  [1770024844.5478] device (vlan21)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Feb 02 09:34:04 compute-1 NetworkManager[49055]: <info>  [1770024844.5482] device (vlan21)[Open vSwitch Interface]: Activation: successful, device activated.
Feb 02 09:34:04 compute-1 NetworkManager[49055]: <info>  [1770024844.5488] device (vlan20)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Feb 02 09:34:04 compute-1 NetworkManager[49055]: <info>  [1770024844.5489] device (vlan20)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Feb 02 09:34:04 compute-1 NetworkManager[49055]: <info>  [1770024844.5493] device (vlan20)[Open vSwitch Interface]: Activation: successful, device activated.
Feb 02 09:34:05 compute-1 NetworkManager[49055]: <info>  [1770024845.6691] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=51833 uid=0 result="success"
Feb 02 09:34:05 compute-1 sudo[52188]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sujcmylnnfmpjeeakzcaqahxhqifaosa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770024845.3893907-841-205938417908927/AnsiballZ_async_status.py'
Feb 02 09:34:05 compute-1 sudo[52188]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:34:05 compute-1 NetworkManager[49055]: <info>  [1770024845.8340] checkpoint[0x55ab2a014950]: destroy /org/freedesktop/NetworkManager/Checkpoint/1
Feb 02 09:34:05 compute-1 NetworkManager[49055]: <info>  [1770024845.8343] audit: op="checkpoint-destroy" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=51833 uid=0 result="success"
Feb 02 09:34:05 compute-1 python3.9[52190]: ansible-ansible.legacy.async_status Invoked with jid=j446185006143.51827 mode=status _async_dir=/root/.ansible_async
Feb 02 09:34:05 compute-1 sudo[52188]: pam_unix(sudo:session): session closed for user root
Feb 02 09:34:06 compute-1 NetworkManager[49055]: <info>  [1770024846.1883] audit: op="checkpoint-create" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=51833 uid=0 result="success"
Feb 02 09:34:06 compute-1 NetworkManager[49055]: <info>  [1770024846.1898] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=51833 uid=0 result="success"
Feb 02 09:34:06 compute-1 NetworkManager[49055]: <info>  [1770024846.4390] audit: op="networking-control" arg="global-dns-configuration" pid=51833 uid=0 result="success"
Feb 02 09:34:06 compute-1 NetworkManager[49055]: <info>  [1770024846.4422] config: signal: SET_VALUES,values,values-intern,global-dns-config (/etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf)
Feb 02 09:34:06 compute-1 NetworkManager[49055]: <info>  [1770024846.4452] audit: op="networking-control" arg="global-dns-configuration" pid=51833 uid=0 result="success"
Feb 02 09:34:06 compute-1 NetworkManager[49055]: <info>  [1770024846.4487] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=51833 uid=0 result="success"
Feb 02 09:34:06 compute-1 NetworkManager[49055]: <info>  [1770024846.6004] checkpoint[0x55ab2a014a20]: destroy /org/freedesktop/NetworkManager/Checkpoint/2
Feb 02 09:34:06 compute-1 NetworkManager[49055]: <info>  [1770024846.6008] audit: op="checkpoint-destroy" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=51833 uid=0 result="success"
Feb 02 09:34:06 compute-1 ansible-async_wrapper.py[51831]: Module complete (51831)
Feb 02 09:34:07 compute-1 ansible-async_wrapper.py[51830]: Done in kid B.
Feb 02 09:34:09 compute-1 sudo[52294]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dbwktbzzmzwdxtrvhjmalemxihsntakk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770024845.3893907-841-205938417908927/AnsiballZ_async_status.py'
Feb 02 09:34:09 compute-1 sudo[52294]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:34:09 compute-1 python3.9[52296]: ansible-ansible.legacy.async_status Invoked with jid=j446185006143.51827 mode=status _async_dir=/root/.ansible_async
Feb 02 09:34:09 compute-1 sudo[52294]: pam_unix(sudo:session): session closed for user root
Feb 02 09:34:09 compute-1 sudo[52394]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jdovcsacraqiiwbyraehpopibzwypixs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770024845.3893907-841-205938417908927/AnsiballZ_async_status.py'
Feb 02 09:34:09 compute-1 sudo[52394]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:34:10 compute-1 python3.9[52396]: ansible-ansible.legacy.async_status Invoked with jid=j446185006143.51827 mode=cleanup _async_dir=/root/.ansible_async
Feb 02 09:34:10 compute-1 sudo[52394]: pam_unix(sudo:session): session closed for user root
Feb 02 09:34:11 compute-1 sudo[52546]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xdiwpjefoixjveycrxaarzoekoxycfoz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770024851.0908043-922-92139863094412/AnsiballZ_stat.py'
Feb 02 09:34:11 compute-1 sudo[52546]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:34:11 compute-1 python3.9[52548]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/os-net-config.returncode follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 02 09:34:11 compute-1 sudo[52546]: pam_unix(sudo:session): session closed for user root
Feb 02 09:34:11 compute-1 sudo[52669]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rmhgvmdwuzqhtccotpffzdqgfdwbnflh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770024851.0908043-922-92139863094412/AnsiballZ_copy.py'
Feb 02 09:34:11 compute-1 sudo[52669]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:34:12 compute-1 python3.9[52671]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/os-net-config.returncode mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1770024851.0908043-922-92139863094412/.source.returncode _original_basename=.3too2mv0 follow=False checksum=b6589fc6ab0dc82cf12099d1c2d40ab994e8410c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 02 09:34:12 compute-1 sudo[52669]: pam_unix(sudo:session): session closed for user root
Feb 02 09:34:12 compute-1 sudo[52822]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ozmqqjsxprmfkstyocgkshvnyqkolwfc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770024852.4113955-970-24807962649288/AnsiballZ_stat.py'
Feb 02 09:34:12 compute-1 sudo[52822]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:34:12 compute-1 python3.9[52824]: ansible-ansible.legacy.stat Invoked with path=/etc/cloud/cloud.cfg.d/99-edpm-disable-network-config.cfg follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 02 09:34:12 compute-1 sudo[52822]: pam_unix(sudo:session): session closed for user root
Feb 02 09:34:13 compute-1 sudo[52945]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gecfvlusnzepqawwqvmqygscoqylaydk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770024852.4113955-970-24807962649288/AnsiballZ_copy.py'
Feb 02 09:34:13 compute-1 sudo[52945]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:34:13 compute-1 python3.9[52947]: ansible-ansible.legacy.copy Invoked with dest=/etc/cloud/cloud.cfg.d/99-edpm-disable-network-config.cfg mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1770024852.4113955-970-24807962649288/.source.cfg _original_basename=.iopzdpeq follow=False checksum=f3c5952a9cd4c6c31b314b25eb897168971cc86e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 02 09:34:13 compute-1 sudo[52945]: pam_unix(sudo:session): session closed for user root
Feb 02 09:34:14 compute-1 sudo[53097]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-shlgcgppyvnyojwdjhzgzmoaekgvzrdq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770024854.0152664-1015-195084735862659/AnsiballZ_systemd.py'
Feb 02 09:34:14 compute-1 sudo[53097]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:34:14 compute-1 python3.9[53099]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=reloaded daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Feb 02 09:34:14 compute-1 systemd[1]: Reloading Network Manager...
Feb 02 09:34:14 compute-1 NetworkManager[49055]: <info>  [1770024854.6484] audit: op="reload" arg="0" pid=53103 uid=0 result="success"
Feb 02 09:34:14 compute-1 NetworkManager[49055]: <info>  [1770024854.6490] config: signal: SIGHUP,config-files,values,values-user,no-auto-default (/etc/NetworkManager/NetworkManager.conf, /usr/lib/NetworkManager/conf.d/00-server.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf, /var/lib/NetworkManager/NetworkManager-intern.conf)
Feb 02 09:34:14 compute-1 systemd[1]: Reloaded Network Manager.
Feb 02 09:34:14 compute-1 sudo[53097]: pam_unix(sudo:session): session closed for user root
Feb 02 09:34:14 compute-1 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Feb 02 09:34:15 compute-1 sshd-session[45057]: Connection closed by 192.168.122.30 port 60874
Feb 02 09:34:15 compute-1 sshd-session[45054]: pam_unix(sshd:session): session closed for user zuul
Feb 02 09:34:15 compute-1 systemd[1]: session-11.scope: Deactivated successfully.
Feb 02 09:34:15 compute-1 systemd[1]: session-11.scope: Consumed 43.624s CPU time.
Feb 02 09:34:15 compute-1 systemd-logind[805]: Session 11 logged out. Waiting for processes to exit.
Feb 02 09:34:15 compute-1 systemd-logind[805]: Removed session 11.
Feb 02 09:34:20 compute-1 sshd-session[53136]: Accepted publickey for zuul from 192.168.122.30 port 52760 ssh2: ECDSA SHA256:RIWOugHsRom13QN8+H2eekzMj6VNcm6gUxie+zDStiQ
Feb 02 09:34:20 compute-1 systemd-logind[805]: New session 12 of user zuul.
Feb 02 09:34:20 compute-1 systemd[1]: Started Session 12 of User zuul.
Feb 02 09:34:20 compute-1 sshd-session[53136]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Feb 02 09:34:21 compute-1 python3.9[53289]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 02 09:34:22 compute-1 python3.9[53443]: ansible-ansible.builtin.setup Invoked with filter=['ansible_default_ipv4'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Feb 02 09:34:23 compute-1 python3.9[53637]: ansible-ansible.legacy.command Invoked with _raw_params=hostname -f _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 02 09:34:24 compute-1 sshd-session[53139]: Connection closed by 192.168.122.30 port 52760
Feb 02 09:34:24 compute-1 sshd-session[53136]: pam_unix(sshd:session): session closed for user zuul
Feb 02 09:34:24 compute-1 systemd[1]: session-12.scope: Deactivated successfully.
Feb 02 09:34:24 compute-1 systemd[1]: session-12.scope: Consumed 2.101s CPU time.
Feb 02 09:34:24 compute-1 systemd-logind[805]: Session 12 logged out. Waiting for processes to exit.
Feb 02 09:34:24 compute-1 systemd-logind[805]: Removed session 12.
Feb 02 09:34:24 compute-1 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Feb 02 09:34:29 compute-1 sshd-session[53666]: Accepted publickey for zuul from 192.168.122.30 port 39194 ssh2: ECDSA SHA256:RIWOugHsRom13QN8+H2eekzMj6VNcm6gUxie+zDStiQ
Feb 02 09:34:29 compute-1 systemd-logind[805]: New session 13 of user zuul.
Feb 02 09:34:29 compute-1 systemd[1]: Started Session 13 of User zuul.
Feb 02 09:34:29 compute-1 sshd-session[53666]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Feb 02 09:34:30 compute-1 python3.9[53819]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 02 09:34:31 compute-1 python3.9[53973]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 02 09:34:32 compute-1 sudo[54128]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-efobqomxwxffpykqmwsyhvedskrmqlaa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770024871.9179251-76-201766066501061/AnsiballZ_setup.py'
Feb 02 09:34:32 compute-1 sudo[54128]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:34:32 compute-1 python3.9[54130]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Feb 02 09:34:32 compute-1 sudo[54128]: pam_unix(sudo:session): session closed for user root
Feb 02 09:34:33 compute-1 sudo[54212]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nieukxjflufdaugtlepsrfpzlbpvqvca ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770024871.9179251-76-201766066501061/AnsiballZ_dnf.py'
Feb 02 09:34:33 compute-1 sudo[54212]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:34:33 compute-1 python3.9[54214]: ansible-ansible.legacy.dnf Invoked with name=['podman'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Feb 02 09:34:34 compute-1 sudo[54212]: pam_unix(sudo:session): session closed for user root
Feb 02 09:34:34 compute-1 sudo[54365]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wprlfuwmewampcjawruavgppdvunzfom ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770024874.5667088-112-172579451164473/AnsiballZ_setup.py'
Feb 02 09:34:34 compute-1 sudo[54365]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:34:35 compute-1 python3.9[54367]: ansible-ansible.builtin.setup Invoked with filter=['ansible_interfaces'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Feb 02 09:34:35 compute-1 sudo[54365]: pam_unix(sudo:session): session closed for user root
Feb 02 09:34:36 compute-1 sudo[54561]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uyibzhcqjxfldqeefkquebjamunhusqq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770024875.8420525-145-132299163310678/AnsiballZ_file.py'
Feb 02 09:34:36 compute-1 sudo[54561]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:34:36 compute-1 python3.9[54563]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/containers/networks recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 02 09:34:36 compute-1 sudo[54561]: pam_unix(sudo:session): session closed for user root
Feb 02 09:34:37 compute-1 sudo[54713]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ztsahylnftblcxivryglyknjoaaemjdj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770024876.793659-169-22981427955068/AnsiballZ_command.py'
Feb 02 09:34:37 compute-1 sudo[54713]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:34:37 compute-1 python3.9[54715]: ansible-ansible.legacy.command Invoked with _raw_params=podman network inspect podman
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 02 09:34:37 compute-1 systemd[1]: var-lib-containers-storage-overlay-compat3221578336-merged.mount: Deactivated successfully.
Feb 02 09:34:37 compute-1 podman[54716]: 2026-02-02 09:34:37.483190949 +0000 UTC m=+0.064142764 system refresh
Feb 02 09:34:37 compute-1 sudo[54713]: pam_unix(sudo:session): session closed for user root
Feb 02 09:34:38 compute-1 sudo[54877]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bylqsclublejtslowrdbmqkknrzuukog ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770024877.6941218-193-73341524666599/AnsiballZ_stat.py'
Feb 02 09:34:38 compute-1 sudo[54877]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:34:38 compute-1 python3.9[54879]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/networks/podman.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 02 09:34:38 compute-1 sudo[54877]: pam_unix(sudo:session): session closed for user root
Feb 02 09:34:38 compute-1 systemd[1]: var-lib-containers-storage-overlay-opaque\x2dbug\x2dcheck2425735953-merged.mount: Deactivated successfully.
Feb 02 09:34:38 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Feb 02 09:34:38 compute-1 sudo[55000]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zxfrysyomilbkknglnjugpyyfrkgsxfz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770024877.6941218-193-73341524666599/AnsiballZ_copy.py'
Feb 02 09:34:38 compute-1 sudo[55000]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:34:39 compute-1 python3.9[55002]: ansible-ansible.legacy.copy Invoked with dest=/etc/containers/networks/podman.json group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1770024877.6941218-193-73341524666599/.source.json follow=False _original_basename=podman_network_config.j2 checksum=85b621edfd6b57aa7ba64b670d6d71e13d1e3a57 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 02 09:34:39 compute-1 sudo[55000]: pam_unix(sudo:session): session closed for user root
Feb 02 09:34:39 compute-1 sudo[55152]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jvapzytaryyopmtsthlrcziuxupebecb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770024879.2886827-238-154031085959593/AnsiballZ_stat.py'
Feb 02 09:34:39 compute-1 sudo[55152]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:34:39 compute-1 python3.9[55154]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 02 09:34:39 compute-1 sudo[55152]: pam_unix(sudo:session): session closed for user root
Feb 02 09:34:40 compute-1 sudo[55275]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mtwngnmkhmutbyswffypjtdrghrrjazw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770024879.2886827-238-154031085959593/AnsiballZ_copy.py'
Feb 02 09:34:40 compute-1 sudo[55275]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:34:40 compute-1 python3.9[55277]: ansible-ansible.legacy.copy Invoked with dest=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1770024879.2886827-238-154031085959593/.source.conf follow=False _original_basename=registries.conf.j2 checksum=804a0d01b832e60d20f779a331306df708c87b02 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Feb 02 09:34:40 compute-1 sudo[55275]: pam_unix(sudo:session): session closed for user root
Feb 02 09:34:41 compute-1 sudo[55427]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xygizjtfsdpgjouqpwdoqkqizdyxmplm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770024880.880274-286-266652401144506/AnsiballZ_ini_file.py'
Feb 02 09:34:41 compute-1 sudo[55427]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:34:41 compute-1 python3.9[55429]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=pids_limit owner=root path=/etc/containers/containers.conf section=containers setype=etc_t value=4096 backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Feb 02 09:34:41 compute-1 sudo[55427]: pam_unix(sudo:session): session closed for user root
Feb 02 09:34:41 compute-1 sudo[55579]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rrozzodtkdzgzbhfxkkobnfjkvifmopj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770024881.626987-286-172139488903291/AnsiballZ_ini_file.py'
Feb 02 09:34:41 compute-1 sudo[55579]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:34:42 compute-1 python3.9[55581]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=events_logger owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="journald" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Feb 02 09:34:42 compute-1 sudo[55579]: pam_unix(sudo:session): session closed for user root
Feb 02 09:34:42 compute-1 sudo[55731]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ttxwwftfxslqyrjxevussgkgpnwgdcnw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770024882.1896834-286-66871688718806/AnsiballZ_ini_file.py'
Feb 02 09:34:42 compute-1 sudo[55731]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:34:42 compute-1 python3.9[55733]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=runtime owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="crun" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Feb 02 09:34:42 compute-1 sudo[55731]: pam_unix(sudo:session): session closed for user root
Feb 02 09:34:42 compute-1 sudo[55883]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-racxnhtofgbgcjftbrajsrqpvkxnfeim ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770024882.7542539-286-194348266853814/AnsiballZ_ini_file.py'
Feb 02 09:34:42 compute-1 sudo[55883]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:34:43 compute-1 python3.9[55885]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=network_backend owner=root path=/etc/containers/containers.conf section=network setype=etc_t value="netavark" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Feb 02 09:34:43 compute-1 sudo[55883]: pam_unix(sudo:session): session closed for user root
Feb 02 09:34:43 compute-1 sudo[56035]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fupsojsoisxpobzejossrsmwupbbbtnh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770024883.5557618-379-137051336312760/AnsiballZ_dnf.py'
Feb 02 09:34:43 compute-1 sudo[56035]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:34:43 compute-1 python3.9[56037]: ansible-ansible.legacy.dnf Invoked with name=['openssh-server'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Feb 02 09:34:45 compute-1 sudo[56035]: pam_unix(sudo:session): session closed for user root
Feb 02 09:34:45 compute-1 sudo[56188]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jifbkeaaagdbixlhujmuxlhgoigkvnfh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770024885.6469274-412-143294793975324/AnsiballZ_setup.py'
Feb 02 09:34:45 compute-1 sudo[56188]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:34:46 compute-1 python3.9[56190]: ansible-setup Invoked with gather_subset=['!all', '!min', 'distribution', 'distribution_major_version', 'distribution_version', 'os_family'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 02 09:34:46 compute-1 sudo[56188]: pam_unix(sudo:session): session closed for user root
Feb 02 09:34:46 compute-1 sudo[56342]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fjoikpofsfgokrcsnagzgekpysmgrztb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770024886.431629-436-185459571107549/AnsiballZ_stat.py'
Feb 02 09:34:46 compute-1 sudo[56342]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:34:46 compute-1 python3.9[56344]: ansible-stat Invoked with path=/run/ostree-booted follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 02 09:34:46 compute-1 sudo[56342]: pam_unix(sudo:session): session closed for user root
Feb 02 09:34:47 compute-1 sudo[56494]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dsbcgdgstaawqqgyszbipggtburhzjpk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770024887.1281314-463-248247244113712/AnsiballZ_stat.py'
Feb 02 09:34:47 compute-1 sudo[56494]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:34:47 compute-1 python3.9[56496]: ansible-stat Invoked with path=/sbin/transactional-update follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 02 09:34:47 compute-1 sudo[56494]: pam_unix(sudo:session): session closed for user root
Feb 02 09:34:48 compute-1 sudo[56646]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mffqnddiujdlykthiionvdyyfecudmnn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770024888.1192207-493-249592013146982/AnsiballZ_command.py'
Feb 02 09:34:48 compute-1 sudo[56646]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:34:48 compute-1 python3.9[56648]: ansible-ansible.legacy.command Invoked with _raw_params=systemctl is-system-running _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 02 09:34:48 compute-1 sudo[56646]: pam_unix(sudo:session): session closed for user root
Feb 02 09:34:49 compute-1 sudo[56799]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ibkxuexbxnqsucektvtnvixfxzdjpmmb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770024888.9117827-523-120419834685927/AnsiballZ_service_facts.py'
Feb 02 09:34:49 compute-1 sudo[56799]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:34:49 compute-1 python3.9[56801]: ansible-service_facts Invoked
Feb 02 09:34:49 compute-1 network[56818]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Feb 02 09:34:49 compute-1 network[56819]: 'network-scripts' will be removed from distribution in near future.
Feb 02 09:34:49 compute-1 network[56820]: It is advised to switch to 'NetworkManager' instead for network management.
Feb 02 09:34:51 compute-1 sudo[56799]: pam_unix(sudo:session): session closed for user root
Feb 02 09:34:53 compute-1 sudo[57103]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mzbnybzxoupmyyeteyprwjshrjrddrlo ; /bin/bash /home/zuul/.ansible/tmp/ansible-tmp-1770024893.6708508-568-201910803229188/AnsiballZ_timesync_provider.sh /home/zuul/.ansible/tmp/ansible-tmp-1770024893.6708508-568-201910803229188/args'
Feb 02 09:34:53 compute-1 sudo[57103]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:34:54 compute-1 sudo[57103]: pam_unix(sudo:session): session closed for user root
Feb 02 09:34:54 compute-1 sudo[57270]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aysgnlfauyfleckovbpjbrspijrvopjx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770024894.397497-601-2212576698879/AnsiballZ_dnf.py'
Feb 02 09:34:54 compute-1 sudo[57270]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:34:54 compute-1 python3.9[57272]: ansible-ansible.legacy.dnf Invoked with name=['chrony'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Feb 02 09:34:56 compute-1 sudo[57270]: pam_unix(sudo:session): session closed for user root
Feb 02 09:34:57 compute-1 sudo[57423]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qzwbgqjombkrydbinuefrtkjqipnaguz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770024896.6237152-640-176715065742270/AnsiballZ_package_facts.py'
Feb 02 09:34:57 compute-1 sudo[57423]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:34:57 compute-1 python3.9[57425]: ansible-package_facts Invoked with manager=['auto'] strategy=first
Feb 02 09:34:57 compute-1 sudo[57423]: pam_unix(sudo:session): session closed for user root
Feb 02 09:34:58 compute-1 sudo[57575]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pkkqyhfpeqknegtwreemcyrpjsymlucy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770024898.4417396-671-281242515459565/AnsiballZ_stat.py'
Feb 02 09:34:58 compute-1 sudo[57575]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:34:58 compute-1 python3.9[57577]: ansible-ansible.legacy.stat Invoked with path=/etc/chrony.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 02 09:34:58 compute-1 sudo[57575]: pam_unix(sudo:session): session closed for user root
Feb 02 09:34:59 compute-1 sudo[57700]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-skoofjozcswtlnjeptvhasxabppeoboy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770024898.4417396-671-281242515459565/AnsiballZ_copy.py'
Feb 02 09:34:59 compute-1 sudo[57700]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:34:59 compute-1 python3.9[57702]: ansible-ansible.legacy.copy Invoked with backup=True dest=/etc/chrony.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1770024898.4417396-671-281242515459565/.source.conf follow=False _original_basename=chrony.conf.j2 checksum=cfb003e56d02d0d2c65555452eb1a05073fecdad force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 02 09:34:59 compute-1 sudo[57700]: pam_unix(sudo:session): session closed for user root
Feb 02 09:35:00 compute-1 sudo[57854]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ecpcepabdcjjsfgjuzppwsnryteslija ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770024899.8367538-716-45054612975642/AnsiballZ_stat.py'
Feb 02 09:35:00 compute-1 sudo[57854]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:35:00 compute-1 python3.9[57856]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/chronyd follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 02 09:35:00 compute-1 sudo[57854]: pam_unix(sudo:session): session closed for user root
Feb 02 09:35:00 compute-1 sudo[57979]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lrmhlukhytsipgdjujtxyqdxjvgwagvv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770024899.8367538-716-45054612975642/AnsiballZ_copy.py'
Feb 02 09:35:00 compute-1 sudo[57979]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:35:00 compute-1 python3.9[57981]: ansible-ansible.legacy.copy Invoked with backup=True dest=/etc/sysconfig/chronyd mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1770024899.8367538-716-45054612975642/.source follow=False _original_basename=chronyd.sysconfig.j2 checksum=dd196b1ff1f915b23eebc37ec77405b5dd3df76c force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 02 09:35:01 compute-1 sudo[57979]: pam_unix(sudo:session): session closed for user root
Feb 02 09:35:02 compute-1 sudo[58133]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vgkswkhyqodamtnqezdmhjjrbgmlbvij ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770024901.9825358-779-233127075452510/AnsiballZ_lineinfile.py'
Feb 02 09:35:02 compute-1 sudo[58133]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:35:02 compute-1 python3.9[58135]: ansible-lineinfile Invoked with backup=True create=True dest=/etc/sysconfig/network line=PEERNTP=no mode=0644 regexp=^PEERNTP= state=present path=/etc/sysconfig/network encoding=utf-8 backrefs=False firstmatch=False unsafe_writes=False search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 02 09:35:02 compute-1 sudo[58133]: pam_unix(sudo:session): session closed for user root
Feb 02 09:35:03 compute-1 sudo[58287]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vspkchuszfxteknquiwpdzsgywdzxtos ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770024903.647021-823-53663499830829/AnsiballZ_setup.py'
Feb 02 09:35:03 compute-1 sudo[58287]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:35:04 compute-1 python3.9[58289]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all'] filter=['ansible_service_mgr'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Feb 02 09:35:04 compute-1 sudo[58287]: pam_unix(sudo:session): session closed for user root
Feb 02 09:35:04 compute-1 sudo[58371]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-inzyizkfcsjjoeufpuazddkvxupvzkdp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770024903.647021-823-53663499830829/AnsiballZ_systemd.py'
Feb 02 09:35:04 compute-1 sudo[58371]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:35:05 compute-1 python3.9[58373]: ansible-ansible.legacy.systemd Invoked with enabled=True name=chronyd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 02 09:35:05 compute-1 sudo[58371]: pam_unix(sudo:session): session closed for user root
Feb 02 09:35:06 compute-1 sudo[58525]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mpacmbmmgrjaluewzhugcrglxyslevbn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770024906.1481097-872-195357799965209/AnsiballZ_setup.py'
Feb 02 09:35:06 compute-1 sudo[58525]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:35:06 compute-1 python3.9[58527]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all'] filter=['ansible_service_mgr'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Feb 02 09:35:06 compute-1 sudo[58525]: pam_unix(sudo:session): session closed for user root
Feb 02 09:35:07 compute-1 sudo[58609]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bsmztkjswmpktzmjpotohdttoxsqfxlx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770024906.1481097-872-195357799965209/AnsiballZ_systemd.py'
Feb 02 09:35:07 compute-1 sudo[58609]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:35:07 compute-1 python3.9[58611]: ansible-ansible.legacy.systemd Invoked with name=chronyd state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Feb 02 09:35:07 compute-1 chronyd[811]: chronyd exiting
Feb 02 09:35:07 compute-1 systemd[1]: Stopping NTP client/server...
Feb 02 09:35:07 compute-1 systemd[1]: chronyd.service: Deactivated successfully.
Feb 02 09:35:07 compute-1 systemd[1]: Stopped NTP client/server.
Feb 02 09:35:07 compute-1 systemd[1]: Starting NTP client/server...
Feb 02 09:35:07 compute-1 chronyd[58619]: chronyd version 4.8 starting (+CMDMON +REFCLOCK +RTC +PRIVDROP +SCFILTER +SIGND +NTS +SECHASH +IPV6 +DEBUG)
Feb 02 09:35:07 compute-1 chronyd[58619]: Frequency -26.550 +/- 0.484 ppm read from /var/lib/chrony/drift
Feb 02 09:35:07 compute-1 chronyd[58619]: Loaded seccomp filter (level 2)
Feb 02 09:35:07 compute-1 systemd[1]: Started NTP client/server.
Feb 02 09:35:07 compute-1 sudo[58609]: pam_unix(sudo:session): session closed for user root
Feb 02 09:35:08 compute-1 sshd-session[53669]: Connection closed by 192.168.122.30 port 39194
Feb 02 09:35:08 compute-1 sshd-session[53666]: pam_unix(sshd:session): session closed for user zuul
Feb 02 09:35:08 compute-1 systemd[1]: session-13.scope: Deactivated successfully.
Feb 02 09:35:08 compute-1 systemd[1]: session-13.scope: Consumed 22.583s CPU time.
Feb 02 09:35:08 compute-1 systemd-logind[805]: Session 13 logged out. Waiting for processes to exit.
Feb 02 09:35:08 compute-1 systemd-logind[805]: Removed session 13.
Feb 02 09:35:13 compute-1 sshd-session[58645]: Accepted publickey for zuul from 192.168.122.30 port 42234 ssh2: ECDSA SHA256:RIWOugHsRom13QN8+H2eekzMj6VNcm6gUxie+zDStiQ
Feb 02 09:35:13 compute-1 systemd-logind[805]: New session 14 of user zuul.
Feb 02 09:35:13 compute-1 systemd[1]: Started Session 14 of User zuul.
Feb 02 09:35:13 compute-1 sshd-session[58645]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Feb 02 09:35:13 compute-1 sudo[58798]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ngdazwccrskznhtemfdazytwvctubrim ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770024913.2564223-22-39500420853350/AnsiballZ_file.py'
Feb 02 09:35:13 compute-1 sudo[58798]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:35:14 compute-1 python3.9[58800]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 02 09:35:14 compute-1 sudo[58798]: pam_unix(sudo:session): session closed for user root
Feb 02 09:35:14 compute-1 sudo[58950]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qjoxcdkgukhtworropreptyzluecnsay ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770024914.3400397-58-245572025514354/AnsiballZ_stat.py'
Feb 02 09:35:14 compute-1 sudo[58950]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:35:14 compute-1 python3.9[58952]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/ceph-networks.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 02 09:35:14 compute-1 sudo[58950]: pam_unix(sudo:session): session closed for user root
Feb 02 09:35:15 compute-1 sudo[59073]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jjlbkuacqlhlwmqqzayrpzrvetqlcybm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770024914.3400397-58-245572025514354/AnsiballZ_copy.py'
Feb 02 09:35:15 compute-1 sudo[59073]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:35:15 compute-1 python3.9[59075]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/ceph-networks.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1770024914.3400397-58-245572025514354/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=729ea8396013e3343245d6e934e0dcef55029ad2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 02 09:35:15 compute-1 sudo[59073]: pam_unix(sudo:session): session closed for user root
Feb 02 09:35:15 compute-1 sshd-session[58648]: Connection closed by 192.168.122.30 port 42234
Feb 02 09:35:15 compute-1 sshd-session[58645]: pam_unix(sshd:session): session closed for user zuul
Feb 02 09:35:15 compute-1 systemd[1]: session-14.scope: Deactivated successfully.
Feb 02 09:35:15 compute-1 systemd[1]: session-14.scope: Consumed 1.480s CPU time.
Feb 02 09:35:15 compute-1 systemd-logind[805]: Session 14 logged out. Waiting for processes to exit.
Feb 02 09:35:15 compute-1 systemd-logind[805]: Removed session 14.
Feb 02 09:35:21 compute-1 sshd-session[59100]: Accepted publickey for zuul from 192.168.122.30 port 39760 ssh2: ECDSA SHA256:RIWOugHsRom13QN8+H2eekzMj6VNcm6gUxie+zDStiQ
Feb 02 09:35:21 compute-1 systemd-logind[805]: New session 15 of user zuul.
Feb 02 09:35:21 compute-1 systemd[1]: Started Session 15 of User zuul.
Feb 02 09:35:21 compute-1 sshd-session[59100]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Feb 02 09:35:22 compute-1 python3.9[59253]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 02 09:35:23 compute-1 sudo[59407]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mysavkdkoxsiuwddsnhhuqjkexmftxqd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770024922.8239112-55-190246729204836/AnsiballZ_file.py'
Feb 02 09:35:23 compute-1 sudo[59407]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:35:23 compute-1 python3.9[59409]: ansible-ansible.builtin.file Invoked with group=zuul mode=0770 owner=zuul path=/root/.config/containers recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 02 09:35:23 compute-1 sudo[59407]: pam_unix(sudo:session): session closed for user root
Feb 02 09:35:24 compute-1 sudo[59582]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wrkirvygdckikbnvajnbmlrfysblifuw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770024923.6217854-79-242405258553402/AnsiballZ_stat.py'
Feb 02 09:35:24 compute-1 sudo[59582]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:35:24 compute-1 python3.9[59584]: ansible-ansible.legacy.stat Invoked with path=/root/.config/containers/auth.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 02 09:35:24 compute-1 sudo[59582]: pam_unix(sudo:session): session closed for user root
Feb 02 09:35:24 compute-1 sudo[59705]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mvwwxphqplisfggzriiqqvfbmtjkfkce ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770024923.6217854-79-242405258553402/AnsiballZ_copy.py'
Feb 02 09:35:24 compute-1 sudo[59705]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:35:24 compute-1 python3.9[59707]: ansible-ansible.legacy.copy Invoked with dest=/root/.config/containers/auth.json group=zuul mode=0660 owner=zuul src=/home/zuul/.ansible/tmp/ansible-tmp-1770024923.6217854-79-242405258553402/.source.json _original_basename=.q8wsgptc follow=False checksum=bf21a9e8fbc5a3846fb05b4fa0859e0917b2202f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 02 09:35:24 compute-1 sudo[59705]: pam_unix(sudo:session): session closed for user root
Feb 02 09:35:25 compute-1 sudo[59857]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ntqrfvehpwhphguessfmpwgqwvprkxcp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770024925.2686567-148-78079986202256/AnsiballZ_stat.py'
Feb 02 09:35:25 compute-1 sudo[59857]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:35:25 compute-1 python3.9[59859]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 02 09:35:25 compute-1 sudo[59857]: pam_unix(sudo:session): session closed for user root
Feb 02 09:35:26 compute-1 sudo[59980]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aeelnwveerotyqxkgwrenmmlpxtmmdxj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770024925.2686567-148-78079986202256/AnsiballZ_copy.py'
Feb 02 09:35:26 compute-1 sudo[59980]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:35:26 compute-1 python3.9[59982]: ansible-ansible.legacy.copy Invoked with dest=/etc/sysconfig/podman_drop_in mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1770024925.2686567-148-78079986202256/.source _original_basename=.ek_rd0vo follow=False checksum=125299ce8dea7711a76292961206447f0043248b backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 02 09:35:26 compute-1 sudo[59980]: pam_unix(sudo:session): session closed for user root
Feb 02 09:35:26 compute-1 sudo[60132]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jmwoagwcvaohszembemtzbxjfmrsplid ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770024926.5727654-196-41063662736735/AnsiballZ_file.py'
Feb 02 09:35:26 compute-1 sudo[60132]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:35:26 compute-1 python3.9[60134]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 02 09:35:26 compute-1 sudo[60132]: pam_unix(sudo:session): session closed for user root
Feb 02 09:35:27 compute-1 sudo[60284]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lwsowxckjdjpnywevzcofhejawrbbswz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770024927.2232182-220-246681203115187/AnsiballZ_stat.py'
Feb 02 09:35:27 compute-1 sudo[60284]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:35:27 compute-1 python3.9[60286]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 02 09:35:27 compute-1 sudo[60284]: pam_unix(sudo:session): session closed for user root
Feb 02 09:35:28 compute-1 sudo[60407]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lequcpsnafxdjibvvtqhpyqdzugjhrrh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770024927.2232182-220-246681203115187/AnsiballZ_copy.py'
Feb 02 09:35:28 compute-1 sudo[60407]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:35:28 compute-1 python3.9[60409]: ansible-ansible.legacy.copy Invoked with dest=/var/local/libexec/edpm-container-shutdown group=root mode=0700 owner=root setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1770024927.2232182-220-246681203115187/.source _original_basename=edpm-container-shutdown follow=False checksum=632c3792eb3dce4288b33ae7b265b71950d69f13 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Feb 02 09:35:28 compute-1 sudo[60407]: pam_unix(sudo:session): session closed for user root
Feb 02 09:35:28 compute-1 sudo[60559]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hdapotpjijodcgjnpbconfcfkmmlplxn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770024928.3833046-220-31431851639660/AnsiballZ_stat.py'
Feb 02 09:35:28 compute-1 sudo[60559]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:35:28 compute-1 python3.9[60561]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 02 09:35:29 compute-1 sudo[60559]: pam_unix(sudo:session): session closed for user root
Feb 02 09:35:29 compute-1 sudo[60682]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hnfoistqbipvhmvtmxnhirwhxqwujnem ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770024928.3833046-220-31431851639660/AnsiballZ_copy.py'
Feb 02 09:35:29 compute-1 sudo[60682]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:35:29 compute-1 python3.9[60684]: ansible-ansible.legacy.copy Invoked with dest=/var/local/libexec/edpm-start-podman-container group=root mode=0700 owner=root setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1770024928.3833046-220-31431851639660/.source _original_basename=edpm-start-podman-container follow=False checksum=b963c569d75a655c0ccae95d9bb4a2a9a4df27d1 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Feb 02 09:35:29 compute-1 sudo[60682]: pam_unix(sudo:session): session closed for user root
Feb 02 09:35:30 compute-1 sudo[60834]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nzqfabkkqlvjfstovlzzrqmivghzlbrr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770024929.7849023-307-93609213627689/AnsiballZ_file.py'
Feb 02 09:35:30 compute-1 sudo[60834]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:35:30 compute-1 python3.9[60836]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 02 09:35:30 compute-1 sudo[60834]: pam_unix(sudo:session): session closed for user root
Feb 02 09:35:30 compute-1 sudo[60986]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kqwokfwhpyuzqksqomtyujxmmbotofea ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770024930.4272084-331-43439646552512/AnsiballZ_stat.py'
Feb 02 09:35:30 compute-1 sudo[60986]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:35:30 compute-1 python3.9[60988]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 02 09:35:30 compute-1 sudo[60986]: pam_unix(sudo:session): session closed for user root
Feb 02 09:35:31 compute-1 sudo[61109]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kvzgfohrqbbzpzouinlylbzrnhrflyjb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770024930.4272084-331-43439646552512/AnsiballZ_copy.py'
Feb 02 09:35:31 compute-1 sudo[61109]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:35:31 compute-1 python3.9[61111]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm-container-shutdown.service group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1770024930.4272084-331-43439646552512/.source.service _original_basename=edpm-container-shutdown-service follow=False checksum=6336835cb0f888670cc99de31e19c8c071444d33 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 02 09:35:31 compute-1 sudo[61109]: pam_unix(sudo:session): session closed for user root
Feb 02 09:35:31 compute-1 sudo[61261]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eagpechcqzczfyvijymifuqcilxjfuox ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770024931.6567607-376-8828196314413/AnsiballZ_stat.py'
Feb 02 09:35:31 compute-1 sudo[61261]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:35:32 compute-1 python3.9[61263]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 02 09:35:32 compute-1 sudo[61261]: pam_unix(sudo:session): session closed for user root
Feb 02 09:35:32 compute-1 sudo[61384]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lpyifypywawhaobzcxktsgxaubelfvgu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770024931.6567607-376-8828196314413/AnsiballZ_copy.py'
Feb 02 09:35:32 compute-1 sudo[61384]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:35:32 compute-1 python3.9[61386]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1770024931.6567607-376-8828196314413/.source.preset _original_basename=91-edpm-container-shutdown-preset follow=False checksum=b275e4375287528cb63464dd32f622c4f142a915 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 02 09:35:32 compute-1 sudo[61384]: pam_unix(sudo:session): session closed for user root
Feb 02 09:35:33 compute-1 sudo[61536]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aveakadgjcpxeowbywuzsjqqlxkkhpec ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770024932.8462603-421-15687774982478/AnsiballZ_systemd.py'
Feb 02 09:35:33 compute-1 sudo[61536]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:35:33 compute-1 python3.9[61538]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 02 09:35:33 compute-1 systemd[1]: Reloading.
Feb 02 09:35:33 compute-1 systemd-sysv-generator[61568]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 02 09:35:33 compute-1 systemd-rc-local-generator[61563]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 02 09:35:33 compute-1 systemd[1]: Reloading.
Feb 02 09:35:34 compute-1 systemd-rc-local-generator[61605]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 02 09:35:34 compute-1 systemd-sysv-generator[61609]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 02 09:35:34 compute-1 systemd[1]: Starting EDPM Container Shutdown...
Feb 02 09:35:34 compute-1 systemd[1]: Finished EDPM Container Shutdown.
Feb 02 09:35:34 compute-1 sudo[61536]: pam_unix(sudo:session): session closed for user root
Feb 02 09:35:34 compute-1 sudo[61764]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ggwuooakvsopnovsxxokedytpjuryqoo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770024934.3337915-445-190825462867451/AnsiballZ_stat.py'
Feb 02 09:35:34 compute-1 sudo[61764]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:35:34 compute-1 python3.9[61766]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 02 09:35:34 compute-1 sudo[61764]: pam_unix(sudo:session): session closed for user root
Feb 02 09:35:35 compute-1 sudo[61887]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-voldnzmtqpxsugxuveschkdahywmaxiz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770024934.3337915-445-190825462867451/AnsiballZ_copy.py'
Feb 02 09:35:35 compute-1 sudo[61887]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:35:35 compute-1 python3.9[61889]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/netns-placeholder.service group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1770024934.3337915-445-190825462867451/.source.service _original_basename=netns-placeholder-service follow=False checksum=b61b1b5918c20c877b8b226fbf34ff89a082d972 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 02 09:35:35 compute-1 sudo[61887]: pam_unix(sudo:session): session closed for user root
Feb 02 09:35:35 compute-1 sudo[62039]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iruoezgavgjmmxazdawrqvtmgnthdixd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770024935.567769-490-40701539531467/AnsiballZ_stat.py'
Feb 02 09:35:35 compute-1 sudo[62039]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:35:36 compute-1 python3.9[62041]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 02 09:35:36 compute-1 sudo[62039]: pam_unix(sudo:session): session closed for user root
Feb 02 09:35:36 compute-1 sudo[62162]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-drzhxfrnksywsrjvyzgbpyyvivocvysh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770024935.567769-490-40701539531467/AnsiballZ_copy.py'
Feb 02 09:35:36 compute-1 sudo[62162]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:35:36 compute-1 python3.9[62164]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system-preset/91-netns-placeholder.preset group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1770024935.567769-490-40701539531467/.source.preset _original_basename=91-netns-placeholder-preset follow=False checksum=28b7b9aa893525d134a1eeda8a0a48fb25b736b9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 02 09:35:36 compute-1 sudo[62162]: pam_unix(sudo:session): session closed for user root
Feb 02 09:35:37 compute-1 sudo[62314]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hnckauuqjgauojlwrudqqsnucnrrmkag ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770024936.9276702-535-194018994538075/AnsiballZ_systemd.py'
Feb 02 09:35:37 compute-1 sudo[62314]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:35:37 compute-1 python3.9[62316]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 02 09:35:37 compute-1 systemd[1]: Reloading.
Feb 02 09:35:37 compute-1 systemd-sysv-generator[62346]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 02 09:35:37 compute-1 systemd-rc-local-generator[62341]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 02 09:35:37 compute-1 systemd[1]: Reloading.
Feb 02 09:35:37 compute-1 systemd-sysv-generator[62381]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 02 09:35:37 compute-1 systemd-rc-local-generator[62378]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 02 09:35:37 compute-1 systemd[1]: Starting Create netns directory...
Feb 02 09:35:37 compute-1 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Feb 02 09:35:37 compute-1 systemd[1]: netns-placeholder.service: Deactivated successfully.
Feb 02 09:35:37 compute-1 systemd[1]: Finished Create netns directory.
Feb 02 09:35:38 compute-1 sudo[62314]: pam_unix(sudo:session): session closed for user root
Feb 02 09:35:38 compute-1 python3.9[62543]: ansible-ansible.builtin.service_facts Invoked
Feb 02 09:35:38 compute-1 network[62560]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Feb 02 09:35:38 compute-1 network[62561]: 'network-scripts' will be removed from distribution in near future.
Feb 02 09:35:38 compute-1 network[62562]: It is advised to switch to 'NetworkManager' instead for network management.
Feb 02 09:35:42 compute-1 sudo[62822]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qnfebwluqbghvbsvqrtaqkvsfmopnmdw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770024942.711107-583-155114647135837/AnsiballZ_systemd.py'
Feb 02 09:35:42 compute-1 sudo[62822]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:35:43 compute-1 python3.9[62824]: ansible-ansible.builtin.systemd Invoked with enabled=False name=iptables.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 02 09:35:43 compute-1 systemd[1]: Reloading.
Feb 02 09:35:43 compute-1 systemd-sysv-generator[62857]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 02 09:35:43 compute-1 systemd-rc-local-generator[62854]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 02 09:35:43 compute-1 systemd[1]: Stopping IPv4 firewall with iptables...
Feb 02 09:35:43 compute-1 iptables.init[62864]: iptables: Setting chains to policy ACCEPT: raw mangle filter nat [  OK  ]
Feb 02 09:35:43 compute-1 iptables.init[62864]: iptables: Flushing firewall rules: [  OK  ]
Feb 02 09:35:43 compute-1 systemd[1]: iptables.service: Deactivated successfully.
Feb 02 09:35:43 compute-1 systemd[1]: Stopped IPv4 firewall with iptables.
Feb 02 09:35:43 compute-1 sudo[62822]: pam_unix(sudo:session): session closed for user root
Feb 02 09:35:44 compute-1 sudo[63058]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kyoiyhtpgarrqbdnmdygpglrpjjkxfus ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770024943.9703588-583-81031923686007/AnsiballZ_systemd.py'
Feb 02 09:35:44 compute-1 sudo[63058]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:35:44 compute-1 python3.9[63060]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ip6tables.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 02 09:35:44 compute-1 sudo[63058]: pam_unix(sudo:session): session closed for user root
Feb 02 09:35:45 compute-1 sudo[63212]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-swppytuonckkzjdyvfhnxfltokmzbuhw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770024944.8353577-631-16704020554513/AnsiballZ_systemd.py'
Feb 02 09:35:45 compute-1 sudo[63212]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:35:45 compute-1 python3.9[63214]: ansible-ansible.builtin.systemd Invoked with enabled=True name=nftables state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 02 09:35:45 compute-1 systemd[1]: Reloading.
Feb 02 09:35:45 compute-1 systemd-sysv-generator[63243]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 02 09:35:45 compute-1 systemd-rc-local-generator[63239]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 02 09:35:45 compute-1 systemd[1]: Starting Netfilter Tables...
Feb 02 09:35:45 compute-1 systemd[1]: Finished Netfilter Tables.
Feb 02 09:35:45 compute-1 sudo[63212]: pam_unix(sudo:session): session closed for user root
Feb 02 09:35:46 compute-1 sudo[63404]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hxfwefxbxbaqsbyhwueogbeucucaetam ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770024945.8925514-655-163490380286415/AnsiballZ_command.py'
Feb 02 09:35:46 compute-1 sudo[63404]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:35:46 compute-1 python3.9[63406]: ansible-ansible.legacy.command Invoked with _raw_params=nft flush ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 02 09:35:46 compute-1 sudo[63404]: pam_unix(sudo:session): session closed for user root
Feb 02 09:35:47 compute-1 sudo[63557]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ufzpghujhcsvcjkapwptpranrpqujinj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770024947.1182163-697-176938743890081/AnsiballZ_stat.py'
Feb 02 09:35:47 compute-1 sudo[63557]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:35:47 compute-1 python3.9[63559]: ansible-ansible.legacy.stat Invoked with path=/etc/ssh/sshd_config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 02 09:35:47 compute-1 sudo[63557]: pam_unix(sudo:session): session closed for user root
Feb 02 09:35:47 compute-1 sudo[63682]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fpfxugyzazkojrzycljqloqmqsothdxh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770024947.1182163-697-176938743890081/AnsiballZ_copy.py'
Feb 02 09:35:47 compute-1 sudo[63682]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:35:48 compute-1 python3.9[63684]: ansible-ansible.legacy.copy Invoked with dest=/etc/ssh/sshd_config mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1770024947.1182163-697-176938743890081/.source validate=/usr/sbin/sshd -T -f %s follow=False _original_basename=sshd_config_block.j2 checksum=6c79f4cb960ad444688fde322eeacb8402e22d79 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 02 09:35:48 compute-1 sudo[63682]: pam_unix(sudo:session): session closed for user root
Feb 02 09:35:48 compute-1 sudo[63835]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ygkpyofiqxlbiywbaeomhojrfcxiaooj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770024948.4490628-742-50542454616746/AnsiballZ_systemd.py'
Feb 02 09:35:48 compute-1 sudo[63835]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:35:49 compute-1 python3.9[63837]: ansible-ansible.builtin.systemd Invoked with name=sshd state=reloaded daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Feb 02 09:35:49 compute-1 systemd[1]: Reloading OpenSSH server daemon...
Feb 02 09:35:49 compute-1 sshd[1010]: Received SIGHUP; restarting.
Feb 02 09:35:49 compute-1 systemd[1]: Reloaded OpenSSH server daemon.
Feb 02 09:35:49 compute-1 sshd[1010]: Server listening on 0.0.0.0 port 22.
Feb 02 09:35:49 compute-1 sshd[1010]: Server listening on :: port 22.
Feb 02 09:35:49 compute-1 sudo[63835]: pam_unix(sudo:session): session closed for user root
Feb 02 09:35:49 compute-1 sudo[63991]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-totqzrzvxgarraokzfmkjaborjghbzaw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770024949.3097255-766-169387491478831/AnsiballZ_file.py'
Feb 02 09:35:49 compute-1 sudo[63991]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:35:49 compute-1 python3.9[63993]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 02 09:35:49 compute-1 sudo[63991]: pam_unix(sudo:session): session closed for user root
Feb 02 09:35:50 compute-1 sudo[64143]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gwhhvcmzonknzxxnzopusfocysgesuyk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770024949.9953647-790-95178274619575/AnsiballZ_stat.py'
Feb 02 09:35:50 compute-1 sudo[64143]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:35:50 compute-1 python3.9[64145]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/sshd-networks.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 02 09:35:50 compute-1 sudo[64143]: pam_unix(sudo:session): session closed for user root
Feb 02 09:35:50 compute-1 sudo[64266]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nuxteujnhblwqyfcbgvqkmbrktplkstw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770024949.9953647-790-95178274619575/AnsiballZ_copy.py'
Feb 02 09:35:50 compute-1 sudo[64266]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:35:50 compute-1 python3.9[64268]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/sshd-networks.yaml group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1770024949.9953647-790-95178274619575/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=0bfc8440fd8f39002ab90252479fb794f51b5ae8 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 02 09:35:51 compute-1 sudo[64266]: pam_unix(sudo:session): session closed for user root
Feb 02 09:35:51 compute-1 sudo[64418]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rtackkutoylwfsjmnhcpnrzbnqprcxzn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770024951.5219772-844-19747577755241/AnsiballZ_timezone.py'
Feb 02 09:35:51 compute-1 sudo[64418]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:35:52 compute-1 python3.9[64420]: ansible-community.general.timezone Invoked with name=UTC hwclock=None
Feb 02 09:35:52 compute-1 systemd[1]: Starting Time & Date Service...
Feb 02 09:35:52 compute-1 systemd[1]: Started Time & Date Service.
Feb 02 09:35:52 compute-1 sudo[64418]: pam_unix(sudo:session): session closed for user root
Feb 02 09:35:52 compute-1 sudo[64574]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rdasozvjxdcscnilcdasjetkyvvdlhfv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770024952.5889068-871-76027739785662/AnsiballZ_file.py'
Feb 02 09:35:52 compute-1 sudo[64574]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:35:53 compute-1 python3.9[64576]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 02 09:35:53 compute-1 sudo[64574]: pam_unix(sudo:session): session closed for user root
Feb 02 09:35:53 compute-1 sudo[64726]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ngfeaipbvhzmvgpkhwzcflclygaliaql ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770024953.2336028-895-61303885170294/AnsiballZ_stat.py'
Feb 02 09:35:53 compute-1 sudo[64726]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:35:53 compute-1 python3.9[64728]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 02 09:35:53 compute-1 sudo[64726]: pam_unix(sudo:session): session closed for user root
Feb 02 09:35:54 compute-1 sudo[64849]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qezccrmandlshadeffffnjxlxjovvwcw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770024953.2336028-895-61303885170294/AnsiballZ_copy.py'
Feb 02 09:35:54 compute-1 sudo[64849]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:35:54 compute-1 python3.9[64851]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1770024953.2336028-895-61303885170294/.source.yaml follow=False _original_basename=base-rules.yaml.j2 checksum=450456afcafded6d4bdecceec7a02e806eebd8b3 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 02 09:35:54 compute-1 sudo[64849]: pam_unix(sudo:session): session closed for user root
Feb 02 09:35:54 compute-1 sudo[65001]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lndzclixrrhglsdwzhvpjhrgtyrtsyvp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770024954.56732-940-7544299619871/AnsiballZ_stat.py'
Feb 02 09:35:54 compute-1 sudo[65001]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:35:55 compute-1 python3.9[65003]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 02 09:35:55 compute-1 sudo[65001]: pam_unix(sudo:session): session closed for user root
Feb 02 09:35:55 compute-1 sudo[65124]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gwosavunbtggracypfttcaowyyiixawo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770024954.56732-940-7544299619871/AnsiballZ_copy.py'
Feb 02 09:35:55 compute-1 sudo[65124]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:35:55 compute-1 python3.9[65126]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1770024954.56732-940-7544299619871/.source.yaml _original_basename=.4bw___po follow=False checksum=97d170e1550eee4afc0af065b78cda302a97674c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 02 09:35:55 compute-1 sudo[65124]: pam_unix(sudo:session): session closed for user root
Feb 02 09:35:56 compute-1 sudo[65276]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-umjzlikbjbbzyrwbdwswjxwbolunxulf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770024955.811372-986-118083414410790/AnsiballZ_stat.py'
Feb 02 09:35:56 compute-1 sudo[65276]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:35:56 compute-1 python3.9[65278]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 02 09:35:56 compute-1 sudo[65276]: pam_unix(sudo:session): session closed for user root
Feb 02 09:35:56 compute-1 sudo[65399]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ibnhnzxotdzuwaztlqwrmyzqnlbdrybu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770024955.811372-986-118083414410790/AnsiballZ_copy.py'
Feb 02 09:35:56 compute-1 sudo[65399]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:35:56 compute-1 python3.9[65401]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/iptables.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1770024955.811372-986-118083414410790/.source.nft _original_basename=iptables.nft follow=False checksum=3e02df08f1f3ab4a513e94056dbd390e3d38fe30 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 02 09:35:56 compute-1 sudo[65399]: pam_unix(sudo:session): session closed for user root
Feb 02 09:35:57 compute-1 sudo[65551]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tcerkpvjzxsgiidpdvtfbilhjwbrfuxy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770024956.9984-1030-176825670485808/AnsiballZ_command.py'
Feb 02 09:35:57 compute-1 sudo[65551]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:35:57 compute-1 python3.9[65553]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/iptables.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 02 09:35:57 compute-1 sudo[65551]: pam_unix(sudo:session): session closed for user root
Feb 02 09:35:57 compute-1 sudo[65704]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pvegtsvyhjrrqxrmzogpewqttdtqrlag ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770024957.6291473-1054-261679533215354/AnsiballZ_command.py'
Feb 02 09:35:57 compute-1 sudo[65704]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:35:58 compute-1 python3.9[65706]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 02 09:35:58 compute-1 sudo[65704]: pam_unix(sudo:session): session closed for user root
Feb 02 09:35:58 compute-1 sudo[65857]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dntpbcugkponzevzpbjdxlfyvxqsbnqt ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1770024958.2908185-1078-36689168800919/AnsiballZ_edpm_nftables_from_files.py'
Feb 02 09:35:58 compute-1 sudo[65857]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:35:58 compute-1 python3[65859]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Feb 02 09:35:58 compute-1 sudo[65857]: pam_unix(sudo:session): session closed for user root
Feb 02 09:35:59 compute-1 sudo[66009]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qikszhechpkqzovonljzyspizikvsqnp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770024959.0886939-1102-239301028363261/AnsiballZ_stat.py'
Feb 02 09:35:59 compute-1 sudo[66009]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:35:59 compute-1 python3.9[66011]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 02 09:35:59 compute-1 sudo[66009]: pam_unix(sudo:session): session closed for user root
Feb 02 09:35:59 compute-1 sudo[66132]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jxpcmgfvwsynaxcsmimgkxfsrfammlka ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770024959.0886939-1102-239301028363261/AnsiballZ_copy.py'
Feb 02 09:35:59 compute-1 sudo[66132]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:36:00 compute-1 python3.9[66134]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1770024959.0886939-1102-239301028363261/.source.nft follow=False _original_basename=jump-chain.j2 checksum=4c6f036d2d5808f109acc0880c19aa74ca48c961 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 02 09:36:00 compute-1 sudo[66132]: pam_unix(sudo:session): session closed for user root
Feb 02 09:36:00 compute-1 sudo[66284]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ehvwywnxpqhuaguazkiyxnhnwmfqfvgd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770024960.2624795-1147-280765877889088/AnsiballZ_stat.py'
Feb 02 09:36:00 compute-1 sudo[66284]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:36:00 compute-1 python3.9[66286]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 02 09:36:00 compute-1 sudo[66284]: pam_unix(sudo:session): session closed for user root
Feb 02 09:36:00 compute-1 sudo[66407]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wounlwnmtseondenwfqjuojpnvsavjmn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770024960.2624795-1147-280765877889088/AnsiballZ_copy.py'
Feb 02 09:36:00 compute-1 sudo[66407]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:36:01 compute-1 python3.9[66409]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-update-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1770024960.2624795-1147-280765877889088/.source.nft follow=False _original_basename=jump-chain.j2 checksum=4c6f036d2d5808f109acc0880c19aa74ca48c961 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 02 09:36:01 compute-1 sudo[66407]: pam_unix(sudo:session): session closed for user root
Feb 02 09:36:01 compute-1 sudo[66559]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-slejuqhnbfvropawlrwsjcbgcebgeiwf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770024961.3785193-1193-249992278542607/AnsiballZ_stat.py'
Feb 02 09:36:01 compute-1 sudo[66559]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:36:01 compute-1 python3.9[66561]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 02 09:36:01 compute-1 sudo[66559]: pam_unix(sudo:session): session closed for user root
Feb 02 09:36:02 compute-1 sudo[66682]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zhxsdffrnrmmbhzuxtmoeuwgutfrjoxq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770024961.3785193-1193-249992278542607/AnsiballZ_copy.py'
Feb 02 09:36:02 compute-1 sudo[66682]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:36:02 compute-1 python3.9[66684]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-flushes.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1770024961.3785193-1193-249992278542607/.source.nft follow=False _original_basename=flush-chain.j2 checksum=d16337256a56373421842284fe09e4e6c7df417e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 02 09:36:02 compute-1 sudo[66682]: pam_unix(sudo:session): session closed for user root
Feb 02 09:36:02 compute-1 sudo[66834]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mtmmsqlwgjekdzysdegqzlaluoeipyzx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770024962.539503-1237-144775975056401/AnsiballZ_stat.py'
Feb 02 09:36:02 compute-1 sudo[66834]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:36:03 compute-1 python3.9[66836]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 02 09:36:03 compute-1 sudo[66834]: pam_unix(sudo:session): session closed for user root
Feb 02 09:36:03 compute-1 sudo[66957]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ziqdmsigxfoptuszppqmlwzefyucoith ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770024962.539503-1237-144775975056401/AnsiballZ_copy.py'
Feb 02 09:36:03 compute-1 sudo[66957]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:36:03 compute-1 python3.9[66959]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-chains.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1770024962.539503-1237-144775975056401/.source.nft follow=False _original_basename=chains.j2 checksum=2079f3b60590a165d1d502e763170876fc8e2984 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 02 09:36:03 compute-1 sudo[66957]: pam_unix(sudo:session): session closed for user root
Feb 02 09:36:04 compute-1 sudo[67109]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-syuwmgqrfhebarqhxluchmafxtafxgzh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770024963.721225-1282-246468476647806/AnsiballZ_stat.py'
Feb 02 09:36:04 compute-1 sudo[67109]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:36:04 compute-1 python3.9[67111]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 02 09:36:04 compute-1 sudo[67109]: pam_unix(sudo:session): session closed for user root
Feb 02 09:36:04 compute-1 sudo[67232]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lhjebznjbxdesjngliqoawrlcparzliz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770024963.721225-1282-246468476647806/AnsiballZ_copy.py'
Feb 02 09:36:04 compute-1 sudo[67232]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:36:04 compute-1 python3.9[67234]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1770024963.721225-1282-246468476647806/.source.nft follow=False _original_basename=ruleset.j2 checksum=693377dc03e5b6b24713cb537b18b88774724e35 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 02 09:36:04 compute-1 sudo[67232]: pam_unix(sudo:session): session closed for user root
Feb 02 09:36:05 compute-1 sudo[67384]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bsmjfrpepjbcmvltqcwggprpoutgrtke ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770024964.976588-1327-35331526072210/AnsiballZ_file.py'
Feb 02 09:36:05 compute-1 sudo[67384]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:36:05 compute-1 python3.9[67386]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 02 09:36:05 compute-1 sudo[67384]: pam_unix(sudo:session): session closed for user root
Feb 02 09:36:05 compute-1 sudo[67536]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cvppjhwwlgcsuixovwprlhvaejnobqhy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770024965.6597018-1351-61931174741481/AnsiballZ_command.py'
Feb 02 09:36:05 compute-1 sudo[67536]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:36:06 compute-1 python3.9[67538]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 02 09:36:06 compute-1 sudo[67536]: pam_unix(sudo:session): session closed for user root
Feb 02 09:36:06 compute-1 sudo[67695]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-unrsiuspiufjxovanxhfcbnwiqdjyvkh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770024966.4033163-1375-138732088375218/AnsiballZ_blockinfile.py'
Feb 02 09:36:06 compute-1 sudo[67695]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:36:07 compute-1 python3.9[67697]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"
                                            include "/etc/nftables/edpm-chains.nft"
                                            include "/etc/nftables/edpm-rules.nft"
                                            include "/etc/nftables/edpm-jumps.nft"
                                             path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 02 09:36:07 compute-1 sudo[67695]: pam_unix(sudo:session): session closed for user root
Feb 02 09:36:07 compute-1 sudo[67848]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qdatcngfmzyottvhxwrymjyqgxlztutu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770024967.4014566-1402-132254602024277/AnsiballZ_file.py'
Feb 02 09:36:07 compute-1 sudo[67848]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:36:07 compute-1 python3.9[67850]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages1G state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 02 09:36:07 compute-1 sudo[67848]: pam_unix(sudo:session): session closed for user root
Feb 02 09:36:08 compute-1 sudo[68000]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dqghxdxbeasrngdpaknyptmbyqkkpasp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770024967.9705055-1402-272506043943043/AnsiballZ_file.py'
Feb 02 09:36:08 compute-1 sudo[68000]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:36:08 compute-1 python3.9[68002]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages2M state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 02 09:36:08 compute-1 sudo[68000]: pam_unix(sudo:session): session closed for user root
Feb 02 09:36:09 compute-1 sudo[68152]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-psgrsxdqlseykvcybbetafslhrgtpgrr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770024968.7140512-1447-139079261292410/AnsiballZ_mount.py'
Feb 02 09:36:09 compute-1 sudo[68152]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:36:09 compute-1 python3.9[68154]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=1G path=/dev/hugepages1G src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None
Feb 02 09:36:09 compute-1 sudo[68152]: pam_unix(sudo:session): session closed for user root
Feb 02 09:36:09 compute-1 sudo[68305]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-adrwlokaefkvpslkircqopogjqemenfo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770024969.5141053-1447-197560413493306/AnsiballZ_mount.py'
Feb 02 09:36:09 compute-1 sudo[68305]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:36:09 compute-1 python3.9[68307]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=2M path=/dev/hugepages2M src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None
Feb 02 09:36:09 compute-1 sudo[68305]: pam_unix(sudo:session): session closed for user root
Feb 02 09:36:10 compute-1 sshd-session[59103]: Connection closed by 192.168.122.30 port 39760
Feb 02 09:36:10 compute-1 sshd-session[59100]: pam_unix(sshd:session): session closed for user zuul
Feb 02 09:36:10 compute-1 systemd-logind[805]: Session 15 logged out. Waiting for processes to exit.
Feb 02 09:36:10 compute-1 systemd[1]: session-15.scope: Deactivated successfully.
Feb 02 09:36:10 compute-1 systemd[1]: session-15.scope: Consumed 31.215s CPU time.
Feb 02 09:36:10 compute-1 systemd-logind[805]: Removed session 15.
Feb 02 09:36:15 compute-1 sshd-session[68333]: Accepted publickey for zuul from 192.168.122.30 port 59912 ssh2: ECDSA SHA256:RIWOugHsRom13QN8+H2eekzMj6VNcm6gUxie+zDStiQ
Feb 02 09:36:15 compute-1 systemd-logind[805]: New session 16 of user zuul.
Feb 02 09:36:15 compute-1 systemd[1]: Started Session 16 of User zuul.
Feb 02 09:36:15 compute-1 sshd-session[68333]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Feb 02 09:36:16 compute-1 sudo[68486]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rmecnagmpnqfthhugqncpclfdcdjwinp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770024975.7738028-19-78963916472672/AnsiballZ_tempfile.py'
Feb 02 09:36:16 compute-1 sudo[68486]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:36:16 compute-1 python3.9[68488]: ansible-ansible.builtin.tempfile Invoked with state=file prefix=ansible. suffix= path=None
Feb 02 09:36:16 compute-1 sudo[68486]: pam_unix(sudo:session): session closed for user root
Feb 02 09:36:17 compute-1 sudo[68638]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eydhkjwhyepdnipzlrydsezzacemwvmp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770024976.604469-55-35930145504195/AnsiballZ_stat.py'
Feb 02 09:36:17 compute-1 sudo[68638]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:36:17 compute-1 python3.9[68640]: ansible-ansible.builtin.stat Invoked with path=/etc/ssh/ssh_known_hosts follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 02 09:36:17 compute-1 sudo[68638]: pam_unix(sudo:session): session closed for user root
Feb 02 09:36:18 compute-1 sudo[68790]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dewrmrhzbgxeqsbhkuqknxlraeiqcjhd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770024977.50209-85-84461788807973/AnsiballZ_setup.py'
Feb 02 09:36:18 compute-1 sudo[68790]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:36:18 compute-1 python3.9[68792]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'ssh_host_key_rsa_public', 'ssh_host_key_ed25519_public', 'ssh_host_key_ecdsa_public'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 02 09:36:18 compute-1 sudo[68790]: pam_unix(sudo:session): session closed for user root
Feb 02 09:36:18 compute-1 sudo[68942]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lvmiqsyoyzdcatruzquajzxuoszxwxrg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770024978.5793054-110-62591594513340/AnsiballZ_blockinfile.py'
Feb 02 09:36:18 compute-1 sudo[68942]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:36:19 compute-1 python3.9[68944]: ansible-ansible.builtin.blockinfile Invoked with block=compute-1.ctlplane.example.com,192.168.122.101,compute-1* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDTA16t8OsOL4s99BOiNF3vckRPwnc9DwrgEMUjNAF5ofBbR7O7JlFD47GnI33lZr51vVc0wnvTxhpFA0jVvhKqVWdJ3lApNf34bJmaJBr8uiy/i3Q84MsUtXBLQ0FDCbwgaPnreNbMz3ae+u9H+Z73jQSP+gnQ5oYWhONHgO4HHkF8K7a8Bow3H5qwfbHz8o7mFQmTpYHwOcwhA53BTbh1NiEJZJNSg7wi1hH7vELUAzts1cbF2slTE0nh8XjMogq9ukokrCIKfE+xX7PmAawCuMnfvGX93zF1298pGcUKqvpnIfUOMDGtJtYEZ8sWsr5aH1YXIoJfHuux/YosRx3XDD5oEcpX0nYKVW6bumHsFIS199XAM5LtWWNr2eMcrbZhVwHNdELC6zoL7QjbBQ+2j/+8nJLq9vIghewgO3EFWK3r7kIVQZg8GYLZ/yisH4cvzUTACRXAF+1o2rq+AUfX3nTSsrqyZQUwlnWpc1vsceEO0Lsuac5tvGylnsJBfmM=
                                            compute-1.ctlplane.example.com,192.168.122.101,compute-1* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIN317jbKb2FNELHPgcKtyDLq5kCgCZN/b/8qYDuirt4l
                                            compute-1.ctlplane.example.com,192.168.122.101,compute-1* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBNpgfrlTfGut7rGFnGEpIiXrs2U1SQK0Fr1bAmmw8notvdnn6jtGfPfwX96hGwcOu4AlAS/i7X7XgbLw573Ooww=
                                            compute-0.ctlplane.example.com,192.168.122.100,compute-0* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDXvxaVTYbHTHv+9EzKdF3T8+Yr2otW2YLuSqNTF+yJaKACfB7wDlIhKDGTHiU1FDrkO4tJ+R3OL/2ZXoIlxp5JSdCgcb42X+5PTj1wPkayVlQW7e0wQvT3kYhrcPtjLgk4T39/sionMGYUat45idwoB6hUSPLdk/L5+n0/3LEg1lByOM/B1/p8wGzHn6H9CWoIP3Ctd6lmrxtIVU1u+pxiBVQCcMjw5gtqsB54l670fL7El5XEkqjRjKHhylw9QTYN3AWMKuQKwcjClm/57/SoFMP7o52r653wGDH9cpvDgs0RYG4bA1mGY5OMkYbDJfcy0CViKEu5qWW4cTBLh/Z88D2EuNlINj3Q1YJk3RwF6vYl31MMsbBW10YhIiBJrA5XF0BLARqBOZ1e6v7JKTSwa7wGGtRzEzbY+me9zl6ZhhDru/I+h24J4MeBA07HvQIS2v8O95tPz76YZJ3DkWlywFWbALG8M4+fkpuQtvVpBZMgdvIWW0kfXO/grGnrgY8=
                                            compute-0.ctlplane.example.com,192.168.122.100,compute-0* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIG3OEs+fDFWrKRKifY4uXYtOpS/6/8E88qPQNs1apj/z
                                            compute-0.ctlplane.example.com,192.168.122.100,compute-0* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBFy9hRh0QDNcy30491f4FwmL+9BopSuPxbkVyWhY9VytT/FG5rm9/DLYyukpd9IKttcZyerq0gzfokDrht76FB4=
                                            compute-2.ctlplane.example.com,192.168.122.102,compute-2* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCpaaLVd9Gqbxcksz46sKNkp3Eu2TY3fUjtOhbkLQru93qJt/RNDTocNiUrE9VAj/UXp9dZqSHg1Hr7ScqXu7zqgZ9i+mq6N7P7QR+ZkN8jLQSybnPztI7X/QWaPhT0j1ArMrYk2F2Me+kAQiFL0GoR2d8udRElL8YKKIYQ6zjC/h2ZsU0WyVET9uiTgeMP/njtMzRSgO2Wp6no4KqJEOMSEY1lgURjVsMWkTr4hGz523SooA41GzquuNamnj1ELwKZSAH+TtVgI8oFJ2T+5TZiE/oW2MizbBwjKA3V5DlnGOEG49eG+LhZ/eWb6jQ7OnJARA/iLU/FsJ+CaGSbRK20/OWXP4JSZu7liaD0DIHM0DwrjEnQcXI6SbfAoAQ494KFtZvFamem7CPtrVhgNAKqybRbDcEQGpDxQgrWeA3m4HyGIBym+IvMUfYlNke9frCkwNpXRH93TK6E/ziPFrBHKkdRcFxVdsG2u1Y+adxOQk7KCjq/skzXBPCPDaHnzBM=
                                            compute-2.ctlplane.example.com,192.168.122.102,compute-2* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIIKtQmhiX/LRkxZONUn47u07V1HNePVW1EWKmTbmuGuY
                                            compute-2.ctlplane.example.com,192.168.122.102,compute-2* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBE0cPV3BwiB9Cc5Ne48bCCSZwMzF/hH7iFXwAiP/TK2pzWYsdZw1mOSJ+vDu1KclkDtQKmwN6Cu0N7j7domqlzE=
                                             create=True mode=0644 path=/tmp/ansible.50jbc6k_ state=present marker=# {mark} ANSIBLE MANAGED BLOCK backup=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 02 09:36:19 compute-1 sudo[68942]: pam_unix(sudo:session): session closed for user root
Feb 02 09:36:19 compute-1 sudo[69094]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-turorydzjkmhcrmcjxxtiulmnwddiuok ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770024979.365117-134-166253400170266/AnsiballZ_command.py'
Feb 02 09:36:19 compute-1 sudo[69094]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:36:19 compute-1 python3.9[69096]: ansible-ansible.legacy.command Invoked with _raw_params=cat '/tmp/ansible.50jbc6k_' > /etc/ssh/ssh_known_hosts _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 02 09:36:19 compute-1 sudo[69094]: pam_unix(sudo:session): session closed for user root
Feb 02 09:36:20 compute-1 sudo[69248]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ebpgviyqmcuqorzgwpnvdzbzatxvkbac ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770024980.181195-158-231775216893657/AnsiballZ_file.py'
Feb 02 09:36:20 compute-1 sudo[69248]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:36:20 compute-1 python3.9[69250]: ansible-ansible.builtin.file Invoked with path=/tmp/ansible.50jbc6k_ state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 02 09:36:20 compute-1 sudo[69248]: pam_unix(sudo:session): session closed for user root
Feb 02 09:36:21 compute-1 sshd-session[68336]: Connection closed by 192.168.122.30 port 59912
Feb 02 09:36:21 compute-1 sshd-session[68333]: pam_unix(sshd:session): session closed for user zuul
Feb 02 09:36:21 compute-1 systemd[1]: session-16.scope: Deactivated successfully.
Feb 02 09:36:21 compute-1 systemd[1]: session-16.scope: Consumed 2.798s CPU time.
Feb 02 09:36:21 compute-1 systemd-logind[805]: Session 16 logged out. Waiting for processes to exit.
Feb 02 09:36:21 compute-1 systemd-logind[805]: Removed session 16.
Feb 02 09:36:22 compute-1 systemd[1]: systemd-timedated.service: Deactivated successfully.
Feb 02 09:36:26 compute-1 sshd-session[69277]: Accepted publickey for zuul from 192.168.122.30 port 43578 ssh2: ECDSA SHA256:RIWOugHsRom13QN8+H2eekzMj6VNcm6gUxie+zDStiQ
Feb 02 09:36:26 compute-1 systemd-logind[805]: New session 17 of user zuul.
Feb 02 09:36:26 compute-1 systemd[1]: Started Session 17 of User zuul.
Feb 02 09:36:26 compute-1 sshd-session[69277]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Feb 02 09:36:27 compute-1 python3.9[69430]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 02 09:36:28 compute-1 sudo[69584]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eienxztbpwgljxqxplkniayuqtauwsml ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770024988.336982-52-48011571362903/AnsiballZ_systemd.py'
Feb 02 09:36:28 compute-1 sudo[69584]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:36:29 compute-1 python3.9[69586]: ansible-ansible.builtin.systemd Invoked with enabled=True name=sshd daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None masked=None
Feb 02 09:36:29 compute-1 sudo[69584]: pam_unix(sudo:session): session closed for user root
Feb 02 09:36:29 compute-1 sudo[69738]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-owkissyporktqgrkznkarjntkhyahufh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770024989.4141994-76-1309993261939/AnsiballZ_systemd.py'
Feb 02 09:36:29 compute-1 sudo[69738]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:36:29 compute-1 python3.9[69740]: ansible-ansible.builtin.systemd Invoked with name=sshd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Feb 02 09:36:29 compute-1 sudo[69738]: pam_unix(sudo:session): session closed for user root
Feb 02 09:36:30 compute-1 sudo[69891]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ifrvgalmjghcbmiovzppefikhitciqah ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770024990.251528-103-150871270910790/AnsiballZ_command.py'
Feb 02 09:36:30 compute-1 sudo[69891]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:36:30 compute-1 python3.9[69893]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 02 09:36:30 compute-1 sudo[69891]: pam_unix(sudo:session): session closed for user root
Feb 02 09:36:31 compute-1 sudo[70044]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ktpghvwtfmanxjtfmwmhfeywctmoncix ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770024991.0317225-127-93429051404133/AnsiballZ_stat.py'
Feb 02 09:36:31 compute-1 sudo[70044]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:36:31 compute-1 python3.9[70046]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 02 09:36:31 compute-1 sudo[70044]: pam_unix(sudo:session): session closed for user root
Feb 02 09:36:32 compute-1 sudo[70198]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cbrwesktdnilsnmnsddbglwqelfqkvgl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770024991.7820525-151-170321590600538/AnsiballZ_command.py'
Feb 02 09:36:32 compute-1 sudo[70198]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:36:32 compute-1 python3.9[70200]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 02 09:36:32 compute-1 sudo[70198]: pam_unix(sudo:session): session closed for user root
Feb 02 09:36:32 compute-1 sudo[70353]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fqvwbokhsxzvfyynwqipdoeefcdzqdqm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770024992.4709752-175-76588699134084/AnsiballZ_file.py'
Feb 02 09:36:32 compute-1 sudo[70353]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:36:33 compute-1 python3.9[70355]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 02 09:36:33 compute-1 sudo[70353]: pam_unix(sudo:session): session closed for user root
Feb 02 09:36:33 compute-1 sshd-session[69280]: Connection closed by 192.168.122.30 port 43578
Feb 02 09:36:33 compute-1 sshd-session[69277]: pam_unix(sshd:session): session closed for user zuul
Feb 02 09:36:33 compute-1 systemd[1]: session-17.scope: Deactivated successfully.
Feb 02 09:36:33 compute-1 systemd[1]: session-17.scope: Consumed 3.779s CPU time.
Feb 02 09:36:33 compute-1 systemd-logind[805]: Session 17 logged out. Waiting for processes to exit.
Feb 02 09:36:33 compute-1 systemd-logind[805]: Removed session 17.
Feb 02 09:36:38 compute-1 sshd-session[70380]: Accepted publickey for zuul from 192.168.122.30 port 40462 ssh2: ECDSA SHA256:RIWOugHsRom13QN8+H2eekzMj6VNcm6gUxie+zDStiQ
Feb 02 09:36:38 compute-1 systemd-logind[805]: New session 18 of user zuul.
Feb 02 09:36:38 compute-1 systemd[1]: Started Session 18 of User zuul.
Feb 02 09:36:38 compute-1 sshd-session[70380]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Feb 02 09:36:40 compute-1 python3.9[70533]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 02 09:36:40 compute-1 sudo[70687]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wdzzkxdqdjomesoittboulhlcxzyhvti ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025000.4758568-58-167023473382197/AnsiballZ_setup.py'
Feb 02 09:36:40 compute-1 sudo[70687]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:36:41 compute-1 python3.9[70689]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Feb 02 09:36:41 compute-1 sudo[70687]: pam_unix(sudo:session): session closed for user root
Feb 02 09:36:41 compute-1 sudo[70771]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ftjmkiucoujbxhysaakqcjsmryzebppe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025000.4758568-58-167023473382197/AnsiballZ_dnf.py'
Feb 02 09:36:41 compute-1 sudo[70771]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:36:41 compute-1 python3.9[70773]: ansible-ansible.legacy.dnf Invoked with name=['yum-utils'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Feb 02 09:36:42 compute-1 sudo[70771]: pam_unix(sudo:session): session closed for user root
Feb 02 09:36:43 compute-1 python3.9[70924]: ansible-ansible.legacy.command Invoked with _raw_params=needs-restarting -r _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 02 09:36:45 compute-1 python3.9[71075]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/reboot_required/'] patterns=[] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Feb 02 09:36:45 compute-1 python3.9[71225]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 02 09:36:45 compute-1 rsyslogd[1009]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Feb 02 09:36:45 compute-1 rsyslogd[1009]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Feb 02 09:36:46 compute-1 python3.9[71376]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 02 09:36:47 compute-1 sshd-session[70383]: Connection closed by 192.168.122.30 port 40462
Feb 02 09:36:47 compute-1 sshd-session[70380]: pam_unix(sshd:session): session closed for user zuul
Feb 02 09:36:47 compute-1 systemd[1]: session-18.scope: Deactivated successfully.
Feb 02 09:36:47 compute-1 systemd[1]: session-18.scope: Consumed 5.128s CPU time.
Feb 02 09:36:47 compute-1 systemd-logind[805]: Session 18 logged out. Waiting for processes to exit.
Feb 02 09:36:47 compute-1 systemd-logind[805]: Removed session 18.
Feb 02 09:36:55 compute-1 sshd-session[71401]: Accepted publickey for zuul from 38.102.83.241 port 53408 ssh2: RSA SHA256:oiZKnX5kwvqrsUV0ZZjSac+GUqsMvprIFYZPo6yyNjU
Feb 02 09:36:55 compute-1 systemd-logind[805]: New session 19 of user zuul.
Feb 02 09:36:55 compute-1 systemd[1]: Started Session 19 of User zuul.
Feb 02 09:36:55 compute-1 sshd-session[71401]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Feb 02 09:36:55 compute-1 sudo[71477]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jjhviaclhxlzkegzbijjitluholrpuyn ; /usr/bin/python3'
Feb 02 09:36:55 compute-1 sudo[71477]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:36:56 compute-1 useradd[71481]: new group: name=ceph-admin, GID=42478
Feb 02 09:36:56 compute-1 useradd[71481]: new user: name=ceph-admin, UID=42477, GID=42478, home=/home/ceph-admin, shell=/bin/bash, from=none
Feb 02 09:36:56 compute-1 sudo[71477]: pam_unix(sudo:session): session closed for user root
Feb 02 09:36:56 compute-1 sudo[71563]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-outmcdytuzfzwbooontphtrkcmpogqzj ; /usr/bin/python3'
Feb 02 09:36:56 compute-1 sudo[71563]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:36:56 compute-1 sudo[71563]: pam_unix(sudo:session): session closed for user root
Feb 02 09:36:56 compute-1 sudo[71636]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-moatcqblwyqxxjedlawdtmdipahsolcc ; /usr/bin/python3'
Feb 02 09:36:56 compute-1 sudo[71636]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:36:57 compute-1 sudo[71636]: pam_unix(sudo:session): session closed for user root
Feb 02 09:36:57 compute-1 sudo[71686]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tybybfcvsqkmmiljkyprktarlqukbssr ; /usr/bin/python3'
Feb 02 09:36:57 compute-1 sudo[71686]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:36:57 compute-1 sudo[71686]: pam_unix(sudo:session): session closed for user root
Feb 02 09:36:57 compute-1 sudo[71712]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jkjhxsxalflwurqphqhiemrivgurqnxp ; /usr/bin/python3'
Feb 02 09:36:57 compute-1 sudo[71712]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:36:57 compute-1 sudo[71712]: pam_unix(sudo:session): session closed for user root
Feb 02 09:36:58 compute-1 sudo[71738]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fvekgpnonuhhjiwcycgafvwfgnfnjymk ; /usr/bin/python3'
Feb 02 09:36:58 compute-1 sudo[71738]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:36:58 compute-1 sudo[71738]: pam_unix(sudo:session): session closed for user root
Feb 02 09:36:58 compute-1 sudo[71764]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lfkmaahzjijjjgwrnvqzepqramlzcgif ; /usr/bin/python3'
Feb 02 09:36:58 compute-1 sudo[71764]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:36:58 compute-1 sudo[71764]: pam_unix(sudo:session): session closed for user root
Feb 02 09:36:59 compute-1 sudo[71842]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-umvgusodrdcxxsreaoponlovzokpddik ; /usr/bin/python3'
Feb 02 09:36:59 compute-1 sudo[71842]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:36:59 compute-1 sudo[71842]: pam_unix(sudo:session): session closed for user root
Feb 02 09:36:59 compute-1 sudo[71915]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wsqtxrbhgioxallzmvtetbqtnfuqitvm ; /usr/bin/python3'
Feb 02 09:36:59 compute-1 sudo[71915]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:36:59 compute-1 sudo[71915]: pam_unix(sudo:session): session closed for user root
Feb 02 09:36:59 compute-1 sudo[72019]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pszijuufwygedagslbwovqhhtvmiyiow ; /usr/bin/python3'
Feb 02 09:36:59 compute-1 sudo[72019]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:37:00 compute-1 sudo[72019]: pam_unix(sudo:session): session closed for user root
Feb 02 09:37:00 compute-1 sudo[72092]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-msnlherwrilrqipwdgbenssoeluhfljr ; /usr/bin/python3'
Feb 02 09:37:00 compute-1 sudo[72092]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:37:00 compute-1 sshd-session[71942]: Invalid user solv from 80.94.92.184 port 54228
Feb 02 09:37:00 compute-1 sudo[72092]: pam_unix(sudo:session): session closed for user root
Feb 02 09:37:00 compute-1 sshd-session[71942]: Connection closed by invalid user solv 80.94.92.184 port 54228 [preauth]
Feb 02 09:37:00 compute-1 sudo[72142]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rtqknvxftfuuvcqjrqbmuuudjkzjwdmh ; /usr/bin/python3'
Feb 02 09:37:00 compute-1 sudo[72142]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:37:01 compute-1 python3[72144]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 02 09:37:01 compute-1 sudo[72142]: pam_unix(sudo:session): session closed for user root
Feb 02 09:37:02 compute-1 sudo[72237]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zsqjftyjtdvxnvhknjssbuphfvmcpsqb ; /usr/bin/python3'
Feb 02 09:37:02 compute-1 sudo[72237]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:37:02 compute-1 python3[72239]: ansible-ansible.legacy.dnf Invoked with name=['util-linux', 'lvm2', 'jq', 'podman'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False use_backend=auto conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Feb 02 09:37:03 compute-1 sudo[72237]: pam_unix(sudo:session): session closed for user root
Feb 02 09:37:04 compute-1 sudo[72264]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fghwwvbydvhghevadgeiaahpjhyypqon ; /usr/bin/python3'
Feb 02 09:37:04 compute-1 sudo[72264]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:37:04 compute-1 python3[72266]: ansible-ansible.builtin.stat Invoked with path=/dev/loop3 follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Feb 02 09:37:04 compute-1 sudo[72264]: pam_unix(sudo:session): session closed for user root
Feb 02 09:37:04 compute-1 sudo[72290]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lxtacevdihuyswdnnofzepcowjxtbapk ; /usr/bin/python3'
Feb 02 09:37:04 compute-1 sudo[72290]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:37:04 compute-1 python3[72292]: ansible-ansible.legacy.command Invoked with _raw_params=dd if=/dev/zero of=/var/lib/ceph-osd-0.img bs=1 count=0 seek=20G
                                          losetup /dev/loop3 /var/lib/ceph-osd-0.img
                                          lsblk _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 02 09:37:04 compute-1 kernel: loop: module loaded
Feb 02 09:37:04 compute-1 kernel: loop3: detected capacity change from 0 to 41943040
Feb 02 09:37:04 compute-1 sudo[72290]: pam_unix(sudo:session): session closed for user root
Feb 02 09:37:04 compute-1 sudo[72325]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aawxrjdeubzhtsmlkjknouqpfkunbbvt ; /usr/bin/python3'
Feb 02 09:37:04 compute-1 sudo[72325]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:37:04 compute-1 python3[72327]: ansible-ansible.legacy.command Invoked with _raw_params=pvcreate /dev/loop3
                                          vgcreate ceph_vg0 /dev/loop3
                                          lvcreate -n ceph_lv0 -l +100%FREE ceph_vg0
                                          lvs _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 02 09:37:05 compute-1 lvm[72330]: PV /dev/loop3 not used.
Feb 02 09:37:05 compute-1 lvm[72332]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Feb 02 09:37:05 compute-1 systemd[1]: Started /usr/sbin/lvm vgchange -aay --autoactivation event ceph_vg0.
Feb 02 09:37:05 compute-1 lvm[72335]:   1 logical volume(s) in volume group "ceph_vg0" now active
Feb 02 09:37:05 compute-1 lvm[72342]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Feb 02 09:37:05 compute-1 lvm[72342]: VG ceph_vg0 finished
Feb 02 09:37:05 compute-1 systemd[1]: lvm-activate-ceph_vg0.service: Deactivated successfully.
Feb 02 09:37:05 compute-1 sudo[72325]: pam_unix(sudo:session): session closed for user root
Feb 02 09:37:05 compute-1 sudo[72418]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jfdxkmmmmumnuvwoboiznvimgouniloc ; /usr/bin/python3'
Feb 02 09:37:05 compute-1 sudo[72418]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:37:05 compute-1 python3[72420]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/ceph-osd-losetup-0.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 02 09:37:05 compute-1 sudo[72418]: pam_unix(sudo:session): session closed for user root
Feb 02 09:37:05 compute-1 sudo[72491]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hycrqbzofmuqpkvyytyhjyqnpeduwqku ; /usr/bin/python3'
Feb 02 09:37:05 compute-1 sudo[72491]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:37:06 compute-1 python3[72493]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1770025025.5627344-36892-11820083717439/source dest=/etc/systemd/system/ceph-osd-losetup-0.service mode=0644 force=True follow=False _original_basename=ceph-osd-losetup.service.j2 checksum=427b1db064a970126b729b07acf99fa7d0eecb9c backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 02 09:37:06 compute-1 sudo[72491]: pam_unix(sudo:session): session closed for user root
Feb 02 09:37:06 compute-1 sudo[72541]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-odrvobfdlptdnamlhwkgbiqpaxspglsf ; /usr/bin/python3'
Feb 02 09:37:06 compute-1 sudo[72541]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:37:06 compute-1 python3[72543]: ansible-ansible.builtin.systemd Invoked with state=started enabled=True name=ceph-osd-losetup-0.service daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 02 09:37:06 compute-1 systemd[1]: Reloading.
Feb 02 09:37:06 compute-1 systemd-rc-local-generator[72563]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 02 09:37:06 compute-1 systemd-sysv-generator[72571]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 02 09:37:07 compute-1 systemd[1]: Starting Ceph OSD losetup...
Feb 02 09:37:07 compute-1 bash[72582]: /dev/loop3: [64513]:4329557 (/var/lib/ceph-osd-0.img)
Feb 02 09:37:07 compute-1 systemd[1]: Finished Ceph OSD losetup.
Feb 02 09:37:07 compute-1 sudo[72541]: pam_unix(sudo:session): session closed for user root
Feb 02 09:37:07 compute-1 lvm[72583]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Feb 02 09:37:07 compute-1 lvm[72583]: VG ceph_vg0 finished
Feb 02 09:37:09 compute-1 python3[72607]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'network'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 02 09:37:16 compute-1 chronyd[58619]: Selected source 142.4.192.253 (pool.ntp.org)
Feb 02 09:38:28 compute-1 sshd-session[72651]: Accepted publickey for ceph-admin from 192.168.122.100 port 35128 ssh2: RSA SHA256:U0yYyMay/+pOHGkTC+bWOMAOMtoKtn/A+YnW2fdFMFU
Feb 02 09:38:28 compute-1 systemd-logind[805]: New session 20 of user ceph-admin.
Feb 02 09:38:28 compute-1 systemd[1]: Created slice User Slice of UID 42477.
Feb 02 09:38:28 compute-1 systemd[1]: Starting User Runtime Directory /run/user/42477...
Feb 02 09:38:28 compute-1 systemd[1]: Finished User Runtime Directory /run/user/42477.
Feb 02 09:38:28 compute-1 systemd[1]: Starting User Manager for UID 42477...
Feb 02 09:38:28 compute-1 systemd[72655]: pam_unix(systemd-user:session): session opened for user ceph-admin(uid=42477) by ceph-admin(uid=0)
Feb 02 09:38:28 compute-1 systemd[72655]: Queued start job for default target Main User Target.
Feb 02 09:38:28 compute-1 systemd[72655]: Created slice User Application Slice.
Feb 02 09:38:28 compute-1 systemd[72655]: Started Mark boot as successful after the user session has run 2 minutes.
Feb 02 09:38:28 compute-1 systemd[72655]: Started Daily Cleanup of User's Temporary Directories.
Feb 02 09:38:28 compute-1 systemd[72655]: Reached target Paths.
Feb 02 09:38:28 compute-1 systemd[72655]: Reached target Timers.
Feb 02 09:38:28 compute-1 sshd-session[72669]: Accepted publickey for ceph-admin from 192.168.122.100 port 35142 ssh2: RSA SHA256:U0yYyMay/+pOHGkTC+bWOMAOMtoKtn/A+YnW2fdFMFU
Feb 02 09:38:28 compute-1 systemd[72655]: Starting D-Bus User Message Bus Socket...
Feb 02 09:38:28 compute-1 systemd[72655]: Starting Create User's Volatile Files and Directories...
Feb 02 09:38:28 compute-1 systemd-logind[805]: New session 22 of user ceph-admin.
Feb 02 09:38:28 compute-1 systemd[72655]: Finished Create User's Volatile Files and Directories.
Feb 02 09:38:28 compute-1 systemd[72655]: Listening on D-Bus User Message Bus Socket.
Feb 02 09:38:28 compute-1 systemd[72655]: Reached target Sockets.
Feb 02 09:38:28 compute-1 systemd[72655]: Reached target Basic System.
Feb 02 09:38:28 compute-1 systemd[72655]: Reached target Main User Target.
Feb 02 09:38:28 compute-1 systemd[72655]: Startup finished in 101ms.
Feb 02 09:38:28 compute-1 systemd[1]: Started User Manager for UID 42477.
Feb 02 09:38:28 compute-1 systemd[1]: Started Session 20 of User ceph-admin.
Feb 02 09:38:28 compute-1 systemd[1]: Started Session 22 of User ceph-admin.
Feb 02 09:38:28 compute-1 sshd-session[72651]: pam_unix(sshd:session): session opened for user ceph-admin(uid=42477) by ceph-admin(uid=0)
Feb 02 09:38:28 compute-1 sshd-session[72669]: pam_unix(sshd:session): session opened for user ceph-admin(uid=42477) by ceph-admin(uid=0)
Feb 02 09:38:28 compute-1 sudo[72676]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 02 09:38:28 compute-1 sudo[72676]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:38:28 compute-1 sudo[72676]: pam_unix(sudo:session): session closed for user root
Feb 02 09:38:28 compute-1 sshd-session[72701]: Accepted publickey for ceph-admin from 192.168.122.100 port 35150 ssh2: RSA SHA256:U0yYyMay/+pOHGkTC+bWOMAOMtoKtn/A+YnW2fdFMFU
Feb 02 09:38:28 compute-1 systemd-logind[805]: New session 23 of user ceph-admin.
Feb 02 09:38:28 compute-1 systemd[1]: Started Session 23 of User ceph-admin.
Feb 02 09:38:28 compute-1 sshd-session[72701]: pam_unix(sshd:session): session opened for user ceph-admin(uid=42477) by ceph-admin(uid=0)
Feb 02 09:38:28 compute-1 sudo[72705]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/d241d473-9fcb-5f74-b163-f1ca4454e7f1/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 check-host --expect-hostname compute-1
Feb 02 09:38:28 compute-1 sudo[72705]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:38:28 compute-1 sudo[72705]: pam_unix(sudo:session): session closed for user root
Feb 02 09:38:29 compute-1 sshd-session[72730]: Accepted publickey for ceph-admin from 192.168.122.100 port 35164 ssh2: RSA SHA256:U0yYyMay/+pOHGkTC+bWOMAOMtoKtn/A+YnW2fdFMFU
Feb 02 09:38:29 compute-1 systemd-logind[805]: New session 24 of user ceph-admin.
Feb 02 09:38:29 compute-1 systemd[1]: Started Session 24 of User ceph-admin.
Feb 02 09:38:29 compute-1 sshd-session[72730]: pam_unix(sshd:session): session opened for user ceph-admin(uid=42477) by ceph-admin(uid=0)
Feb 02 09:38:29 compute-1 sudo[72734]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /var/lib/ceph/d241d473-9fcb-5f74-b163-f1ca4454e7f1/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36
Feb 02 09:38:29 compute-1 sudo[72734]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:38:29 compute-1 sudo[72734]: pam_unix(sudo:session): session closed for user root
Feb 02 09:38:29 compute-1 sshd-session[72759]: Accepted publickey for ceph-admin from 192.168.122.100 port 35172 ssh2: RSA SHA256:U0yYyMay/+pOHGkTC+bWOMAOMtoKtn/A+YnW2fdFMFU
Feb 02 09:38:29 compute-1 systemd-logind[805]: New session 25 of user ceph-admin.
Feb 02 09:38:29 compute-1 systemd[1]: Started Session 25 of User ceph-admin.
Feb 02 09:38:29 compute-1 sshd-session[72759]: pam_unix(sshd:session): session opened for user ceph-admin(uid=42477) by ceph-admin(uid=0)
Feb 02 09:38:29 compute-1 sudo[72763]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/d241d473-9fcb-5f74-b163-f1ca4454e7f1
Feb 02 09:38:29 compute-1 sudo[72763]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:38:29 compute-1 sudo[72763]: pam_unix(sudo:session): session closed for user root
Feb 02 09:38:29 compute-1 sshd-session[72788]: Accepted publickey for ceph-admin from 192.168.122.100 port 35186 ssh2: RSA SHA256:U0yYyMay/+pOHGkTC+bWOMAOMtoKtn/A+YnW2fdFMFU
Feb 02 09:38:29 compute-1 systemd-logind[805]: New session 26 of user ceph-admin.
Feb 02 09:38:29 compute-1 systemd[1]: Started Session 26 of User ceph-admin.
Feb 02 09:38:29 compute-1 sshd-session[72788]: pam_unix(sshd:session): session opened for user ceph-admin(uid=42477) by ceph-admin(uid=0)
Feb 02 09:38:29 compute-1 sudo[72792]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-d241d473-9fcb-5f74-b163-f1ca4454e7f1/var/lib/ceph/d241d473-9fcb-5f74-b163-f1ca4454e7f1
Feb 02 09:38:29 compute-1 sudo[72792]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:38:29 compute-1 sudo[72792]: pam_unix(sudo:session): session closed for user root
Feb 02 09:38:30 compute-1 sshd-session[72817]: Accepted publickey for ceph-admin from 192.168.122.100 port 35196 ssh2: RSA SHA256:U0yYyMay/+pOHGkTC+bWOMAOMtoKtn/A+YnW2fdFMFU
Feb 02 09:38:30 compute-1 systemd-logind[805]: New session 27 of user ceph-admin.
Feb 02 09:38:30 compute-1 systemd[1]: Started Session 27 of User ceph-admin.
Feb 02 09:38:30 compute-1 sshd-session[72817]: pam_unix(sshd:session): session opened for user ceph-admin(uid=42477) by ceph-admin(uid=0)
Feb 02 09:38:30 compute-1 sudo[72821]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-d241d473-9fcb-5f74-b163-f1ca4454e7f1/var/lib/ceph/d241d473-9fcb-5f74-b163-f1ca4454e7f1/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36.new
Feb 02 09:38:30 compute-1 sudo[72821]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:38:30 compute-1 sudo[72821]: pam_unix(sudo:session): session closed for user root
Feb 02 09:38:30 compute-1 sshd-session[72846]: Accepted publickey for ceph-admin from 192.168.122.100 port 35204 ssh2: RSA SHA256:U0yYyMay/+pOHGkTC+bWOMAOMtoKtn/A+YnW2fdFMFU
Feb 02 09:38:30 compute-1 systemd-logind[805]: New session 28 of user ceph-admin.
Feb 02 09:38:30 compute-1 systemd[1]: Started Session 28 of User ceph-admin.
Feb 02 09:38:30 compute-1 sshd-session[72846]: pam_unix(sshd:session): session opened for user ceph-admin(uid=42477) by ceph-admin(uid=0)
Feb 02 09:38:30 compute-1 sudo[72850]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-d241d473-9fcb-5f74-b163-f1ca4454e7f1
Feb 02 09:38:30 compute-1 sudo[72850]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:38:30 compute-1 sudo[72850]: pam_unix(sudo:session): session closed for user root
Feb 02 09:38:30 compute-1 sshd-session[72875]: Accepted publickey for ceph-admin from 192.168.122.100 port 35206 ssh2: RSA SHA256:U0yYyMay/+pOHGkTC+bWOMAOMtoKtn/A+YnW2fdFMFU
Feb 02 09:38:30 compute-1 systemd-logind[805]: New session 29 of user ceph-admin.
Feb 02 09:38:30 compute-1 systemd[1]: Started Session 29 of User ceph-admin.
Feb 02 09:38:30 compute-1 sshd-session[72875]: pam_unix(sshd:session): session opened for user ceph-admin(uid=42477) by ceph-admin(uid=0)
Feb 02 09:38:30 compute-1 sudo[72879]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-d241d473-9fcb-5f74-b163-f1ca4454e7f1/var/lib/ceph/d241d473-9fcb-5f74-b163-f1ca4454e7f1/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36.new
Feb 02 09:38:30 compute-1 sudo[72879]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:38:30 compute-1 sudo[72879]: pam_unix(sudo:session): session closed for user root
Feb 02 09:38:31 compute-1 sshd-session[72904]: Accepted publickey for ceph-admin from 192.168.122.100 port 35210 ssh2: RSA SHA256:U0yYyMay/+pOHGkTC+bWOMAOMtoKtn/A+YnW2fdFMFU
Feb 02 09:38:31 compute-1 systemd-logind[805]: New session 30 of user ceph-admin.
Feb 02 09:38:31 compute-1 systemd[1]: Started Session 30 of User ceph-admin.
Feb 02 09:38:31 compute-1 sshd-session[72904]: pam_unix(sshd:session): session opened for user ceph-admin(uid=42477) by ceph-admin(uid=0)
Feb 02 09:38:32 compute-1 sshd-session[72931]: Accepted publickey for ceph-admin from 192.168.122.100 port 35226 ssh2: RSA SHA256:U0yYyMay/+pOHGkTC+bWOMAOMtoKtn/A+YnW2fdFMFU
Feb 02 09:38:32 compute-1 systemd-logind[805]: New session 31 of user ceph-admin.
Feb 02 09:38:32 compute-1 systemd[1]: Started Session 31 of User ceph-admin.
Feb 02 09:38:32 compute-1 sshd-session[72931]: pam_unix(sshd:session): session opened for user ceph-admin(uid=42477) by ceph-admin(uid=0)
Feb 02 09:38:32 compute-1 sudo[72935]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-d241d473-9fcb-5f74-b163-f1ca4454e7f1/var/lib/ceph/d241d473-9fcb-5f74-b163-f1ca4454e7f1/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36.new /var/lib/ceph/d241d473-9fcb-5f74-b163-f1ca4454e7f1/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36
Feb 02 09:38:32 compute-1 sudo[72935]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:38:32 compute-1 sudo[72935]: pam_unix(sudo:session): session closed for user root
Feb 02 09:38:32 compute-1 sshd-session[72960]: Accepted publickey for ceph-admin from 192.168.122.100 port 35238 ssh2: RSA SHA256:U0yYyMay/+pOHGkTC+bWOMAOMtoKtn/A+YnW2fdFMFU
Feb 02 09:38:32 compute-1 systemd-logind[805]: New session 32 of user ceph-admin.
Feb 02 09:38:32 compute-1 systemd[1]: Started Session 32 of User ceph-admin.
Feb 02 09:38:32 compute-1 sshd-session[72960]: pam_unix(sshd:session): session opened for user ceph-admin(uid=42477) by ceph-admin(uid=0)
Feb 02 09:38:32 compute-1 sudo[72964]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/d241d473-9fcb-5f74-b163-f1ca4454e7f1/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 check-host --expect-hostname compute-1
Feb 02 09:38:32 compute-1 sudo[72964]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:38:33 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Feb 02 09:38:33 compute-1 sudo[72964]: pam_unix(sudo:session): session closed for user root
Feb 02 09:38:33 compute-1 sudo[73009]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 02 09:38:33 compute-1 sudo[73009]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:38:33 compute-1 sudo[73009]: pam_unix(sudo:session): session closed for user root
Feb 02 09:38:33 compute-1 sudo[73034]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/d241d473-9fcb-5f74-b163-f1ca4454e7f1/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 check-host
Feb 02 09:38:33 compute-1 sudo[73034]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:38:33 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Feb 02 09:38:33 compute-1 sudo[73034]: pam_unix(sudo:session): session closed for user root
Feb 02 09:38:33 compute-1 sudo[73078]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 02 09:38:33 compute-1 sudo[73078]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:38:33 compute-1 sudo[73078]: pam_unix(sudo:session): session closed for user root
Feb 02 09:38:33 compute-1 sudo[73103]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/d241d473-9fcb-5f74-b163-f1ca4454e7f1/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --image quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec --timeout 895 ls
Feb 02 09:38:33 compute-1 sudo[73103]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:38:33 compute-1 sudo[73103]: pam_unix(sudo:session): session closed for user root
Feb 02 09:38:33 compute-1 sudo[73167]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 02 09:38:33 compute-1 sudo[73167]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:38:33 compute-1 sudo[73167]: pam_unix(sudo:session): session closed for user root
Feb 02 09:38:34 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Feb 02 09:38:34 compute-1 sudo[73192]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/d241d473-9fcb-5f74-b163-f1ca4454e7f1/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Feb 02 09:38:34 compute-1 sudo[73192]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:38:34 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Feb 02 09:38:34 compute-1 systemd[1]: proc-sys-fs-binfmt_misc.automount: Got automount request for /proc/sys/fs/binfmt_misc, triggered by 73229 (sysctl)
Feb 02 09:38:34 compute-1 systemd[1]: Mounting Arbitrary Executable File Formats File System...
Feb 02 09:38:34 compute-1 systemd[1]: Mounted Arbitrary Executable File Formats File System.
Feb 02 09:38:34 compute-1 sudo[73192]: pam_unix(sudo:session): session closed for user root
Feb 02 09:38:34 compute-1 sudo[73251]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 02 09:38:34 compute-1 sudo[73251]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:38:34 compute-1 sudo[73251]: pam_unix(sudo:session): session closed for user root
Feb 02 09:38:34 compute-1 sudo[73276]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/d241d473-9fcb-5f74-b163-f1ca4454e7f1/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --image quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec --timeout 895 list-networks
Feb 02 09:38:34 compute-1 sudo[73276]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:38:34 compute-1 sudo[73276]: pam_unix(sudo:session): session closed for user root
Feb 02 09:38:34 compute-1 sudo[73319]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 02 09:38:34 compute-1 sudo[73319]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:38:34 compute-1 sudo[73319]: pam_unix(sudo:session): session closed for user root
Feb 02 09:38:35 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Feb 02 09:38:35 compute-1 sudo[73344]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/d241d473-9fcb-5f74-b163-f1ca4454e7f1/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --image quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec --timeout 895 ceph-volume --fsid d241d473-9fcb-5f74-b163-f1ca4454e7f1 -- inventory --format=json-pretty --filter-for-batch
Feb 02 09:38:35 compute-1 sudo[73344]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:38:35 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Feb 02 09:38:35 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Feb 02 09:38:37 compute-1 systemd[1]: var-lib-containers-storage-overlay-compat3689025887-lower\x2dmapped.mount: Deactivated successfully.
Feb 02 09:38:49 compute-1 podman[73406]: 2026-02-02 09:38:49.459555545 +0000 UTC m=+14.190186850 container create b9ec825bed60a10206f7a3092ce90086154d690df4689f0d0be70194b9982b6d (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=sad_perlman, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True)
Feb 02 09:38:49 compute-1 systemd[1]: var-lib-containers-storage-overlay-volatile\x2dcheck2021824710-merged.mount: Deactivated successfully.
Feb 02 09:38:49 compute-1 podman[73406]: 2026-02-02 09:38:49.443523323 +0000 UTC m=+14.174154658 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Feb 02 09:38:49 compute-1 systemd[1]: Created slice Virtual Machine and Container Slice.
Feb 02 09:38:49 compute-1 systemd[1]: Started libpod-conmon-b9ec825bed60a10206f7a3092ce90086154d690df4689f0d0be70194b9982b6d.scope.
Feb 02 09:38:49 compute-1 systemd[1]: Started libcrun container.
Feb 02 09:38:49 compute-1 podman[73406]: 2026-02-02 09:38:49.585478182 +0000 UTC m=+14.316109507 container init b9ec825bed60a10206f7a3092ce90086154d690df4689f0d0be70194b9982b6d (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=sad_perlman, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, ceph=True, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9)
Feb 02 09:38:49 compute-1 podman[73406]: 2026-02-02 09:38:49.59285669 +0000 UTC m=+14.323488035 container start b9ec825bed60a10206f7a3092ce90086154d690df4689f0d0be70194b9982b6d (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=sad_perlman, ceph=True, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9)
Feb 02 09:38:49 compute-1 sad_perlman[73469]: 167 167
Feb 02 09:38:49 compute-1 systemd[1]: libpod-b9ec825bed60a10206f7a3092ce90086154d690df4689f0d0be70194b9982b6d.scope: Deactivated successfully.
Feb 02 09:38:49 compute-1 podman[73406]: 2026-02-02 09:38:49.601786962 +0000 UTC m=+14.332418267 container attach b9ec825bed60a10206f7a3092ce90086154d690df4689f0d0be70194b9982b6d (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=sad_perlman, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9)
Feb 02 09:38:49 compute-1 podman[73406]: 2026-02-02 09:38:49.602556013 +0000 UTC m=+14.333187348 container died b9ec825bed60a10206f7a3092ce90086154d690df4689f0d0be70194b9982b6d (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=sad_perlman, ceph=True, io.buildah.version=1.40.1, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0)
Feb 02 09:38:49 compute-1 systemd[1]: var-lib-containers-storage-overlay-a43cdac388ff540a4372b0bdff5ee97b6306dab70c2c010bf1ce6716bedd5b0c-merged.mount: Deactivated successfully.
Feb 02 09:38:49 compute-1 podman[73406]: 2026-02-02 09:38:49.643493596 +0000 UTC m=+14.374124941 container remove b9ec825bed60a10206f7a3092ce90086154d690df4689f0d0be70194b9982b6d (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=sad_perlman, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0)
Feb 02 09:38:49 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Feb 02 09:38:49 compute-1 systemd[1]: libpod-conmon-b9ec825bed60a10206f7a3092ce90086154d690df4689f0d0be70194b9982b6d.scope: Deactivated successfully.
Feb 02 09:38:49 compute-1 podman[73496]: 2026-02-02 09:38:49.816523311 +0000 UTC m=+0.090944883 container create 9d51fdbb9567150810c5a83c92eb5d56238b3040ded4e3873fc2527ba66bf5b0 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=sharp_elgamal, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.build-date=20250325, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, CEPH_REF=squid, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 02 09:38:49 compute-1 podman[73496]: 2026-02-02 09:38:49.75331668 +0000 UTC m=+0.027738332 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Feb 02 09:38:49 compute-1 systemd[1]: Started libpod-conmon-9d51fdbb9567150810c5a83c92eb5d56238b3040ded4e3873fc2527ba66bf5b0.scope.
Feb 02 09:38:49 compute-1 systemd[1]: Started libcrun container.
Feb 02 09:38:49 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/84b79ac1165815354e27e9923c37a0915eb47f211ef80d5ba87414cf21b984f0/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Feb 02 09:38:49 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/84b79ac1165815354e27e9923c37a0915eb47f211ef80d5ba87414cf21b984f0/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Feb 02 09:38:49 compute-1 podman[73496]: 2026-02-02 09:38:49.882979183 +0000 UTC m=+0.157400785 container init 9d51fdbb9567150810c5a83c92eb5d56238b3040ded4e3873fc2527ba66bf5b0 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=sharp_elgamal, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.40.1, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/)
Feb 02 09:38:49 compute-1 podman[73496]: 2026-02-02 09:38:49.889019823 +0000 UTC m=+0.163441405 container start 9d51fdbb9567150810c5a83c92eb5d56238b3040ded4e3873fc2527ba66bf5b0 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=sharp_elgamal, org.label-schema.vendor=CentOS, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325, ceph=True, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9)
Feb 02 09:38:49 compute-1 podman[73496]: 2026-02-02 09:38:49.892605064 +0000 UTC m=+0.167026666 container attach 9d51fdbb9567150810c5a83c92eb5d56238b3040ded4e3873fc2527ba66bf5b0 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=sharp_elgamal, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, CEPH_REF=squid)
Feb 02 09:38:50 compute-1 sharp_elgamal[73513]: [
Feb 02 09:38:50 compute-1 sharp_elgamal[73513]:     {
Feb 02 09:38:50 compute-1 sharp_elgamal[73513]:         "available": false,
Feb 02 09:38:50 compute-1 sharp_elgamal[73513]:         "being_replaced": false,
Feb 02 09:38:50 compute-1 sharp_elgamal[73513]:         "ceph_device_lvm": false,
Feb 02 09:38:50 compute-1 sharp_elgamal[73513]:         "device_id": "QEMU_DVD-ROM_QM00001",
Feb 02 09:38:50 compute-1 sharp_elgamal[73513]:         "lsm_data": {},
Feb 02 09:38:50 compute-1 sharp_elgamal[73513]:         "lvs": [],
Feb 02 09:38:50 compute-1 sharp_elgamal[73513]:         "path": "/dev/sr0",
Feb 02 09:38:50 compute-1 sharp_elgamal[73513]:         "rejected_reasons": [
Feb 02 09:38:50 compute-1 sharp_elgamal[73513]:             "Has a FileSystem",
Feb 02 09:38:50 compute-1 sharp_elgamal[73513]:             "Insufficient space (<5GB)"
Feb 02 09:38:50 compute-1 sharp_elgamal[73513]:         ],
Feb 02 09:38:50 compute-1 sharp_elgamal[73513]:         "sys_api": {
Feb 02 09:38:50 compute-1 sharp_elgamal[73513]:             "actuators": null,
Feb 02 09:38:50 compute-1 sharp_elgamal[73513]:             "device_nodes": [
Feb 02 09:38:50 compute-1 sharp_elgamal[73513]:                 "sr0"
Feb 02 09:38:50 compute-1 sharp_elgamal[73513]:             ],
Feb 02 09:38:50 compute-1 sharp_elgamal[73513]:             "devname": "sr0",
Feb 02 09:38:50 compute-1 sharp_elgamal[73513]:             "human_readable_size": "482.00 KB",
Feb 02 09:38:50 compute-1 sharp_elgamal[73513]:             "id_bus": "ata",
Feb 02 09:38:50 compute-1 sharp_elgamal[73513]:             "model": "QEMU DVD-ROM",
Feb 02 09:38:50 compute-1 sharp_elgamal[73513]:             "nr_requests": "2",
Feb 02 09:38:50 compute-1 sharp_elgamal[73513]:             "parent": "/dev/sr0",
Feb 02 09:38:50 compute-1 sharp_elgamal[73513]:             "partitions": {},
Feb 02 09:38:50 compute-1 sharp_elgamal[73513]:             "path": "/dev/sr0",
Feb 02 09:38:50 compute-1 sharp_elgamal[73513]:             "removable": "1",
Feb 02 09:38:50 compute-1 sharp_elgamal[73513]:             "rev": "2.5+",
Feb 02 09:38:50 compute-1 sharp_elgamal[73513]:             "ro": "0",
Feb 02 09:38:50 compute-1 sharp_elgamal[73513]:             "rotational": "1",
Feb 02 09:38:50 compute-1 sharp_elgamal[73513]:             "sas_address": "",
Feb 02 09:38:50 compute-1 sharp_elgamal[73513]:             "sas_device_handle": "",
Feb 02 09:38:50 compute-1 sharp_elgamal[73513]:             "scheduler_mode": "mq-deadline",
Feb 02 09:38:50 compute-1 sharp_elgamal[73513]:             "sectors": 0,
Feb 02 09:38:50 compute-1 sharp_elgamal[73513]:             "sectorsize": "2048",
Feb 02 09:38:50 compute-1 sharp_elgamal[73513]:             "size": 493568.0,
Feb 02 09:38:50 compute-1 sharp_elgamal[73513]:             "support_discard": "2048",
Feb 02 09:38:50 compute-1 sharp_elgamal[73513]:             "type": "disk",
Feb 02 09:38:50 compute-1 sharp_elgamal[73513]:             "vendor": "QEMU"
Feb 02 09:38:50 compute-1 sharp_elgamal[73513]:         }
Feb 02 09:38:50 compute-1 sharp_elgamal[73513]:     }
Feb 02 09:38:50 compute-1 sharp_elgamal[73513]: ]
Feb 02 09:38:50 compute-1 systemd[1]: libpod-9d51fdbb9567150810c5a83c92eb5d56238b3040ded4e3873fc2527ba66bf5b0.scope: Deactivated successfully.
Feb 02 09:38:50 compute-1 podman[74505]: 2026-02-02 09:38:50.669286233 +0000 UTC m=+0.036855009 container died 9d51fdbb9567150810c5a83c92eb5d56238b3040ded4e3873fc2527ba66bf5b0 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=sharp_elgamal, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.label-schema.build-date=20250325, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, CEPH_REF=squid, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Feb 02 09:38:50 compute-1 systemd[1]: var-lib-containers-storage-overlay-84b79ac1165815354e27e9923c37a0915eb47f211ef80d5ba87414cf21b984f0-merged.mount: Deactivated successfully.
Feb 02 09:38:50 compute-1 podman[74505]: 2026-02-02 09:38:50.701568762 +0000 UTC m=+0.069137468 container remove 9d51fdbb9567150810c5a83c92eb5d56238b3040ded4e3873fc2527ba66bf5b0 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=sharp_elgamal, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, io.buildah.version=1.40.1, org.label-schema.build-date=20250325, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Feb 02 09:38:50 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Feb 02 09:38:50 compute-1 systemd[1]: libpod-conmon-9d51fdbb9567150810c5a83c92eb5d56238b3040ded4e3873fc2527ba66bf5b0.scope: Deactivated successfully.
Feb 02 09:38:50 compute-1 sudo[73344]: pam_unix(sudo:session): session closed for user root
Feb 02 09:38:50 compute-1 sudo[74520]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /etc/ceph
Feb 02 09:38:50 compute-1 sudo[74520]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:38:50 compute-1 sudo[74520]: pam_unix(sudo:session): session closed for user root
Feb 02 09:38:50 compute-1 sudo[74545]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-d241d473-9fcb-5f74-b163-f1ca4454e7f1/etc/ceph
Feb 02 09:38:50 compute-1 sudo[74545]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:38:50 compute-1 sudo[74545]: pam_unix(sudo:session): session closed for user root
Feb 02 09:38:50 compute-1 sudo[74570]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-d241d473-9fcb-5f74-b163-f1ca4454e7f1/etc/ceph/ceph.conf.new
Feb 02 09:38:50 compute-1 sudo[74570]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:38:50 compute-1 sudo[74570]: pam_unix(sudo:session): session closed for user root
Feb 02 09:38:50 compute-1 sudo[74595]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-d241d473-9fcb-5f74-b163-f1ca4454e7f1
Feb 02 09:38:50 compute-1 sudo[74595]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:38:50 compute-1 sudo[74595]: pam_unix(sudo:session): session closed for user root
Feb 02 09:38:51 compute-1 sudo[74620]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-d241d473-9fcb-5f74-b163-f1ca4454e7f1/etc/ceph/ceph.conf.new
Feb 02 09:38:51 compute-1 sudo[74620]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:38:51 compute-1 sudo[74620]: pam_unix(sudo:session): session closed for user root
Feb 02 09:38:51 compute-1 sudo[74668]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-d241d473-9fcb-5f74-b163-f1ca4454e7f1/etc/ceph/ceph.conf.new
Feb 02 09:38:51 compute-1 sudo[74668]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:38:51 compute-1 sudo[74668]: pam_unix(sudo:session): session closed for user root
Feb 02 09:38:51 compute-1 sudo[74693]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-d241d473-9fcb-5f74-b163-f1ca4454e7f1/etc/ceph/ceph.conf.new
Feb 02 09:38:51 compute-1 sudo[74693]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:38:51 compute-1 sudo[74693]: pam_unix(sudo:session): session closed for user root
Feb 02 09:38:51 compute-1 sudo[74718]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-d241d473-9fcb-5f74-b163-f1ca4454e7f1/etc/ceph/ceph.conf.new /etc/ceph/ceph.conf
Feb 02 09:38:51 compute-1 sudo[74718]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:38:51 compute-1 sudo[74718]: pam_unix(sudo:session): session closed for user root
Feb 02 09:38:51 compute-1 sudo[74743]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/d241d473-9fcb-5f74-b163-f1ca4454e7f1/config
Feb 02 09:38:51 compute-1 sudo[74743]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:38:51 compute-1 sudo[74743]: pam_unix(sudo:session): session closed for user root
Feb 02 09:38:51 compute-1 sudo[74768]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-d241d473-9fcb-5f74-b163-f1ca4454e7f1/var/lib/ceph/d241d473-9fcb-5f74-b163-f1ca4454e7f1/config
Feb 02 09:38:51 compute-1 sudo[74768]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:38:51 compute-1 sudo[74768]: pam_unix(sudo:session): session closed for user root
Feb 02 09:38:51 compute-1 sudo[74793]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-d241d473-9fcb-5f74-b163-f1ca4454e7f1/var/lib/ceph/d241d473-9fcb-5f74-b163-f1ca4454e7f1/config/ceph.conf.new
Feb 02 09:38:51 compute-1 sudo[74793]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:38:51 compute-1 sudo[74793]: pam_unix(sudo:session): session closed for user root
Feb 02 09:38:51 compute-1 sudo[74818]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-d241d473-9fcb-5f74-b163-f1ca4454e7f1
Feb 02 09:38:51 compute-1 sudo[74818]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:38:51 compute-1 sudo[74818]: pam_unix(sudo:session): session closed for user root
Feb 02 09:38:51 compute-1 sudo[74843]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-d241d473-9fcb-5f74-b163-f1ca4454e7f1/var/lib/ceph/d241d473-9fcb-5f74-b163-f1ca4454e7f1/config/ceph.conf.new
Feb 02 09:38:51 compute-1 sudo[74843]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:38:51 compute-1 sudo[74843]: pam_unix(sudo:session): session closed for user root
Feb 02 09:38:51 compute-1 sudo[74891]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-d241d473-9fcb-5f74-b163-f1ca4454e7f1/var/lib/ceph/d241d473-9fcb-5f74-b163-f1ca4454e7f1/config/ceph.conf.new
Feb 02 09:38:51 compute-1 sudo[74891]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:38:51 compute-1 sudo[74891]: pam_unix(sudo:session): session closed for user root
Feb 02 09:38:51 compute-1 sudo[74916]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-d241d473-9fcb-5f74-b163-f1ca4454e7f1/var/lib/ceph/d241d473-9fcb-5f74-b163-f1ca4454e7f1/config/ceph.conf.new
Feb 02 09:38:51 compute-1 sudo[74916]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:38:51 compute-1 sudo[74916]: pam_unix(sudo:session): session closed for user root
Feb 02 09:38:51 compute-1 sudo[74941]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-d241d473-9fcb-5f74-b163-f1ca4454e7f1/var/lib/ceph/d241d473-9fcb-5f74-b163-f1ca4454e7f1/config/ceph.conf.new /var/lib/ceph/d241d473-9fcb-5f74-b163-f1ca4454e7f1/config/ceph.conf
Feb 02 09:38:51 compute-1 sudo[74941]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:38:51 compute-1 sudo[74941]: pam_unix(sudo:session): session closed for user root
Feb 02 09:38:51 compute-1 sudo[74966]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /etc/ceph
Feb 02 09:38:51 compute-1 sudo[74966]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:38:51 compute-1 sudo[74966]: pam_unix(sudo:session): session closed for user root
Feb 02 09:38:51 compute-1 sudo[74991]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-d241d473-9fcb-5f74-b163-f1ca4454e7f1/etc/ceph
Feb 02 09:38:51 compute-1 sudo[74991]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:38:51 compute-1 sudo[74991]: pam_unix(sudo:session): session closed for user root
Feb 02 09:38:51 compute-1 sudo[75016]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-d241d473-9fcb-5f74-b163-f1ca4454e7f1/etc/ceph/ceph.client.admin.keyring.new
Feb 02 09:38:51 compute-1 sudo[75016]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:38:51 compute-1 sudo[75016]: pam_unix(sudo:session): session closed for user root
Feb 02 09:38:52 compute-1 sudo[75041]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-d241d473-9fcb-5f74-b163-f1ca4454e7f1
Feb 02 09:38:52 compute-1 sudo[75041]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:38:52 compute-1 sudo[75041]: pam_unix(sudo:session): session closed for user root
Feb 02 09:38:52 compute-1 sudo[75066]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-d241d473-9fcb-5f74-b163-f1ca4454e7f1/etc/ceph/ceph.client.admin.keyring.new
Feb 02 09:38:52 compute-1 sudo[75066]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:38:52 compute-1 sudo[75066]: pam_unix(sudo:session): session closed for user root
Feb 02 09:38:52 compute-1 sudo[75114]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-d241d473-9fcb-5f74-b163-f1ca4454e7f1/etc/ceph/ceph.client.admin.keyring.new
Feb 02 09:38:52 compute-1 sudo[75114]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:38:52 compute-1 sudo[75114]: pam_unix(sudo:session): session closed for user root
Feb 02 09:38:52 compute-1 sudo[75139]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 600 /tmp/cephadm-d241d473-9fcb-5f74-b163-f1ca4454e7f1/etc/ceph/ceph.client.admin.keyring.new
Feb 02 09:38:52 compute-1 sudo[75139]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:38:52 compute-1 sudo[75139]: pam_unix(sudo:session): session closed for user root
Feb 02 09:38:52 compute-1 sudo[75164]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-d241d473-9fcb-5f74-b163-f1ca4454e7f1/etc/ceph/ceph.client.admin.keyring.new /etc/ceph/ceph.client.admin.keyring
Feb 02 09:38:52 compute-1 sudo[75164]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:38:52 compute-1 sudo[75164]: pam_unix(sudo:session): session closed for user root
Feb 02 09:38:52 compute-1 sudo[75189]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/d241d473-9fcb-5f74-b163-f1ca4454e7f1/config
Feb 02 09:38:52 compute-1 sudo[75189]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:38:52 compute-1 sudo[75189]: pam_unix(sudo:session): session closed for user root
Feb 02 09:38:52 compute-1 sudo[75214]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-d241d473-9fcb-5f74-b163-f1ca4454e7f1/var/lib/ceph/d241d473-9fcb-5f74-b163-f1ca4454e7f1/config
Feb 02 09:38:52 compute-1 sudo[75214]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:38:52 compute-1 sudo[75214]: pam_unix(sudo:session): session closed for user root
Feb 02 09:38:52 compute-1 sudo[75239]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-d241d473-9fcb-5f74-b163-f1ca4454e7f1/var/lib/ceph/d241d473-9fcb-5f74-b163-f1ca4454e7f1/config/ceph.client.admin.keyring.new
Feb 02 09:38:52 compute-1 sudo[75239]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:38:52 compute-1 sudo[75239]: pam_unix(sudo:session): session closed for user root
Feb 02 09:38:52 compute-1 sudo[75264]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-d241d473-9fcb-5f74-b163-f1ca4454e7f1
Feb 02 09:38:52 compute-1 sudo[75264]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:38:52 compute-1 sudo[75264]: pam_unix(sudo:session): session closed for user root
Feb 02 09:38:52 compute-1 sudo[75289]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-d241d473-9fcb-5f74-b163-f1ca4454e7f1/var/lib/ceph/d241d473-9fcb-5f74-b163-f1ca4454e7f1/config/ceph.client.admin.keyring.new
Feb 02 09:38:52 compute-1 sudo[75289]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:38:52 compute-1 sudo[75289]: pam_unix(sudo:session): session closed for user root
Feb 02 09:38:52 compute-1 sudo[75337]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-d241d473-9fcb-5f74-b163-f1ca4454e7f1/var/lib/ceph/d241d473-9fcb-5f74-b163-f1ca4454e7f1/config/ceph.client.admin.keyring.new
Feb 02 09:38:52 compute-1 sudo[75337]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:38:52 compute-1 sudo[75337]: pam_unix(sudo:session): session closed for user root
Feb 02 09:38:52 compute-1 sudo[75362]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 600 /tmp/cephadm-d241d473-9fcb-5f74-b163-f1ca4454e7f1/var/lib/ceph/d241d473-9fcb-5f74-b163-f1ca4454e7f1/config/ceph.client.admin.keyring.new
Feb 02 09:38:52 compute-1 sudo[75362]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:38:52 compute-1 sudo[75362]: pam_unix(sudo:session): session closed for user root
Feb 02 09:38:52 compute-1 sudo[75387]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-d241d473-9fcb-5f74-b163-f1ca4454e7f1/var/lib/ceph/d241d473-9fcb-5f74-b163-f1ca4454e7f1/config/ceph.client.admin.keyring.new /var/lib/ceph/d241d473-9fcb-5f74-b163-f1ca4454e7f1/config/ceph.client.admin.keyring
Feb 02 09:38:52 compute-1 sudo[75387]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:38:52 compute-1 sudo[75387]: pam_unix(sudo:session): session closed for user root
Feb 02 09:38:53 compute-1 sudo[75412]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 02 09:38:53 compute-1 sudo[75412]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:38:53 compute-1 sudo[75412]: pam_unix(sudo:session): session closed for user root
Feb 02 09:38:53 compute-1 sudo[75437]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/d241d473-9fcb-5f74-b163-f1ca4454e7f1/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --image quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec --timeout 895 _orch deploy --fsid d241d473-9fcb-5f74-b163-f1ca4454e7f1
Feb 02 09:38:53 compute-1 sudo[75437]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:38:53 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Feb 02 09:38:53 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Feb 02 09:38:53 compute-1 podman[75500]: 2026-02-02 09:38:53.453821064 +0000 UTC m=+0.053079916 container create acb34e1de3679e5efcde7a0eaa3334f62b890746448a9a23cd396db789a54f25 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=busy_engelbart, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS)
Feb 02 09:38:53 compute-1 systemd[1]: Started libpod-conmon-acb34e1de3679e5efcde7a0eaa3334f62b890746448a9a23cd396db789a54f25.scope.
Feb 02 09:38:53 compute-1 systemd[1]: Started libcrun container.
Feb 02 09:38:53 compute-1 podman[75500]: 2026-02-02 09:38:53.423039477 +0000 UTC m=+0.022298349 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Feb 02 09:38:53 compute-1 podman[75500]: 2026-02-02 09:38:53.541437472 +0000 UTC m=+0.140696344 container init acb34e1de3679e5efcde7a0eaa3334f62b890746448a9a23cd396db789a54f25 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=busy_engelbart, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, ceph=True, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Feb 02 09:38:53 compute-1 podman[75500]: 2026-02-02 09:38:53.548841281 +0000 UTC m=+0.148100143 container start acb34e1de3679e5efcde7a0eaa3334f62b890746448a9a23cd396db789a54f25 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=busy_engelbart, org.label-schema.schema-version=1.0, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Feb 02 09:38:53 compute-1 busy_engelbart[75517]: 167 167
Feb 02 09:38:53 compute-1 systemd[1]: libpod-acb34e1de3679e5efcde7a0eaa3334f62b890746448a9a23cd396db789a54f25.scope: Deactivated successfully.
Feb 02 09:38:53 compute-1 podman[75500]: 2026-02-02 09:38:53.559368977 +0000 UTC m=+0.158627849 container attach acb34e1de3679e5efcde7a0eaa3334f62b890746448a9a23cd396db789a54f25 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=busy_engelbart, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, ceph=True, OSD_FLAVOR=default, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/)
Feb 02 09:38:53 compute-1 podman[75500]: 2026-02-02 09:38:53.559719237 +0000 UTC m=+0.158978089 container died acb34e1de3679e5efcde7a0eaa3334f62b890746448a9a23cd396db789a54f25 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=busy_engelbart, org.label-schema.build-date=20250325, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_REF=squid, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 02 09:38:53 compute-1 podman[75500]: 2026-02-02 09:38:53.616138356 +0000 UTC m=+0.215397208 container remove acb34e1de3679e5efcde7a0eaa3334f62b890746448a9a23cd396db789a54f25 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=busy_engelbart, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, io.buildah.version=1.40.1, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_REF=squid, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Feb 02 09:38:53 compute-1 systemd[1]: libpod-conmon-acb34e1de3679e5efcde7a0eaa3334f62b890746448a9a23cd396db789a54f25.scope: Deactivated successfully.
Feb 02 09:38:53 compute-1 systemd[1]: Reloading.
Feb 02 09:38:53 compute-1 systemd-sysv-generator[75560]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 02 09:38:53 compute-1 systemd-rc-local-generator[75557]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 02 09:38:53 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Feb 02 09:38:53 compute-1 systemd[1]: Reloading.
Feb 02 09:38:53 compute-1 systemd-rc-local-generator[75595]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 02 09:38:53 compute-1 systemd-sysv-generator[75598]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 02 09:38:54 compute-1 systemd[1]: Reached target All Ceph clusters and services.
Feb 02 09:38:54 compute-1 systemd[1]: Reloading.
Feb 02 09:38:54 compute-1 systemd-rc-local-generator[75634]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 02 09:38:54 compute-1 systemd-sysv-generator[75637]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 02 09:38:54 compute-1 systemd[1]: Reached target Ceph cluster d241d473-9fcb-5f74-b163-f1ca4454e7f1.
Feb 02 09:38:54 compute-1 systemd[1]: Reloading.
Feb 02 09:38:54 compute-1 systemd-sysv-generator[75672]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 02 09:38:54 compute-1 systemd-rc-local-generator[75669]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 02 09:38:54 compute-1 systemd[1]: Reloading.
Feb 02 09:38:54 compute-1 systemd-sysv-generator[75719]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 02 09:38:54 compute-1 systemd-rc-local-generator[75715]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 02 09:38:54 compute-1 systemd[1]: Created slice Slice /system/ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1.
Feb 02 09:38:54 compute-1 systemd[1]: Reached target System Time Set.
Feb 02 09:38:54 compute-1 systemd[1]: Reached target System Time Synchronized.
Feb 02 09:38:54 compute-1 systemd[1]: Starting Ceph crash.compute-1 for d241d473-9fcb-5f74-b163-f1ca4454e7f1...
Feb 02 09:38:54 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Feb 02 09:38:54 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Feb 02 09:38:54 compute-1 podman[75771]: 2026-02-02 09:38:54.959458037 +0000 UTC m=+0.044369041 container create 01cf0f34952fee77036b3444f3bcffc4063a933b24861a159799fb77cf2e6e0e (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-crash-compute-1, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS)
Feb 02 09:38:55 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3a85a64eb64edfca97039050bd1f681fe6913753b1aa6e26f5063f680989aa6c/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Feb 02 09:38:55 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3a85a64eb64edfca97039050bd1f681fe6913753b1aa6e26f5063f680989aa6c/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Feb 02 09:38:55 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3a85a64eb64edfca97039050bd1f681fe6913753b1aa6e26f5063f680989aa6c/merged/etc/ceph/ceph.client.crash.compute-1.keyring supports timestamps until 2038 (0x7fffffff)
Feb 02 09:38:55 compute-1 podman[75771]: 2026-02-02 09:38:55.027942826 +0000 UTC m=+0.112853810 container init 01cf0f34952fee77036b3444f3bcffc4063a933b24861a159799fb77cf2e6e0e (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-crash-compute-1, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, ceph=True, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1)
Feb 02 09:38:55 compute-1 podman[75771]: 2026-02-02 09:38:54.938644351 +0000 UTC m=+0.023555355 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Feb 02 09:38:55 compute-1 podman[75771]: 2026-02-02 09:38:55.040589533 +0000 UTC m=+0.125500497 container start 01cf0f34952fee77036b3444f3bcffc4063a933b24861a159799fb77cf2e6e0e (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-crash-compute-1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9)
Feb 02 09:38:55 compute-1 bash[75771]: 01cf0f34952fee77036b3444f3bcffc4063a933b24861a159799fb77cf2e6e0e
Feb 02 09:38:55 compute-1 systemd[1]: Started Ceph crash.compute-1 for d241d473-9fcb-5f74-b163-f1ca4454e7f1.
Feb 02 09:38:55 compute-1 sudo[75437]: pam_unix(sudo:session): session closed for user root
Feb 02 09:38:55 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-crash-compute-1[75787]: INFO:ceph-crash:pinging cluster to exercise our key
Feb 02 09:38:55 compute-1 sudo[75794]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 02 09:38:55 compute-1 sudo[75794]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:38:55 compute-1 sudo[75794]: pam_unix(sudo:session): session closed for user root
Feb 02 09:38:55 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-crash-compute-1[75787]: 2026-02-02T09:38:55.228+0000 7f00ded2e640 -1 auth: unable to find a keyring on /etc/ceph/ceph.client.admin.keyring,/etc/ceph/ceph.keyring,/etc/ceph/keyring,/etc/ceph/keyring.bin: (2) No such file or directory
Feb 02 09:38:55 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-crash-compute-1[75787]: 2026-02-02T09:38:55.229+0000 7f00ded2e640 -1 AuthRegistry(0x7f00d8069490) no keyring found at /etc/ceph/ceph.client.admin.keyring,/etc/ceph/ceph.keyring,/etc/ceph/keyring,/etc/ceph/keyring.bin, disabling cephx
Feb 02 09:38:55 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-crash-compute-1[75787]: 2026-02-02T09:38:55.229+0000 7f00ded2e640 -1 auth: unable to find a keyring on /etc/ceph/ceph.client.admin.keyring,/etc/ceph/ceph.keyring,/etc/ceph/keyring,/etc/ceph/keyring.bin: (2) No such file or directory
Feb 02 09:38:55 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-crash-compute-1[75787]: 2026-02-02T09:38:55.229+0000 7f00ded2e640 -1 AuthRegistry(0x7f00ded2cff0) no keyring found at /etc/ceph/ceph.client.admin.keyring,/etc/ceph/ceph.keyring,/etc/ceph/keyring,/etc/ceph/keyring.bin, disabling cephx
Feb 02 09:38:55 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-crash-compute-1[75787]: 2026-02-02T09:38:55.231+0000 7f00dcaa3640 -1 monclient(hunting): handle_auth_bad_method server allowed_methods [2] but i only support [1]
Feb 02 09:38:55 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-crash-compute-1[75787]: 2026-02-02T09:38:55.231+0000 7f00ded2e640 -1 monclient: authenticate NOTE: no keyring found; disabled cephx authentication
Feb 02 09:38:55 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-crash-compute-1[75787]: [errno 13] RADOS permission denied (error connecting to the cluster)
Feb 02 09:38:55 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-crash-compute-1[75787]: INFO:ceph-crash:monitoring path /var/lib/ceph/crash, delay 600s
Feb 02 09:38:55 compute-1 sudo[75820]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/d241d473-9fcb-5f74-b163-f1ca4454e7f1/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --env CEPH_VOLUME_OSDSPEC_AFFINITY=default_drive_group --image quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec --timeout 895 ceph-volume --fsid d241d473-9fcb-5f74-b163-f1ca4454e7f1 --config-json - -- lvm batch --no-auto /dev/ceph_vg0/ceph_lv0 --yes --no-systemd
Feb 02 09:38:55 compute-1 sudo[75820]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:38:55 compute-1 podman[75893]: 2026-02-02 09:38:55.609914271 +0000 UTC m=+0.052808409 container create 457707bdf699433cf47833343e287c4f7522917cc705097bd0f2eb855176b436 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=blissful_bardeen, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, OSD_FLAVOR=default, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9)
Feb 02 09:38:55 compute-1 systemd[1]: Started libpod-conmon-457707bdf699433cf47833343e287c4f7522917cc705097bd0f2eb855176b436.scope.
Feb 02 09:38:55 compute-1 podman[75893]: 2026-02-02 09:38:55.576928352 +0000 UTC m=+0.019822510 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Feb 02 09:38:55 compute-1 systemd[1]: Started libcrun container.
Feb 02 09:38:55 compute-1 podman[75893]: 2026-02-02 09:38:55.706319847 +0000 UTC m=+0.149213975 container init 457707bdf699433cf47833343e287c4f7522917cc705097bd0f2eb855176b436 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=blissful_bardeen, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_REF=squid, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9)
Feb 02 09:38:55 compute-1 podman[75893]: 2026-02-02 09:38:55.714634111 +0000 UTC m=+0.157528259 container start 457707bdf699433cf47833343e287c4f7522917cc705097bd0f2eb855176b436 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=blissful_bardeen, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, io.buildah.version=1.40.1)
Feb 02 09:38:55 compute-1 blissful_bardeen[75909]: 167 167
Feb 02 09:38:55 compute-1 systemd[1]: libpod-457707bdf699433cf47833343e287c4f7522917cc705097bd0f2eb855176b436.scope: Deactivated successfully.
Feb 02 09:38:55 compute-1 podman[75893]: 2026-02-02 09:38:55.721979208 +0000 UTC m=+0.164873326 container attach 457707bdf699433cf47833343e287c4f7522917cc705097bd0f2eb855176b436 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=blissful_bardeen, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, ceph=True, OSD_FLAVOR=default, org.label-schema.license=GPLv2, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 02 09:38:55 compute-1 podman[75893]: 2026-02-02 09:38:55.722272496 +0000 UTC m=+0.165166614 container died 457707bdf699433cf47833343e287c4f7522917cc705097bd0f2eb855176b436 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=blissful_bardeen, OSD_FLAVOR=default, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, io.buildah.version=1.40.1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0)
Feb 02 09:38:55 compute-1 systemd[1]: var-lib-containers-storage-overlay-8f9d9cfcf40122678e6ff3cf722977bbe9582d13cc4c48bfcc339b56dfc8d874-merged.mount: Deactivated successfully.
Feb 02 09:38:55 compute-1 podman[75893]: 2026-02-02 09:38:55.852430723 +0000 UTC m=+0.295324841 container remove 457707bdf699433cf47833343e287c4f7522917cc705097bd0f2eb855176b436 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=blissful_bardeen, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, ceph=True, org.label-schema.license=GPLv2, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Feb 02 09:38:55 compute-1 systemd[1]: libpod-conmon-457707bdf699433cf47833343e287c4f7522917cc705097bd0f2eb855176b436.scope: Deactivated successfully.
Feb 02 09:38:55 compute-1 podman[75933]: 2026-02-02 09:38:55.98049393 +0000 UTC m=+0.041830809 container create 824ffc552739a7ea6cc86fa13c5dbf1f71631e41c05496fe0c2dc32967708085 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=sweet_mirzakhani, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, org.label-schema.license=GPLv2)
Feb 02 09:38:56 compute-1 systemd[1]: Started libpod-conmon-824ffc552739a7ea6cc86fa13c5dbf1f71631e41c05496fe0c2dc32967708085.scope.
Feb 02 09:38:56 compute-1 systemd[1]: Started libcrun container.
Feb 02 09:38:56 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cd77aff117acbeae3df9c2df6f0fe6ce0dcf55eb6b6217116f5f993c3fcd697d/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Feb 02 09:38:56 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cd77aff117acbeae3df9c2df6f0fe6ce0dcf55eb6b6217116f5f993c3fcd697d/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Feb 02 09:38:56 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cd77aff117acbeae3df9c2df6f0fe6ce0dcf55eb6b6217116f5f993c3fcd697d/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Feb 02 09:38:56 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cd77aff117acbeae3df9c2df6f0fe6ce0dcf55eb6b6217116f5f993c3fcd697d/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Feb 02 09:38:56 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cd77aff117acbeae3df9c2df6f0fe6ce0dcf55eb6b6217116f5f993c3fcd697d/merged/var/lib/ceph/bootstrap-osd/ceph.keyring supports timestamps until 2038 (0x7fffffff)
Feb 02 09:38:56 compute-1 podman[75933]: 2026-02-02 09:38:55.961136485 +0000 UTC m=+0.022473364 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Feb 02 09:38:56 compute-1 podman[75933]: 2026-02-02 09:38:56.069890659 +0000 UTC m=+0.131227548 container init 824ffc552739a7ea6cc86fa13c5dbf1f71631e41c05496fe0c2dc32967708085 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=sweet_mirzakhani, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.label-schema.build-date=20250325, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.40.1, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 02 09:38:56 compute-1 podman[75933]: 2026-02-02 09:38:56.076916797 +0000 UTC m=+0.138253646 container start 824ffc552739a7ea6cc86fa13c5dbf1f71631e41c05496fe0c2dc32967708085 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=sweet_mirzakhani, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, org.label-schema.license=GPLv2, org.label-schema.build-date=20250325, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.40.1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, ceph=True, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Feb 02 09:38:56 compute-1 podman[75933]: 2026-02-02 09:38:56.08024266 +0000 UTC m=+0.141579539 container attach 824ffc552739a7ea6cc86fa13c5dbf1f71631e41c05496fe0c2dc32967708085 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=sweet_mirzakhani, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, CEPH_REF=squid)
Feb 02 09:38:56 compute-1 sweet_mirzakhani[75949]: --> passed data devices: 0 physical, 1 LVM
Feb 02 09:38:56 compute-1 sweet_mirzakhani[75949]: Running command: /usr/bin/ceph-authtool --gen-print-key
Feb 02 09:38:56 compute-1 sweet_mirzakhani[75949]: Running command: /usr/bin/ceph-authtool --gen-print-key
Feb 02 09:38:56 compute-1 sweet_mirzakhani[75949]: Running command: /usr/bin/ceph --cluster ceph --name client.bootstrap-osd --keyring /var/lib/ceph/bootstrap-osd/ceph.keyring -i - osd new 273baa6d-671d-41d3-8896-5eac2274aa10
Feb 02 09:38:56 compute-1 sweet_mirzakhani[75949]: Running command: /usr/bin/mount -t tmpfs tmpfs /var/lib/ceph/osd/ceph-0
Feb 02 09:38:56 compute-1 sweet_mirzakhani[75949]: Running command: /usr/bin/chown -h ceph:ceph /dev/ceph_vg0/ceph_lv0
Feb 02 09:38:56 compute-1 sweet_mirzakhani[75949]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-0
Feb 02 09:38:56 compute-1 lvm[76010]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Feb 02 09:38:56 compute-1 lvm[76010]: VG ceph_vg0 finished
Feb 02 09:38:56 compute-1 sweet_mirzakhani[75949]: Running command: /usr/bin/ln -s /dev/ceph_vg0/ceph_lv0 /var/lib/ceph/osd/ceph-0/block
Feb 02 09:38:56 compute-1 sweet_mirzakhani[75949]: Running command: /usr/bin/ceph --cluster ceph --name client.bootstrap-osd --keyring /var/lib/ceph/bootstrap-osd/ceph.keyring mon getmap -o /var/lib/ceph/osd/ceph-0/activate.monmap
Feb 02 09:38:57 compute-1 sweet_mirzakhani[75949]:  stderr: got monmap epoch 1
Feb 02 09:38:57 compute-1 sweet_mirzakhani[75949]: --> Creating keyring file for osd.0
Feb 02 09:38:57 compute-1 sweet_mirzakhani[75949]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-0/keyring
Feb 02 09:38:57 compute-1 sweet_mirzakhani[75949]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-0/
Feb 02 09:38:57 compute-1 sweet_mirzakhani[75949]: Running command: /usr/bin/ceph-osd --cluster ceph --osd-objectstore bluestore --mkfs -i 0 --monmap /var/lib/ceph/osd/ceph-0/activate.monmap --keyfile - --osdspec-affinity default_drive_group --osd-data /var/lib/ceph/osd/ceph-0/ --osd-uuid 273baa6d-671d-41d3-8896-5eac2274aa10 --setuser ceph --setgroup ceph
Feb 02 09:39:00 compute-1 sweet_mirzakhani[75949]:  stderr: 2026-02-02T09:38:57.425+0000 7f9d27a7f740 -1 bluestore(/var/lib/ceph/osd/ceph-0//block) No valid bdev label found
Feb 02 09:39:00 compute-1 sweet_mirzakhani[75949]:  stderr: 2026-02-02T09:38:57.688+0000 7f9d27a7f740 -1 bluestore(/var/lib/ceph/osd/ceph-0/) _read_fsid unparsable uuid
Feb 02 09:39:00 compute-1 sweet_mirzakhani[75949]: --> ceph-volume lvm prepare successful for: ceph_vg0/ceph_lv0
Feb 02 09:39:00 compute-1 sweet_mirzakhani[75949]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-0
Feb 02 09:39:00 compute-1 sweet_mirzakhani[75949]: Running command: /usr/bin/ceph-bluestore-tool --cluster=ceph prime-osd-dir --dev /dev/ceph_vg0/ceph_lv0 --path /var/lib/ceph/osd/ceph-0 --no-mon-config
Feb 02 09:39:01 compute-1 sweet_mirzakhani[75949]: Running command: /usr/bin/ln -snf /dev/ceph_vg0/ceph_lv0 /var/lib/ceph/osd/ceph-0/block
Feb 02 09:39:01 compute-1 sweet_mirzakhani[75949]: Running command: /usr/bin/chown -h ceph:ceph /var/lib/ceph/osd/ceph-0/block
Feb 02 09:39:01 compute-1 sweet_mirzakhani[75949]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-0
Feb 02 09:39:01 compute-1 sweet_mirzakhani[75949]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-0
Feb 02 09:39:01 compute-1 sweet_mirzakhani[75949]: --> ceph-volume lvm activate successful for osd ID: 0
Feb 02 09:39:01 compute-1 sweet_mirzakhani[75949]: --> ceph-volume lvm create successful for: ceph_vg0/ceph_lv0
Feb 02 09:39:01 compute-1 systemd[1]: libpod-824ffc552739a7ea6cc86fa13c5dbf1f71631e41c05496fe0c2dc32967708085.scope: Deactivated successfully.
Feb 02 09:39:01 compute-1 systemd[1]: libpod-824ffc552739a7ea6cc86fa13c5dbf1f71631e41c05496fe0c2dc32967708085.scope: Consumed 1.839s CPU time.
Feb 02 09:39:01 compute-1 podman[75933]: 2026-02-02 09:39:01.138893693 +0000 UTC m=+5.200230542 container died 824ffc552739a7ea6cc86fa13c5dbf1f71631e41c05496fe0c2dc32967708085 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=sweet_mirzakhani, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.build-date=20250325, io.buildah.version=1.40.1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 02 09:39:01 compute-1 systemd[1]: var-lib-containers-storage-overlay-cd77aff117acbeae3df9c2df6f0fe6ce0dcf55eb6b6217116f5f993c3fcd697d-merged.mount: Deactivated successfully.
Feb 02 09:39:01 compute-1 podman[75933]: 2026-02-02 09:39:01.19415015 +0000 UTC m=+5.255486999 container remove 824ffc552739a7ea6cc86fa13c5dbf1f71631e41c05496fe0c2dc32967708085 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=sweet_mirzakhani, org.label-schema.build-date=20250325, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.40.1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.label-schema.vendor=CentOS)
Feb 02 09:39:01 compute-1 systemd[1]: libpod-conmon-824ffc552739a7ea6cc86fa13c5dbf1f71631e41c05496fe0c2dc32967708085.scope: Deactivated successfully.
Feb 02 09:39:01 compute-1 sudo[75820]: pam_unix(sudo:session): session closed for user root
Feb 02 09:39:01 compute-1 sudo[76943]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 02 09:39:01 compute-1 sudo[76943]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:39:01 compute-1 sudo[76943]: pam_unix(sudo:session): session closed for user root
Feb 02 09:39:01 compute-1 sudo[76968]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/d241d473-9fcb-5f74-b163-f1ca4454e7f1/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --image quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec --timeout 895 ceph-volume --fsid d241d473-9fcb-5f74-b163-f1ca4454e7f1 -- lvm list --format json
Feb 02 09:39:01 compute-1 sudo[76968]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:39:01 compute-1 podman[77031]: 2026-02-02 09:39:01.775857665 +0000 UTC m=+0.037717423 container create 1b4ec0a105066ac322aa94f749dd51646d9dc0f85fba4fb5434f6f2fbf71b4fd (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=compassionate_solomon, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, OSD_FLAVOR=default, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Feb 02 09:39:01 compute-1 systemd[1]: Started libpod-conmon-1b4ec0a105066ac322aa94f749dd51646d9dc0f85fba4fb5434f6f2fbf71b4fd.scope.
Feb 02 09:39:01 compute-1 systemd[1]: Started libcrun container.
Feb 02 09:39:01 compute-1 podman[77031]: 2026-02-02 09:39:01.852666099 +0000 UTC m=+0.114525907 container init 1b4ec0a105066ac322aa94f749dd51646d9dc0f85fba4fb5434f6f2fbf71b4fd (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=compassionate_solomon, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, org.label-schema.schema-version=1.0, ceph=True, OSD_FLAVOR=default, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1, org.label-schema.license=GPLv2)
Feb 02 09:39:01 compute-1 podman[77031]: 2026-02-02 09:39:01.758365063 +0000 UTC m=+0.020224811 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Feb 02 09:39:01 compute-1 podman[77031]: 2026-02-02 09:39:01.861784956 +0000 UTC m=+0.123644704 container start 1b4ec0a105066ac322aa94f749dd51646d9dc0f85fba4fb5434f6f2fbf71b4fd (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=compassionate_solomon, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.40.1, ceph=True, OSD_FLAVOR=default, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0)
Feb 02 09:39:01 compute-1 podman[77031]: 2026-02-02 09:39:01.866456798 +0000 UTC m=+0.128316546 container attach 1b4ec0a105066ac322aa94f749dd51646d9dc0f85fba4fb5434f6f2fbf71b4fd (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=compassionate_solomon, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, io.buildah.version=1.40.1, OSD_FLAVOR=default, CEPH_REF=squid, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS)
Feb 02 09:39:01 compute-1 compassionate_solomon[77047]: 167 167
Feb 02 09:39:01 compute-1 systemd[1]: libpod-1b4ec0a105066ac322aa94f749dd51646d9dc0f85fba4fb5434f6f2fbf71b4fd.scope: Deactivated successfully.
Feb 02 09:39:01 compute-1 podman[77031]: 2026-02-02 09:39:01.868226038 +0000 UTC m=+0.130085786 container died 1b4ec0a105066ac322aa94f749dd51646d9dc0f85fba4fb5434f6f2fbf71b4fd (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=compassionate_solomon, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, ceph=True, CEPH_REF=squid, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Feb 02 09:39:01 compute-1 systemd[1]: var-lib-containers-storage-overlay-5dcd08a1eb775db66759cd5e261e2f3cf23a30cd2f7ce62f711a278766950711-merged.mount: Deactivated successfully.
Feb 02 09:39:01 compute-1 podman[77031]: 2026-02-02 09:39:01.902230315 +0000 UTC m=+0.164090033 container remove 1b4ec0a105066ac322aa94f749dd51646d9dc0f85fba4fb5434f6f2fbf71b4fd (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=compassionate_solomon, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, io.buildah.version=1.40.1, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS)
Feb 02 09:39:01 compute-1 systemd[1]: libpod-conmon-1b4ec0a105066ac322aa94f749dd51646d9dc0f85fba4fb5434f6f2fbf71b4fd.scope: Deactivated successfully.
Feb 02 09:39:02 compute-1 podman[77071]: 2026-02-02 09:39:02.040057078 +0000 UTC m=+0.042996582 container create 13d4e50c6e1c8c8c79e5d4504e462c131f9a018d12c737528a04c5f0f74516d6 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=suspicious_tu, org.label-schema.license=GPLv2, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0)
Feb 02 09:39:02 compute-1 systemd[1]: Started libpod-conmon-13d4e50c6e1c8c8c79e5d4504e462c131f9a018d12c737528a04c5f0f74516d6.scope.
Feb 02 09:39:02 compute-1 systemd[1]: Started libcrun container.
Feb 02 09:39:02 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/52469679324de6d5d63ad6e7b79f8a0bc3886a66579e7d35bcc6168386cef263/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Feb 02 09:39:02 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/52469679324de6d5d63ad6e7b79f8a0bc3886a66579e7d35bcc6168386cef263/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Feb 02 09:39:02 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/52469679324de6d5d63ad6e7b79f8a0bc3886a66579e7d35bcc6168386cef263/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Feb 02 09:39:02 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/52469679324de6d5d63ad6e7b79f8a0bc3886a66579e7d35bcc6168386cef263/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Feb 02 09:39:02 compute-1 podman[77071]: 2026-02-02 09:39:02.022820813 +0000 UTC m=+0.025760367 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Feb 02 09:39:02 compute-1 podman[77071]: 2026-02-02 09:39:02.145451647 +0000 UTC m=+0.148391181 container init 13d4e50c6e1c8c8c79e5d4504e462c131f9a018d12c737528a04c5f0f74516d6 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=suspicious_tu, ceph=True, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250325, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 02 09:39:02 compute-1 podman[77071]: 2026-02-02 09:39:02.153762521 +0000 UTC m=+0.156702035 container start 13d4e50c6e1c8c8c79e5d4504e462c131f9a018d12c737528a04c5f0f74516d6 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=suspicious_tu, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, CEPH_REF=squid, ceph=True)
Feb 02 09:39:02 compute-1 podman[77071]: 2026-02-02 09:39:02.157904008 +0000 UTC m=+0.160843582 container attach 13d4e50c6e1c8c8c79e5d4504e462c131f9a018d12c737528a04c5f0f74516d6 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=suspicious_tu, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.40.1, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2)
Feb 02 09:39:02 compute-1 suspicious_tu[77087]: {
Feb 02 09:39:02 compute-1 suspicious_tu[77087]:     "0": [
Feb 02 09:39:02 compute-1 suspicious_tu[77087]:         {
Feb 02 09:39:02 compute-1 suspicious_tu[77087]:             "devices": [
Feb 02 09:39:02 compute-1 suspicious_tu[77087]:                 "/dev/loop3"
Feb 02 09:39:02 compute-1 suspicious_tu[77087]:             ],
Feb 02 09:39:02 compute-1 suspicious_tu[77087]:             "lv_name": "ceph_lv0",
Feb 02 09:39:02 compute-1 suspicious_tu[77087]:             "lv_path": "/dev/ceph_vg0/ceph_lv0",
Feb 02 09:39:02 compute-1 suspicious_tu[77087]:             "lv_size": "21470642176",
Feb 02 09:39:02 compute-1 suspicious_tu[77087]:             "lv_tags": "ceph.block_device=/dev/ceph_vg0/ceph_lv0,ceph.block_uuid=ZuZGzb-UVFf-JLk7-u6dl-HCpn-c9Dl-1zqhCS,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=d241d473-9fcb-5f74-b163-f1ca4454e7f1,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.osd_fsid=273baa6d-671d-41d3-8896-5eac2274aa10,ceph.osd_id=0,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0,ceph.with_tpm=0",
Feb 02 09:39:02 compute-1 suspicious_tu[77087]:             "lv_uuid": "ZuZGzb-UVFf-JLk7-u6dl-HCpn-c9Dl-1zqhCS",
Feb 02 09:39:02 compute-1 suspicious_tu[77087]:             "name": "ceph_lv0",
Feb 02 09:39:02 compute-1 suspicious_tu[77087]:             "path": "/dev/ceph_vg0/ceph_lv0",
Feb 02 09:39:02 compute-1 suspicious_tu[77087]:             "tags": {
Feb 02 09:39:02 compute-1 suspicious_tu[77087]:                 "ceph.block_device": "/dev/ceph_vg0/ceph_lv0",
Feb 02 09:39:02 compute-1 suspicious_tu[77087]:                 "ceph.block_uuid": "ZuZGzb-UVFf-JLk7-u6dl-HCpn-c9Dl-1zqhCS",
Feb 02 09:39:02 compute-1 suspicious_tu[77087]:                 "ceph.cephx_lockbox_secret": "",
Feb 02 09:39:02 compute-1 suspicious_tu[77087]:                 "ceph.cluster_fsid": "d241d473-9fcb-5f74-b163-f1ca4454e7f1",
Feb 02 09:39:02 compute-1 suspicious_tu[77087]:                 "ceph.cluster_name": "ceph",
Feb 02 09:39:02 compute-1 suspicious_tu[77087]:                 "ceph.crush_device_class": "",
Feb 02 09:39:02 compute-1 suspicious_tu[77087]:                 "ceph.encrypted": "0",
Feb 02 09:39:02 compute-1 suspicious_tu[77087]:                 "ceph.osd_fsid": "273baa6d-671d-41d3-8896-5eac2274aa10",
Feb 02 09:39:02 compute-1 suspicious_tu[77087]:                 "ceph.osd_id": "0",
Feb 02 09:39:02 compute-1 suspicious_tu[77087]:                 "ceph.osdspec_affinity": "default_drive_group",
Feb 02 09:39:02 compute-1 suspicious_tu[77087]:                 "ceph.type": "block",
Feb 02 09:39:02 compute-1 suspicious_tu[77087]:                 "ceph.vdo": "0",
Feb 02 09:39:02 compute-1 suspicious_tu[77087]:                 "ceph.with_tpm": "0"
Feb 02 09:39:02 compute-1 suspicious_tu[77087]:             },
Feb 02 09:39:02 compute-1 suspicious_tu[77087]:             "type": "block",
Feb 02 09:39:02 compute-1 suspicious_tu[77087]:             "vg_name": "ceph_vg0"
Feb 02 09:39:02 compute-1 suspicious_tu[77087]:         }
Feb 02 09:39:02 compute-1 suspicious_tu[77087]:     ]
Feb 02 09:39:02 compute-1 suspicious_tu[77087]: }
Feb 02 09:39:02 compute-1 systemd[1]: libpod-13d4e50c6e1c8c8c79e5d4504e462c131f9a018d12c737528a04c5f0f74516d6.scope: Deactivated successfully.
Feb 02 09:39:02 compute-1 podman[77071]: 2026-02-02 09:39:02.455360867 +0000 UTC m=+0.458300411 container died 13d4e50c6e1c8c8c79e5d4504e462c131f9a018d12c737528a04c5f0f74516d6 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=suspicious_tu, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, ceph=True, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0)
Feb 02 09:39:02 compute-1 systemd[1]: var-lib-containers-storage-overlay-52469679324de6d5d63ad6e7b79f8a0bc3886a66579e7d35bcc6168386cef263-merged.mount: Deactivated successfully.
Feb 02 09:39:02 compute-1 podman[77071]: 2026-02-02 09:39:02.499976014 +0000 UTC m=+0.502915558 container remove 13d4e50c6e1c8c8c79e5d4504e462c131f9a018d12c737528a04c5f0f74516d6 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=suspicious_tu, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True)
Feb 02 09:39:02 compute-1 systemd[1]: libpod-conmon-13d4e50c6e1c8c8c79e5d4504e462c131f9a018d12c737528a04c5f0f74516d6.scope: Deactivated successfully.
Feb 02 09:39:02 compute-1 sudo[76968]: pam_unix(sudo:session): session closed for user root
Feb 02 09:39:02 compute-1 sudo[77108]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 02 09:39:02 compute-1 sudo[77108]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:39:02 compute-1 sudo[77108]: pam_unix(sudo:session): session closed for user root
Feb 02 09:39:02 compute-1 sudo[77133]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/d241d473-9fcb-5f74-b163-f1ca4454e7f1/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --image quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec --timeout 895 _orch deploy --fsid d241d473-9fcb-5f74-b163-f1ca4454e7f1
Feb 02 09:39:02 compute-1 sudo[77133]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:39:03 compute-1 podman[77199]: 2026-02-02 09:39:03.047137568 +0000 UTC m=+0.042483188 container create 8a0435963612ca6b5d28cb9aa9201c53bb715f49ac7396b76b395c4f7e3a1723 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=jolly_lederberg, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Feb 02 09:39:03 compute-1 systemd[1]: Started libpod-conmon-8a0435963612ca6b5d28cb9aa9201c53bb715f49ac7396b76b395c4f7e3a1723.scope.
Feb 02 09:39:03 compute-1 systemd[1]: Started libcrun container.
Feb 02 09:39:03 compute-1 podman[77199]: 2026-02-02 09:39:03.024935843 +0000 UTC m=+0.020281523 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Feb 02 09:39:03 compute-1 podman[77199]: 2026-02-02 09:39:03.129335644 +0000 UTC m=+0.124681234 container init 8a0435963612ca6b5d28cb9aa9201c53bb715f49ac7396b76b395c4f7e3a1723 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=jolly_lederberg, OSD_FLAVOR=default, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid)
Feb 02 09:39:03 compute-1 podman[77199]: 2026-02-02 09:39:03.136625699 +0000 UTC m=+0.131971319 container start 8a0435963612ca6b5d28cb9aa9201c53bb715f49ac7396b76b395c4f7e3a1723 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=jolly_lederberg, CEPH_REF=squid, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 02 09:39:03 compute-1 podman[77199]: 2026-02-02 09:39:03.140507888 +0000 UTC m=+0.135853568 container attach 8a0435963612ca6b5d28cb9aa9201c53bb715f49ac7396b76b395c4f7e3a1723 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=jolly_lederberg, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.build-date=20250325, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, io.buildah.version=1.40.1, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9)
Feb 02 09:39:03 compute-1 jolly_lederberg[77215]: 167 167
Feb 02 09:39:03 compute-1 systemd[1]: libpod-8a0435963612ca6b5d28cb9aa9201c53bb715f49ac7396b76b395c4f7e3a1723.scope: Deactivated successfully.
Feb 02 09:39:03 compute-1 podman[77199]: 2026-02-02 09:39:03.142117654 +0000 UTC m=+0.137463284 container died 8a0435963612ca6b5d28cb9aa9201c53bb715f49ac7396b76b395c4f7e3a1723 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=jolly_lederberg, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, CEPH_REF=squid, org.label-schema.build-date=20250325, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0)
Feb 02 09:39:03 compute-1 systemd[1]: var-lib-containers-storage-overlay-f6fcf546da33ddb409fc05eee577bcd92b594c3aa6eac9e11db1c1c417e2879b-merged.mount: Deactivated successfully.
Feb 02 09:39:03 compute-1 podman[77199]: 2026-02-02 09:39:03.178815697 +0000 UTC m=+0.174161297 container remove 8a0435963612ca6b5d28cb9aa9201c53bb715f49ac7396b76b395c4f7e3a1723 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=jolly_lederberg, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, org.label-schema.build-date=20250325, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9)
Feb 02 09:39:03 compute-1 systemd[1]: libpod-conmon-8a0435963612ca6b5d28cb9aa9201c53bb715f49ac7396b76b395c4f7e3a1723.scope: Deactivated successfully.
Feb 02 09:39:03 compute-1 podman[77247]: 2026-02-02 09:39:03.451502289 +0000 UTC m=+0.052204002 container create 9c8a0b72fe20aefa8a2e78915117cb3cfa1ffa13d6aa2538bef2dafcca2dd68d (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-osd-0-activate-test, io.buildah.version=1.40.1, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, ceph=True, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Feb 02 09:39:03 compute-1 systemd[1]: Started libpod-conmon-9c8a0b72fe20aefa8a2e78915117cb3cfa1ffa13d6aa2538bef2dafcca2dd68d.scope.
Feb 02 09:39:03 compute-1 systemd[1]: Started libcrun container.
Feb 02 09:39:03 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a3e6cfe46bb9e35d1dcfa3b7de3d217db740a885bee146df40a31d688f884ecd/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Feb 02 09:39:03 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a3e6cfe46bb9e35d1dcfa3b7de3d217db740a885bee146df40a31d688f884ecd/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Feb 02 09:39:03 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a3e6cfe46bb9e35d1dcfa3b7de3d217db740a885bee146df40a31d688f884ecd/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Feb 02 09:39:03 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a3e6cfe46bb9e35d1dcfa3b7de3d217db740a885bee146df40a31d688f884ecd/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Feb 02 09:39:03 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a3e6cfe46bb9e35d1dcfa3b7de3d217db740a885bee146df40a31d688f884ecd/merged/var/lib/ceph/osd/ceph-0 supports timestamps until 2038 (0x7fffffff)
Feb 02 09:39:03 compute-1 podman[77247]: 2026-02-02 09:39:03.430318572 +0000 UTC m=+0.031020375 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Feb 02 09:39:03 compute-1 podman[77247]: 2026-02-02 09:39:03.534359713 +0000 UTC m=+0.135061526 container init 9c8a0b72fe20aefa8a2e78915117cb3cfa1ffa13d6aa2538bef2dafcca2dd68d (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-osd-0-activate-test, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, OSD_FLAVOR=default, org.label-schema.license=GPLv2, io.buildah.version=1.40.1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Feb 02 09:39:03 compute-1 podman[77247]: 2026-02-02 09:39:03.543525051 +0000 UTC m=+0.144226784 container start 9c8a0b72fe20aefa8a2e78915117cb3cfa1ffa13d6aa2538bef2dafcca2dd68d (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-osd-0-activate-test, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.label-schema.build-date=20250325, io.buildah.version=1.40.1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True)
Feb 02 09:39:03 compute-1 podman[77247]: 2026-02-02 09:39:03.547466762 +0000 UTC m=+0.148168545 container attach 9c8a0b72fe20aefa8a2e78915117cb3cfa1ffa13d6aa2538bef2dafcca2dd68d (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-osd-0-activate-test, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325, ceph=True, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Feb 02 09:39:03 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-osd-0-activate-test[77263]: usage: ceph-volume activate [-h] [--osd-id OSD_ID] [--osd-uuid OSD_FSID]
Feb 02 09:39:03 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-osd-0-activate-test[77263]:                             [--no-systemd] [--no-tmpfs]
Feb 02 09:39:03 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-osd-0-activate-test[77263]: ceph-volume activate: error: unrecognized arguments: --bad-option
Feb 02 09:39:03 compute-1 systemd[1]: libpod-9c8a0b72fe20aefa8a2e78915117cb3cfa1ffa13d6aa2538bef2dafcca2dd68d.scope: Deactivated successfully.
Feb 02 09:39:03 compute-1 podman[77247]: 2026-02-02 09:39:03.730635172 +0000 UTC m=+0.331336905 container died 9c8a0b72fe20aefa8a2e78915117cb3cfa1ffa13d6aa2538bef2dafcca2dd68d (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-osd-0-activate-test, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, io.buildah.version=1.40.1, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.schema-version=1.0)
Feb 02 09:39:03 compute-1 systemd[1]: var-lib-containers-storage-overlay-a3e6cfe46bb9e35d1dcfa3b7de3d217db740a885bee146df40a31d688f884ecd-merged.mount: Deactivated successfully.
Feb 02 09:39:03 compute-1 podman[77247]: 2026-02-02 09:39:03.77174386 +0000 UTC m=+0.372445563 container remove 9c8a0b72fe20aefa8a2e78915117cb3cfa1ffa13d6aa2538bef2dafcca2dd68d (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-osd-0-activate-test, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True)
Feb 02 09:39:03 compute-1 systemd[1]: libpod-conmon-9c8a0b72fe20aefa8a2e78915117cb3cfa1ffa13d6aa2538bef2dafcca2dd68d.scope: Deactivated successfully.
Feb 02 09:39:03 compute-1 systemd[1]: Reloading.
Feb 02 09:39:04 compute-1 systemd-sysv-generator[77322]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 02 09:39:04 compute-1 systemd-rc-local-generator[77317]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 02 09:39:04 compute-1 systemd[1]: Reloading.
Feb 02 09:39:04 compute-1 systemd-rc-local-generator[77359]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 02 09:39:04 compute-1 systemd-sysv-generator[77362]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 02 09:39:04 compute-1 systemd[1]: Starting Ceph osd.0 for d241d473-9fcb-5f74-b163-f1ca4454e7f1...
Feb 02 09:39:04 compute-1 podman[77426]: 2026-02-02 09:39:04.669777378 +0000 UTC m=+0.070494467 container create ae9883c51d91e739a9e0ad046c17bb0d387249130f225f42b3c43c87f9093785 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-osd-0-activate, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, org.label-schema.license=GPLv2, org.label-schema.build-date=20250325, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True)
Feb 02 09:39:04 compute-1 podman[77426]: 2026-02-02 09:39:04.622958019 +0000 UTC m=+0.023675088 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Feb 02 09:39:04 compute-1 systemd[1]: Started libcrun container.
Feb 02 09:39:04 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1519328e7f63a1cd0039cf771c8b2054bd01fe8399ec3133544bf518fef05bac/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Feb 02 09:39:04 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1519328e7f63a1cd0039cf771c8b2054bd01fe8399ec3133544bf518fef05bac/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Feb 02 09:39:04 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1519328e7f63a1cd0039cf771c8b2054bd01fe8399ec3133544bf518fef05bac/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Feb 02 09:39:04 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1519328e7f63a1cd0039cf771c8b2054bd01fe8399ec3133544bf518fef05bac/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Feb 02 09:39:04 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1519328e7f63a1cd0039cf771c8b2054bd01fe8399ec3133544bf518fef05bac/merged/var/lib/ceph/osd/ceph-0 supports timestamps until 2038 (0x7fffffff)
Feb 02 09:39:04 compute-1 podman[77426]: 2026-02-02 09:39:04.781873956 +0000 UTC m=+0.182591035 container init ae9883c51d91e739a9e0ad046c17bb0d387249130f225f42b3c43c87f9093785 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-osd-0-activate, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, ceph=True, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS)
Feb 02 09:39:04 compute-1 podman[77426]: 2026-02-02 09:39:04.788984516 +0000 UTC m=+0.189701605 container start ae9883c51d91e739a9e0ad046c17bb0d387249130f225f42b3c43c87f9093785 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-osd-0-activate, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325, io.buildah.version=1.40.1, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, org.label-schema.vendor=CentOS)
Feb 02 09:39:04 compute-1 podman[77426]: 2026-02-02 09:39:04.798985628 +0000 UTC m=+0.199702707 container attach ae9883c51d91e739a9e0ad046c17bb0d387249130f225f42b3c43c87f9093785 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-osd-0-activate, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True)
Feb 02 09:39:04 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-osd-0-activate[77441]: Running command: /usr/bin/ceph-authtool --gen-print-key
Feb 02 09:39:04 compute-1 bash[77426]: Running command: /usr/bin/ceph-authtool --gen-print-key
Feb 02 09:39:04 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-osd-0-activate[77441]: Running command: /usr/bin/ceph-authtool --gen-print-key
Feb 02 09:39:04 compute-1 bash[77426]: Running command: /usr/bin/ceph-authtool --gen-print-key
Feb 02 09:39:05 compute-1 lvm[77522]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Feb 02 09:39:05 compute-1 lvm[77522]: VG ceph_vg0 finished
Feb 02 09:39:05 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-osd-0-activate[77441]: --> Failed to activate via raw: did not find any matching OSD to activate
Feb 02 09:39:05 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-osd-0-activate[77441]: Running command: /usr/bin/ceph-authtool --gen-print-key
Feb 02 09:39:05 compute-1 bash[77426]: --> Failed to activate via raw: did not find any matching OSD to activate
Feb 02 09:39:05 compute-1 bash[77426]: Running command: /usr/bin/ceph-authtool --gen-print-key
Feb 02 09:39:05 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-osd-0-activate[77441]: Running command: /usr/bin/ceph-authtool --gen-print-key
Feb 02 09:39:05 compute-1 bash[77426]: Running command: /usr/bin/ceph-authtool --gen-print-key
Feb 02 09:39:05 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-osd-0-activate[77441]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-0
Feb 02 09:39:05 compute-1 bash[77426]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-0
Feb 02 09:39:05 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-osd-0-activate[77441]: Running command: /usr/bin/ceph-bluestore-tool --cluster=ceph prime-osd-dir --dev /dev/ceph_vg0/ceph_lv0 --path /var/lib/ceph/osd/ceph-0 --no-mon-config
Feb 02 09:39:05 compute-1 bash[77426]: Running command: /usr/bin/ceph-bluestore-tool --cluster=ceph prime-osd-dir --dev /dev/ceph_vg0/ceph_lv0 --path /var/lib/ceph/osd/ceph-0 --no-mon-config
Feb 02 09:39:06 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-osd-0-activate[77441]: Running command: /usr/bin/ln -snf /dev/ceph_vg0/ceph_lv0 /var/lib/ceph/osd/ceph-0/block
Feb 02 09:39:06 compute-1 bash[77426]: Running command: /usr/bin/ln -snf /dev/ceph_vg0/ceph_lv0 /var/lib/ceph/osd/ceph-0/block
Feb 02 09:39:06 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-osd-0-activate[77441]: Running command: /usr/bin/chown -h ceph:ceph /var/lib/ceph/osd/ceph-0/block
Feb 02 09:39:06 compute-1 bash[77426]: Running command: /usr/bin/chown -h ceph:ceph /var/lib/ceph/osd/ceph-0/block
Feb 02 09:39:06 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-osd-0-activate[77441]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-0
Feb 02 09:39:06 compute-1 bash[77426]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-0
Feb 02 09:39:06 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-osd-0-activate[77441]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-0
Feb 02 09:39:06 compute-1 bash[77426]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-0
Feb 02 09:39:06 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-osd-0-activate[77441]: --> ceph-volume lvm activate successful for osd ID: 0
Feb 02 09:39:06 compute-1 bash[77426]: --> ceph-volume lvm activate successful for osd ID: 0
Feb 02 09:39:06 compute-1 systemd[1]: libpod-ae9883c51d91e739a9e0ad046c17bb0d387249130f225f42b3c43c87f9093785.scope: Deactivated successfully.
Feb 02 09:39:06 compute-1 podman[77426]: 2026-02-02 09:39:06.108599609 +0000 UTC m=+1.509316668 container died ae9883c51d91e739a9e0ad046c17bb0d387249130f225f42b3c43c87f9093785 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-osd-0-activate, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, io.buildah.version=1.40.1, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2)
Feb 02 09:39:06 compute-1 systemd[1]: libpod-ae9883c51d91e739a9e0ad046c17bb0d387249130f225f42b3c43c87f9093785.scope: Consumed 1.306s CPU time.
Feb 02 09:39:06 compute-1 systemd[1]: var-lib-containers-storage-overlay-1519328e7f63a1cd0039cf771c8b2054bd01fe8399ec3133544bf518fef05bac-merged.mount: Deactivated successfully.
Feb 02 09:39:06 compute-1 podman[77426]: 2026-02-02 09:39:06.144872881 +0000 UTC m=+1.545589930 container remove ae9883c51d91e739a9e0ad046c17bb0d387249130f225f42b3c43c87f9093785 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-osd-0-activate, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.build-date=20250325, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 02 09:39:06 compute-1 podman[77671]: 2026-02-02 09:39:06.34186864 +0000 UTC m=+0.044013840 container create 4f53c93054a9f438fb3ecc749c307ebb8ea38df5790a8803f0451fb2ff7bfad3 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-osd-0, ceph=True, io.buildah.version=1.40.1, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9)
Feb 02 09:39:06 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1f8e173f1bfdda3a8998be8783246431a26077ee4afa149923003b7ab42e6be4/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Feb 02 09:39:06 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1f8e173f1bfdda3a8998be8783246431a26077ee4afa149923003b7ab42e6be4/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Feb 02 09:39:06 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1f8e173f1bfdda3a8998be8783246431a26077ee4afa149923003b7ab42e6be4/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Feb 02 09:39:06 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1f8e173f1bfdda3a8998be8783246431a26077ee4afa149923003b7ab42e6be4/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Feb 02 09:39:06 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1f8e173f1bfdda3a8998be8783246431a26077ee4afa149923003b7ab42e6be4/merged/var/lib/ceph/osd/ceph-0 supports timestamps until 2038 (0x7fffffff)
Feb 02 09:39:06 compute-1 podman[77671]: 2026-02-02 09:39:06.402893229 +0000 UTC m=+0.105038469 container init 4f53c93054a9f438fb3ecc749c307ebb8ea38df5790a8803f0451fb2ff7bfad3 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-osd-0, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, io.buildah.version=1.40.1, CEPH_REF=squid, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Feb 02 09:39:06 compute-1 podman[77671]: 2026-02-02 09:39:06.316916607 +0000 UTC m=+0.019061817 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Feb 02 09:39:06 compute-1 podman[77671]: 2026-02-02 09:39:06.416179634 +0000 UTC m=+0.118324844 container start 4f53c93054a9f438fb3ecc749c307ebb8ea38df5790a8803f0451fb2ff7bfad3 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-osd-0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325, OSD_FLAVOR=default, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0)
Feb 02 09:39:06 compute-1 bash[77671]: 4f53c93054a9f438fb3ecc749c307ebb8ea38df5790a8803f0451fb2ff7bfad3
Feb 02 09:39:06 compute-1 systemd[1]: Started Ceph osd.0 for d241d473-9fcb-5f74-b163-f1ca4454e7f1.
Feb 02 09:39:06 compute-1 ceph-osd[77691]: set uid:gid to 167:167 (ceph:ceph)
Feb 02 09:39:06 compute-1 ceph-osd[77691]: ceph version 19.2.3 (c92aebb279828e9c3c1f5d24613efca272649e62) squid (stable), process ceph-osd, pid 2
Feb 02 09:39:06 compute-1 ceph-osd[77691]: pidfile_write: ignore empty --pid-file
Feb 02 09:39:06 compute-1 ceph-osd[77691]: bdev(0x5616de9a7800 /var/lib/ceph/osd/ceph-0/block) open path /var/lib/ceph/osd/ceph-0/block
Feb 02 09:39:06 compute-1 ceph-osd[77691]: bdev(0x5616de9a7800 /var/lib/ceph/osd/ceph-0/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-0/block failed: (22) Invalid argument
Feb 02 09:39:06 compute-1 ceph-osd[77691]: bdev(0x5616de9a7800 /var/lib/ceph/osd/ceph-0/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Feb 02 09:39:06 compute-1 ceph-osd[77691]: bdev(0x5616de9a7800 /var/lib/ceph/osd/ceph-0/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Feb 02 09:39:06 compute-1 ceph-osd[77691]: bdev(0x5616de9a7800 /var/lib/ceph/osd/ceph-0/block) close
Feb 02 09:39:06 compute-1 sudo[77133]: pam_unix(sudo:session): session closed for user root
Feb 02 09:39:06 compute-1 sudo[77703]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 02 09:39:06 compute-1 sudo[77703]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:39:06 compute-1 sudo[77703]: pam_unix(sudo:session): session closed for user root
Feb 02 09:39:06 compute-1 sudo[77728]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/d241d473-9fcb-5f74-b163-f1ca4454e7f1/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --image quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec --timeout 895 ceph-volume --fsid d241d473-9fcb-5f74-b163-f1ca4454e7f1 -- raw list --format json
Feb 02 09:39:06 compute-1 sudo[77728]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:39:06 compute-1 ceph-osd[77691]: bdev(0x5616de9a7800 /var/lib/ceph/osd/ceph-0/block) open path /var/lib/ceph/osd/ceph-0/block
Feb 02 09:39:06 compute-1 ceph-osd[77691]: bdev(0x5616de9a7800 /var/lib/ceph/osd/ceph-0/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-0/block failed: (22) Invalid argument
Feb 02 09:39:06 compute-1 ceph-osd[77691]: bdev(0x5616de9a7800 /var/lib/ceph/osd/ceph-0/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Feb 02 09:39:06 compute-1 ceph-osd[77691]: bdev(0x5616de9a7800 /var/lib/ceph/osd/ceph-0/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Feb 02 09:39:06 compute-1 ceph-osd[77691]: bdev(0x5616de9a7800 /var/lib/ceph/osd/ceph-0/block) close
Feb 02 09:39:06 compute-1 ceph-osd[77691]: bdev(0x5616de9a7800 /var/lib/ceph/osd/ceph-0/block) open path /var/lib/ceph/osd/ceph-0/block
Feb 02 09:39:06 compute-1 ceph-osd[77691]: bdev(0x5616de9a7800 /var/lib/ceph/osd/ceph-0/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-0/block failed: (22) Invalid argument
Feb 02 09:39:06 compute-1 ceph-osd[77691]: bdev(0x5616de9a7800 /var/lib/ceph/osd/ceph-0/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Feb 02 09:39:06 compute-1 ceph-osd[77691]: bdev(0x5616de9a7800 /var/lib/ceph/osd/ceph-0/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Feb 02 09:39:06 compute-1 ceph-osd[77691]: bdev(0x5616de9a7800 /var/lib/ceph/osd/ceph-0/block) close
Feb 02 09:39:06 compute-1 ceph-osd[77691]: bdev(0x5616de9a7800 /var/lib/ceph/osd/ceph-0/block) open path /var/lib/ceph/osd/ceph-0/block
Feb 02 09:39:06 compute-1 ceph-osd[77691]: bdev(0x5616de9a7800 /var/lib/ceph/osd/ceph-0/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-0/block failed: (22) Invalid argument
Feb 02 09:39:06 compute-1 ceph-osd[77691]: bdev(0x5616de9a7800 /var/lib/ceph/osd/ceph-0/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Feb 02 09:39:06 compute-1 ceph-osd[77691]: bdev(0x5616de9a7800 /var/lib/ceph/osd/ceph-0/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Feb 02 09:39:06 compute-1 ceph-osd[77691]: bdev(0x5616de9a7800 /var/lib/ceph/osd/ceph-0/block) close
Feb 02 09:39:06 compute-1 podman[77801]: 2026-02-02 09:39:06.971889568 +0000 UTC m=+0.051718128 container create bd0f7b15a1f8a1cc660dc7bf3658ed6db41099e24c52201032af804e64227918 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=cool_morse, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 02 09:39:07 compute-1 systemd[1]: Started libpod-conmon-bd0f7b15a1f8a1cc660dc7bf3658ed6db41099e24c52201032af804e64227918.scope.
Feb 02 09:39:07 compute-1 systemd[1]: Started libcrun container.
Feb 02 09:39:07 compute-1 ceph-osd[77691]: bdev(0x5616de9a7800 /var/lib/ceph/osd/ceph-0/block) open path /var/lib/ceph/osd/ceph-0/block
Feb 02 09:39:07 compute-1 ceph-osd[77691]: bdev(0x5616de9a7800 /var/lib/ceph/osd/ceph-0/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-0/block failed: (22) Invalid argument
Feb 02 09:39:07 compute-1 ceph-osd[77691]: bdev(0x5616de9a7800 /var/lib/ceph/osd/ceph-0/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Feb 02 09:39:07 compute-1 ceph-osd[77691]: bdev(0x5616de9a7800 /var/lib/ceph/osd/ceph-0/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Feb 02 09:39:07 compute-1 ceph-osd[77691]: bdev(0x5616de9a7800 /var/lib/ceph/osd/ceph-0/block) close
Feb 02 09:39:07 compute-1 podman[77801]: 2026-02-02 09:39:07.041056027 +0000 UTC m=+0.120884607 container init bd0f7b15a1f8a1cc660dc7bf3658ed6db41099e24c52201032af804e64227918 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=cool_morse, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, CEPH_REF=squid, org.label-schema.license=GPLv2, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0)
Feb 02 09:39:07 compute-1 podman[77801]: 2026-02-02 09:39:06.94993469 +0000 UTC m=+0.029763300 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Feb 02 09:39:07 compute-1 podman[77801]: 2026-02-02 09:39:07.052565791 +0000 UTC m=+0.132394391 container start bd0f7b15a1f8a1cc660dc7bf3658ed6db41099e24c52201032af804e64227918 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=cool_morse, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325)
Feb 02 09:39:07 compute-1 podman[77801]: 2026-02-02 09:39:07.056862132 +0000 UTC m=+0.136690722 container attach bd0f7b15a1f8a1cc660dc7bf3658ed6db41099e24c52201032af804e64227918 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=cool_morse, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.license=GPLv2)
Feb 02 09:39:07 compute-1 cool_morse[77817]: 167 167
Feb 02 09:39:07 compute-1 systemd[1]: libpod-bd0f7b15a1f8a1cc660dc7bf3658ed6db41099e24c52201032af804e64227918.scope: Deactivated successfully.
Feb 02 09:39:07 compute-1 podman[77801]: 2026-02-02 09:39:07.058358774 +0000 UTC m=+0.138187374 container died bd0f7b15a1f8a1cc660dc7bf3658ed6db41099e24c52201032af804e64227918 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=cool_morse, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.build-date=20250325, CEPH_REF=squid, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/)
Feb 02 09:39:07 compute-1 systemd[1]: var-lib-containers-storage-overlay-803654d395faadb15789c9ab5627dbe13a389df7419e35874d4d864aaf9d2268-merged.mount: Deactivated successfully.
Feb 02 09:39:07 compute-1 podman[77801]: 2026-02-02 09:39:07.113374604 +0000 UTC m=+0.193203204 container remove bd0f7b15a1f8a1cc660dc7bf3658ed6db41099e24c52201032af804e64227918 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=cool_morse, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid)
Feb 02 09:39:07 compute-1 systemd[1]: libpod-conmon-bd0f7b15a1f8a1cc660dc7bf3658ed6db41099e24c52201032af804e64227918.scope: Deactivated successfully.
Feb 02 09:39:07 compute-1 podman[77844]: 2026-02-02 09:39:07.287250602 +0000 UTC m=+0.061235226 container create 185f0503aa65d97ed9ddd75613a9c43ce3a0026079bb24baeee81b8b6dc3bd09 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=funny_blackwell, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, ceph=True, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.40.1, org.label-schema.build-date=20250325)
Feb 02 09:39:07 compute-1 ceph-osd[77691]: bdev(0x5616de9a7800 /var/lib/ceph/osd/ceph-0/block) open path /var/lib/ceph/osd/ceph-0/block
Feb 02 09:39:07 compute-1 ceph-osd[77691]: bdev(0x5616de9a7800 /var/lib/ceph/osd/ceph-0/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-0/block failed: (22) Invalid argument
Feb 02 09:39:07 compute-1 ceph-osd[77691]: bdev(0x5616de9a7800 /var/lib/ceph/osd/ceph-0/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Feb 02 09:39:07 compute-1 ceph-osd[77691]: bdev(0x5616de9a7800 /var/lib/ceph/osd/ceph-0/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Feb 02 09:39:07 compute-1 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 kv_onode 0.04 data 0.06
Feb 02 09:39:07 compute-1 ceph-osd[77691]: bdev(0x5616de9a7c00 /var/lib/ceph/osd/ceph-0/block) open path /var/lib/ceph/osd/ceph-0/block
Feb 02 09:39:07 compute-1 ceph-osd[77691]: bdev(0x5616de9a7c00 /var/lib/ceph/osd/ceph-0/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-0/block failed: (22) Invalid argument
Feb 02 09:39:07 compute-1 ceph-osd[77691]: bdev(0x5616de9a7c00 /var/lib/ceph/osd/ceph-0/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Feb 02 09:39:07 compute-1 ceph-osd[77691]: bdev(0x5616de9a7c00 /var/lib/ceph/osd/ceph-0/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Feb 02 09:39:07 compute-1 ceph-osd[77691]: bluefs add_block_device bdev 1 path /var/lib/ceph/osd/ceph-0/block size 20 GiB
Feb 02 09:39:07 compute-1 ceph-osd[77691]: bdev(0x5616de9a7c00 /var/lib/ceph/osd/ceph-0/block) close
Feb 02 09:39:07 compute-1 systemd[1]: Started libpod-conmon-185f0503aa65d97ed9ddd75613a9c43ce3a0026079bb24baeee81b8b6dc3bd09.scope.
Feb 02 09:39:07 compute-1 podman[77844]: 2026-02-02 09:39:07.261787355 +0000 UTC m=+0.035772039 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Feb 02 09:39:07 compute-1 systemd[1]: Started libcrun container.
Feb 02 09:39:07 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1a6854a4c6a97cec9e59b7ed4594f37d280e811cdd8a3df9aac88073c5581177/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Feb 02 09:39:07 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1a6854a4c6a97cec9e59b7ed4594f37d280e811cdd8a3df9aac88073c5581177/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Feb 02 09:39:07 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1a6854a4c6a97cec9e59b7ed4594f37d280e811cdd8a3df9aac88073c5581177/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Feb 02 09:39:07 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1a6854a4c6a97cec9e59b7ed4594f37d280e811cdd8a3df9aac88073c5581177/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Feb 02 09:39:07 compute-1 podman[77844]: 2026-02-02 09:39:07.380099728 +0000 UTC m=+0.154084372 container init 185f0503aa65d97ed9ddd75613a9c43ce3a0026079bb24baeee81b8b6dc3bd09 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=funny_blackwell, CEPH_REF=squid, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325)
Feb 02 09:39:07 compute-1 podman[77844]: 2026-02-02 09:39:07.386122147 +0000 UTC m=+0.160106731 container start 185f0503aa65d97ed9ddd75613a9c43ce3a0026079bb24baeee81b8b6dc3bd09 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=funny_blackwell, org.label-schema.license=GPLv2, ceph=True, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.40.1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/)
Feb 02 09:39:07 compute-1 podman[77844]: 2026-02-02 09:39:07.389310687 +0000 UTC m=+0.163295351 container attach 185f0503aa65d97ed9ddd75613a9c43ce3a0026079bb24baeee81b8b6dc3bd09 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=funny_blackwell, CEPH_REF=squid, ceph=True, OSD_FLAVOR=default, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Feb 02 09:39:07 compute-1 ceph-osd[77691]: bdev(0x5616de9a7800 /var/lib/ceph/osd/ceph-0/block) close
Feb 02 09:39:07 compute-1 ceph-osd[77691]: starting osd.0 osd_data /var/lib/ceph/osd/ceph-0 /var/lib/ceph/osd/ceph-0/journal
Feb 02 09:39:07 compute-1 ceph-osd[77691]: load: jerasure load: lrc 
Feb 02 09:39:07 compute-1 ceph-osd[77691]: bdev(0x5616df854c00 /var/lib/ceph/osd/ceph-0/block) open path /var/lib/ceph/osd/ceph-0/block
Feb 02 09:39:07 compute-1 ceph-osd[77691]: bdev(0x5616df854c00 /var/lib/ceph/osd/ceph-0/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-0/block failed: (22) Invalid argument
Feb 02 09:39:07 compute-1 ceph-osd[77691]: bdev(0x5616df854c00 /var/lib/ceph/osd/ceph-0/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Feb 02 09:39:07 compute-1 ceph-osd[77691]: bdev(0x5616df854c00 /var/lib/ceph/osd/ceph-0/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Feb 02 09:39:07 compute-1 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 kv_onode 0.04 data 0.06
Feb 02 09:39:07 compute-1 ceph-osd[77691]: bdev(0x5616df854c00 /var/lib/ceph/osd/ceph-0/block) close
Feb 02 09:39:07 compute-1 lvm[77942]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Feb 02 09:39:07 compute-1 lvm[77942]: VG ceph_vg0 finished
Feb 02 09:39:08 compute-1 funny_blackwell[77863]: {}
Feb 02 09:39:08 compute-1 systemd[1]: libpod-185f0503aa65d97ed9ddd75613a9c43ce3a0026079bb24baeee81b8b6dc3bd09.scope: Deactivated successfully.
Feb 02 09:39:08 compute-1 systemd[1]: libpod-185f0503aa65d97ed9ddd75613a9c43ce3a0026079bb24baeee81b8b6dc3bd09.scope: Consumed 1.018s CPU time.
Feb 02 09:39:08 compute-1 podman[77844]: 2026-02-02 09:39:08.08226692 +0000 UTC m=+0.856251534 container died 185f0503aa65d97ed9ddd75613a9c43ce3a0026079bb24baeee81b8b6dc3bd09 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=funny_blackwell, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, io.buildah.version=1.40.1, ceph=True, OSD_FLAVOR=default)
Feb 02 09:39:08 compute-1 systemd[1]: var-lib-containers-storage-overlay-1a6854a4c6a97cec9e59b7ed4594f37d280e811cdd8a3df9aac88073c5581177-merged.mount: Deactivated successfully.
Feb 02 09:39:08 compute-1 ceph-osd[77691]: bdev(0x5616df854c00 /var/lib/ceph/osd/ceph-0/block) open path /var/lib/ceph/osd/ceph-0/block
Feb 02 09:39:08 compute-1 ceph-osd[77691]: bdev(0x5616df854c00 /var/lib/ceph/osd/ceph-0/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-0/block failed: (22) Invalid argument
Feb 02 09:39:08 compute-1 ceph-osd[77691]: bdev(0x5616df854c00 /var/lib/ceph/osd/ceph-0/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Feb 02 09:39:08 compute-1 ceph-osd[77691]: bdev(0x5616df854c00 /var/lib/ceph/osd/ceph-0/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Feb 02 09:39:08 compute-1 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 kv_onode 0.04 data 0.06
Feb 02 09:39:08 compute-1 ceph-osd[77691]: bdev(0x5616df854c00 /var/lib/ceph/osd/ceph-0/block) close
Feb 02 09:39:08 compute-1 podman[77844]: 2026-02-02 09:39:08.132876755 +0000 UTC m=+0.906861379 container remove 185f0503aa65d97ed9ddd75613a9c43ce3a0026079bb24baeee81b8b6dc3bd09 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=funny_blackwell, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, ceph=True, io.buildah.version=1.40.1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 02 09:39:08 compute-1 systemd[1]: libpod-conmon-185f0503aa65d97ed9ddd75613a9c43ce3a0026079bb24baeee81b8b6dc3bd09.scope: Deactivated successfully.
Feb 02 09:39:08 compute-1 sudo[77728]: pam_unix(sudo:session): session closed for user root
Feb 02 09:39:08 compute-1 ceph-osd[77691]: mClockScheduler: set_osd_capacity_params_from_config: osd_bandwidth_cost_per_io: 499321.90 bytes/io, osd_bandwidth_capacity_per_shard 157286400.00 bytes/second
Feb 02 09:39:08 compute-1 ceph-osd[77691]: osd.0:0.OSDShard using op scheduler mclock_scheduler, cutoff=196
Feb 02 09:39:08 compute-1 ceph-osd[77691]: bdev(0x5616df854c00 /var/lib/ceph/osd/ceph-0/block) open path /var/lib/ceph/osd/ceph-0/block
Feb 02 09:39:08 compute-1 ceph-osd[77691]: bdev(0x5616df854c00 /var/lib/ceph/osd/ceph-0/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-0/block failed: (22) Invalid argument
Feb 02 09:39:08 compute-1 ceph-osd[77691]: bdev(0x5616df854c00 /var/lib/ceph/osd/ceph-0/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Feb 02 09:39:08 compute-1 ceph-osd[77691]: bdev(0x5616df854c00 /var/lib/ceph/osd/ceph-0/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Feb 02 09:39:08 compute-1 ceph-osd[77691]: bdev(0x5616df854c00 /var/lib/ceph/osd/ceph-0/block) close
Feb 02 09:39:08 compute-1 ceph-osd[77691]: bdev(0x5616df854c00 /var/lib/ceph/osd/ceph-0/block) open path /var/lib/ceph/osd/ceph-0/block
Feb 02 09:39:08 compute-1 ceph-osd[77691]: bdev(0x5616df854c00 /var/lib/ceph/osd/ceph-0/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-0/block failed: (22) Invalid argument
Feb 02 09:39:08 compute-1 ceph-osd[77691]: bdev(0x5616df854c00 /var/lib/ceph/osd/ceph-0/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Feb 02 09:39:08 compute-1 ceph-osd[77691]: bdev(0x5616df854c00 /var/lib/ceph/osd/ceph-0/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Feb 02 09:39:08 compute-1 ceph-osd[77691]: bdev(0x5616df854c00 /var/lib/ceph/osd/ceph-0/block) close
Feb 02 09:39:08 compute-1 ceph-osd[77691]: bdev(0x5616df854c00 /var/lib/ceph/osd/ceph-0/block) open path /var/lib/ceph/osd/ceph-0/block
Feb 02 09:39:08 compute-1 ceph-osd[77691]: bdev(0x5616df854c00 /var/lib/ceph/osd/ceph-0/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-0/block failed: (22) Invalid argument
Feb 02 09:39:08 compute-1 ceph-osd[77691]: bdev(0x5616df854c00 /var/lib/ceph/osd/ceph-0/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Feb 02 09:39:08 compute-1 ceph-osd[77691]: bdev(0x5616df854c00 /var/lib/ceph/osd/ceph-0/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Feb 02 09:39:08 compute-1 ceph-osd[77691]: bdev(0x5616df854c00 /var/lib/ceph/osd/ceph-0/block) close
Feb 02 09:39:08 compute-1 ceph-osd[77691]: bdev(0x5616df854c00 /var/lib/ceph/osd/ceph-0/block) open path /var/lib/ceph/osd/ceph-0/block
Feb 02 09:39:08 compute-1 ceph-osd[77691]: bdev(0x5616df854c00 /var/lib/ceph/osd/ceph-0/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-0/block failed: (22) Invalid argument
Feb 02 09:39:08 compute-1 ceph-osd[77691]: bdev(0x5616df854c00 /var/lib/ceph/osd/ceph-0/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Feb 02 09:39:08 compute-1 ceph-osd[77691]: bdev(0x5616df854c00 /var/lib/ceph/osd/ceph-0/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Feb 02 09:39:08 compute-1 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 kv_onode 0.04 data 0.06
Feb 02 09:39:08 compute-1 ceph-osd[77691]: bdev(0x5616df855000 /var/lib/ceph/osd/ceph-0/block) open path /var/lib/ceph/osd/ceph-0/block
Feb 02 09:39:08 compute-1 ceph-osd[77691]: bdev(0x5616df855000 /var/lib/ceph/osd/ceph-0/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-0/block failed: (22) Invalid argument
Feb 02 09:39:08 compute-1 ceph-osd[77691]: bdev(0x5616df855000 /var/lib/ceph/osd/ceph-0/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Feb 02 09:39:08 compute-1 ceph-osd[77691]: bdev(0x5616df855000 /var/lib/ceph/osd/ceph-0/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Feb 02 09:39:08 compute-1 ceph-osd[77691]: bluefs add_block_device bdev 1 path /var/lib/ceph/osd/ceph-0/block size 20 GiB
Feb 02 09:39:08 compute-1 ceph-osd[77691]: bluefs mount
Feb 02 09:39:08 compute-1 ceph-osd[77691]: bluefs _init_alloc shared, id 1, capacity 0x4ffc00000, block size 0x10000
Feb 02 09:39:08 compute-1 ceph-osd[77691]: bluefs mount final locked allocations 0 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0>
Feb 02 09:39:08 compute-1 ceph-osd[77691]: bluefs mount final locked allocations 1 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0>
Feb 02 09:39:08 compute-1 ceph-osd[77691]: bluefs mount final locked allocations 2 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0>
Feb 02 09:39:08 compute-1 ceph-osd[77691]: bluefs mount final locked allocations 3 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0>
Feb 02 09:39:08 compute-1 ceph-osd[77691]: bluefs mount final locked allocations 4 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0>
Feb 02 09:39:08 compute-1 ceph-osd[77691]: bluefs mount shared_bdev_used = 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _prepare_db_environment set db_paths to db,20397110067 db.slow,20397110067
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: RocksDB version: 7.9.2
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Git sha 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Compile date 2025-07-17 03:12:14
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: DB SUMMARY
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: DB Session ID:  URAREYQTFRBAH0CRPE4U
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: CURRENT file:  CURRENT
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: IDENTITY file:  IDENTITY
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: MANIFEST file:  MANIFEST-000032 size: 1007 Bytes
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: SST files in db dir, Total Num: 1, files: 000030.sst 
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: SST files in db.slow dir, Total Num: 0, files: 
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Write Ahead Log file in db.wal: 000031.log size: 5097 ; 
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                         Options.error_if_exists: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                       Options.create_if_missing: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                         Options.paranoid_checks: 1
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:             Options.flush_verify_memtable_count: 1
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                               Options.track_and_verify_wals_in_manifest: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:        Options.verify_sst_unique_id_in_manifest: 1
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                                     Options.env: 0x5616df813dc0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                                      Options.fs: LegacyFileSystem
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                                Options.info_log: 0x5616df8177a0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                Options.max_file_opening_threads: 16
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                              Options.statistics: (nil)
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                               Options.use_fsync: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                       Options.max_log_file_size: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                  Options.max_manifest_file_size: 1073741824
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                   Options.log_file_time_to_roll: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                       Options.keep_log_file_num: 1000
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                    Options.recycle_log_file_num: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                         Options.allow_fallocate: 1
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                        Options.allow_mmap_reads: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                       Options.allow_mmap_writes: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                        Options.use_direct_reads: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                        Options.use_direct_io_for_flush_and_compaction: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:          Options.create_missing_column_families: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                              Options.db_log_dir: 
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                                 Options.wal_dir: db.wal
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                Options.table_cache_numshardbits: 6
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                         Options.WAL_ttl_seconds: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                       Options.WAL_size_limit_MB: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                        Options.max_write_batch_group_size_bytes: 1048576
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:             Options.manifest_preallocation_size: 4194304
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                     Options.is_fd_close_on_exec: 1
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                   Options.advise_random_on_open: 1
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                    Options.db_write_buffer_size: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                    Options.write_buffer_manager: 0x5616df91ea00
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:         Options.access_hint_on_compaction_start: 1
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:           Options.random_access_max_buffer_size: 1048576
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                      Options.use_adaptive_mutex: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                            Options.rate_limiter: (nil)
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:     Options.sst_file_manager.rate_bytes_per_sec: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                       Options.wal_recovery_mode: 2
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                  Options.enable_thread_tracking: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                  Options.enable_pipelined_write: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                  Options.unordered_write: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:         Options.allow_concurrent_memtable_write: 1
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:      Options.enable_write_thread_adaptive_yield: 1
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:             Options.write_thread_max_yield_usec: 100
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:            Options.write_thread_slow_yield_usec: 3
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                               Options.row_cache: None
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                              Options.wal_filter: None
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:             Options.avoid_flush_during_recovery: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:             Options.allow_ingest_behind: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:             Options.two_write_queues: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:             Options.manual_wal_flush: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:             Options.wal_compression: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:             Options.atomic_flush: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:             Options.avoid_unnecessary_blocking_io: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                 Options.persist_stats_to_disk: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                 Options.write_dbid_to_manifest: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                 Options.log_readahead_size: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                 Options.file_checksum_gen_factory: Unknown
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                 Options.best_efforts_recovery: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                Options.max_bgerror_resume_count: 2147483647
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:            Options.bgerror_resume_retry_interval: 1000000
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:             Options.allow_data_in_errors: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:             Options.db_host_id: __hostname__
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:             Options.enforce_single_del_contracts: true
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:             Options.max_background_jobs: 4
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:             Options.max_background_compactions: -1
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:             Options.max_subcompactions: 1
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:             Options.avoid_flush_during_shutdown: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:           Options.writable_file_max_buffer_size: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:             Options.delayed_write_rate : 16777216
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:             Options.max_total_wal_size: 1073741824
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:             Options.delete_obsolete_files_period_micros: 21600000000
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                   Options.stats_dump_period_sec: 600
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                 Options.stats_persist_period_sec: 600
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                 Options.stats_history_buffer_size: 1048576
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                          Options.max_open_files: -1
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                          Options.bytes_per_sync: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                      Options.wal_bytes_per_sync: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                   Options.strict_bytes_per_sync: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:       Options.compaction_readahead_size: 2097152
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                  Options.max_background_flushes: -1
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Compression algorithms supported:
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:         kZSTD supported: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:         kXpressCompression supported: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:         kBZip2Compression supported: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:         kZSTDNotFinalCompression supported: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:         kLZ4Compression supported: 1
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:         kZlibCompression supported: 1
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:         kLZ4HCCompression supported: 1
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:         kSnappyCompression supported: 1
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Fast CRC32 supported: Supported on x86
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: DMutex implementation: pthread_mutex_t
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: [db/db_impl/db_impl_readonly.cc:25] Opening the db in read only mode
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: [db/version_set.cc:5527] Recovering from manifest file: db/MANIFEST-000032
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [default]:
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:           Options.merge_operator: .T:int64_array.b:bitwise_xor
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:        Options.compaction_filter: None
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:        Options.compaction_filter_factory: None
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:  Options.sst_partitioner_factory: None
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:         Options.memtable_factory: SkipListFactory
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:            Options.table_factory: BlockBasedTable
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x5616df817b60)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x5616dea3d350
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 483183820
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:        Options.write_buffer_size: 16777216
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:  Options.max_write_buffer_number: 64
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:          Options.compression: LZ4
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                  Options.bottommost_compression: Disabled
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:       Options.prefix_extractor: nullptr
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:             Options.num_levels: 7
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:            Options.compression_opts.window_bits: -14
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                  Options.compression_opts.level: 32767
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:               Options.compression_opts.strategy: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:         Options.compression_opts.parallel_threads: 1
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                  Options.compression_opts.enabled: false
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:              Options.level0_stop_writes_trigger: 36
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                   Options.target_file_size_base: 67108864
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:             Options.target_file_size_multiplier: 1
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                        Options.arena_block_size: 1048576
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                Options.disable_auto_compactions: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                   Options.inplace_update_support: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                 Options.inplace_update_num_locks: 10000
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:               Options.memtable_whole_key_filtering: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:   Options.memtable_huge_page_size: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                           Options.bloom_locality: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                    Options.max_successive_merges: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                Options.optimize_filters_for_hits: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                Options.paranoid_file_checks: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                Options.force_consistency_checks: 1
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                Options.report_bg_io_stats: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                               Options.ttl: 2592000
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:          Options.periodic_compaction_seconds: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:    Options.preserve_internal_time_seconds: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                       Options.enable_blob_files: false
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                           Options.min_blob_size: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                          Options.blob_file_size: 268435456
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                   Options.blob_compression_type: NoCompression
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:          Options.enable_blob_garbage_collection: false
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:          Options.blob_compaction_readahead_size: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                Options.blob_file_starting_level: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-0]:
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:           Options.merge_operator: None
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:        Options.compaction_filter: None
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:        Options.compaction_filter_factory: None
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:  Options.sst_partitioner_factory: None
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:         Options.memtable_factory: SkipListFactory
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:            Options.table_factory: BlockBasedTable
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x5616df817b60)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x5616dea3d350
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 483183820
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:        Options.write_buffer_size: 16777216
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:  Options.max_write_buffer_number: 64
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:          Options.compression: LZ4
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                  Options.bottommost_compression: Disabled
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:       Options.prefix_extractor: nullptr
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:             Options.num_levels: 7
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:            Options.compression_opts.window_bits: -14
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                  Options.compression_opts.level: 32767
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:               Options.compression_opts.strategy: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:         Options.compression_opts.parallel_threads: 1
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                  Options.compression_opts.enabled: false
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:              Options.level0_stop_writes_trigger: 36
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                   Options.target_file_size_base: 67108864
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:             Options.target_file_size_multiplier: 1
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                        Options.arena_block_size: 1048576
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                Options.disable_auto_compactions: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                   Options.inplace_update_support: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                 Options.inplace_update_num_locks: 10000
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:               Options.memtable_whole_key_filtering: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:   Options.memtable_huge_page_size: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                           Options.bloom_locality: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                    Options.max_successive_merges: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                Options.optimize_filters_for_hits: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                Options.paranoid_file_checks: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                Options.force_consistency_checks: 1
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                Options.report_bg_io_stats: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                               Options.ttl: 2592000
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:          Options.periodic_compaction_seconds: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:    Options.preserve_internal_time_seconds: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                       Options.enable_blob_files: false
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                           Options.min_blob_size: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                          Options.blob_file_size: 268435456
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                   Options.blob_compression_type: NoCompression
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:          Options.enable_blob_garbage_collection: false
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:          Options.blob_compaction_readahead_size: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                Options.blob_file_starting_level: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-1]:
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:           Options.merge_operator: None
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:        Options.compaction_filter: None
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:        Options.compaction_filter_factory: None
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:  Options.sst_partitioner_factory: None
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:         Options.memtable_factory: SkipListFactory
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:            Options.table_factory: BlockBasedTable
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x5616df817b60)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x5616dea3d350
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 483183820
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:        Options.write_buffer_size: 16777216
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:  Options.max_write_buffer_number: 64
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:          Options.compression: LZ4
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                  Options.bottommost_compression: Disabled
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:       Options.prefix_extractor: nullptr
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:             Options.num_levels: 7
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:            Options.compression_opts.window_bits: -14
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                  Options.compression_opts.level: 32767
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:               Options.compression_opts.strategy: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:         Options.compression_opts.parallel_threads: 1
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                  Options.compression_opts.enabled: false
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:              Options.level0_stop_writes_trigger: 36
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                   Options.target_file_size_base: 67108864
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:             Options.target_file_size_multiplier: 1
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                        Options.arena_block_size: 1048576
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                Options.disable_auto_compactions: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                   Options.inplace_update_support: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                 Options.inplace_update_num_locks: 10000
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:               Options.memtable_whole_key_filtering: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:   Options.memtable_huge_page_size: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                           Options.bloom_locality: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                    Options.max_successive_merges: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                Options.optimize_filters_for_hits: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                Options.paranoid_file_checks: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                Options.force_consistency_checks: 1
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                Options.report_bg_io_stats: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                               Options.ttl: 2592000
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:          Options.periodic_compaction_seconds: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:    Options.preserve_internal_time_seconds: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                       Options.enable_blob_files: false
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                           Options.min_blob_size: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                          Options.blob_file_size: 268435456
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                   Options.blob_compression_type: NoCompression
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:          Options.enable_blob_garbage_collection: false
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:          Options.blob_compaction_readahead_size: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                Options.blob_file_starting_level: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-2]:
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:           Options.merge_operator: None
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:        Options.compaction_filter: None
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:        Options.compaction_filter_factory: None
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:  Options.sst_partitioner_factory: None
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:         Options.memtable_factory: SkipListFactory
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:            Options.table_factory: BlockBasedTable
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x5616df817b60)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x5616dea3d350
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 483183820
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:        Options.write_buffer_size: 16777216
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:  Options.max_write_buffer_number: 64
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:          Options.compression: LZ4
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                  Options.bottommost_compression: Disabled
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:       Options.prefix_extractor: nullptr
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:             Options.num_levels: 7
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:            Options.compression_opts.window_bits: -14
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                  Options.compression_opts.level: 32767
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:               Options.compression_opts.strategy: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:         Options.compression_opts.parallel_threads: 1
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                  Options.compression_opts.enabled: false
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:              Options.level0_stop_writes_trigger: 36
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                   Options.target_file_size_base: 67108864
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:             Options.target_file_size_multiplier: 1
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                        Options.arena_block_size: 1048576
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                Options.disable_auto_compactions: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                   Options.inplace_update_support: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                 Options.inplace_update_num_locks: 10000
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:               Options.memtable_whole_key_filtering: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:   Options.memtable_huge_page_size: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                           Options.bloom_locality: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                    Options.max_successive_merges: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                Options.optimize_filters_for_hits: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                Options.paranoid_file_checks: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                Options.force_consistency_checks: 1
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                Options.report_bg_io_stats: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                               Options.ttl: 2592000
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:          Options.periodic_compaction_seconds: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:    Options.preserve_internal_time_seconds: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                       Options.enable_blob_files: false
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                           Options.min_blob_size: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                          Options.blob_file_size: 268435456
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                   Options.blob_compression_type: NoCompression
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:          Options.enable_blob_garbage_collection: false
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:          Options.blob_compaction_readahead_size: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                Options.blob_file_starting_level: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-0]:
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:           Options.merge_operator: None
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:        Options.compaction_filter: None
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:        Options.compaction_filter_factory: None
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:  Options.sst_partitioner_factory: None
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:         Options.memtable_factory: SkipListFactory
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:            Options.table_factory: BlockBasedTable
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x5616df817b60)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x5616dea3d350
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 483183820
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:        Options.write_buffer_size: 16777216
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:  Options.max_write_buffer_number: 64
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:          Options.compression: LZ4
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                  Options.bottommost_compression: Disabled
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:       Options.prefix_extractor: nullptr
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:             Options.num_levels: 7
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:            Options.compression_opts.window_bits: -14
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                  Options.compression_opts.level: 32767
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:               Options.compression_opts.strategy: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:         Options.compression_opts.parallel_threads: 1
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                  Options.compression_opts.enabled: false
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:              Options.level0_stop_writes_trigger: 36
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                   Options.target_file_size_base: 67108864
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:             Options.target_file_size_multiplier: 1
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                        Options.arena_block_size: 1048576
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                Options.disable_auto_compactions: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                   Options.inplace_update_support: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                 Options.inplace_update_num_locks: 10000
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:               Options.memtable_whole_key_filtering: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:   Options.memtable_huge_page_size: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                           Options.bloom_locality: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                    Options.max_successive_merges: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                Options.optimize_filters_for_hits: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                Options.paranoid_file_checks: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                Options.force_consistency_checks: 1
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                Options.report_bg_io_stats: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                               Options.ttl: 2592000
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:          Options.periodic_compaction_seconds: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:    Options.preserve_internal_time_seconds: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                       Options.enable_blob_files: false
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                           Options.min_blob_size: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                          Options.blob_file_size: 268435456
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                   Options.blob_compression_type: NoCompression
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:          Options.enable_blob_garbage_collection: false
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:          Options.blob_compaction_readahead_size: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                Options.blob_file_starting_level: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-1]:
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:           Options.merge_operator: None
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:        Options.compaction_filter: None
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:        Options.compaction_filter_factory: None
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:  Options.sst_partitioner_factory: None
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:         Options.memtable_factory: SkipListFactory
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:            Options.table_factory: BlockBasedTable
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x5616df817b60)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x5616dea3d350
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 483183820
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:        Options.write_buffer_size: 16777216
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:  Options.max_write_buffer_number: 64
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:          Options.compression: LZ4
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                  Options.bottommost_compression: Disabled
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:       Options.prefix_extractor: nullptr
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:             Options.num_levels: 7
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:            Options.compression_opts.window_bits: -14
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                  Options.compression_opts.level: 32767
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:               Options.compression_opts.strategy: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:         Options.compression_opts.parallel_threads: 1
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                  Options.compression_opts.enabled: false
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:              Options.level0_stop_writes_trigger: 36
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                   Options.target_file_size_base: 67108864
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:             Options.target_file_size_multiplier: 1
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                        Options.arena_block_size: 1048576
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                Options.disable_auto_compactions: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                   Options.inplace_update_support: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                 Options.inplace_update_num_locks: 10000
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:               Options.memtable_whole_key_filtering: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:   Options.memtable_huge_page_size: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                           Options.bloom_locality: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                    Options.max_successive_merges: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                Options.optimize_filters_for_hits: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                Options.paranoid_file_checks: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                Options.force_consistency_checks: 1
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                Options.report_bg_io_stats: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                               Options.ttl: 2592000
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:          Options.periodic_compaction_seconds: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:    Options.preserve_internal_time_seconds: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                       Options.enable_blob_files: false
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                           Options.min_blob_size: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                          Options.blob_file_size: 268435456
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                   Options.blob_compression_type: NoCompression
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:          Options.enable_blob_garbage_collection: false
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:          Options.blob_compaction_readahead_size: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                Options.blob_file_starting_level: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-2]:
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:           Options.merge_operator: None
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:        Options.compaction_filter: None
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:        Options.compaction_filter_factory: None
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:  Options.sst_partitioner_factory: None
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:         Options.memtable_factory: SkipListFactory
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:            Options.table_factory: BlockBasedTable
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x5616df817b60)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x5616dea3d350
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 483183820
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:        Options.write_buffer_size: 16777216
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:  Options.max_write_buffer_number: 64
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:          Options.compression: LZ4
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                  Options.bottommost_compression: Disabled
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:       Options.prefix_extractor: nullptr
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:             Options.num_levels: 7
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:            Options.compression_opts.window_bits: -14
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                  Options.compression_opts.level: 32767
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:               Options.compression_opts.strategy: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:         Options.compression_opts.parallel_threads: 1
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                  Options.compression_opts.enabled: false
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:              Options.level0_stop_writes_trigger: 36
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                   Options.target_file_size_base: 67108864
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:             Options.target_file_size_multiplier: 1
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                        Options.arena_block_size: 1048576
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                Options.disable_auto_compactions: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                   Options.inplace_update_support: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                 Options.inplace_update_num_locks: 10000
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:               Options.memtable_whole_key_filtering: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:   Options.memtable_huge_page_size: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                           Options.bloom_locality: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                    Options.max_successive_merges: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                Options.optimize_filters_for_hits: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                Options.paranoid_file_checks: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                Options.force_consistency_checks: 1
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                Options.report_bg_io_stats: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                               Options.ttl: 2592000
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:          Options.periodic_compaction_seconds: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:    Options.preserve_internal_time_seconds: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                       Options.enable_blob_files: false
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                           Options.min_blob_size: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                          Options.blob_file_size: 268435456
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                   Options.blob_compression_type: NoCompression
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:          Options.enable_blob_garbage_collection: false
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:          Options.blob_compaction_readahead_size: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                Options.blob_file_starting_level: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-0]:
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:           Options.merge_operator: None
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:        Options.compaction_filter: None
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:        Options.compaction_filter_factory: None
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:  Options.sst_partitioner_factory: None
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:         Options.memtable_factory: SkipListFactory
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:            Options.table_factory: BlockBasedTable
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x5616df817b80)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x5616dea3c9b0
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 536870912
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:        Options.write_buffer_size: 16777216
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:  Options.max_write_buffer_number: 64
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:          Options.compression: LZ4
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                  Options.bottommost_compression: Disabled
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:       Options.prefix_extractor: nullptr
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:             Options.num_levels: 7
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:            Options.compression_opts.window_bits: -14
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                  Options.compression_opts.level: 32767
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:               Options.compression_opts.strategy: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:         Options.compression_opts.parallel_threads: 1
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                  Options.compression_opts.enabled: false
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:              Options.level0_stop_writes_trigger: 36
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                   Options.target_file_size_base: 67108864
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:             Options.target_file_size_multiplier: 1
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                        Options.arena_block_size: 1048576
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                Options.disable_auto_compactions: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                   Options.inplace_update_support: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                 Options.inplace_update_num_locks: 10000
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:               Options.memtable_whole_key_filtering: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:   Options.memtable_huge_page_size: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                           Options.bloom_locality: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                    Options.max_successive_merges: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                Options.optimize_filters_for_hits: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                Options.paranoid_file_checks: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                Options.force_consistency_checks: 1
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                Options.report_bg_io_stats: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                               Options.ttl: 2592000
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:          Options.periodic_compaction_seconds: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:    Options.preserve_internal_time_seconds: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                       Options.enable_blob_files: false
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                           Options.min_blob_size: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                          Options.blob_file_size: 268435456
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                   Options.blob_compression_type: NoCompression
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:          Options.enable_blob_garbage_collection: false
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:          Options.blob_compaction_readahead_size: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                Options.blob_file_starting_level: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-1]:
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:           Options.merge_operator: None
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:        Options.compaction_filter: None
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:        Options.compaction_filter_factory: None
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:  Options.sst_partitioner_factory: None
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:         Options.memtable_factory: SkipListFactory
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:            Options.table_factory: BlockBasedTable
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x5616df817b80)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x5616dea3c9b0
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 536870912
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:        Options.write_buffer_size: 16777216
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:  Options.max_write_buffer_number: 64
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:          Options.compression: LZ4
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                  Options.bottommost_compression: Disabled
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:       Options.prefix_extractor: nullptr
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:             Options.num_levels: 7
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:            Options.compression_opts.window_bits: -14
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                  Options.compression_opts.level: 32767
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:               Options.compression_opts.strategy: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:         Options.compression_opts.parallel_threads: 1
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                  Options.compression_opts.enabled: false
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:              Options.level0_stop_writes_trigger: 36
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                   Options.target_file_size_base: 67108864
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:             Options.target_file_size_multiplier: 1
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                        Options.arena_block_size: 1048576
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                Options.disable_auto_compactions: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                   Options.inplace_update_support: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                 Options.inplace_update_num_locks: 10000
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:               Options.memtable_whole_key_filtering: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:   Options.memtable_huge_page_size: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                           Options.bloom_locality: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                    Options.max_successive_merges: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                Options.optimize_filters_for_hits: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                Options.paranoid_file_checks: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                Options.force_consistency_checks: 1
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                Options.report_bg_io_stats: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                               Options.ttl: 2592000
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:          Options.periodic_compaction_seconds: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:    Options.preserve_internal_time_seconds: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                       Options.enable_blob_files: false
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                           Options.min_blob_size: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                          Options.blob_file_size: 268435456
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                   Options.blob_compression_type: NoCompression
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:          Options.enable_blob_garbage_collection: false
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:          Options.blob_compaction_readahead_size: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                Options.blob_file_starting_level: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-2]:
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:           Options.merge_operator: None
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:        Options.compaction_filter: None
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:        Options.compaction_filter_factory: None
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:  Options.sst_partitioner_factory: None
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:         Options.memtable_factory: SkipListFactory
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:            Options.table_factory: BlockBasedTable
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x5616df817b80)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x5616dea3c9b0
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 536870912
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:        Options.write_buffer_size: 16777216
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:  Options.max_write_buffer_number: 64
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:          Options.compression: LZ4
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                  Options.bottommost_compression: Disabled
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:       Options.prefix_extractor: nullptr
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:             Options.num_levels: 7
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:            Options.compression_opts.window_bits: -14
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                  Options.compression_opts.level: 32767
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:               Options.compression_opts.strategy: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:         Options.compression_opts.parallel_threads: 1
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                  Options.compression_opts.enabled: false
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:              Options.level0_stop_writes_trigger: 36
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                   Options.target_file_size_base: 67108864
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:             Options.target_file_size_multiplier: 1
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                        Options.arena_block_size: 1048576
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                Options.disable_auto_compactions: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                   Options.inplace_update_support: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                 Options.inplace_update_num_locks: 10000
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:               Options.memtable_whole_key_filtering: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:   Options.memtable_huge_page_size: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                           Options.bloom_locality: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                    Options.max_successive_merges: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                Options.optimize_filters_for_hits: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                Options.paranoid_file_checks: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                Options.force_consistency_checks: 1
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                Options.report_bg_io_stats: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                               Options.ttl: 2592000
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:          Options.periodic_compaction_seconds: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:    Options.preserve_internal_time_seconds: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                       Options.enable_blob_files: false
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                           Options.min_blob_size: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                          Options.blob_file_size: 268435456
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                   Options.blob_compression_type: NoCompression
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:          Options.enable_blob_garbage_collection: false
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:          Options.blob_compaction_readahead_size: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                Options.blob_file_starting_level: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: [db/column_family.cc:635]         (skipping printing options)
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: [db/column_family.cc:635]         (skipping printing options)
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: [db/version_set.cc:5566] Recovered from manifest file:db/MANIFEST-000032 succeeded,manifest_file_number is 32, next_file_number is 34, last_sequence is 12, log_number is 5,prev_log_number is 0,max_column_family is 11,min_log_number_to_keep is 5
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: [db/version_set.cc:5581] Column family [default] (ID 0), log number is 5
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: [db/version_set.cc:5581] Column family [m-0] (ID 1), log number is 5
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: [db/version_set.cc:5581] Column family [m-1] (ID 2), log number is 5
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: [db/version_set.cc:5581] Column family [m-2] (ID 3), log number is 5
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: [db/version_set.cc:5581] Column family [p-0] (ID 4), log number is 5
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: [db/version_set.cc:5581] Column family [p-1] (ID 5), log number is 5
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: [db/version_set.cc:5581] Column family [p-2] (ID 6), log number is 5
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: [db/version_set.cc:5581] Column family [O-0] (ID 7), log number is 5
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: [db/version_set.cc:5581] Column family [O-1] (ID 8), log number is 5
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: [db/version_set.cc:5581] Column family [O-2] (ID 9), log number is 5
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: [db/version_set.cc:5581] Column family [L] (ID 10), log number is 5
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: [db/version_set.cc:5581] Column family [P] (ID 11), log number is 5
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: [db/db_impl/db_impl_open.cc:539] DB ID: b15bed18-a53e-4ccb-bbb2-c9066cfaeff1
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: EVENT_LOG_v1 {"time_micros": 1770025148509956, "job": 1, "event": "recovery_started", "wal_files": [31]}
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: [db/db_impl/db_impl_open.cc:1043] Recovering log #31 mode 2
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: EVENT_LOG_v1 {"time_micros": 1770025148510262, "job": 1, "event": "recovery_finished"}
Feb 02 09:39:08 compute-1 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _open_db opened rocksdb path db options compression=kLZ4Compression,max_write_buffer_number=64,min_write_buffer_number_to_merge=6,compaction_style=kCompactionStyleLevel,write_buffer_size=16777216,max_background_jobs=4,level0_file_num_compaction_trigger=8,max_bytes_for_level_base=1073741824,max_bytes_for_level_multiplier=8,compaction_readahead_size=2MB,max_total_wal_size=1073741824,writable_file_max_buffer_size=0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _open_super_meta old nid_max 1025
Feb 02 09:39:08 compute-1 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _open_super_meta old blobid_max 10240
Feb 02 09:39:08 compute-1 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _open_super_meta ondisk_format 4 compat_ondisk_format 3
Feb 02 09:39:08 compute-1 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _open_super_meta min_alloc_size 0x1000
Feb 02 09:39:08 compute-1 ceph-osd[77691]: freelist init
Feb 02 09:39:08 compute-1 ceph-osd[77691]: freelist _read_cfg
Feb 02 09:39:08 compute-1 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _init_alloc loaded 20 GiB in 2 extents, allocator type hybrid, capacity 0x4ffc00000, block size 0x1000, free 0x4ffbfd000, fragmentation 1.9e-07
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: [db/db_impl/db_impl.cc:496] Shutdown: canceling all background work
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: [db/db_impl/db_impl.cc:704] Shutdown complete
Feb 02 09:39:08 compute-1 ceph-osd[77691]: bluefs umount
Feb 02 09:39:08 compute-1 ceph-osd[77691]: bdev(0x5616df855000 /var/lib/ceph/osd/ceph-0/block) close
Feb 02 09:39:08 compute-1 ceph-osd[77691]: bdev(0x5616df855000 /var/lib/ceph/osd/ceph-0/block) open path /var/lib/ceph/osd/ceph-0/block
Feb 02 09:39:08 compute-1 ceph-osd[77691]: bdev(0x5616df855000 /var/lib/ceph/osd/ceph-0/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-0/block failed: (22) Invalid argument
Feb 02 09:39:08 compute-1 ceph-osd[77691]: bdev(0x5616df855000 /var/lib/ceph/osd/ceph-0/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Feb 02 09:39:08 compute-1 ceph-osd[77691]: bdev(0x5616df855000 /var/lib/ceph/osd/ceph-0/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Feb 02 09:39:08 compute-1 ceph-osd[77691]: bluefs add_block_device bdev 1 path /var/lib/ceph/osd/ceph-0/block size 20 GiB
Feb 02 09:39:08 compute-1 ceph-osd[77691]: bluefs mount
Feb 02 09:39:08 compute-1 ceph-osd[77691]: bluefs _init_alloc shared, id 1, capacity 0x4ffc00000, block size 0x10000
Feb 02 09:39:08 compute-1 ceph-osd[77691]: bluefs mount final locked allocations 0 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0>
Feb 02 09:39:08 compute-1 ceph-osd[77691]: bluefs mount final locked allocations 1 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0>
Feb 02 09:39:08 compute-1 ceph-osd[77691]: bluefs mount final locked allocations 2 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0>
Feb 02 09:39:08 compute-1 ceph-osd[77691]: bluefs mount final locked allocations 3 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0>
Feb 02 09:39:08 compute-1 ceph-osd[77691]: bluefs mount final locked allocations 4 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0>
Feb 02 09:39:08 compute-1 ceph-osd[77691]: bluefs mount shared_bdev_used = 4718592
Feb 02 09:39:08 compute-1 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _prepare_db_environment set db_paths to db,20397110067 db.slow,20397110067
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: RocksDB version: 7.9.2
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Git sha 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Compile date 2025-07-17 03:12:14
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: DB SUMMARY
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: DB Session ID:  URAREYQTFRBAH0CRPE4V
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: CURRENT file:  CURRENT
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: IDENTITY file:  IDENTITY
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: MANIFEST file:  MANIFEST-000032 size: 1007 Bytes
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: SST files in db dir, Total Num: 1, files: 000030.sst 
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: SST files in db.slow dir, Total Num: 0, files: 
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Write Ahead Log file in db.wal: 000031.log size: 5097 ; 
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                         Options.error_if_exists: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                       Options.create_if_missing: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                         Options.paranoid_checks: 1
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:             Options.flush_verify_memtable_count: 1
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                               Options.track_and_verify_wals_in_manifest: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:        Options.verify_sst_unique_id_in_manifest: 1
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                                     Options.env: 0x5616df9c2310
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                                      Options.fs: LegacyFileSystem
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                                Options.info_log: 0x5616df817940
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                Options.max_file_opening_threads: 16
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                              Options.statistics: (nil)
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                               Options.use_fsync: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                       Options.max_log_file_size: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                  Options.max_manifest_file_size: 1073741824
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                   Options.log_file_time_to_roll: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                       Options.keep_log_file_num: 1000
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                    Options.recycle_log_file_num: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                         Options.allow_fallocate: 1
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                        Options.allow_mmap_reads: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                       Options.allow_mmap_writes: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                        Options.use_direct_reads: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                        Options.use_direct_io_for_flush_and_compaction: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:          Options.create_missing_column_families: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                              Options.db_log_dir: 
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                                 Options.wal_dir: db.wal
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                Options.table_cache_numshardbits: 6
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                         Options.WAL_ttl_seconds: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                       Options.WAL_size_limit_MB: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                        Options.max_write_batch_group_size_bytes: 1048576
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:             Options.manifest_preallocation_size: 4194304
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                     Options.is_fd_close_on_exec: 1
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                   Options.advise_random_on_open: 1
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                    Options.db_write_buffer_size: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                    Options.write_buffer_manager: 0x5616df91ea00
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:         Options.access_hint_on_compaction_start: 1
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:           Options.random_access_max_buffer_size: 1048576
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                      Options.use_adaptive_mutex: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                            Options.rate_limiter: (nil)
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:     Options.sst_file_manager.rate_bytes_per_sec: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                       Options.wal_recovery_mode: 2
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                  Options.enable_thread_tracking: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                  Options.enable_pipelined_write: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                  Options.unordered_write: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:         Options.allow_concurrent_memtable_write: 1
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:      Options.enable_write_thread_adaptive_yield: 1
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:             Options.write_thread_max_yield_usec: 100
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:            Options.write_thread_slow_yield_usec: 3
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                               Options.row_cache: None
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                              Options.wal_filter: None
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:             Options.avoid_flush_during_recovery: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:             Options.allow_ingest_behind: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:             Options.two_write_queues: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:             Options.manual_wal_flush: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:             Options.wal_compression: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:             Options.atomic_flush: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:             Options.avoid_unnecessary_blocking_io: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                 Options.persist_stats_to_disk: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                 Options.write_dbid_to_manifest: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                 Options.log_readahead_size: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                 Options.file_checksum_gen_factory: Unknown
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                 Options.best_efforts_recovery: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                Options.max_bgerror_resume_count: 2147483647
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:            Options.bgerror_resume_retry_interval: 1000000
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:             Options.allow_data_in_errors: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:             Options.db_host_id: __hostname__
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:             Options.enforce_single_del_contracts: true
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:             Options.max_background_jobs: 4
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:             Options.max_background_compactions: -1
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:             Options.max_subcompactions: 1
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:             Options.avoid_flush_during_shutdown: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:           Options.writable_file_max_buffer_size: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:             Options.delayed_write_rate : 16777216
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:             Options.max_total_wal_size: 1073741824
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:             Options.delete_obsolete_files_period_micros: 21600000000
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                   Options.stats_dump_period_sec: 600
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                 Options.stats_persist_period_sec: 600
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                 Options.stats_history_buffer_size: 1048576
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                          Options.max_open_files: -1
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                          Options.bytes_per_sync: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                      Options.wal_bytes_per_sync: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                   Options.strict_bytes_per_sync: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:       Options.compaction_readahead_size: 2097152
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                  Options.max_background_flushes: -1
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Compression algorithms supported:
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:         kZSTD supported: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:         kXpressCompression supported: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:         kBZip2Compression supported: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:         kZSTDNotFinalCompression supported: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:         kLZ4Compression supported: 1
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:         kZlibCompression supported: 1
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:         kLZ4HCCompression supported: 1
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:         kSnappyCompression supported: 1
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Fast CRC32 supported: Supported on x86
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: DMutex implementation: pthread_mutex_t
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: [db/version_set.cc:5527] Recovering from manifest file: db/MANIFEST-000032
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [default]:
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:           Options.merge_operator: .T:int64_array.b:bitwise_xor
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:        Options.compaction_filter: None
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:        Options.compaction_filter_factory: None
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:  Options.sst_partitioner_factory: None
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:         Options.memtable_factory: SkipListFactory
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:            Options.table_factory: BlockBasedTable
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x5616df817680)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x5616dea3d350
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 483183820
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:        Options.write_buffer_size: 16777216
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:  Options.max_write_buffer_number: 64
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:          Options.compression: LZ4
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                  Options.bottommost_compression: Disabled
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:       Options.prefix_extractor: nullptr
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:             Options.num_levels: 7
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:            Options.compression_opts.window_bits: -14
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                  Options.compression_opts.level: 32767
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:               Options.compression_opts.strategy: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:         Options.compression_opts.parallel_threads: 1
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                  Options.compression_opts.enabled: false
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:              Options.level0_stop_writes_trigger: 36
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                   Options.target_file_size_base: 67108864
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:             Options.target_file_size_multiplier: 1
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                        Options.arena_block_size: 1048576
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                Options.disable_auto_compactions: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                   Options.inplace_update_support: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                 Options.inplace_update_num_locks: 10000
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:               Options.memtable_whole_key_filtering: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:   Options.memtable_huge_page_size: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                           Options.bloom_locality: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                    Options.max_successive_merges: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                Options.optimize_filters_for_hits: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                Options.paranoid_file_checks: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                Options.force_consistency_checks: 1
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                Options.report_bg_io_stats: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                               Options.ttl: 2592000
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:          Options.periodic_compaction_seconds: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:    Options.preserve_internal_time_seconds: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                       Options.enable_blob_files: false
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                           Options.min_blob_size: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                          Options.blob_file_size: 268435456
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                   Options.blob_compression_type: NoCompression
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:          Options.enable_blob_garbage_collection: false
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:          Options.blob_compaction_readahead_size: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                Options.blob_file_starting_level: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-0]:
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:           Options.merge_operator: None
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:        Options.compaction_filter: None
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:        Options.compaction_filter_factory: None
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:  Options.sst_partitioner_factory: None
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:         Options.memtable_factory: SkipListFactory
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:            Options.table_factory: BlockBasedTable
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x5616df817680)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x5616dea3d350
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 483183820
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:        Options.write_buffer_size: 16777216
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:  Options.max_write_buffer_number: 64
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:          Options.compression: LZ4
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                  Options.bottommost_compression: Disabled
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:       Options.prefix_extractor: nullptr
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:             Options.num_levels: 7
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:            Options.compression_opts.window_bits: -14
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                  Options.compression_opts.level: 32767
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:               Options.compression_opts.strategy: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:         Options.compression_opts.parallel_threads: 1
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                  Options.compression_opts.enabled: false
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:              Options.level0_stop_writes_trigger: 36
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                   Options.target_file_size_base: 67108864
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:             Options.target_file_size_multiplier: 1
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                        Options.arena_block_size: 1048576
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                Options.disable_auto_compactions: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                   Options.inplace_update_support: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                 Options.inplace_update_num_locks: 10000
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:               Options.memtable_whole_key_filtering: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:   Options.memtable_huge_page_size: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                           Options.bloom_locality: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                    Options.max_successive_merges: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                Options.optimize_filters_for_hits: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                Options.paranoid_file_checks: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                Options.force_consistency_checks: 1
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                Options.report_bg_io_stats: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                               Options.ttl: 2592000
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:          Options.periodic_compaction_seconds: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:    Options.preserve_internal_time_seconds: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                       Options.enable_blob_files: false
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                           Options.min_blob_size: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                          Options.blob_file_size: 268435456
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                   Options.blob_compression_type: NoCompression
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:          Options.enable_blob_garbage_collection: false
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:          Options.blob_compaction_readahead_size: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                Options.blob_file_starting_level: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-1]:
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:           Options.merge_operator: None
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:        Options.compaction_filter: None
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:        Options.compaction_filter_factory: None
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:  Options.sst_partitioner_factory: None
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:         Options.memtable_factory: SkipListFactory
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:            Options.table_factory: BlockBasedTable
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x5616df817680)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x5616dea3d350
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 483183820
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:        Options.write_buffer_size: 16777216
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:  Options.max_write_buffer_number: 64
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:          Options.compression: LZ4
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                  Options.bottommost_compression: Disabled
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:       Options.prefix_extractor: nullptr
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:             Options.num_levels: 7
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:            Options.compression_opts.window_bits: -14
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                  Options.compression_opts.level: 32767
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:               Options.compression_opts.strategy: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:         Options.compression_opts.parallel_threads: 1
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                  Options.compression_opts.enabled: false
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:              Options.level0_stop_writes_trigger: 36
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                   Options.target_file_size_base: 67108864
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:             Options.target_file_size_multiplier: 1
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                        Options.arena_block_size: 1048576
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                Options.disable_auto_compactions: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                   Options.inplace_update_support: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                 Options.inplace_update_num_locks: 10000
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:               Options.memtable_whole_key_filtering: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:   Options.memtable_huge_page_size: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                           Options.bloom_locality: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                    Options.max_successive_merges: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                Options.optimize_filters_for_hits: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                Options.paranoid_file_checks: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                Options.force_consistency_checks: 1
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                Options.report_bg_io_stats: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                               Options.ttl: 2592000
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:          Options.periodic_compaction_seconds: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:    Options.preserve_internal_time_seconds: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                       Options.enable_blob_files: false
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                           Options.min_blob_size: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                          Options.blob_file_size: 268435456
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                   Options.blob_compression_type: NoCompression
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:          Options.enable_blob_garbage_collection: false
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:          Options.blob_compaction_readahead_size: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                Options.blob_file_starting_level: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-2]:
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:           Options.merge_operator: None
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:        Options.compaction_filter: None
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:        Options.compaction_filter_factory: None
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:  Options.sst_partitioner_factory: None
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:         Options.memtable_factory: SkipListFactory
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:            Options.table_factory: BlockBasedTable
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x5616df817680)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x5616dea3d350
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 483183820
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:        Options.write_buffer_size: 16777216
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:  Options.max_write_buffer_number: 64
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:          Options.compression: LZ4
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                  Options.bottommost_compression: Disabled
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:       Options.prefix_extractor: nullptr
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:             Options.num_levels: 7
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:            Options.compression_opts.window_bits: -14
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                  Options.compression_opts.level: 32767
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:               Options.compression_opts.strategy: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:         Options.compression_opts.parallel_threads: 1
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                  Options.compression_opts.enabled: false
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:              Options.level0_stop_writes_trigger: 36
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                   Options.target_file_size_base: 67108864
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:             Options.target_file_size_multiplier: 1
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                        Options.arena_block_size: 1048576
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                Options.disable_auto_compactions: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                   Options.inplace_update_support: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                 Options.inplace_update_num_locks: 10000
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:               Options.memtable_whole_key_filtering: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:   Options.memtable_huge_page_size: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                           Options.bloom_locality: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                    Options.max_successive_merges: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                Options.optimize_filters_for_hits: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                Options.paranoid_file_checks: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                Options.force_consistency_checks: 1
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                Options.report_bg_io_stats: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                               Options.ttl: 2592000
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:          Options.periodic_compaction_seconds: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:    Options.preserve_internal_time_seconds: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                       Options.enable_blob_files: false
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                           Options.min_blob_size: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                          Options.blob_file_size: 268435456
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                   Options.blob_compression_type: NoCompression
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:          Options.enable_blob_garbage_collection: false
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:          Options.blob_compaction_readahead_size: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                Options.blob_file_starting_level: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-0]:
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:           Options.merge_operator: None
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:        Options.compaction_filter: None
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:        Options.compaction_filter_factory: None
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:  Options.sst_partitioner_factory: None
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:         Options.memtable_factory: SkipListFactory
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:            Options.table_factory: BlockBasedTable
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x5616df817680)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x5616dea3d350
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 483183820
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:        Options.write_buffer_size: 16777216
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:  Options.max_write_buffer_number: 64
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:          Options.compression: LZ4
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                  Options.bottommost_compression: Disabled
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:       Options.prefix_extractor: nullptr
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:             Options.num_levels: 7
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:            Options.compression_opts.window_bits: -14
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                  Options.compression_opts.level: 32767
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:               Options.compression_opts.strategy: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:         Options.compression_opts.parallel_threads: 1
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                  Options.compression_opts.enabled: false
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:              Options.level0_stop_writes_trigger: 36
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                   Options.target_file_size_base: 67108864
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:             Options.target_file_size_multiplier: 1
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                        Options.arena_block_size: 1048576
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                Options.disable_auto_compactions: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                   Options.inplace_update_support: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                 Options.inplace_update_num_locks: 10000
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:               Options.memtable_whole_key_filtering: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:   Options.memtable_huge_page_size: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                           Options.bloom_locality: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                    Options.max_successive_merges: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                Options.optimize_filters_for_hits: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                Options.paranoid_file_checks: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                Options.force_consistency_checks: 1
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                Options.report_bg_io_stats: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                               Options.ttl: 2592000
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:          Options.periodic_compaction_seconds: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:    Options.preserve_internal_time_seconds: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                       Options.enable_blob_files: false
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                           Options.min_blob_size: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                          Options.blob_file_size: 268435456
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                   Options.blob_compression_type: NoCompression
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:          Options.enable_blob_garbage_collection: false
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:          Options.blob_compaction_readahead_size: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                Options.blob_file_starting_level: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-1]:
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:           Options.merge_operator: None
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:        Options.compaction_filter: None
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:        Options.compaction_filter_factory: None
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:  Options.sst_partitioner_factory: None
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:         Options.memtable_factory: SkipListFactory
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:            Options.table_factory: BlockBasedTable
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x5616df817680)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x5616dea3d350
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 483183820
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:        Options.write_buffer_size: 16777216
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:  Options.max_write_buffer_number: 64
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:          Options.compression: LZ4
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                  Options.bottommost_compression: Disabled
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:       Options.prefix_extractor: nullptr
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:             Options.num_levels: 7
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:            Options.compression_opts.window_bits: -14
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                  Options.compression_opts.level: 32767
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:               Options.compression_opts.strategy: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:         Options.compression_opts.parallel_threads: 1
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                  Options.compression_opts.enabled: false
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:              Options.level0_stop_writes_trigger: 36
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                   Options.target_file_size_base: 67108864
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:             Options.target_file_size_multiplier: 1
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                        Options.arena_block_size: 1048576
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                Options.disable_auto_compactions: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                   Options.inplace_update_support: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                 Options.inplace_update_num_locks: 10000
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:               Options.memtable_whole_key_filtering: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:   Options.memtable_huge_page_size: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                           Options.bloom_locality: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                    Options.max_successive_merges: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                Options.optimize_filters_for_hits: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                Options.paranoid_file_checks: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                Options.force_consistency_checks: 1
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                Options.report_bg_io_stats: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                               Options.ttl: 2592000
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:          Options.periodic_compaction_seconds: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:    Options.preserve_internal_time_seconds: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                       Options.enable_blob_files: false
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                           Options.min_blob_size: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                          Options.blob_file_size: 268435456
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                   Options.blob_compression_type: NoCompression
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:          Options.enable_blob_garbage_collection: false
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:          Options.blob_compaction_readahead_size: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                Options.blob_file_starting_level: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-2]:
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:           Options.merge_operator: None
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:        Options.compaction_filter: None
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:        Options.compaction_filter_factory: None
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:  Options.sst_partitioner_factory: None
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:         Options.memtable_factory: SkipListFactory
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:            Options.table_factory: BlockBasedTable
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x5616df817680)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x5616dea3d350
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 483183820
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:        Options.write_buffer_size: 16777216
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:  Options.max_write_buffer_number: 64
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:          Options.compression: LZ4
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                  Options.bottommost_compression: Disabled
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:       Options.prefix_extractor: nullptr
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:             Options.num_levels: 7
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:            Options.compression_opts.window_bits: -14
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                  Options.compression_opts.level: 32767
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:               Options.compression_opts.strategy: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:         Options.compression_opts.parallel_threads: 1
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                  Options.compression_opts.enabled: false
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:              Options.level0_stop_writes_trigger: 36
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                   Options.target_file_size_base: 67108864
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:             Options.target_file_size_multiplier: 1
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                        Options.arena_block_size: 1048576
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                Options.disable_auto_compactions: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                   Options.inplace_update_support: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                 Options.inplace_update_num_locks: 10000
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:               Options.memtable_whole_key_filtering: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:   Options.memtable_huge_page_size: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                           Options.bloom_locality: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                    Options.max_successive_merges: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                Options.optimize_filters_for_hits: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                Options.paranoid_file_checks: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                Options.force_consistency_checks: 1
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                Options.report_bg_io_stats: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                               Options.ttl: 2592000
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:          Options.periodic_compaction_seconds: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:    Options.preserve_internal_time_seconds: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                       Options.enable_blob_files: false
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                           Options.min_blob_size: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                          Options.blob_file_size: 268435456
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                   Options.blob_compression_type: NoCompression
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:          Options.enable_blob_garbage_collection: false
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:          Options.blob_compaction_readahead_size: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                Options.blob_file_starting_level: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-0]:
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:           Options.merge_operator: None
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:        Options.compaction_filter: None
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:        Options.compaction_filter_factory: None
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:  Options.sst_partitioner_factory: None
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:         Options.memtable_factory: SkipListFactory
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:            Options.table_factory: BlockBasedTable
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x5616df817ac0)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x5616dea3c9b0
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 536870912
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:        Options.write_buffer_size: 16777216
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:  Options.max_write_buffer_number: 64
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:          Options.compression: LZ4
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                  Options.bottommost_compression: Disabled
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:       Options.prefix_extractor: nullptr
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:             Options.num_levels: 7
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:            Options.compression_opts.window_bits: -14
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                  Options.compression_opts.level: 32767
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:               Options.compression_opts.strategy: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:         Options.compression_opts.parallel_threads: 1
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                  Options.compression_opts.enabled: false
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:              Options.level0_stop_writes_trigger: 36
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                   Options.target_file_size_base: 67108864
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:             Options.target_file_size_multiplier: 1
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                        Options.arena_block_size: 1048576
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                Options.disable_auto_compactions: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                   Options.inplace_update_support: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                 Options.inplace_update_num_locks: 10000
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:               Options.memtable_whole_key_filtering: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:   Options.memtable_huge_page_size: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                           Options.bloom_locality: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                    Options.max_successive_merges: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                Options.optimize_filters_for_hits: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                Options.paranoid_file_checks: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                Options.force_consistency_checks: 1
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                Options.report_bg_io_stats: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                               Options.ttl: 2592000
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:          Options.periodic_compaction_seconds: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:    Options.preserve_internal_time_seconds: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                       Options.enable_blob_files: false
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                           Options.min_blob_size: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                          Options.blob_file_size: 268435456
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                   Options.blob_compression_type: NoCompression
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:          Options.enable_blob_garbage_collection: false
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:          Options.blob_compaction_readahead_size: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                Options.blob_file_starting_level: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-1]:
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:           Options.merge_operator: None
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:        Options.compaction_filter: None
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:        Options.compaction_filter_factory: None
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:  Options.sst_partitioner_factory: None
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:         Options.memtable_factory: SkipListFactory
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:            Options.table_factory: BlockBasedTable
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x5616df817ac0)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x5616dea3c9b0
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 536870912
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:        Options.write_buffer_size: 16777216
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:  Options.max_write_buffer_number: 64
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:          Options.compression: LZ4
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                  Options.bottommost_compression: Disabled
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:       Options.prefix_extractor: nullptr
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:             Options.num_levels: 7
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:            Options.compression_opts.window_bits: -14
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                  Options.compression_opts.level: 32767
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:               Options.compression_opts.strategy: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:         Options.compression_opts.parallel_threads: 1
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                  Options.compression_opts.enabled: false
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:              Options.level0_stop_writes_trigger: 36
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                   Options.target_file_size_base: 67108864
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:             Options.target_file_size_multiplier: 1
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                        Options.arena_block_size: 1048576
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                Options.disable_auto_compactions: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                   Options.inplace_update_support: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                 Options.inplace_update_num_locks: 10000
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:               Options.memtable_whole_key_filtering: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:   Options.memtable_huge_page_size: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                           Options.bloom_locality: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                    Options.max_successive_merges: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                Options.optimize_filters_for_hits: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                Options.paranoid_file_checks: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                Options.force_consistency_checks: 1
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                Options.report_bg_io_stats: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                               Options.ttl: 2592000
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:          Options.periodic_compaction_seconds: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:    Options.preserve_internal_time_seconds: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                       Options.enable_blob_files: false
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                           Options.min_blob_size: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                          Options.blob_file_size: 268435456
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                   Options.blob_compression_type: NoCompression
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:          Options.enable_blob_garbage_collection: false
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:          Options.blob_compaction_readahead_size: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                Options.blob_file_starting_level: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-2]:
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:           Options.merge_operator: None
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:        Options.compaction_filter: None
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:        Options.compaction_filter_factory: None
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:  Options.sst_partitioner_factory: None
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:         Options.memtable_factory: SkipListFactory
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:            Options.table_factory: BlockBasedTable
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x5616df817ac0)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x5616dea3c9b0
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 536870912
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:        Options.write_buffer_size: 16777216
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:  Options.max_write_buffer_number: 64
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:          Options.compression: LZ4
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                  Options.bottommost_compression: Disabled
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:       Options.prefix_extractor: nullptr
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:             Options.num_levels: 7
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:            Options.compression_opts.window_bits: -14
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                  Options.compression_opts.level: 32767
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:               Options.compression_opts.strategy: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:         Options.compression_opts.parallel_threads: 1
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                  Options.compression_opts.enabled: false
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:              Options.level0_stop_writes_trigger: 36
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                   Options.target_file_size_base: 67108864
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:             Options.target_file_size_multiplier: 1
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                        Options.arena_block_size: 1048576
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                Options.disable_auto_compactions: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                   Options.inplace_update_support: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                 Options.inplace_update_num_locks: 10000
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:               Options.memtable_whole_key_filtering: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:   Options.memtable_huge_page_size: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                           Options.bloom_locality: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                    Options.max_successive_merges: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                Options.optimize_filters_for_hits: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                Options.paranoid_file_checks: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                Options.force_consistency_checks: 1
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                Options.report_bg_io_stats: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                               Options.ttl: 2592000
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:          Options.periodic_compaction_seconds: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:    Options.preserve_internal_time_seconds: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                       Options.enable_blob_files: false
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                           Options.min_blob_size: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                          Options.blob_file_size: 268435456
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                   Options.blob_compression_type: NoCompression
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:          Options.enable_blob_garbage_collection: false
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:          Options.blob_compaction_readahead_size: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb:                Options.blob_file_starting_level: 0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: [db/column_family.cc:635]         (skipping printing options)
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: [db/column_family.cc:635]         (skipping printing options)
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: [db/version_set.cc:5566] Recovered from manifest file:db/MANIFEST-000032 succeeded,manifest_file_number is 32, next_file_number is 34, last_sequence is 12, log_number is 5,prev_log_number is 0,max_column_family is 11,min_log_number_to_keep is 5
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: [db/version_set.cc:5581] Column family [default] (ID 0), log number is 5
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: [db/version_set.cc:5581] Column family [m-0] (ID 1), log number is 5
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: [db/version_set.cc:5581] Column family [m-1] (ID 2), log number is 5
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: [db/version_set.cc:5581] Column family [m-2] (ID 3), log number is 5
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: [db/version_set.cc:5581] Column family [p-0] (ID 4), log number is 5
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: [db/version_set.cc:5581] Column family [p-1] (ID 5), log number is 5
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: [db/version_set.cc:5581] Column family [p-2] (ID 6), log number is 5
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: [db/version_set.cc:5581] Column family [O-0] (ID 7), log number is 5
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: [db/version_set.cc:5581] Column family [O-1] (ID 8), log number is 5
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: [db/version_set.cc:5581] Column family [O-2] (ID 9), log number is 5
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: [db/version_set.cc:5581] Column family [L] (ID 10), log number is 5
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: [db/version_set.cc:5581] Column family [P] (ID 11), log number is 5
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: [db/db_impl/db_impl_open.cc:539] DB ID: b15bed18-a53e-4ccb-bbb2-c9066cfaeff1
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: EVENT_LOG_v1 {"time_micros": 1770025148775268, "job": 1, "event": "recovery_started", "wal_files": [31]}
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: [db/db_impl/db_impl_open.cc:1043] Recovering log #31 mode 2
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: EVENT_LOG_v1 {"time_micros": 1770025148780566, "cf_name": "default", "job": 1, "event": "table_file_creation", "file_number": 35, "file_size": 1272, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 13, "largest_seqno": 21, "table_properties": {"data_size": 128, "index_size": 27, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 69, "raw_key_size": 87, "raw_average_key_size": 17, "raw_value_size": 82, "raw_average_value_size": 16, "num_data_blocks": 1, "num_entries": 5, "num_filter_entries": 5, "num_deletions": 0, "num_merge_operands": 2, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": ".T:int64_array.b:bitwise_xor", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "LZ4", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1770025148, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "b15bed18-a53e-4ccb-bbb2-c9066cfaeff1", "db_session_id": "URAREYQTFRBAH0CRPE4V", "orig_file_number": 35, "seqno_to_time_mapping": "N/A"}}
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: EVENT_LOG_v1 {"time_micros": 1770025148787062, "cf_name": "p-0", "job": 1, "event": "table_file_creation", "file_number": 36, "file_size": 1594, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 14, "largest_seqno": 15, "table_properties": {"data_size": 468, "index_size": 39, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 69, "raw_key_size": 72, "raw_average_key_size": 36, "raw_value_size": 571, "raw_average_value_size": 285, "num_data_blocks": 1, "num_entries": 2, "num_filter_entries": 2, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "p-0", "column_family_id": 4, "comparator": "leveldb.BytewiseComparator", "merge_operator": "nullptr", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "LZ4", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1770025148, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "b15bed18-a53e-4ccb-bbb2-c9066cfaeff1", "db_session_id": "URAREYQTFRBAH0CRPE4V", "orig_file_number": 36, "seqno_to_time_mapping": "N/A"}}
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: EVENT_LOG_v1 {"time_micros": 1770025148789913, "cf_name": "O-2", "job": 1, "event": "table_file_creation", "file_number": 37, "file_size": 1275, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 16, "largest_seqno": 16, "table_properties": {"data_size": 121, "index_size": 64, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 69, "raw_key_size": 55, "raw_average_key_size": 55, "raw_value_size": 50, "raw_average_value_size": 50, "num_data_blocks": 1, "num_entries": 1, "num_filter_entries": 1, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "O-2", "column_family_id": 9, "comparator": "leveldb.BytewiseComparator", "merge_operator": "nullptr", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "LZ4", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1770025148, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "b15bed18-a53e-4ccb-bbb2-c9066cfaeff1", "db_session_id": "URAREYQTFRBAH0CRPE4V", "orig_file_number": 37, "seqno_to_time_mapping": "N/A"}}
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: EVENT_LOG_v1 {"time_micros": 1770025148791224, "job": 1, "event": "recovery_finished"}
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: [db/version_set.cc:5047] Creating manifest 40
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: [db/db_impl/db_impl_open.cc:1987] SstFileManager instance 0x5616dfa14000
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: DB pointer 0x5616df9d0000
Feb 02 09:39:08 compute-1 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _open_db opened rocksdb path db options compression=kLZ4Compression,max_write_buffer_number=64,min_write_buffer_number_to_merge=6,compaction_style=kCompactionStyleLevel,write_buffer_size=16777216,max_background_jobs=4,level0_file_num_compaction_trigger=8,max_bytes_for_level_base=1073741824,max_bytes_for_level_multiplier=8,compaction_readahead_size=2MB,max_total_wal_size=1073741824,writable_file_max_buffer_size=0
Feb 02 09:39:08 compute-1 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _upgrade_super from 4, latest 4
Feb 02 09:39:08 compute-1 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _upgrade_super done
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Feb 02 09:39:08 compute-1 ceph-osd[77691]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                           ** DB Stats **
                                           Uptime(secs): 0.1 total, 0.1 interval
                                           Cumulative writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 GB, 0.00 MB/s
                                           Cumulative WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s
                                           Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                           Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s
                                           Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s
                                           Interval stall: 00:00:0.000 H:M:S, 0.0 percent
                                           
                                           ** Compaction Stats [default] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      2/0    2.63 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.2      0.01              0.00         1    0.005       0      0       0.0       0.0
                                            Sum      2/0    2.63 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.2      0.01              0.00         1    0.005       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.2      0.01              0.00         1    0.005       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [default] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.2      0.01              0.00         1    0.005       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 0.1 total, 0.1 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.02 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.02 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x5616dea3d350#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 4.8e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [default] **
                                           
                                           ** Compaction Stats [m-0] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [m-0] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 0.1 total, 0.1 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x5616dea3d350#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 4.8e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [m-0] **
                                           
                                           ** Compaction Stats [m-1] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [m-1] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 0.1 total, 0.1 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x5616dea3d350#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 4.8e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [m-1] **
                                           
                                           ** Compaction Stats [m-2] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [m-2] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 0.1 total, 0.1 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x5616dea3d350#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 4.8e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [m-2] **
                                           
                                           ** Compaction Stats [p-0] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      1/0    1.56 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.2      0.01              0.00         1    0.007       0      0       0.0       0.0
                                            Sum      1/0    1.56 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.2      0.01              0.00         1    0.007       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.2      0.01              0.00         1    0.007       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [p-0] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.2      0.01              0.00         1    0.007       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 0.1 total, 0.1 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.03 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.03 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x5616dea3d350#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 4.8e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [p-0] **
                                           
                                           ** Compaction Stats [p-1] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [p-1] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 0.1 total, 0.1 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x5616dea3d350#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 4.8e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [p-1] **
                                           
                                           ** Compaction Stats [p-2] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [p-2] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 0.1 total, 0.1 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x5616dea3d350#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 4.8e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [p-2] **
                                           
                                           ** Compaction Stats [O-0] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [O-0] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 0.1 total, 0.1 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x5616dea3c9b0#2 capacity: 512.00 MB usage: 0.25 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 2 last_secs: 1e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): FilterBlock(1,0.11 KB,2.08616e-05%) IndexBlock(1,0.14 KB,2.68221e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [O-0] **
                                           
                                           ** Compaction Stats [O-1] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [O-1] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 0.1 total, 0.1 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x5616dea3c9b0#2 capacity: 512.00 MB usage: 0.25 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 2 last_secs: 1e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): FilterBlock(1,0.11 KB,2.08616e-05%) IndexBlock(1,0.14 KB,2.68221e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [O-1] **
                                           
                                           ** Compaction Stats [O-2] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      1/0    1.25 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.4      0.00              0.00         1    0.003       0      0       0.0       0.0
                                            Sum      1/0    1.25 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.4      0.00              0.00         1    0.003       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.4      0.00              0.00         1    0.003       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [O-2] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.4      0.00              0.00         1    0.003       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 0.1 total, 0.1 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.02 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.02 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x5616dea3c9b0#2 capacity: 512.00 MB usage: 0.25 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 2 last_secs: 1e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): FilterBlock(1,0.11 KB,2.08616e-05%) IndexBlock(1,0.14 KB,2.68221e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [O-2] **
                                           
                                           ** Compaction Stats [L] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.001       0      0       0.0       0.0
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.001       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.001       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [L] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.001       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 0.1 total, 0.1 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x5616dea3d350#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 4.8e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [L] **
                                           
                                           ** Compaction Stats [P] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [P] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 0.1 total, 0.1 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x5616dea3d350#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 4.8e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [P] **
Feb 02 09:39:08 compute-1 ceph-osd[77691]: <cls> /home/jenkins-build/build/workspace/ceph-build/ARCH/x86_64/AVAILABLE_ARCH/x86_64/AVAILABLE_DIST/centos9/DIST/centos9/MACHINE_SIZE/gigantic/release/19.2.3/rpm/el9/BUILD/ceph-19.2.3/src/cls/cephfs/cls_cephfs.cc:201: loading cephfs
Feb 02 09:39:08 compute-1 ceph-osd[77691]: <cls> /home/jenkins-build/build/workspace/ceph-build/ARCH/x86_64/AVAILABLE_ARCH/x86_64/AVAILABLE_DIST/centos9/DIST/centos9/MACHINE_SIZE/gigantic/release/19.2.3/rpm/el9/BUILD/ceph-19.2.3/src/cls/hello/cls_hello.cc:316: loading cls_hello
Feb 02 09:39:08 compute-1 ceph-osd[77691]: _get_class not permitted to load lua
Feb 02 09:39:08 compute-1 ceph-osd[77691]: _get_class not permitted to load sdk
Feb 02 09:39:08 compute-1 ceph-osd[77691]: osd.0 0 crush map has features 288232575208783872, adjusting msgr requires for clients
Feb 02 09:39:08 compute-1 ceph-osd[77691]: osd.0 0 crush map has features 288232575208783872 was 8705, adjusting msgr requires for mons
Feb 02 09:39:08 compute-1 ceph-osd[77691]: osd.0 0 crush map has features 288232575208783872, adjusting msgr requires for osds
Feb 02 09:39:08 compute-1 ceph-osd[77691]: osd.0 0 check_osdmap_features enabling on-disk ERASURE CODES compat feature
Feb 02 09:39:08 compute-1 ceph-osd[77691]: osd.0 0 load_pgs
Feb 02 09:39:08 compute-1 ceph-osd[77691]: osd.0 0 load_pgs opened 0 pgs
Feb 02 09:39:08 compute-1 ceph-osd[77691]: osd.0 0 log_to_monitors true
Feb 02 09:39:08 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-osd-0[77687]: 2026-02-02T09:39:08.813+0000 7fb40eadf740 -1 osd.0 0 log_to_monitors true
Feb 02 09:39:09 compute-1 sudo[78377]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 02 09:39:09 compute-1 sudo[78377]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:39:09 compute-1 sudo[78377]: pam_unix(sudo:session): session closed for user root
Feb 02 09:39:09 compute-1 sudo[78402]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 02 09:39:09 compute-1 sudo[78402]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:39:09 compute-1 sudo[78402]: pam_unix(sudo:session): session closed for user root
Feb 02 09:39:09 compute-1 ceph-osd[77691]: log_channel(cluster) log [DBG] : purged_snaps scrub starts
Feb 02 09:39:09 compute-1 ceph-osd[77691]: log_channel(cluster) log [DBG] : purged_snaps scrub ok
Feb 02 09:39:09 compute-1 sudo[78427]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/d241d473-9fcb-5f74-b163-f1ca4454e7f1/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --image quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec --timeout 895 ls
Feb 02 09:39:09 compute-1 sudo[78427]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:39:10 compute-1 podman[78526]: 2026-02-02 09:39:10.284630501 +0000 UTC m=+0.072379870 container exec 01cf0f34952fee77036b3444f3bcffc4063a933b24861a159799fb77cf2e6e0e (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-crash-compute-1, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, OSD_FLAVOR=default, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 02 09:39:10 compute-1 podman[78526]: 2026-02-02 09:39:10.395582536 +0000 UTC m=+0.183331895 container exec_died 01cf0f34952fee77036b3444f3bcffc4063a933b24861a159799fb77cf2e6e0e (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-crash-compute-1, org.label-schema.build-date=20250325, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2)
Feb 02 09:39:10 compute-1 sudo[78427]: pam_unix(sudo:session): session closed for user root
Feb 02 09:39:10 compute-1 ceph-osd[77691]: osd.0 0 done with init, starting boot process
Feb 02 09:39:10 compute-1 ceph-osd[77691]: osd.0 0 start_boot
Feb 02 09:39:10 compute-1 ceph-osd[77691]: osd.0 0 maybe_override_options_for_qos osd_max_backfills set to 1
Feb 02 09:39:10 compute-1 ceph-osd[77691]: osd.0 0 maybe_override_options_for_qos osd_recovery_max_active set to 0
Feb 02 09:39:10 compute-1 ceph-osd[77691]: osd.0 0 maybe_override_options_for_qos osd_recovery_max_active_hdd set to 3
Feb 02 09:39:10 compute-1 ceph-osd[77691]: osd.0 0 maybe_override_options_for_qos osd_recovery_max_active_ssd set to 10
Feb 02 09:39:10 compute-1 ceph-osd[77691]: osd.0 0  bench count 12288000 bsize 4 KiB
Feb 02 09:39:10 compute-1 sudo[78576]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 02 09:39:10 compute-1 sudo[78576]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:39:10 compute-1 sudo[78576]: pam_unix(sudo:session): session closed for user root
Feb 02 09:39:10 compute-1 sudo[78601]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/d241d473-9fcb-5f74-b163-f1ca4454e7f1/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --image quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec --timeout 895 ceph-volume --fsid d241d473-9fcb-5f74-b163-f1ca4454e7f1 -- inventory --format=json-pretty --filter-for-batch
Feb 02 09:39:10 compute-1 sudo[78601]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:39:11 compute-1 podman[78666]: 2026-02-02 09:39:11.078422972 +0000 UTC m=+0.056109962 container create 271d2edad702722aff0bb79c60089f933191502429efadb527bbf44cf5e3694e (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=blissful_cannon, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Feb 02 09:39:11 compute-1 systemd[1]: Started libpod-conmon-271d2edad702722aff0bb79c60089f933191502429efadb527bbf44cf5e3694e.scope.
Feb 02 09:39:11 compute-1 podman[78666]: 2026-02-02 09:39:11.0414428 +0000 UTC m=+0.019129780 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Feb 02 09:39:11 compute-1 systemd[1]: Started libcrun container.
Feb 02 09:39:11 compute-1 podman[78666]: 2026-02-02 09:39:11.168421897 +0000 UTC m=+0.146108887 container init 271d2edad702722aff0bb79c60089f933191502429efadb527bbf44cf5e3694e (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=blissful_cannon, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.build-date=20250325, OSD_FLAVOR=default, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Feb 02 09:39:11 compute-1 podman[78666]: 2026-02-02 09:39:11.17704501 +0000 UTC m=+0.154731970 container start 271d2edad702722aff0bb79c60089f933191502429efadb527bbf44cf5e3694e (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=blissful_cannon, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325, OSD_FLAVOR=default, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True)
Feb 02 09:39:11 compute-1 blissful_cannon[78682]: 167 167
Feb 02 09:39:11 compute-1 systemd[1]: libpod-271d2edad702722aff0bb79c60089f933191502429efadb527bbf44cf5e3694e.scope: Deactivated successfully.
Feb 02 09:39:11 compute-1 podman[78666]: 2026-02-02 09:39:11.186358363 +0000 UTC m=+0.164045323 container attach 271d2edad702722aff0bb79c60089f933191502429efadb527bbf44cf5e3694e (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=blissful_cannon, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, io.buildah.version=1.40.1, OSD_FLAVOR=default)
Feb 02 09:39:11 compute-1 podman[78666]: 2026-02-02 09:39:11.186700172 +0000 UTC m=+0.164387162 container died 271d2edad702722aff0bb79c60089f933191502429efadb527bbf44cf5e3694e (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=blissful_cannon, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2)
Feb 02 09:39:11 compute-1 systemd[1]: var-lib-containers-storage-overlay-9b61ab76e6f878e3f1c921f127571566dde26f12639941dbf85293ca7a6aecdd-merged.mount: Deactivated successfully.
Feb 02 09:39:11 compute-1 podman[78666]: 2026-02-02 09:39:11.285260859 +0000 UTC m=+0.262947819 container remove 271d2edad702722aff0bb79c60089f933191502429efadb527bbf44cf5e3694e (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=blissful_cannon, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.40.1, ceph=True, OSD_FLAVOR=default, org.label-schema.schema-version=1.0)
Feb 02 09:39:11 compute-1 systemd[1]: libpod-conmon-271d2edad702722aff0bb79c60089f933191502429efadb527bbf44cf5e3694e.scope: Deactivated successfully.
Feb 02 09:39:11 compute-1 podman[78705]: 2026-02-02 09:39:11.382724634 +0000 UTC m=+0.024364547 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Feb 02 09:39:11 compute-1 podman[78705]: 2026-02-02 09:39:11.507930062 +0000 UTC m=+0.149569955 container create 5d10fcc12ec4d1d33581e926ddd2b768c894d18a6ab9c67603458950093f7df2 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=naughty_tu, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, CEPH_REF=squid, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.40.1)
Feb 02 09:39:11 compute-1 systemd[1]: Started libpod-conmon-5d10fcc12ec4d1d33581e926ddd2b768c894d18a6ab9c67603458950093f7df2.scope.
Feb 02 09:39:11 compute-1 systemd[1]: Started libcrun container.
Feb 02 09:39:11 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6d1b0af21475f5042f80d3713eb11a421a51ba53ba588a31103d8e278c92c5ca/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Feb 02 09:39:11 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6d1b0af21475f5042f80d3713eb11a421a51ba53ba588a31103d8e278c92c5ca/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Feb 02 09:39:11 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6d1b0af21475f5042f80d3713eb11a421a51ba53ba588a31103d8e278c92c5ca/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Feb 02 09:39:11 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6d1b0af21475f5042f80d3713eb11a421a51ba53ba588a31103d8e278c92c5ca/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Feb 02 09:39:11 compute-1 podman[78705]: 2026-02-02 09:39:11.818591613 +0000 UTC m=+0.460231536 container init 5d10fcc12ec4d1d33581e926ddd2b768c894d18a6ab9c67603458950093f7df2 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=naughty_tu, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.label-schema.build-date=20250325, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1)
Feb 02 09:39:11 compute-1 podman[78705]: 2026-02-02 09:39:11.827602037 +0000 UTC m=+0.469241970 container start 5d10fcc12ec4d1d33581e926ddd2b768c894d18a6ab9c67603458950093f7df2 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=naughty_tu, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, org.label-schema.license=GPLv2, org.label-schema.build-date=20250325, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Feb 02 09:39:12 compute-1 podman[78705]: 2026-02-02 09:39:12.023404973 +0000 UTC m=+0.665044876 container attach 5d10fcc12ec4d1d33581e926ddd2b768c894d18a6ab9c67603458950093f7df2 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=naughty_tu, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, CEPH_REF=squid, ceph=True)
Feb 02 09:39:12 compute-1 naughty_tu[78723]: [
Feb 02 09:39:12 compute-1 naughty_tu[78723]:     {
Feb 02 09:39:12 compute-1 naughty_tu[78723]:         "available": false,
Feb 02 09:39:12 compute-1 naughty_tu[78723]:         "being_replaced": false,
Feb 02 09:39:12 compute-1 naughty_tu[78723]:         "ceph_device_lvm": false,
Feb 02 09:39:12 compute-1 naughty_tu[78723]:         "device_id": "QEMU_DVD-ROM_QM00001",
Feb 02 09:39:12 compute-1 naughty_tu[78723]:         "lsm_data": {},
Feb 02 09:39:12 compute-1 naughty_tu[78723]:         "lvs": [],
Feb 02 09:39:12 compute-1 naughty_tu[78723]:         "path": "/dev/sr0",
Feb 02 09:39:12 compute-1 naughty_tu[78723]:         "rejected_reasons": [
Feb 02 09:39:12 compute-1 naughty_tu[78723]:             "Insufficient space (<5GB)",
Feb 02 09:39:12 compute-1 naughty_tu[78723]:             "Has a FileSystem"
Feb 02 09:39:12 compute-1 naughty_tu[78723]:         ],
Feb 02 09:39:12 compute-1 naughty_tu[78723]:         "sys_api": {
Feb 02 09:39:12 compute-1 naughty_tu[78723]:             "actuators": null,
Feb 02 09:39:12 compute-1 naughty_tu[78723]:             "device_nodes": [
Feb 02 09:39:12 compute-1 naughty_tu[78723]:                 "sr0"
Feb 02 09:39:12 compute-1 naughty_tu[78723]:             ],
Feb 02 09:39:12 compute-1 naughty_tu[78723]:             "devname": "sr0",
Feb 02 09:39:12 compute-1 naughty_tu[78723]:             "human_readable_size": "482.00 KB",
Feb 02 09:39:12 compute-1 naughty_tu[78723]:             "id_bus": "ata",
Feb 02 09:39:12 compute-1 naughty_tu[78723]:             "model": "QEMU DVD-ROM",
Feb 02 09:39:12 compute-1 naughty_tu[78723]:             "nr_requests": "2",
Feb 02 09:39:12 compute-1 naughty_tu[78723]:             "parent": "/dev/sr0",
Feb 02 09:39:12 compute-1 naughty_tu[78723]:             "partitions": {},
Feb 02 09:39:12 compute-1 naughty_tu[78723]:             "path": "/dev/sr0",
Feb 02 09:39:12 compute-1 naughty_tu[78723]:             "removable": "1",
Feb 02 09:39:12 compute-1 naughty_tu[78723]:             "rev": "2.5+",
Feb 02 09:39:12 compute-1 naughty_tu[78723]:             "ro": "0",
Feb 02 09:39:12 compute-1 naughty_tu[78723]:             "rotational": "1",
Feb 02 09:39:12 compute-1 naughty_tu[78723]:             "sas_address": "",
Feb 02 09:39:12 compute-1 naughty_tu[78723]:             "sas_device_handle": "",
Feb 02 09:39:12 compute-1 naughty_tu[78723]:             "scheduler_mode": "mq-deadline",
Feb 02 09:39:12 compute-1 naughty_tu[78723]:             "sectors": 0,
Feb 02 09:39:12 compute-1 naughty_tu[78723]:             "sectorsize": "2048",
Feb 02 09:39:12 compute-1 naughty_tu[78723]:             "size": 493568.0,
Feb 02 09:39:12 compute-1 naughty_tu[78723]:             "support_discard": "2048",
Feb 02 09:39:12 compute-1 naughty_tu[78723]:             "type": "disk",
Feb 02 09:39:12 compute-1 naughty_tu[78723]:             "vendor": "QEMU"
Feb 02 09:39:12 compute-1 naughty_tu[78723]:         }
Feb 02 09:39:12 compute-1 naughty_tu[78723]:     }
Feb 02 09:39:12 compute-1 naughty_tu[78723]: ]
Feb 02 09:39:12 compute-1 systemd[1]: libpod-5d10fcc12ec4d1d33581e926ddd2b768c894d18a6ab9c67603458950093f7df2.scope: Deactivated successfully.
Feb 02 09:39:12 compute-1 podman[78705]: 2026-02-02 09:39:12.522470171 +0000 UTC m=+1.164110064 container died 5d10fcc12ec4d1d33581e926ddd2b768c894d18a6ab9c67603458950093f7df2 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=naughty_tu, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, ceph=True, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325)
Feb 02 09:39:12 compute-1 systemd[1]: var-lib-containers-storage-overlay-6d1b0af21475f5042f80d3713eb11a421a51ba53ba588a31103d8e278c92c5ca-merged.mount: Deactivated successfully.
Feb 02 09:39:12 compute-1 podman[78705]: 2026-02-02 09:39:12.610944563 +0000 UTC m=+1.252584486 container remove 5d10fcc12ec4d1d33581e926ddd2b768c894d18a6ab9c67603458950093f7df2 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=naughty_tu, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, CEPH_REF=squid, ceph=True, org.label-schema.license=GPLv2)
Feb 02 09:39:12 compute-1 systemd[1]: libpod-conmon-5d10fcc12ec4d1d33581e926ddd2b768c894d18a6ab9c67603458950093f7df2.scope: Deactivated successfully.
Feb 02 09:39:12 compute-1 sudo[78601]: pam_unix(sudo:session): session closed for user root
Feb 02 09:39:14 compute-1 ceph-osd[77691]: osd.0 0 maybe_override_max_osd_capacity_for_qos osd bench result - bandwidth (MiB/sec): 19.526 iops: 4998.717 elapsed_sec: 0.600
Feb 02 09:39:14 compute-1 ceph-osd[77691]: log_channel(cluster) log [WRN] : OSD bench result of 4998.717013 IOPS is not within the threshold limit range of 50.000000 IOPS and 500.000000 IOPS for osd.0. IOPS capacity is unchanged at 315.000000 IOPS. The recommendation is to establish the osd's IOPS capacity using other benchmark tools (e.g. Fio) and then override osd_mclock_max_capacity_iops_[hdd|ssd].
Feb 02 09:39:14 compute-1 ceph-osd[77691]: osd.0 0 waiting for initial osdmap
Feb 02 09:39:14 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-osd-0[77687]: 2026-02-02T09:39:14.712+0000 7fb40aa62640 -1 osd.0 0 waiting for initial osdmap
Feb 02 09:39:14 compute-1 ceph-osd[77691]: osd.0 9 crush map has features 288514050185494528, adjusting msgr requires for clients
Feb 02 09:39:14 compute-1 ceph-osd[77691]: osd.0 9 crush map has features 288514050185494528 was 288232575208792577, adjusting msgr requires for mons
Feb 02 09:39:14 compute-1 ceph-osd[77691]: osd.0 9 crush map has features 3314932999778484224, adjusting msgr requires for osds
Feb 02 09:39:14 compute-1 ceph-osd[77691]: osd.0 9 check_osdmap_features require_osd_release unknown -> squid
Feb 02 09:39:14 compute-1 ceph-osd[77691]: osd.0 9 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory
Feb 02 09:39:14 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-osd-0[77687]: 2026-02-02T09:39:14.743+0000 7fb40608a640 -1 osd.0 9 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory
Feb 02 09:39:14 compute-1 ceph-osd[77691]: osd.0 9 set_numa_affinity not setting numa affinity
Feb 02 09:39:14 compute-1 ceph-osd[77691]: osd.0 9 _collect_metadata loop3:  no unique device id for loop3: fallback method has no model nor serial no unique device path for loop3: no symlink to loop3 in /dev/disk/by-path
Feb 02 09:39:15 compute-1 ceph-osd[77691]: osd.0 9 tick checking mon for new map
Feb 02 09:39:15 compute-1 ceph-osd[77691]: osd.0 10 state: booting -> active
Feb 02 09:39:16 compute-1 ceph-osd[77691]: osd.0 11 crush map has features 288514051259236352, adjusting msgr requires for clients
Feb 02 09:39:16 compute-1 ceph-osd[77691]: osd.0 11 crush map has features 288514051259236352 was 288514050185503233, adjusting msgr requires for mons
Feb 02 09:39:16 compute-1 ceph-osd[77691]: osd.0 11 crush map has features 3314933000852226048, adjusting msgr requires for osds
Feb 02 09:39:33 compute-1 sudo[79793]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 02 09:39:33 compute-1 sudo[79793]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:39:33 compute-1 sudo[79793]: pam_unix(sudo:session): session closed for user root
Feb 02 09:39:33 compute-1 sudo[79818]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/d241d473-9fcb-5f74-b163-f1ca4454e7f1/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --image quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec --timeout 895 _orch deploy --fsid d241d473-9fcb-5f74-b163-f1ca4454e7f1
Feb 02 09:39:33 compute-1 sudo[79818]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:39:33 compute-1 podman[79883]: 2026-02-02 09:39:33.956260586 +0000 UTC m=+0.041663464 container create d460304fac7bf4e1fdca9fc6abfa20772da0554e789d82a703db095751be3596 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=wonderful_keller, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, ceph=True, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_REF=squid, io.buildah.version=1.40.1, FROM_IMAGE=quay.io/centos/centos:stream9)
Feb 02 09:39:33 compute-1 systemd[1]: Started libpod-conmon-d460304fac7bf4e1fdca9fc6abfa20772da0554e789d82a703db095751be3596.scope.
Feb 02 09:39:34 compute-1 systemd[1]: Started libcrun container.
Feb 02 09:39:34 compute-1 podman[79883]: 2026-02-02 09:39:33.937396541 +0000 UTC m=+0.022799439 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Feb 02 09:39:34 compute-1 podman[79883]: 2026-02-02 09:39:34.03367214 +0000 UTC m=+0.119075018 container init d460304fac7bf4e1fdca9fc6abfa20772da0554e789d82a703db095751be3596 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=wonderful_keller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS)
Feb 02 09:39:34 compute-1 podman[79883]: 2026-02-02 09:39:34.040041384 +0000 UTC m=+0.125444272 container start d460304fac7bf4e1fdca9fc6abfa20772da0554e789d82a703db095751be3596 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=wonderful_keller, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, CEPH_REF=squid, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True)
Feb 02 09:39:34 compute-1 podman[79883]: 2026-02-02 09:39:34.042837426 +0000 UTC m=+0.128240304 container attach d460304fac7bf4e1fdca9fc6abfa20772da0554e789d82a703db095751be3596 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=wonderful_keller, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, CEPH_REF=squid, org.label-schema.schema-version=1.0, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, io.buildah.version=1.40.1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS)
Feb 02 09:39:34 compute-1 wonderful_keller[79899]: 167 167
Feb 02 09:39:34 compute-1 systemd[1]: libpod-d460304fac7bf4e1fdca9fc6abfa20772da0554e789d82a703db095751be3596.scope: Deactivated successfully.
Feb 02 09:39:34 compute-1 podman[79883]: 2026-02-02 09:39:34.044946181 +0000 UTC m=+0.130349059 container died d460304fac7bf4e1fdca9fc6abfa20772da0554e789d82a703db095751be3596 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=wonderful_keller, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, CEPH_REF=squid, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Feb 02 09:39:34 compute-1 systemd[1]: var-lib-containers-storage-overlay-e47b27560c9f1de2943fda8b54a351f70eb847a54cac1705d764496bc137908e-merged.mount: Deactivated successfully.
Feb 02 09:39:34 compute-1 podman[79883]: 2026-02-02 09:39:34.083854023 +0000 UTC m=+0.169256901 container remove d460304fac7bf4e1fdca9fc6abfa20772da0554e789d82a703db095751be3596 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=wonderful_keller, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Feb 02 09:39:34 compute-1 systemd[1]: libpod-conmon-d460304fac7bf4e1fdca9fc6abfa20772da0554e789d82a703db095751be3596.scope: Deactivated successfully.
Feb 02 09:39:34 compute-1 podman[79917]: 2026-02-02 09:39:34.156773571 +0000 UTC m=+0.050490372 container create ddc2e2ce4b98f06f54b452491dc30603c6b0ac61197c638d97946e87ab92d04a (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=intelligent_benz, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325, io.buildah.version=1.40.1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/)
Feb 02 09:39:34 compute-1 systemd[1]: Started libpod-conmon-ddc2e2ce4b98f06f54b452491dc30603c6b0ac61197c638d97946e87ab92d04a.scope.
Feb 02 09:39:34 compute-1 systemd[1]: Started libcrun container.
Feb 02 09:39:34 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d419a8073852563555ed78329b04ef7e1da550e13c8f5c0b2f85e031f2767cef/merged/tmp/config supports timestamps until 2038 (0x7fffffff)
Feb 02 09:39:34 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d419a8073852563555ed78329b04ef7e1da550e13c8f5c0b2f85e031f2767cef/merged/tmp/keyring supports timestamps until 2038 (0x7fffffff)
Feb 02 09:39:34 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d419a8073852563555ed78329b04ef7e1da550e13c8f5c0b2f85e031f2767cef/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Feb 02 09:39:34 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d419a8073852563555ed78329b04ef7e1da550e13c8f5c0b2f85e031f2767cef/merged/var/lib/ceph/mon/ceph-compute-1 supports timestamps until 2038 (0x7fffffff)
Feb 02 09:39:34 compute-1 podman[79917]: 2026-02-02 09:39:34.228649462 +0000 UTC m=+0.122366283 container init ddc2e2ce4b98f06f54b452491dc30603c6b0ac61197c638d97946e87ab92d04a (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=intelligent_benz, io.buildah.version=1.40.1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.schema-version=1.0, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, OSD_FLAVOR=default, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.documentation=https://docs.ceph.com/)
Feb 02 09:39:34 compute-1 podman[79917]: 2026-02-02 09:39:34.138577242 +0000 UTC m=+0.032294043 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Feb 02 09:39:34 compute-1 podman[79917]: 2026-02-02 09:39:34.238005333 +0000 UTC m=+0.131722134 container start ddc2e2ce4b98f06f54b452491dc30603c6b0ac61197c638d97946e87ab92d04a (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=intelligent_benz, org.label-schema.build-date=20250325, ceph=True, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, OSD_FLAVOR=default, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Feb 02 09:39:34 compute-1 podman[79917]: 2026-02-02 09:39:34.241546314 +0000 UTC m=+0.135263085 container attach ddc2e2ce4b98f06f54b452491dc30603c6b0ac61197c638d97946e87ab92d04a (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=intelligent_benz, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, CEPH_REF=squid, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325)
Feb 02 09:39:34 compute-1 systemd[1]: libpod-ddc2e2ce4b98f06f54b452491dc30603c6b0ac61197c638d97946e87ab92d04a.scope: Deactivated successfully.
Feb 02 09:39:34 compute-1 conmon[79932]: conmon ddc2e2ce4b98f06f54b4 <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-ddc2e2ce4b98f06f54b452491dc30603c6b0ac61197c638d97946e87ab92d04a.scope/container/memory.events
Feb 02 09:39:34 compute-1 podman[79917]: 2026-02-02 09:39:34.33999999 +0000 UTC m=+0.233716761 container died ddc2e2ce4b98f06f54b452491dc30603c6b0ac61197c638d97946e87ab92d04a (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=intelligent_benz, io.buildah.version=1.40.1, CEPH_REF=squid, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325)
Feb 02 09:39:34 compute-1 systemd[1]: var-lib-containers-storage-overlay-d419a8073852563555ed78329b04ef7e1da550e13c8f5c0b2f85e031f2767cef-merged.mount: Deactivated successfully.
Feb 02 09:39:34 compute-1 podman[79917]: 2026-02-02 09:39:34.370359242 +0000 UTC m=+0.264076063 container remove ddc2e2ce4b98f06f54b452491dc30603c6b0ac61197c638d97946e87ab92d04a (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=intelligent_benz, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.schema-version=1.0, io.buildah.version=1.40.1, FROM_IMAGE=quay.io/centos/centos:stream9)
Feb 02 09:39:34 compute-1 systemd[1]: libpod-conmon-ddc2e2ce4b98f06f54b452491dc30603c6b0ac61197c638d97946e87ab92d04a.scope: Deactivated successfully.
Feb 02 09:39:34 compute-1 systemd[1]: Reloading.
Feb 02 09:39:34 compute-1 systemd-rc-local-generator[79990]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 02 09:39:34 compute-1 systemd-sysv-generator[79997]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 02 09:39:34 compute-1 systemd[1]: Reloading.
Feb 02 09:39:34 compute-1 systemd-sysv-generator[80041]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 02 09:39:34 compute-1 systemd-rc-local-generator[80037]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 02 09:39:34 compute-1 systemd[1]: Starting Ceph mon.compute-1 for d241d473-9fcb-5f74-b163-f1ca4454e7f1...
Feb 02 09:39:35 compute-1 podman[80095]: 2026-02-02 09:39:35.085332847 +0000 UTC m=+0.053927470 container create 4fad2af3fdacb89ebd4fdf531ec7dcca4c4e2060f02fe02ce9a8fbecdbfd8229 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mon-compute-1, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, org.label-schema.build-date=20250325, org.opencontainers.image.documentation=https://docs.ceph.com/)
Feb 02 09:39:35 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1276f2d60fbd2a1ce412bd995784b0a7c65ba46366e9b517ec41454855023823/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Feb 02 09:39:35 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1276f2d60fbd2a1ce412bd995784b0a7c65ba46366e9b517ec41454855023823/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Feb 02 09:39:35 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1276f2d60fbd2a1ce412bd995784b0a7c65ba46366e9b517ec41454855023823/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Feb 02 09:39:35 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1276f2d60fbd2a1ce412bd995784b0a7c65ba46366e9b517ec41454855023823/merged/var/lib/ceph/mon/ceph-compute-1 supports timestamps until 2038 (0x7fffffff)
Feb 02 09:39:35 compute-1 podman[80095]: 2026-02-02 09:39:35.148476144 +0000 UTC m=+0.117070737 container init 4fad2af3fdacb89ebd4fdf531ec7dcca4c4e2060f02fe02ce9a8fbecdbfd8229 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mon-compute-1, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, io.buildah.version=1.40.1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Feb 02 09:39:35 compute-1 podman[80095]: 2026-02-02 09:39:35.060795115 +0000 UTC m=+0.029389788 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Feb 02 09:39:35 compute-1 podman[80095]: 2026-02-02 09:39:35.155540156 +0000 UTC m=+0.124134749 container start 4fad2af3fdacb89ebd4fdf531ec7dcca4c4e2060f02fe02ce9a8fbecdbfd8229 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mon-compute-1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1)
Feb 02 09:39:35 compute-1 bash[80095]: 4fad2af3fdacb89ebd4fdf531ec7dcca4c4e2060f02fe02ce9a8fbecdbfd8229
Feb 02 09:39:35 compute-1 systemd[1]: Started Ceph mon.compute-1 for d241d473-9fcb-5f74-b163-f1ca4454e7f1.
Feb 02 09:39:35 compute-1 ceph-mon[80115]: set uid:gid to 167:167 (ceph:ceph)
Feb 02 09:39:35 compute-1 ceph-mon[80115]: ceph version 19.2.3 (c92aebb279828e9c3c1f5d24613efca272649e62) squid (stable), process ceph-mon, pid 2
Feb 02 09:39:35 compute-1 ceph-mon[80115]: pidfile_write: ignore empty --pid-file
Feb 02 09:39:35 compute-1 sudo[79818]: pam_unix(sudo:session): session closed for user root
Feb 02 09:39:35 compute-1 ceph-mon[80115]: load: jerasure load: lrc 
Feb 02 09:39:35 compute-1 ceph-mon[80115]: rocksdb: RocksDB version: 7.9.2
Feb 02 09:39:35 compute-1 ceph-mon[80115]: rocksdb: Git sha 0
Feb 02 09:39:35 compute-1 ceph-mon[80115]: rocksdb: Compile date 2025-07-17 03:12:14
Feb 02 09:39:35 compute-1 ceph-mon[80115]: rocksdb: DB SUMMARY
Feb 02 09:39:35 compute-1 ceph-mon[80115]: rocksdb: DB Session ID:  DE871D21TSCUFP8UED8E
Feb 02 09:39:35 compute-1 ceph-mon[80115]: rocksdb: CURRENT file:  CURRENT
Feb 02 09:39:35 compute-1 ceph-mon[80115]: rocksdb: IDENTITY file:  IDENTITY
Feb 02 09:39:35 compute-1 ceph-mon[80115]: rocksdb: MANIFEST file:  MANIFEST-000005 size: 59 Bytes
Feb 02 09:39:35 compute-1 ceph-mon[80115]: rocksdb: SST files in /var/lib/ceph/mon/ceph-compute-1/store.db dir, Total Num: 0, files: 
Feb 02 09:39:35 compute-1 ceph-mon[80115]: rocksdb: Write Ahead Log file in /var/lib/ceph/mon/ceph-compute-1/store.db: 000004.log size: 511 ; 
Feb 02 09:39:35 compute-1 ceph-mon[80115]: rocksdb:                         Options.error_if_exists: 0
Feb 02 09:39:35 compute-1 ceph-mon[80115]: rocksdb:                       Options.create_if_missing: 0
Feb 02 09:39:35 compute-1 ceph-mon[80115]: rocksdb:                         Options.paranoid_checks: 1
Feb 02 09:39:35 compute-1 ceph-mon[80115]: rocksdb:             Options.flush_verify_memtable_count: 1
Feb 02 09:39:35 compute-1 ceph-mon[80115]: rocksdb:                               Options.track_and_verify_wals_in_manifest: 0
Feb 02 09:39:35 compute-1 ceph-mon[80115]: rocksdb:        Options.verify_sst_unique_id_in_manifest: 1
Feb 02 09:39:35 compute-1 ceph-mon[80115]: rocksdb:                                     Options.env: 0x55a64c64ec20
Feb 02 09:39:35 compute-1 ceph-mon[80115]: rocksdb:                                      Options.fs: PosixFileSystem
Feb 02 09:39:35 compute-1 ceph-mon[80115]: rocksdb:                                Options.info_log: 0x55a64de99a20
Feb 02 09:39:35 compute-1 ceph-mon[80115]: rocksdb:                Options.max_file_opening_threads: 16
Feb 02 09:39:35 compute-1 ceph-mon[80115]: rocksdb:                              Options.statistics: (nil)
Feb 02 09:39:35 compute-1 ceph-mon[80115]: rocksdb:                               Options.use_fsync: 0
Feb 02 09:39:35 compute-1 ceph-mon[80115]: rocksdb:                       Options.max_log_file_size: 0
Feb 02 09:39:35 compute-1 ceph-mon[80115]: rocksdb:                  Options.max_manifest_file_size: 1073741824
Feb 02 09:39:35 compute-1 ceph-mon[80115]: rocksdb:                   Options.log_file_time_to_roll: 0
Feb 02 09:39:35 compute-1 ceph-mon[80115]: rocksdb:                       Options.keep_log_file_num: 1000
Feb 02 09:39:35 compute-1 ceph-mon[80115]: rocksdb:                    Options.recycle_log_file_num: 0
Feb 02 09:39:35 compute-1 ceph-mon[80115]: rocksdb:                         Options.allow_fallocate: 1
Feb 02 09:39:35 compute-1 ceph-mon[80115]: rocksdb:                        Options.allow_mmap_reads: 0
Feb 02 09:39:35 compute-1 ceph-mon[80115]: rocksdb:                       Options.allow_mmap_writes: 0
Feb 02 09:39:35 compute-1 ceph-mon[80115]: rocksdb:                        Options.use_direct_reads: 0
Feb 02 09:39:35 compute-1 ceph-mon[80115]: rocksdb:                        Options.use_direct_io_for_flush_and_compaction: 0
Feb 02 09:39:35 compute-1 ceph-mon[80115]: rocksdb:          Options.create_missing_column_families: 0
Feb 02 09:39:35 compute-1 ceph-mon[80115]: rocksdb:                              Options.db_log_dir: 
Feb 02 09:39:35 compute-1 ceph-mon[80115]: rocksdb:                                 Options.wal_dir: 
Feb 02 09:39:35 compute-1 ceph-mon[80115]: rocksdb:                Options.table_cache_numshardbits: 6
Feb 02 09:39:35 compute-1 ceph-mon[80115]: rocksdb:                         Options.WAL_ttl_seconds: 0
Feb 02 09:39:35 compute-1 ceph-mon[80115]: rocksdb:                       Options.WAL_size_limit_MB: 0
Feb 02 09:39:35 compute-1 ceph-mon[80115]: rocksdb:                        Options.max_write_batch_group_size_bytes: 1048576
Feb 02 09:39:35 compute-1 ceph-mon[80115]: rocksdb:             Options.manifest_preallocation_size: 4194304
Feb 02 09:39:35 compute-1 ceph-mon[80115]: rocksdb:                     Options.is_fd_close_on_exec: 1
Feb 02 09:39:35 compute-1 ceph-mon[80115]: rocksdb:                   Options.advise_random_on_open: 1
Feb 02 09:39:35 compute-1 ceph-mon[80115]: rocksdb:                    Options.db_write_buffer_size: 0
Feb 02 09:39:35 compute-1 ceph-mon[80115]: rocksdb:                    Options.write_buffer_manager: 0x55a64de9d900
Feb 02 09:39:35 compute-1 ceph-mon[80115]: rocksdb:         Options.access_hint_on_compaction_start: 1
Feb 02 09:39:35 compute-1 ceph-mon[80115]: rocksdb:           Options.random_access_max_buffer_size: 1048576
Feb 02 09:39:35 compute-1 ceph-mon[80115]: rocksdb:                      Options.use_adaptive_mutex: 0
Feb 02 09:39:35 compute-1 ceph-mon[80115]: rocksdb:                            Options.rate_limiter: (nil)
Feb 02 09:39:35 compute-1 ceph-mon[80115]: rocksdb:     Options.sst_file_manager.rate_bytes_per_sec: 0
Feb 02 09:39:35 compute-1 ceph-mon[80115]: rocksdb:                       Options.wal_recovery_mode: 2
Feb 02 09:39:35 compute-1 ceph-mon[80115]: rocksdb:                  Options.enable_thread_tracking: 0
Feb 02 09:39:35 compute-1 ceph-mon[80115]: rocksdb:                  Options.enable_pipelined_write: 0
Feb 02 09:39:35 compute-1 ceph-mon[80115]: rocksdb:                  Options.unordered_write: 0
Feb 02 09:39:35 compute-1 ceph-mon[80115]: rocksdb:         Options.allow_concurrent_memtable_write: 1
Feb 02 09:39:35 compute-1 ceph-mon[80115]: rocksdb:      Options.enable_write_thread_adaptive_yield: 1
Feb 02 09:39:35 compute-1 ceph-mon[80115]: rocksdb:             Options.write_thread_max_yield_usec: 100
Feb 02 09:39:35 compute-1 ceph-mon[80115]: rocksdb:            Options.write_thread_slow_yield_usec: 3
Feb 02 09:39:35 compute-1 ceph-mon[80115]: rocksdb:                               Options.row_cache: None
Feb 02 09:39:35 compute-1 ceph-mon[80115]: rocksdb:                              Options.wal_filter: None
Feb 02 09:39:35 compute-1 ceph-mon[80115]: rocksdb:             Options.avoid_flush_during_recovery: 0
Feb 02 09:39:35 compute-1 ceph-mon[80115]: rocksdb:             Options.allow_ingest_behind: 0
Feb 02 09:39:35 compute-1 ceph-mon[80115]: rocksdb:             Options.two_write_queues: 0
Feb 02 09:39:35 compute-1 ceph-mon[80115]: rocksdb:             Options.manual_wal_flush: 0
Feb 02 09:39:35 compute-1 ceph-mon[80115]: rocksdb:             Options.wal_compression: 0
Feb 02 09:39:35 compute-1 ceph-mon[80115]: rocksdb:             Options.atomic_flush: 0
Feb 02 09:39:35 compute-1 ceph-mon[80115]: rocksdb:             Options.avoid_unnecessary_blocking_io: 0
Feb 02 09:39:35 compute-1 ceph-mon[80115]: rocksdb:                 Options.persist_stats_to_disk: 0
Feb 02 09:39:35 compute-1 ceph-mon[80115]: rocksdb:                 Options.write_dbid_to_manifest: 0
Feb 02 09:39:35 compute-1 ceph-mon[80115]: rocksdb:                 Options.log_readahead_size: 0
Feb 02 09:39:35 compute-1 ceph-mon[80115]: rocksdb:                 Options.file_checksum_gen_factory: Unknown
Feb 02 09:39:35 compute-1 ceph-mon[80115]: rocksdb:                 Options.best_efforts_recovery: 0
Feb 02 09:39:35 compute-1 ceph-mon[80115]: rocksdb:                Options.max_bgerror_resume_count: 2147483647
Feb 02 09:39:35 compute-1 ceph-mon[80115]: rocksdb:            Options.bgerror_resume_retry_interval: 1000000
Feb 02 09:39:35 compute-1 ceph-mon[80115]: rocksdb:             Options.allow_data_in_errors: 0
Feb 02 09:39:35 compute-1 ceph-mon[80115]: rocksdb:             Options.db_host_id: __hostname__
Feb 02 09:39:35 compute-1 ceph-mon[80115]: rocksdb:             Options.enforce_single_del_contracts: true
Feb 02 09:39:35 compute-1 ceph-mon[80115]: rocksdb:             Options.max_background_jobs: 2
Feb 02 09:39:35 compute-1 ceph-mon[80115]: rocksdb:             Options.max_background_compactions: -1
Feb 02 09:39:35 compute-1 ceph-mon[80115]: rocksdb:             Options.max_subcompactions: 1
Feb 02 09:39:35 compute-1 ceph-mon[80115]: rocksdb:             Options.avoid_flush_during_shutdown: 0
Feb 02 09:39:35 compute-1 ceph-mon[80115]: rocksdb:           Options.writable_file_max_buffer_size: 1048576
Feb 02 09:39:35 compute-1 ceph-mon[80115]: rocksdb:             Options.delayed_write_rate : 16777216
Feb 02 09:39:35 compute-1 ceph-mon[80115]: rocksdb:             Options.max_total_wal_size: 0
Feb 02 09:39:35 compute-1 ceph-mon[80115]: rocksdb:             Options.delete_obsolete_files_period_micros: 21600000000
Feb 02 09:39:35 compute-1 ceph-mon[80115]: rocksdb:                   Options.stats_dump_period_sec: 600
Feb 02 09:39:35 compute-1 ceph-mon[80115]: rocksdb:                 Options.stats_persist_period_sec: 600
Feb 02 09:39:35 compute-1 ceph-mon[80115]: rocksdb:                 Options.stats_history_buffer_size: 1048576
Feb 02 09:39:35 compute-1 ceph-mon[80115]: rocksdb:                          Options.max_open_files: -1
Feb 02 09:39:35 compute-1 ceph-mon[80115]: rocksdb:                          Options.bytes_per_sync: 0
Feb 02 09:39:35 compute-1 ceph-mon[80115]: rocksdb:                      Options.wal_bytes_per_sync: 0
Feb 02 09:39:35 compute-1 ceph-mon[80115]: rocksdb:                   Options.strict_bytes_per_sync: 0
Feb 02 09:39:35 compute-1 ceph-mon[80115]: rocksdb:       Options.compaction_readahead_size: 0
Feb 02 09:39:35 compute-1 ceph-mon[80115]: rocksdb:                  Options.max_background_flushes: -1
Feb 02 09:39:35 compute-1 ceph-mon[80115]: rocksdb: Compression algorithms supported:
Feb 02 09:39:35 compute-1 ceph-mon[80115]: rocksdb:         kZSTD supported: 0
Feb 02 09:39:35 compute-1 ceph-mon[80115]: rocksdb:         kXpressCompression supported: 0
Feb 02 09:39:35 compute-1 ceph-mon[80115]: rocksdb:         kBZip2Compression supported: 0
Feb 02 09:39:35 compute-1 ceph-mon[80115]: rocksdb:         kZSTDNotFinalCompression supported: 0
Feb 02 09:39:35 compute-1 ceph-mon[80115]: rocksdb:         kLZ4Compression supported: 1
Feb 02 09:39:35 compute-1 ceph-mon[80115]: rocksdb:         kZlibCompression supported: 1
Feb 02 09:39:35 compute-1 ceph-mon[80115]: rocksdb:         kLZ4HCCompression supported: 1
Feb 02 09:39:35 compute-1 ceph-mon[80115]: rocksdb:         kSnappyCompression supported: 1
Feb 02 09:39:35 compute-1 ceph-mon[80115]: rocksdb: Fast CRC32 supported: Supported on x86
Feb 02 09:39:35 compute-1 ceph-mon[80115]: rocksdb: DMutex implementation: pthread_mutex_t
Feb 02 09:39:35 compute-1 ceph-mon[80115]: rocksdb: [db/version_set.cc:5527] Recovering from manifest file: /var/lib/ceph/mon/ceph-compute-1/store.db/MANIFEST-000005
Feb 02 09:39:35 compute-1 ceph-mon[80115]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [default]:
Feb 02 09:39:35 compute-1 ceph-mon[80115]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Feb 02 09:39:35 compute-1 ceph-mon[80115]: rocksdb:           Options.merge_operator: 
Feb 02 09:39:35 compute-1 ceph-mon[80115]: rocksdb:        Options.compaction_filter: None
Feb 02 09:39:35 compute-1 ceph-mon[80115]: rocksdb:        Options.compaction_filter_factory: None
Feb 02 09:39:35 compute-1 ceph-mon[80115]: rocksdb:  Options.sst_partitioner_factory: None
Feb 02 09:39:35 compute-1 ceph-mon[80115]: rocksdb:         Options.memtable_factory: SkipListFactory
Feb 02 09:39:35 compute-1 ceph-mon[80115]: rocksdb:            Options.table_factory: BlockBasedTable
Feb 02 09:39:35 compute-1 ceph-mon[80115]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55a64de985c0)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x55a64debd350
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 536870912
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Feb 02 09:39:35 compute-1 ceph-mon[80115]: rocksdb:        Options.write_buffer_size: 33554432
Feb 02 09:39:35 compute-1 ceph-mon[80115]: rocksdb:  Options.max_write_buffer_number: 2
Feb 02 09:39:35 compute-1 ceph-mon[80115]: rocksdb:          Options.compression: NoCompression
Feb 02 09:39:35 compute-1 ceph-mon[80115]: rocksdb:                  Options.bottommost_compression: Disabled
Feb 02 09:39:35 compute-1 ceph-mon[80115]: rocksdb:       Options.prefix_extractor: nullptr
Feb 02 09:39:35 compute-1 ceph-mon[80115]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Feb 02 09:39:35 compute-1 ceph-mon[80115]: rocksdb:             Options.num_levels: 7
Feb 02 09:39:35 compute-1 ceph-mon[80115]: rocksdb:        Options.min_write_buffer_number_to_merge: 1
Feb 02 09:39:35 compute-1 ceph-mon[80115]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Feb 02 09:39:35 compute-1 ceph-mon[80115]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Feb 02 09:39:35 compute-1 ceph-mon[80115]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Feb 02 09:39:35 compute-1 ceph-mon[80115]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Feb 02 09:39:35 compute-1 ceph-mon[80115]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Feb 02 09:39:35 compute-1 ceph-mon[80115]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Feb 02 09:39:35 compute-1 ceph-mon[80115]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Feb 02 09:39:35 compute-1 ceph-mon[80115]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Feb 02 09:39:35 compute-1 ceph-mon[80115]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Feb 02 09:39:35 compute-1 ceph-mon[80115]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Feb 02 09:39:35 compute-1 ceph-mon[80115]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Feb 02 09:39:35 compute-1 ceph-mon[80115]: rocksdb:            Options.compression_opts.window_bits: -14
Feb 02 09:39:35 compute-1 ceph-mon[80115]: rocksdb:                  Options.compression_opts.level: 32767
Feb 02 09:39:35 compute-1 ceph-mon[80115]: rocksdb:               Options.compression_opts.strategy: 0
Feb 02 09:39:35 compute-1 ceph-mon[80115]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Feb 02 09:39:35 compute-1 ceph-mon[80115]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Feb 02 09:39:35 compute-1 ceph-mon[80115]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Feb 02 09:39:35 compute-1 ceph-mon[80115]: rocksdb:         Options.compression_opts.parallel_threads: 1
Feb 02 09:39:35 compute-1 ceph-mon[80115]: rocksdb:                  Options.compression_opts.enabled: false
Feb 02 09:39:35 compute-1 ceph-mon[80115]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Feb 02 09:39:35 compute-1 ceph-mon[80115]: rocksdb:      Options.level0_file_num_compaction_trigger: 4
Feb 02 09:39:35 compute-1 ceph-mon[80115]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Feb 02 09:39:35 compute-1 ceph-mon[80115]: rocksdb:              Options.level0_stop_writes_trigger: 36
Feb 02 09:39:35 compute-1 ceph-mon[80115]: rocksdb:                   Options.target_file_size_base: 67108864
Feb 02 09:39:35 compute-1 ceph-mon[80115]: rocksdb:             Options.target_file_size_multiplier: 1
Feb 02 09:39:35 compute-1 ceph-mon[80115]: rocksdb:                Options.max_bytes_for_level_base: 268435456
Feb 02 09:39:35 compute-1 ceph-mon[80115]: rocksdb: Options.level_compaction_dynamic_level_bytes: 1
Feb 02 09:39:35 compute-1 ceph-mon[80115]: rocksdb:          Options.max_bytes_for_level_multiplier: 10.000000
Feb 02 09:39:35 compute-1 ceph-mon[80115]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Feb 02 09:39:35 compute-1 ceph-mon[80115]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Feb 02 09:39:35 compute-1 ceph-mon[80115]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Feb 02 09:39:35 compute-1 ceph-mon[80115]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Feb 02 09:39:35 compute-1 ceph-mon[80115]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Feb 02 09:39:35 compute-1 ceph-mon[80115]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Feb 02 09:39:35 compute-1 ceph-mon[80115]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Feb 02 09:39:35 compute-1 ceph-mon[80115]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Feb 02 09:39:35 compute-1 ceph-mon[80115]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Feb 02 09:39:35 compute-1 ceph-mon[80115]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Feb 02 09:39:35 compute-1 ceph-mon[80115]: rocksdb:                        Options.arena_block_size: 1048576
Feb 02 09:39:35 compute-1 ceph-mon[80115]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Feb 02 09:39:35 compute-1 ceph-mon[80115]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Feb 02 09:39:35 compute-1 ceph-mon[80115]: rocksdb:                Options.disable_auto_compactions: 0
Feb 02 09:39:35 compute-1 ceph-mon[80115]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Feb 02 09:39:35 compute-1 ceph-mon[80115]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Feb 02 09:39:35 compute-1 ceph-mon[80115]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Feb 02 09:39:35 compute-1 ceph-mon[80115]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Feb 02 09:39:35 compute-1 ceph-mon[80115]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Feb 02 09:39:35 compute-1 ceph-mon[80115]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Feb 02 09:39:35 compute-1 ceph-mon[80115]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Feb 02 09:39:35 compute-1 ceph-mon[80115]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Feb 02 09:39:35 compute-1 ceph-mon[80115]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Feb 02 09:39:35 compute-1 ceph-mon[80115]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Feb 02 09:39:35 compute-1 ceph-mon[80115]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Feb 02 09:39:35 compute-1 ceph-mon[80115]: rocksdb:                   Options.inplace_update_support: 0
Feb 02 09:39:35 compute-1 ceph-mon[80115]: rocksdb:                 Options.inplace_update_num_locks: 10000
Feb 02 09:39:35 compute-1 ceph-mon[80115]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Feb 02 09:39:35 compute-1 ceph-mon[80115]: rocksdb:               Options.memtable_whole_key_filtering: 0
Feb 02 09:39:35 compute-1 ceph-mon[80115]: rocksdb:   Options.memtable_huge_page_size: 0
Feb 02 09:39:35 compute-1 ceph-mon[80115]: rocksdb:                           Options.bloom_locality: 0
Feb 02 09:39:35 compute-1 ceph-mon[80115]: rocksdb:                    Options.max_successive_merges: 0
Feb 02 09:39:35 compute-1 ceph-mon[80115]: rocksdb:                Options.optimize_filters_for_hits: 0
Feb 02 09:39:35 compute-1 ceph-mon[80115]: rocksdb:                Options.paranoid_file_checks: 0
Feb 02 09:39:35 compute-1 ceph-mon[80115]: rocksdb:                Options.force_consistency_checks: 1
Feb 02 09:39:35 compute-1 ceph-mon[80115]: rocksdb:                Options.report_bg_io_stats: 0
Feb 02 09:39:35 compute-1 ceph-mon[80115]: rocksdb:                               Options.ttl: 2592000
Feb 02 09:39:35 compute-1 ceph-mon[80115]: rocksdb:          Options.periodic_compaction_seconds: 0
Feb 02 09:39:35 compute-1 ceph-mon[80115]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Feb 02 09:39:35 compute-1 ceph-mon[80115]: rocksdb:    Options.preserve_internal_time_seconds: 0
Feb 02 09:39:35 compute-1 ceph-mon[80115]: rocksdb:                       Options.enable_blob_files: false
Feb 02 09:39:35 compute-1 ceph-mon[80115]: rocksdb:                           Options.min_blob_size: 0
Feb 02 09:39:35 compute-1 ceph-mon[80115]: rocksdb:                          Options.blob_file_size: 268435456
Feb 02 09:39:35 compute-1 ceph-mon[80115]: rocksdb:                   Options.blob_compression_type: NoCompression
Feb 02 09:39:35 compute-1 ceph-mon[80115]: rocksdb:          Options.enable_blob_garbage_collection: false
Feb 02 09:39:35 compute-1 ceph-mon[80115]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Feb 02 09:39:35 compute-1 ceph-mon[80115]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Feb 02 09:39:35 compute-1 ceph-mon[80115]: rocksdb:          Options.blob_compaction_readahead_size: 0
Feb 02 09:39:35 compute-1 ceph-mon[80115]: rocksdb:                Options.blob_file_starting_level: 0
Feb 02 09:39:35 compute-1 ceph-mon[80115]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Feb 02 09:39:35 compute-1 ceph-mon[80115]: rocksdb: [db/version_set.cc:5566] Recovered from manifest file:/var/lib/ceph/mon/ceph-compute-1/store.db/MANIFEST-000005 succeeded,manifest_file_number is 5, next_file_number is 7, last_sequence is 0, log_number is 0,prev_log_number is 0,max_column_family is 0,min_log_number_to_keep is 0
Feb 02 09:39:35 compute-1 ceph-mon[80115]: rocksdb: [db/version_set.cc:5581] Column family [default] (ID 0), log number is 0
Feb 02 09:39:35 compute-1 ceph-mon[80115]: rocksdb: [db/db_impl/db_impl_open.cc:539] DB ID: fac1d709-8a2a-487d-b05b-57255ec289c7
Feb 02 09:39:35 compute-1 ceph-mon[80115]: rocksdb: EVENT_LOG_v1 {"time_micros": 1770025175200272, "job": 1, "event": "recovery_started", "wal_files": [4]}
Feb 02 09:39:35 compute-1 ceph-mon[80115]: rocksdb: [db/db_impl/db_impl_open.cc:1043] Recovering log #4 mode 2
Feb 02 09:39:35 compute-1 ceph-mon[80115]: rocksdb: EVENT_LOG_v1 {"time_micros": 1770025175202204, "cf_name": "default", "job": 1, "event": "table_file_creation", "file_number": 8, "file_size": 1648, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 1, "largest_seqno": 5, "table_properties": {"data_size": 523, "index_size": 31, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 69, "raw_key_size": 115, "raw_average_key_size": 23, "raw_value_size": 401, "raw_average_value_size": 80, "num_data_blocks": 1, "num_entries": 5, "num_filter_entries": 5, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1770025175, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "fac1d709-8a2a-487d-b05b-57255ec289c7", "db_session_id": "DE871D21TSCUFP8UED8E", "orig_file_number": 8, "seqno_to_time_mapping": "N/A"}}
Feb 02 09:39:35 compute-1 ceph-mon[80115]: rocksdb: EVENT_LOG_v1 {"time_micros": 1770025175202286, "job": 1, "event": "recovery_finished"}
Feb 02 09:39:35 compute-1 ceph-mon[80115]: rocksdb: [db/version_set.cc:5047] Creating manifest 10
Feb 02 09:39:35 compute-1 ceph-mon[80115]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000004.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb 02 09:39:35 compute-1 ceph-mon[80115]: rocksdb: [db/db_impl/db_impl_open.cc:1987] SstFileManager instance 0x55a64debee00
Feb 02 09:39:35 compute-1 ceph-mon[80115]: rocksdb: DB pointer 0x55a64dfc8000
Feb 02 09:39:35 compute-1 ceph-mon[80115]: mon.compute-1 does not exist in monmap, will attempt to join an existing cluster
Feb 02 09:39:35 compute-1 ceph-mon[80115]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Feb 02 09:39:35 compute-1 ceph-mon[80115]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                           ** DB Stats **
                                           Uptime(secs): 0.0 total, 0.0 interval
                                           Cumulative writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 GB, 0.00 MB/s
                                           Cumulative WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s
                                           Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                           Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s
                                           Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s
                                           Interval stall: 00:00:0.000 H:M:S, 0.0 percent
                                           
                                           ** Compaction Stats [default] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      1/0    1.61 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.8      0.00              0.00         1    0.002       0      0       0.0       0.0
                                            Sum      1/0    1.61 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.8      0.00              0.00         1    0.002       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.8      0.00              0.00         1    0.002       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [default] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.8      0.00              0.00         1    0.002       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 0.0 total, 0.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.13 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.13 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55a64debd350#2 capacity: 512.00 MB usage: 0.22 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 0 last_secs: 2.9e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): FilterBlock(1,0.11 KB,2.08616e-05%) IndexBlock(1,0.11 KB,2.08616e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [default] **
Feb 02 09:39:35 compute-1 ceph-mon[80115]: using public_addr v2:192.168.122.101:0/0 -> [v2:192.168.122.101:3300/0,v1:192.168.122.101:6789/0]
Feb 02 09:39:35 compute-1 ceph-mon[80115]: starting mon.compute-1 rank -1 at public addrs [v2:192.168.122.101:3300/0,v1:192.168.122.101:6789/0] at bind addrs [v2:192.168.122.101:3300/0,v1:192.168.122.101:6789/0] mon_data /var/lib/ceph/mon/ceph-compute-1 fsid d241d473-9fcb-5f74-b163-f1ca4454e7f1
Feb 02 09:39:35 compute-1 ceph-mon[80115]: mon.compute-1@-1(???) e0 preinit fsid d241d473-9fcb-5f74-b163-f1ca4454e7f1
Feb 02 09:39:35 compute-1 ceph-mon[80115]: mon.compute-1@-1(synchronizing).mds e1 new map
Feb 02 09:39:35 compute-1 ceph-mon[80115]: mon.compute-1@-1(synchronizing).mds e1 print_map
                                           e1
                                           btime 2026-02-02T09:37:43:907997+0000
                                           enable_multiple, ever_enabled_multiple: 1,1
                                           default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}
                                           legacy client fscid: -1
                                            
                                           No filesystems configured
Feb 02 09:39:35 compute-1 ceph-mon[80115]: mon.compute-1@-1(synchronizing).osd e0 _set_cache_ratios kv ratio 0.25 inc ratio 0.375 full ratio 0.375
Feb 02 09:39:35 compute-1 ceph-mon[80115]: mon.compute-1@-1(synchronizing).osd e0 register_cache_with_pcm pcm target: 2147483648 pcm max: 1020054732 pcm min: 134217728 inc_osd_cache size: 1
Feb 02 09:39:35 compute-1 ceph-mon[80115]: mon.compute-1@-1(synchronizing).osd e1 e1: 0 total, 0 up, 0 in
Feb 02 09:39:35 compute-1 ceph-mon[80115]: mon.compute-1@-1(synchronizing).osd e2 e2: 0 total, 0 up, 0 in
Feb 02 09:39:35 compute-1 ceph-mon[80115]: mon.compute-1@-1(synchronizing).osd e3 e3: 0 total, 0 up, 0 in
Feb 02 09:39:35 compute-1 ceph-mon[80115]: mon.compute-1@-1(synchronizing).osd e4 e4: 1 total, 0 up, 1 in
Feb 02 09:39:35 compute-1 ceph-mon[80115]: mon.compute-1@-1(synchronizing).osd e5 e5: 2 total, 0 up, 2 in
Feb 02 09:39:35 compute-1 ceph-mon[80115]: mon.compute-1@-1(synchronizing).osd e6 e6: 2 total, 0 up, 2 in
Feb 02 09:39:35 compute-1 ceph-mon[80115]: mon.compute-1@-1(synchronizing).osd e7 e7: 2 total, 0 up, 2 in
Feb 02 09:39:35 compute-1 ceph-mon[80115]: mon.compute-1@-1(synchronizing).osd e8 e8: 2 total, 0 up, 2 in
Feb 02 09:39:35 compute-1 ceph-mon[80115]: mon.compute-1@-1(synchronizing).osd e9 e9: 2 total, 0 up, 2 in
Feb 02 09:39:35 compute-1 ceph-mon[80115]: mon.compute-1@-1(synchronizing).osd e10 e10: 2 total, 1 up, 2 in
Feb 02 09:39:35 compute-1 ceph-mon[80115]: mon.compute-1@-1(synchronizing).osd e11 e11: 2 total, 2 up, 2 in
Feb 02 09:39:35 compute-1 ceph-mon[80115]: mon.compute-1@-1(synchronizing).osd e12 e12: 2 total, 2 up, 2 in
Feb 02 09:39:35 compute-1 ceph-mon[80115]: mon.compute-1@-1(synchronizing).osd e13 e13: 2 total, 2 up, 2 in
Feb 02 09:39:35 compute-1 ceph-mon[80115]: mon.compute-1@-1(synchronizing).osd e13 crush map has features 3314933000852226048, adjusting msgr requires
Feb 02 09:39:35 compute-1 ceph-mon[80115]: mon.compute-1@-1(synchronizing).osd e13 crush map has features 288514051259236352, adjusting msgr requires
Feb 02 09:39:35 compute-1 ceph-mon[80115]: mon.compute-1@-1(synchronizing).osd e13 crush map has features 288514051259236352, adjusting msgr requires
Feb 02 09:39:35 compute-1 ceph-mon[80115]: mon.compute-1@-1(synchronizing).osd e13 crush map has features 288514051259236352, adjusting msgr requires
Feb 02 09:39:35 compute-1 ceph-mon[80115]: Health check failed: Failed to apply 2 service(s): mon,mgr (CEPHADM_APPLY_SPEC_FAIL)
Feb 02 09:39:35 compute-1 ceph-mon[80115]: pgmap v22: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail
Feb 02 09:39:35 compute-1 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' 
Feb 02 09:39:35 compute-1 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' 
Feb 02 09:39:35 compute-1 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' 
Feb 02 09:39:35 compute-1 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' 
Feb 02 09:39:35 compute-1 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Feb 02 09:39:35 compute-1 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Feb 02 09:39:35 compute-1 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Feb 02 09:39:35 compute-1 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Feb 02 09:39:35 compute-1 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Feb 02 09:39:35 compute-1 ceph-mon[80115]: from='client.? 192.168.122.101:0/908444544' entity='client.bootstrap-osd' cmd=[{"prefix": "osd new", "uuid": "273baa6d-671d-41d3-8896-5eac2274aa10"}]: dispatch
Feb 02 09:39:35 compute-1 ceph-mon[80115]: from='client.? 192.168.122.101:0/908444544' entity='client.bootstrap-osd' cmd='[{"prefix": "osd new", "uuid": "273baa6d-671d-41d3-8896-5eac2274aa10"}]': finished
Feb 02 09:39:35 compute-1 ceph-mon[80115]: osdmap e4: 1 total, 0 up, 1 in
Feb 02 09:39:35 compute-1 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd metadata", "id": 0}]: dispatch
Feb 02 09:39:35 compute-1 ceph-mon[80115]: from='client.? 192.168.122.100:0/224206128' entity='client.bootstrap-osd' cmd=[{"prefix": "osd new", "uuid": "fabfc705-a3af-416c-81a4-3fd4d777fb5f"}]: dispatch
Feb 02 09:39:35 compute-1 ceph-mon[80115]: from='client.? 192.168.122.100:0/224206128' entity='client.bootstrap-osd' cmd='[{"prefix": "osd new", "uuid": "fabfc705-a3af-416c-81a4-3fd4d777fb5f"}]': finished
Feb 02 09:39:35 compute-1 ceph-mon[80115]: osdmap e5: 2 total, 0 up, 2 in
Feb 02 09:39:35 compute-1 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd metadata", "id": 0}]: dispatch
Feb 02 09:39:35 compute-1 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch
Feb 02 09:39:35 compute-1 ceph-mon[80115]: pgmap v25: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail
Feb 02 09:39:35 compute-1 ceph-mon[80115]: from='client.? 192.168.122.101:0/380622187' entity='client.bootstrap-osd' cmd=[{"prefix": "mon getmap"}]: dispatch
Feb 02 09:39:35 compute-1 ceph-mon[80115]: from='client.? 192.168.122.100:0/434230814' entity='client.bootstrap-osd' cmd=[{"prefix": "mon getmap"}]: dispatch
Feb 02 09:39:35 compute-1 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' 
Feb 02 09:39:35 compute-1 ceph-mon[80115]: Health check cleared: TOO_FEW_OSDS (was: OSD count 0 < osd_pool_default_size 1)
Feb 02 09:39:35 compute-1 ceph-mon[80115]: pgmap v26: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail
Feb 02 09:39:35 compute-1 ceph-mon[80115]: pgmap v27: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail
Feb 02 09:39:35 compute-1 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "auth get", "entity": "osd.0"}]: dispatch
Feb 02 09:39:35 compute-1 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Feb 02 09:39:35 compute-1 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "auth get", "entity": "osd.1"}]: dispatch
Feb 02 09:39:35 compute-1 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Feb 02 09:39:35 compute-1 ceph-mon[80115]: Deploying daemon osd.0 on compute-1
Feb 02 09:39:35 compute-1 ceph-mon[80115]: pgmap v28: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail
Feb 02 09:39:35 compute-1 ceph-mon[80115]: Deploying daemon osd.1 on compute-0
Feb 02 09:39:35 compute-1 ceph-mon[80115]: pgmap v29: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail
Feb 02 09:39:35 compute-1 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' 
Feb 02 09:39:35 compute-1 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' 
Feb 02 09:39:35 compute-1 ceph-mon[80115]: pgmap v30: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail
Feb 02 09:39:35 compute-1 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' 
Feb 02 09:39:35 compute-1 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' 
Feb 02 09:39:35 compute-1 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' 
Feb 02 09:39:35 compute-1 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' 
Feb 02 09:39:35 compute-1 ceph-mon[80115]: from='osd.0 [v2:192.168.122.101:6800/3783871040,v1:192.168.122.101:6801/3783871040]' entity='osd.0' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["0"]}]: dispatch
Feb 02 09:39:35 compute-1 ceph-mon[80115]: pgmap v31: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail
Feb 02 09:39:35 compute-1 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' 
Feb 02 09:39:35 compute-1 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' 
Feb 02 09:39:35 compute-1 ceph-mon[80115]: from='osd.0 [v2:192.168.122.101:6800/3783871040,v1:192.168.122.101:6801/3783871040]' entity='osd.0' cmd='[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["0"]}]': finished
Feb 02 09:39:35 compute-1 ceph-mon[80115]: osdmap e6: 2 total, 0 up, 2 in
Feb 02 09:39:35 compute-1 ceph-mon[80115]: from='osd.0 [v2:192.168.122.101:6800/3783871040,v1:192.168.122.101:6801/3783871040]' entity='osd.0' cmd=[{"prefix": "osd crush create-or-move", "id": 0, "weight":0.0195, "args": ["host=compute-1", "root=default"]}]: dispatch
Feb 02 09:39:35 compute-1 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd metadata", "id": 0}]: dispatch
Feb 02 09:39:35 compute-1 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch
Feb 02 09:39:35 compute-1 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' 
Feb 02 09:39:35 compute-1 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' 
Feb 02 09:39:35 compute-1 ceph-mon[80115]: from='osd.0 [v2:192.168.122.101:6800/3783871040,v1:192.168.122.101:6801/3783871040]' entity='osd.0' cmd='[{"prefix": "osd crush create-or-move", "id": 0, "weight":0.0195, "args": ["host=compute-1", "root=default"]}]': finished
Feb 02 09:39:35 compute-1 ceph-mon[80115]: osdmap e7: 2 total, 0 up, 2 in
Feb 02 09:39:35 compute-1 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd metadata", "id": 0}]: dispatch
Feb 02 09:39:35 compute-1 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch
Feb 02 09:39:35 compute-1 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' 
Feb 02 09:39:35 compute-1 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd metadata", "id": 0}]: dispatch
Feb 02 09:39:35 compute-1 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' 
Feb 02 09:39:35 compute-1 ceph-mon[80115]: from='osd.1 [v2:192.168.122.100:6802/3795740271,v1:192.168.122.100:6803/3795740271]' entity='osd.1' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["1"]}]: dispatch
Feb 02 09:39:35 compute-1 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' 
Feb 02 09:39:35 compute-1 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' 
Feb 02 09:39:35 compute-1 ceph-mon[80115]: pgmap v34: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail
Feb 02 09:39:35 compute-1 ceph-mon[80115]: from='client.? 192.168.122.100:0/3299804201' entity='client.admin' cmd=[{"prefix": "status", "format": "json"}]: dispatch
Feb 02 09:39:35 compute-1 ceph-mon[80115]: purged_snaps scrub starts
Feb 02 09:39:35 compute-1 ceph-mon[80115]: purged_snaps scrub ok
Feb 02 09:39:35 compute-1 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd metadata", "id": 0}]: dispatch
Feb 02 09:39:35 compute-1 ceph-mon[80115]: from='osd.1 [v2:192.168.122.100:6802/3795740271,v1:192.168.122.100:6803/3795740271]' entity='osd.1' cmd='[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["1"]}]': finished
Feb 02 09:39:35 compute-1 ceph-mon[80115]: osdmap e8: 2 total, 0 up, 2 in
Feb 02 09:39:35 compute-1 ceph-mon[80115]: from='osd.1 [v2:192.168.122.100:6802/3795740271,v1:192.168.122.100:6803/3795740271]' entity='osd.1' cmd=[{"prefix": "osd crush create-or-move", "id": 1, "weight":0.0195, "args": ["host=compute-0", "root=default"]}]: dispatch
Feb 02 09:39:35 compute-1 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd metadata", "id": 0}]: dispatch
Feb 02 09:39:35 compute-1 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch
Feb 02 09:39:35 compute-1 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' 
Feb 02 09:39:35 compute-1 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd metadata", "id": 0}]: dispatch
Feb 02 09:39:35 compute-1 ceph-mon[80115]: from='osd.1 [v2:192.168.122.100:6802/3795740271,v1:192.168.122.100:6803/3795740271]' entity='osd.1' cmd='[{"prefix": "osd crush create-or-move", "id": 1, "weight":0.0195, "args": ["host=compute-0", "root=default"]}]': finished
Feb 02 09:39:35 compute-1 ceph-mon[80115]: osdmap e9: 2 total, 0 up, 2 in
Feb 02 09:39:35 compute-1 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd metadata", "id": 0}]: dispatch
Feb 02 09:39:35 compute-1 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch
Feb 02 09:39:35 compute-1 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch
Feb 02 09:39:35 compute-1 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' 
Feb 02 09:39:35 compute-1 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' 
Feb 02 09:39:35 compute-1 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' 
Feb 02 09:39:35 compute-1 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' 
Feb 02 09:39:35 compute-1 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' 
Feb 02 09:39:35 compute-1 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' 
Feb 02 09:39:35 compute-1 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"}]: dispatch
Feb 02 09:39:35 compute-1 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' 
Feb 02 09:39:35 compute-1 ceph-mon[80115]: Adjusting osd_memory_target on compute-1 to  5247M
Feb 02 09:39:35 compute-1 ceph-mon[80115]: pgmap v37: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail
Feb 02 09:39:35 compute-1 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' 
Feb 02 09:39:35 compute-1 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"}]: dispatch
Feb 02 09:39:35 compute-1 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' 
Feb 02 09:39:35 compute-1 ceph-mon[80115]: Adjusting osd_memory_target on compute-0 to 127.9M
Feb 02 09:39:35 compute-1 ceph-mon[80115]: Unable to set osd_memory_target on compute-0 to 134197657: error parsing value: Value '134197657' is below minimum 939524096
Feb 02 09:39:35 compute-1 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd metadata", "id": 0}]: dispatch
Feb 02 09:39:35 compute-1 ceph-mon[80115]: purged_snaps scrub starts
Feb 02 09:39:35 compute-1 ceph-mon[80115]: purged_snaps scrub ok
Feb 02 09:39:35 compute-1 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch
Feb 02 09:39:35 compute-1 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd metadata", "id": 0}]: dispatch
Feb 02 09:39:35 compute-1 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch
Feb 02 09:39:35 compute-1 ceph-mon[80115]: OSD bench result of 4998.717013 IOPS is not within the threshold limit range of 50.000000 IOPS and 500.000000 IOPS for osd.0. IOPS capacity is unchanged at 315.000000 IOPS. The recommendation is to establish the osd's IOPS capacity using other benchmark tools (e.g. Fio) and then override osd_mclock_max_capacity_iops_[hdd|ssd].
Feb 02 09:39:35 compute-1 ceph-mon[80115]: pgmap v38: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail
Feb 02 09:39:35 compute-1 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd metadata", "id": 0}]: dispatch
Feb 02 09:39:35 compute-1 ceph-mon[80115]: osd.0 [v2:192.168.122.101:6800/3783871040,v1:192.168.122.101:6801/3783871040] boot
Feb 02 09:39:35 compute-1 ceph-mon[80115]: osdmap e10: 2 total, 1 up, 2 in
Feb 02 09:39:35 compute-1 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd metadata", "id": 0}]: dispatch
Feb 02 09:39:35 compute-1 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch
Feb 02 09:39:35 compute-1 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd pool create", "format": "json", "pool": ".mgr", "pg_num": 1, "pg_num_min": 1, "pg_num_max": 32, "yes_i_really_mean_it": true}]: dispatch
Feb 02 09:39:35 compute-1 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch
Feb 02 09:39:35 compute-1 ceph-mon[80115]: OSD bench result of 7696.745182 IOPS is not within the threshold limit range of 50.000000 IOPS and 500.000000 IOPS for osd.1. IOPS capacity is unchanged at 315.000000 IOPS. The recommendation is to establish the osd's IOPS capacity using other benchmark tools (e.g. Fio) and then override osd_mclock_max_capacity_iops_[hdd|ssd].
Feb 02 09:39:35 compute-1 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' cmd='[{"prefix": "osd pool create", "format": "json", "pool": ".mgr", "pg_num": 1, "pg_num_min": 1, "pg_num_max": 32, "yes_i_really_mean_it": true}]': finished
Feb 02 09:39:35 compute-1 ceph-mon[80115]: osd.1 [v2:192.168.122.100:6802/3795740271,v1:192.168.122.100:6803/3795740271] boot
Feb 02 09:39:35 compute-1 ceph-mon[80115]: osdmap e11: 2 total, 2 up, 2 in
Feb 02 09:39:35 compute-1 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch
Feb 02 09:39:35 compute-1 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd pool application enable", "format": "json", "pool": ".mgr", "app": "mgr", "yes_i_really_mean_it": true}]: dispatch
Feb 02 09:39:35 compute-1 ceph-mon[80115]: pgmap v41: 1 pgs: 1 unknown; 0 B data, 26 MiB used, 20 GiB / 20 GiB avail
Feb 02 09:39:35 compute-1 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' cmd='[{"prefix": "osd pool application enable", "format": "json", "pool": ".mgr", "app": "mgr", "yes_i_really_mean_it": true}]': finished
Feb 02 09:39:35 compute-1 ceph-mon[80115]: osdmap e12: 2 total, 2 up, 2 in
Feb 02 09:39:35 compute-1 ceph-mon[80115]: Health check failed: 1 pool(s) do not have an application enabled (POOL_APP_NOT_ENABLED)
Feb 02 09:39:35 compute-1 ceph-mon[80115]: from='admin socket' entity='admin socket' cmd='smart' args=[json]: dispatch
Feb 02 09:39:35 compute-1 ceph-mon[80115]: from='admin socket' entity='admin socket' cmd=smart args=[json]: finished
Feb 02 09:39:35 compute-1 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "mon metadata", "id": "compute-0"}]: dispatch
Feb 02 09:39:35 compute-1 ceph-mon[80115]: osdmap e13: 2 total, 2 up, 2 in
Feb 02 09:39:35 compute-1 ceph-mon[80115]: mgrmap e9: compute-0.djvyfo(active, since 76s)
Feb 02 09:39:35 compute-1 ceph-mon[80115]: pgmap v44: 1 pgs: 1 creating+peering; 0 B data, 453 MiB used, 40 GiB / 40 GiB avail
Feb 02 09:39:35 compute-1 ceph-mon[80115]: Health check cleared: POOL_APP_NOT_ENABLED (was: 1 pool(s) do not have an application enabled)
Feb 02 09:39:35 compute-1 ceph-mon[80115]: pgmap v45: 1 pgs: 1 creating+peering; 0 B data, 453 MiB used, 40 GiB / 40 GiB avail
Feb 02 09:39:35 compute-1 ceph-mon[80115]: pgmap v46: 1 pgs: 1 active+clean; 449 KiB data, 53 MiB used, 40 GiB / 40 GiB avail
Feb 02 09:39:35 compute-1 ceph-mon[80115]: pgmap v47: 1 pgs: 1 active+clean; 449 KiB data, 53 MiB used, 40 GiB / 40 GiB avail
Feb 02 09:39:35 compute-1 ceph-mon[80115]: pgmap v48: 1 pgs: 1 active+clean; 449 KiB data, 53 MiB used, 40 GiB / 40 GiB avail
Feb 02 09:39:35 compute-1 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' 
Feb 02 09:39:35 compute-1 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' 
Feb 02 09:39:35 compute-1 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' 
Feb 02 09:39:35 compute-1 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' 
Feb 02 09:39:35 compute-1 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "config rm", "who": "osd/host:compute-2", "name": "osd_memory_target"}]: dispatch
Feb 02 09:39:35 compute-1 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Feb 02 09:39:35 compute-1 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Feb 02 09:39:35 compute-1 ceph-mon[80115]: Updating compute-2:/etc/ceph/ceph.conf
Feb 02 09:39:35 compute-1 ceph-mon[80115]: pgmap v49: 1 pgs: 1 active+clean; 449 KiB data, 53 MiB used, 40 GiB / 40 GiB avail
Feb 02 09:39:35 compute-1 ceph-mon[80115]: Updating compute-2:/var/lib/ceph/d241d473-9fcb-5f74-b163-f1ca4454e7f1/config/ceph.conf
Feb 02 09:39:35 compute-1 ceph-mon[80115]: Updating compute-2:/etc/ceph/ceph.client.admin.keyring
Feb 02 09:39:35 compute-1 ceph-mon[80115]: Updating compute-2:/var/lib/ceph/d241d473-9fcb-5f74-b163-f1ca4454e7f1/config/ceph.client.admin.keyring
Feb 02 09:39:35 compute-1 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' 
Feb 02 09:39:35 compute-1 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' 
Feb 02 09:39:35 compute-1 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' 
Feb 02 09:39:35 compute-1 ceph-mon[80115]: pgmap v50: 1 pgs: 1 active+clean; 449 KiB data, 53 MiB used, 40 GiB / 40 GiB avail
Feb 02 09:39:35 compute-1 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "auth get", "entity": "mon."}]: dispatch
Feb 02 09:39:35 compute-1 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "config get", "who": "mon", "key": "public_network"}]: dispatch
Feb 02 09:39:35 compute-1 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Feb 02 09:39:35 compute-1 ceph-mon[80115]: Deploying daemon mon.compute-2 on compute-2
Feb 02 09:39:35 compute-1 ceph-mon[80115]: Health check cleared: CEPHADM_APPLY_SPEC_FAIL (was: Failed to apply 2 service(s): mon,mgr)
Feb 02 09:39:35 compute-1 ceph-mon[80115]: Cluster is now healthy
Feb 02 09:39:35 compute-1 ceph-mon[80115]: mon.compute-1@-1(synchronizing).paxosservice(auth 1..7) refresh upgraded, format 0 -> 3
Feb 02 09:39:41 compute-1 ceph-mon[80115]: mon.compute-1@-1(probing) e3  my rank is now 2 (was -1)
Feb 02 09:39:41 compute-1 ceph-mon[80115]: log_channel(cluster) log [INF] : mon.compute-1 calling monitor election
Feb 02 09:39:41 compute-1 ceph-mon[80115]: paxos.2).electionLogic(0) init, first boot, initializing epoch at 1 
Feb 02 09:39:41 compute-1 ceph-mon[80115]: mon.compute-1@2(electing) e3 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Feb 02 09:39:44 compute-1 ceph-mon[80115]: mon.compute-1@2(electing) e3 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Feb 02 09:39:44 compute-1 ceph-mon[80115]: mon.compute-1@2(peon) e3 _apply_compatset_features enabling new quorum features: compat={},rocompat={},incompat={4=support erasure code pools,5=new-style osdmap encoding,6=support isa/lrc erasure code,7=support shec erasure code}
Feb 02 09:39:44 compute-1 ceph-mon[80115]: mon.compute-1@2(peon) e3 _apply_compatset_features enabling new quorum features: compat={},rocompat={},incompat={8=support monmap features,9=luminous ondisk layout,10=mimic ondisk layout,11=nautilus ondisk layout,12=octopus ondisk layout,13=pacific ondisk layout,14=quincy ondisk layout,15=reef ondisk layout,16=squid ondisk layout}
Feb 02 09:39:44 compute-1 ceph-mon[80115]: Deploying daemon mon.compute-1 on compute-1
Feb 02 09:39:44 compute-1 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "mon metadata", "id": "compute-0"}]: dispatch
Feb 02 09:39:44 compute-1 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "mon metadata", "id": "compute-2"}]: dispatch
Feb 02 09:39:44 compute-1 ceph-mon[80115]: mon.compute-0 calling monitor election
Feb 02 09:39:44 compute-1 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "mon metadata", "id": "compute-2"}]: dispatch
Feb 02 09:39:44 compute-1 ceph-mon[80115]: pgmap v52: 1 pgs: 1 active+clean; 449 KiB data, 53 MiB used, 40 GiB / 40 GiB avail
Feb 02 09:39:44 compute-1 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "mon metadata", "id": "compute-1"}]: dispatch
Feb 02 09:39:44 compute-1 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "mon metadata", "id": "compute-2"}]: dispatch
Feb 02 09:39:44 compute-1 ceph-mon[80115]: mon.compute-2 calling monitor election
Feb 02 09:39:44 compute-1 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "mon metadata", "id": "compute-1"}]: dispatch
Feb 02 09:39:44 compute-1 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "mon metadata", "id": "compute-2"}]: dispatch
Feb 02 09:39:44 compute-1 ceph-mon[80115]: pgmap v53: 1 pgs: 1 active+clean; 449 KiB data, 53 MiB used, 40 GiB / 40 GiB avail
Feb 02 09:39:44 compute-1 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "mon metadata", "id": "compute-1"}]: dispatch
Feb 02 09:39:44 compute-1 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "mon metadata", "id": "compute-2"}]: dispatch
Feb 02 09:39:44 compute-1 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "mon metadata", "id": "compute-1"}]: dispatch
Feb 02 09:39:44 compute-1 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "mon metadata", "id": "compute-2"}]: dispatch
Feb 02 09:39:44 compute-1 ceph-mon[80115]: mon.compute-0 is new leader, mons compute-0,compute-2 in quorum (ranks 0,1)
Feb 02 09:39:44 compute-1 ceph-mon[80115]: monmap epoch 2
Feb 02 09:39:44 compute-1 ceph-mon[80115]: fsid d241d473-9fcb-5f74-b163-f1ca4454e7f1
Feb 02 09:39:44 compute-1 ceph-mon[80115]: last_changed 2026-02-02T09:39:33.475774+0000
Feb 02 09:39:44 compute-1 ceph-mon[80115]: created 2026-02-02T09:37:41.899871+0000
Feb 02 09:39:44 compute-1 ceph-mon[80115]: min_mon_release 19 (squid)
Feb 02 09:39:44 compute-1 ceph-mon[80115]: election_strategy: 1
Feb 02 09:39:44 compute-1 ceph-mon[80115]: 0: [v2:192.168.122.100:3300/0,v1:192.168.122.100:6789/0] mon.compute-0
Feb 02 09:39:44 compute-1 ceph-mon[80115]: 1: [v2:192.168.122.102:3300/0,v1:192.168.122.102:6789/0] mon.compute-2
Feb 02 09:39:44 compute-1 ceph-mon[80115]: fsmap 
Feb 02 09:39:44 compute-1 ceph-mon[80115]: osdmap e13: 2 total, 2 up, 2 in
Feb 02 09:39:44 compute-1 ceph-mon[80115]: mgrmap e9: compute-0.djvyfo(active, since 96s)
Feb 02 09:39:44 compute-1 ceph-mon[80115]: overall HEALTH_OK
Feb 02 09:39:44 compute-1 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' 
Feb 02 09:39:44 compute-1 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' 
Feb 02 09:39:44 compute-1 ceph-mon[80115]: mon.compute-1@2(peon) e3 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Feb 02 09:39:44 compute-1 ceph-mon[80115]: mgrc update_daemon_metadata mon.compute-1 metadata {addrs=[v2:192.168.122.101:3300/0,v1:192.168.122.101:6789/0],arch=x86_64,ceph_release=squid,ceph_version=ceph version 19.2.3 (c92aebb279828e9c3c1f5d24613efca272649e62) squid (stable),ceph_version_short=19.2.3,compression_algorithms=none, snappy, zlib, zstd, lz4,container_hostname=compute-1,container_image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec,cpu=AMD EPYC-Rome Processor,device_ids=,device_paths=vda=/dev/disk/by-path/pci-0000:00:04.0,devices=vda,distro=centos,distro_description=CentOS Stream 9,distro_version=9,hostname=compute-1,kernel_description=#1 SMP PREEMPT_DYNAMIC Thu Jan 22 12:30:22 UTC 2026,kernel_version=5.14.0-665.el9.x86_64,mem_swap_kb=1048572,mem_total_kb=7864292,os=Linux}
Feb 02 09:39:44 compute-1 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "mon metadata", "id": "compute-0"}]: dispatch
Feb 02 09:39:44 compute-1 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "mon metadata", "id": "compute-1"}]: dispatch
Feb 02 09:39:44 compute-1 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "mon metadata", "id": "compute-2"}]: dispatch
Feb 02 09:39:44 compute-1 ceph-mon[80115]: mon.compute-0 calling monitor election
Feb 02 09:39:44 compute-1 ceph-mon[80115]: mon.compute-2 calling monitor election
Feb 02 09:39:44 compute-1 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "mon metadata", "id": "compute-1"}]: dispatch
Feb 02 09:39:44 compute-1 ceph-mon[80115]: pgmap v55: 1 pgs: 1 active+clean; 449 KiB data, 53 MiB used, 40 GiB / 40 GiB avail
Feb 02 09:39:44 compute-1 ceph-mon[80115]: mon.compute-1 calling monitor election
Feb 02 09:39:44 compute-1 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "mon metadata", "id": "compute-1"}]: dispatch
Feb 02 09:39:44 compute-1 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "mon metadata", "id": "compute-1"}]: dispatch
Feb 02 09:39:44 compute-1 ceph-mon[80115]: pgmap v56: 1 pgs: 1 active+clean; 449 KiB data, 53 MiB used, 40 GiB / 40 GiB avail
Feb 02 09:39:44 compute-1 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "mon metadata", "id": "compute-1"}]: dispatch
Feb 02 09:39:44 compute-1 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "mon metadata", "id": "compute-1"}]: dispatch
Feb 02 09:39:44 compute-1 ceph-mon[80115]: mon.compute-0 is new leader, mons compute-0,compute-2,compute-1 in quorum (ranks 0,1,2)
Feb 02 09:39:44 compute-1 ceph-mon[80115]: monmap epoch 3
Feb 02 09:39:44 compute-1 ceph-mon[80115]: fsid d241d473-9fcb-5f74-b163-f1ca4454e7f1
Feb 02 09:39:44 compute-1 ceph-mon[80115]: last_changed 2026-02-02T09:39:39.266649+0000
Feb 02 09:39:44 compute-1 ceph-mon[80115]: created 2026-02-02T09:37:41.899871+0000
Feb 02 09:39:44 compute-1 ceph-mon[80115]: min_mon_release 19 (squid)
Feb 02 09:39:44 compute-1 ceph-mon[80115]: election_strategy: 1
Feb 02 09:39:44 compute-1 ceph-mon[80115]: 0: [v2:192.168.122.100:3300/0,v1:192.168.122.100:6789/0] mon.compute-0
Feb 02 09:39:44 compute-1 ceph-mon[80115]: 1: [v2:192.168.122.102:3300/0,v1:192.168.122.102:6789/0] mon.compute-2
Feb 02 09:39:44 compute-1 ceph-mon[80115]: 2: [v2:192.168.122.101:3300/0,v1:192.168.122.101:6789/0] mon.compute-1
Feb 02 09:39:44 compute-1 ceph-mon[80115]: fsmap 
Feb 02 09:39:44 compute-1 ceph-mon[80115]: osdmap e13: 2 total, 2 up, 2 in
Feb 02 09:39:44 compute-1 ceph-mon[80115]: mgrmap e9: compute-0.djvyfo(active, since 101s)
Feb 02 09:39:44 compute-1 ceph-mon[80115]: overall HEALTH_OK
Feb 02 09:39:44 compute-1 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' 
Feb 02 09:39:44 compute-1 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' 
Feb 02 09:39:44 compute-1 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' 
Feb 02 09:39:44 compute-1 sudo[80154]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 02 09:39:44 compute-1 sudo[80154]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:39:44 compute-1 sudo[80154]: pam_unix(sudo:session): session closed for user root
Feb 02 09:39:44 compute-1 sudo[80179]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/d241d473-9fcb-5f74-b163-f1ca4454e7f1/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --image quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec --timeout 895 _orch deploy --fsid d241d473-9fcb-5f74-b163-f1ca4454e7f1
Feb 02 09:39:44 compute-1 sudo[80179]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:39:44 compute-1 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_auth_request failed to assign global_id
Feb 02 09:39:44 compute-1 podman[80246]: 2026-02-02 09:39:44.990868407 +0000 UTC m=+0.047301099 container create 8faafb7ae4a14b207878b577beec2290714f38bbfd1875771e160211e18f10ff (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=peaceful_buck, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, org.label-schema.license=GPLv2, org.label-schema.build-date=20250325, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Feb 02 09:39:45 compute-1 systemd[1]: Started libpod-conmon-8faafb7ae4a14b207878b577beec2290714f38bbfd1875771e160211e18f10ff.scope.
Feb 02 09:39:45 compute-1 systemd[1]: Started libcrun container.
Feb 02 09:39:45 compute-1 podman[80246]: 2026-02-02 09:39:44.968507232 +0000 UTC m=+0.024939914 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Feb 02 09:39:45 compute-1 podman[80246]: 2026-02-02 09:39:45.069187974 +0000 UTC m=+0.125620656 container init 8faafb7ae4a14b207878b577beec2290714f38bbfd1875771e160211e18f10ff (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=peaceful_buck, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250325, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2)
Feb 02 09:39:45 compute-1 podman[80246]: 2026-02-02 09:39:45.081375518 +0000 UTC m=+0.137808210 container start 8faafb7ae4a14b207878b577beec2290714f38bbfd1875771e160211e18f10ff (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=peaceful_buck, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_REF=squid, org.label-schema.license=GPLv2, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default)
Feb 02 09:39:45 compute-1 podman[80246]: 2026-02-02 09:39:45.084815837 +0000 UTC m=+0.141248529 container attach 8faafb7ae4a14b207878b577beec2290714f38bbfd1875771e160211e18f10ff (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=peaceful_buck, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid)
Feb 02 09:39:45 compute-1 peaceful_buck[80262]: 167 167
Feb 02 09:39:45 compute-1 systemd[1]: libpod-8faafb7ae4a14b207878b577beec2290714f38bbfd1875771e160211e18f10ff.scope: Deactivated successfully.
Feb 02 09:39:45 compute-1 podman[80246]: 2026-02-02 09:39:45.088570664 +0000 UTC m=+0.145003326 container died 8faafb7ae4a14b207878b577beec2290714f38bbfd1875771e160211e18f10ff (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=peaceful_buck, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, ceph=True, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default)
Feb 02 09:39:45 compute-1 systemd[1]: var-lib-containers-storage-overlay-3b4863e23ba2b1038f54ad1d06529049c7f62ba5d7b8112cb6f59a0ba41f85b9-merged.mount: Deactivated successfully.
Feb 02 09:39:45 compute-1 podman[80246]: 2026-02-02 09:39:45.119441329 +0000 UTC m=+0.175874001 container remove 8faafb7ae4a14b207878b577beec2290714f38bbfd1875771e160211e18f10ff (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=peaceful_buck, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, CEPH_REF=squid)
Feb 02 09:39:45 compute-1 systemd[1]: libpod-conmon-8faafb7ae4a14b207878b577beec2290714f38bbfd1875771e160211e18f10ff.scope: Deactivated successfully.
Feb 02 09:39:45 compute-1 systemd[1]: Reloading.
Feb 02 09:39:45 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e13 _set_new_cache_sizes cache_size:1019937191 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 02 09:39:45 compute-1 systemd-rc-local-generator[80302]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 02 09:39:45 compute-1 systemd-sysv-generator[80307]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 02 09:39:45 compute-1 systemd[1]: Reloading.
Feb 02 09:39:45 compute-1 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' 
Feb 02 09:39:45 compute-1 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "auth get-or-create", "entity": "mgr.compute-1.teascl", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]: dispatch
Feb 02 09:39:45 compute-1 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' cmd='[{"prefix": "auth get-or-create", "entity": "mgr.compute-1.teascl", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]': finished
Feb 02 09:39:45 compute-1 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "mgr services"}]: dispatch
Feb 02 09:39:45 compute-1 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Feb 02 09:39:45 compute-1 ceph-mon[80115]: Deploying daemon mgr.compute-1.teascl on compute-1
Feb 02 09:39:45 compute-1 ceph-mon[80115]: pgmap v57: 1 pgs: 1 active+clean; 449 KiB data, 53 MiB used, 40 GiB / 40 GiB avail
Feb 02 09:39:45 compute-1 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "mon metadata", "id": "compute-1"}]: dispatch
Feb 02 09:39:45 compute-1 systemd-rc-local-generator[80345]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 02 09:39:45 compute-1 systemd-sysv-generator[80349]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 02 09:39:45 compute-1 systemd[1]: Starting Ceph mgr.compute-1.teascl for d241d473-9fcb-5f74-b163-f1ca4454e7f1...
Feb 02 09:39:45 compute-1 podman[80403]: 2026-02-02 09:39:45.847404488 +0000 UTC m=+0.037840455 container create 0fc1762cd853cc2a965061c42d42995418c7f99fc735927111e33663986b9994 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl, io.buildah.version=1.40.1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True)
Feb 02 09:39:45 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/11afdbff549d454aa86b0a5e2e5a5b9932ccb9b413356ee9e1a859501bb40587/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Feb 02 09:39:45 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/11afdbff549d454aa86b0a5e2e5a5b9932ccb9b413356ee9e1a859501bb40587/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Feb 02 09:39:45 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/11afdbff549d454aa86b0a5e2e5a5b9932ccb9b413356ee9e1a859501bb40587/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Feb 02 09:39:45 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/11afdbff549d454aa86b0a5e2e5a5b9932ccb9b413356ee9e1a859501bb40587/merged/var/lib/ceph/mgr/ceph-compute-1.teascl supports timestamps until 2038 (0x7fffffff)
Feb 02 09:39:45 compute-1 podman[80403]: 2026-02-02 09:39:45.908884952 +0000 UTC m=+0.099320919 container init 0fc1762cd853cc2a965061c42d42995418c7f99fc735927111e33663986b9994 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, ceph=True, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Feb 02 09:39:45 compute-1 podman[80403]: 2026-02-02 09:39:45.918389257 +0000 UTC m=+0.108825224 container start 0fc1762cd853cc2a965061c42d42995418c7f99fc735927111e33663986b9994 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, OSD_FLAVOR=default, org.label-schema.license=GPLv2, ceph=True, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.40.1)
Feb 02 09:39:45 compute-1 bash[80403]: 0fc1762cd853cc2a965061c42d42995418c7f99fc735927111e33663986b9994
Feb 02 09:39:45 compute-1 podman[80403]: 2026-02-02 09:39:45.830644697 +0000 UTC m=+0.021080634 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Feb 02 09:39:45 compute-1 systemd[1]: Started Ceph mgr.compute-1.teascl for d241d473-9fcb-5f74-b163-f1ca4454e7f1.
Feb 02 09:39:45 compute-1 ceph-mgr[80422]: set uid:gid to 167:167 (ceph:ceph)
Feb 02 09:39:45 compute-1 ceph-mgr[80422]: ceph version 19.2.3 (c92aebb279828e9c3c1f5d24613efca272649e62) squid (stable), process ceph-mgr, pid 2
Feb 02 09:39:45 compute-1 ceph-mgr[80422]: pidfile_write: ignore empty --pid-file
Feb 02 09:39:45 compute-1 sudo[80179]: pam_unix(sudo:session): session closed for user root
Feb 02 09:39:45 compute-1 ceph-mgr[80422]: mgr[py] Loading python module 'alerts'
Feb 02 09:39:46 compute-1 ceph-mgr[80422]: mgr[py] Module alerts has missing NOTIFY_TYPES member
Feb 02 09:39:46 compute-1 ceph-mgr[80422]: mgr[py] Loading python module 'balancer'
Feb 02 09:39:46 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]: 2026-02-02T09:39:46.074+0000 7f2ea4a6c140 -1 mgr[py] Module alerts has missing NOTIFY_TYPES member
Feb 02 09:39:46 compute-1 ceph-mgr[80422]: mgr[py] Module balancer has missing NOTIFY_TYPES member
Feb 02 09:39:46 compute-1 ceph-mgr[80422]: mgr[py] Loading python module 'cephadm'
Feb 02 09:39:46 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]: 2026-02-02T09:39:46.146+0000 7f2ea4a6c140 -1 mgr[py] Module balancer has missing NOTIFY_TYPES member
Feb 02 09:39:46 compute-1 ceph-mon[80115]: from='client.? 192.168.122.100:0/8543329' entity='client.admin' cmd=[{"prefix": "status", "format": "json"}]: dispatch
Feb 02 09:39:46 compute-1 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' 
Feb 02 09:39:46 compute-1 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' 
Feb 02 09:39:46 compute-1 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' 
Feb 02 09:39:46 compute-1 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' 
Feb 02 09:39:46 compute-1 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "auth get-or-create", "entity": "client.crash.compute-2", "caps": ["mon", "profile crash", "mgr", "profile crash"]}]: dispatch
Feb 02 09:39:46 compute-1 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' cmd='[{"prefix": "auth get-or-create", "entity": "client.crash.compute-2", "caps": ["mon", "profile crash", "mgr", "profile crash"]}]': finished
Feb 02 09:39:46 compute-1 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Feb 02 09:39:46 compute-1 ceph-mon[80115]: Deploying daemon crash.compute-2 on compute-2
Feb 02 09:39:46 compute-1 ceph-mgr[80422]: mgr[py] Loading python module 'crash'
Feb 02 09:39:46 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]: 2026-02-02T09:39:46.801+0000 7f2ea4a6c140 -1 mgr[py] Module crash has missing NOTIFY_TYPES member
Feb 02 09:39:46 compute-1 ceph-mgr[80422]: mgr[py] Module crash has missing NOTIFY_TYPES member
Feb 02 09:39:46 compute-1 ceph-mgr[80422]: mgr[py] Loading python module 'dashboard'
Feb 02 09:39:47 compute-1 ceph-mgr[80422]: mgr[py] Loading python module 'devicehealth'
Feb 02 09:39:47 compute-1 ceph-mgr[80422]: mgr[py] Module devicehealth has missing NOTIFY_TYPES member
Feb 02 09:39:47 compute-1 ceph-mgr[80422]: mgr[py] Loading python module 'diskprediction_local'
Feb 02 09:39:47 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]: 2026-02-02T09:39:47.339+0000 7f2ea4a6c140 -1 mgr[py] Module devicehealth has missing NOTIFY_TYPES member
Feb 02 09:39:47 compute-1 ceph-mon[80115]: pgmap v58: 1 pgs: 1 active+clean; 449 KiB data, 53 MiB used, 40 GiB / 40 GiB avail
Feb 02 09:39:47 compute-1 ceph-mon[80115]: from='client.? 192.168.122.100:0/2428528003' entity='client.admin' cmd=[{"prefix": "osd pool create", "pool": "vms", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"}]: dispatch
Feb 02 09:39:47 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e14 e14: 2 total, 2 up, 2 in
Feb 02 09:39:47 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]: /lib64/python3.9/site-packages/scipy/__init__.py:73: UserWarning: NumPy was imported from a Python sub-interpreter but NumPy does not properly support sub-interpreters. This will likely work for most users but might cause hard to track down issues or subtle bugs. A common user of the rare sub-interpreter feature is wsgi which also allows single-interpreter mode.
Feb 02 09:39:47 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]: Improvements in the case of bugs are welcome, but is not on the NumPy roadmap, and full support may require significant effort to achieve.
Feb 02 09:39:47 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]:   from numpy import show_config as show_numpy_config
Feb 02 09:39:47 compute-1 ceph-mgr[80422]: mgr[py] Module diskprediction_local has missing NOTIFY_TYPES member
Feb 02 09:39:47 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]: 2026-02-02T09:39:47.484+0000 7f2ea4a6c140 -1 mgr[py] Module diskprediction_local has missing NOTIFY_TYPES member
Feb 02 09:39:47 compute-1 ceph-mgr[80422]: mgr[py] Loading python module 'influx'
Feb 02 09:39:47 compute-1 ceph-mgr[80422]: mgr[py] Module influx has missing NOTIFY_TYPES member
Feb 02 09:39:47 compute-1 ceph-mgr[80422]: mgr[py] Loading python module 'insights'
Feb 02 09:39:47 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]: 2026-02-02T09:39:47.545+0000 7f2ea4a6c140 -1 mgr[py] Module influx has missing NOTIFY_TYPES member
Feb 02 09:39:47 compute-1 ceph-mgr[80422]: mgr[py] Loading python module 'iostat'
Feb 02 09:39:47 compute-1 ceph-mgr[80422]: mgr[py] Module iostat has missing NOTIFY_TYPES member
Feb 02 09:39:47 compute-1 ceph-mgr[80422]: mgr[py] Loading python module 'k8sevents'
Feb 02 09:39:47 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]: 2026-02-02T09:39:47.663+0000 7f2ea4a6c140 -1 mgr[py] Module iostat has missing NOTIFY_TYPES member
Feb 02 09:39:47 compute-1 ceph-mgr[80422]: mgr[py] Loading python module 'localpool'
Feb 02 09:39:48 compute-1 ceph-mgr[80422]: mgr[py] Loading python module 'mds_autoscaler'
Feb 02 09:39:48 compute-1 ceph-mgr[80422]: mgr[py] Loading python module 'mirroring'
Feb 02 09:39:48 compute-1 ceph-mgr[80422]: mgr[py] Loading python module 'nfs'
Feb 02 09:39:48 compute-1 ceph-mon[80115]: from='client.? 192.168.122.100:0/2428528003' entity='client.admin' cmd='[{"prefix": "osd pool create", "pool": "vms", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"}]': finished
Feb 02 09:39:48 compute-1 ceph-mon[80115]: osdmap e14: 2 total, 2 up, 2 in
Feb 02 09:39:48 compute-1 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' 
Feb 02 09:39:48 compute-1 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' 
Feb 02 09:39:48 compute-1 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' 
Feb 02 09:39:48 compute-1 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' 
Feb 02 09:39:48 compute-1 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Feb 02 09:39:48 compute-1 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Feb 02 09:39:48 compute-1 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Feb 02 09:39:48 compute-1 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Feb 02 09:39:48 compute-1 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Feb 02 09:39:48 compute-1 ceph-mon[80115]: from='client.? 192.168.122.100:0/4145763547' entity='client.admin' cmd=[{"prefix": "osd pool create", "pool": "volumes", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"}]: dispatch
Feb 02 09:39:48 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e15 e15: 2 total, 2 up, 2 in
Feb 02 09:39:48 compute-1 ceph-mgr[80422]: mgr[py] Module nfs has missing NOTIFY_TYPES member
Feb 02 09:39:48 compute-1 ceph-mgr[80422]: mgr[py] Loading python module 'orchestrator'
Feb 02 09:39:48 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]: 2026-02-02T09:39:48.544+0000 7f2ea4a6c140 -1 mgr[py] Module nfs has missing NOTIFY_TYPES member
Feb 02 09:39:48 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 15 pg[3.0( empty local-lis/les=0/0 n=0 ec=15/15 lis/c=0/0 les/c/f=0/0/0 sis=15) [0] r=0 lpr=15 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 02 09:39:48 compute-1 ceph-mgr[80422]: mgr[py] Module orchestrator has missing NOTIFY_TYPES member
Feb 02 09:39:48 compute-1 ceph-mgr[80422]: mgr[py] Loading python module 'osd_perf_query'
Feb 02 09:39:48 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]: 2026-02-02T09:39:48.743+0000 7f2ea4a6c140 -1 mgr[py] Module orchestrator has missing NOTIFY_TYPES member
Feb 02 09:39:48 compute-1 ceph-mgr[80422]: mgr[py] Module osd_perf_query has missing NOTIFY_TYPES member
Feb 02 09:39:48 compute-1 ceph-mgr[80422]: mgr[py] Loading python module 'osd_support'
Feb 02 09:39:48 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]: 2026-02-02T09:39:48.813+0000 7f2ea4a6c140 -1 mgr[py] Module osd_perf_query has missing NOTIFY_TYPES member
Feb 02 09:39:48 compute-1 ceph-mgr[80422]: mgr[py] Module osd_support has missing NOTIFY_TYPES member
Feb 02 09:39:48 compute-1 ceph-mgr[80422]: mgr[py] Loading python module 'pg_autoscaler'
Feb 02 09:39:48 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]: 2026-02-02T09:39:48.875+0000 7f2ea4a6c140 -1 mgr[py] Module osd_support has missing NOTIFY_TYPES member
Feb 02 09:39:48 compute-1 ceph-mgr[80422]: mgr[py] Module pg_autoscaler has missing NOTIFY_TYPES member
Feb 02 09:39:48 compute-1 ceph-mgr[80422]: mgr[py] Loading python module 'progress'
Feb 02 09:39:48 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]: 2026-02-02T09:39:48.949+0000 7f2ea4a6c140 -1 mgr[py] Module pg_autoscaler has missing NOTIFY_TYPES member
Feb 02 09:39:49 compute-1 ceph-mgr[80422]: mgr[py] Module progress has missing NOTIFY_TYPES member
Feb 02 09:39:49 compute-1 ceph-mgr[80422]: mgr[py] Loading python module 'prometheus'
Feb 02 09:39:49 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]: 2026-02-02T09:39:49.029+0000 7f2ea4a6c140 -1 mgr[py] Module progress has missing NOTIFY_TYPES member
Feb 02 09:39:49 compute-1 ceph-mgr[80422]: mgr[py] Module prometheus has missing NOTIFY_TYPES member
Feb 02 09:39:49 compute-1 ceph-mgr[80422]: mgr[py] Loading python module 'rbd_support'
Feb 02 09:39:49 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]: 2026-02-02T09:39:49.333+0000 7f2ea4a6c140 -1 mgr[py] Module prometheus has missing NOTIFY_TYPES member
Feb 02 09:39:49 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e16 e16: 3 total, 2 up, 3 in
Feb 02 09:39:49 compute-1 ceph-mgr[80422]: mgr[py] Module rbd_support has missing NOTIFY_TYPES member
Feb 02 09:39:49 compute-1 ceph-mgr[80422]: mgr[py] Loading python module 'restful'
Feb 02 09:39:49 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]: 2026-02-02T09:39:49.422+0000 7f2ea4a6c140 -1 mgr[py] Module rbd_support has missing NOTIFY_TYPES member
Feb 02 09:39:49 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 16 pg[4.0( empty local-lis/les=0/0 n=0 ec=16/16 lis/c=0/0 les/c/f=0/0/0 sis=16) [0] r=0 lpr=16 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 02 09:39:49 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 16 pg[3.0( empty local-lis/les=15/16 n=0 ec=15/15 lis/c=0/0 les/c/f=0/0/0 sis=15) [0] r=0 lpr=15 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 02 09:39:49 compute-1 ceph-mon[80115]: from='client.? 192.168.122.100:0/4145763547' entity='client.admin' cmd='[{"prefix": "osd pool create", "pool": "volumes", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"}]': finished
Feb 02 09:39:49 compute-1 ceph-mon[80115]: osdmap e15: 2 total, 2 up, 2 in
Feb 02 09:39:49 compute-1 ceph-mon[80115]: pgmap v61: 3 pgs: 1 unknown, 1 creating+peering, 1 active+clean; 449 KiB data, 53 MiB used, 40 GiB / 40 GiB avail
Feb 02 09:39:49 compute-1 ceph-mon[80115]: from='client.? 192.168.122.100:0/2619661592' entity='client.admin' cmd=[{"prefix": "osd pool create", "pool": "backups", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"}]: dispatch
Feb 02 09:39:49 compute-1 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' 
Feb 02 09:39:49 compute-1 ceph-mon[80115]: from='client.? 192.168.122.102:0/1508149425' entity='client.bootstrap-osd' cmd=[{"prefix": "osd new", "uuid": "d6a8c5e6-c7a4-4174-b954-0533ecfedcd2"}]: dispatch
Feb 02 09:39:49 compute-1 ceph-mon[80115]: from='client.? ' entity='client.bootstrap-osd' cmd=[{"prefix": "osd new", "uuid": "d6a8c5e6-c7a4-4174-b954-0533ecfedcd2"}]: dispatch
Feb 02 09:39:49 compute-1 ceph-mon[80115]: from='client.? 192.168.122.100:0/2619661592' entity='client.admin' cmd='[{"prefix": "osd pool create", "pool": "backups", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"}]': finished
Feb 02 09:39:49 compute-1 ceph-mon[80115]: from='client.? ' entity='client.bootstrap-osd' cmd='[{"prefix": "osd new", "uuid": "d6a8c5e6-c7a4-4174-b954-0533ecfedcd2"}]': finished
Feb 02 09:39:49 compute-1 ceph-mon[80115]: osdmap e16: 3 total, 2 up, 3 in
Feb 02 09:39:49 compute-1 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch
Feb 02 09:39:49 compute-1 ceph-mgr[80422]: mgr[py] Loading python module 'rgw'
Feb 02 09:39:49 compute-1 ceph-mgr[80422]: mgr[py] Module rgw has missing NOTIFY_TYPES member
Feb 02 09:39:49 compute-1 ceph-mgr[80422]: mgr[py] Loading python module 'rook'
Feb 02 09:39:49 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]: 2026-02-02T09:39:49.831+0000 7f2ea4a6c140 -1 mgr[py] Module rgw has missing NOTIFY_TYPES member
Feb 02 09:39:50 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e16 _set_new_cache_sizes cache_size:1020053324 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 02 09:39:50 compute-1 ceph-mgr[80422]: mgr[py] Module rook has missing NOTIFY_TYPES member
Feb 02 09:39:50 compute-1 ceph-mgr[80422]: mgr[py] Loading python module 'selftest'
Feb 02 09:39:50 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]: 2026-02-02T09:39:50.336+0000 7f2ea4a6c140 -1 mgr[py] Module rook has missing NOTIFY_TYPES member
Feb 02 09:39:50 compute-1 ceph-mgr[80422]: mgr[py] Module selftest has missing NOTIFY_TYPES member
Feb 02 09:39:50 compute-1 ceph-mgr[80422]: mgr[py] Loading python module 'snap_schedule'
Feb 02 09:39:50 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]: 2026-02-02T09:39:50.414+0000 7f2ea4a6c140 -1 mgr[py] Module selftest has missing NOTIFY_TYPES member
Feb 02 09:39:50 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e17 e17: 3 total, 2 up, 3 in
Feb 02 09:39:50 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 17 pg[5.0( empty local-lis/les=0/0 n=0 ec=17/17 lis/c=0/0 les/c/f=0/0/0 sis=17) [0] r=0 lpr=17 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 02 09:39:50 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 17 pg[4.0( empty local-lis/les=16/17 n=0 ec=16/16 lis/c=0/0 les/c/f=0/0/0 sis=16) [0] r=0 lpr=16 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 02 09:39:50 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]: 2026-02-02T09:39:50.494+0000 7f2ea4a6c140 -1 mgr[py] Module snap_schedule has missing NOTIFY_TYPES member
Feb 02 09:39:50 compute-1 ceph-mgr[80422]: mgr[py] Module snap_schedule has missing NOTIFY_TYPES member
Feb 02 09:39:50 compute-1 ceph-mgr[80422]: mgr[py] Loading python module 'stats'
Feb 02 09:39:50 compute-1 ceph-mon[80115]: Health check failed: 2 pool(s) do not have an application enabled (POOL_APP_NOT_ENABLED)
Feb 02 09:39:50 compute-1 ceph-mon[80115]: from='client.? 192.168.122.102:0/2640401350' entity='client.bootstrap-osd' cmd=[{"prefix": "mon getmap"}]: dispatch
Feb 02 09:39:50 compute-1 ceph-mon[80115]: from='client.? 192.168.122.100:0/1316336526' entity='client.admin' cmd=[{"prefix": "osd pool create", "pool": "images", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"}]: dispatch
Feb 02 09:39:50 compute-1 ceph-mon[80115]: from='client.? 192.168.122.100:0/1316336526' entity='client.admin' cmd='[{"prefix": "osd pool create", "pool": "images", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"}]': finished
Feb 02 09:39:50 compute-1 ceph-mon[80115]: osdmap e17: 3 total, 2 up, 3 in
Feb 02 09:39:50 compute-1 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch
Feb 02 09:39:50 compute-1 ceph-mgr[80422]: mgr[py] Loading python module 'status'
Feb 02 09:39:50 compute-1 ceph-mgr[80422]: mgr[py] Module status has missing NOTIFY_TYPES member
Feb 02 09:39:50 compute-1 ceph-mgr[80422]: mgr[py] Loading python module 'telegraf'
Feb 02 09:39:50 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]: 2026-02-02T09:39:50.625+0000 7f2ea4a6c140 -1 mgr[py] Module status has missing NOTIFY_TYPES member
Feb 02 09:39:50 compute-1 ceph-mgr[80422]: mgr[py] Module telegraf has missing NOTIFY_TYPES member
Feb 02 09:39:50 compute-1 ceph-mgr[80422]: mgr[py] Loading python module 'telemetry'
Feb 02 09:39:50 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]: 2026-02-02T09:39:50.688+0000 7f2ea4a6c140 -1 mgr[py] Module telegraf has missing NOTIFY_TYPES member
Feb 02 09:39:50 compute-1 ceph-mgr[80422]: mgr[py] Module telemetry has missing NOTIFY_TYPES member
Feb 02 09:39:50 compute-1 ceph-mgr[80422]: mgr[py] Loading python module 'test_orchestrator'
Feb 02 09:39:50 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]: 2026-02-02T09:39:50.827+0000 7f2ea4a6c140 -1 mgr[py] Module telemetry has missing NOTIFY_TYPES member
Feb 02 09:39:51 compute-1 ceph-mgr[80422]: mgr[py] Module test_orchestrator has missing NOTIFY_TYPES member
Feb 02 09:39:51 compute-1 ceph-mgr[80422]: mgr[py] Loading python module 'volumes'
Feb 02 09:39:51 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]: 2026-02-02T09:39:51.020+0000 7f2ea4a6c140 -1 mgr[py] Module test_orchestrator has missing NOTIFY_TYPES member
Feb 02 09:39:51 compute-1 ceph-mgr[80422]: mgr[py] Module volumes has missing NOTIFY_TYPES member
Feb 02 09:39:51 compute-1 ceph-mgr[80422]: mgr[py] Loading python module 'zabbix'
Feb 02 09:39:51 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]: 2026-02-02T09:39:51.257+0000 7f2ea4a6c140 -1 mgr[py] Module volumes has missing NOTIFY_TYPES member
Feb 02 09:39:51 compute-1 ceph-mgr[80422]: mgr[py] Module zabbix has missing NOTIFY_TYPES member
Feb 02 09:39:51 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]: 2026-02-02T09:39:51.318+0000 7f2ea4a6c140 -1 mgr[py] Module zabbix has missing NOTIFY_TYPES member
Feb 02 09:39:51 compute-1 ceph-mgr[80422]: ms_deliver_dispatch: unhandled message 0x55dc9e0b8d00 mon_map magic: 0 from mon.2 v2:192.168.122.101:3300/0
Feb 02 09:39:51 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e18 e18: 3 total, 2 up, 3 in
Feb 02 09:39:51 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 18 pg[6.0( empty local-lis/les=0/0 n=0 ec=18/18 lis/c=0/0 les/c/f=0/0/0 sis=18) [0] r=0 lpr=18 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 02 09:39:51 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 18 pg[5.0( empty local-lis/les=17/18 n=0 ec=17/17 lis/c=0/0 les/c/f=0/0/0 sis=17) [0] r=0 lpr=17 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 02 09:39:51 compute-1 ceph-mon[80115]: pgmap v64: 5 pgs: 3 unknown, 1 creating+peering, 1 active+clean; 449 KiB data, 53 MiB used, 40 GiB / 40 GiB avail
Feb 02 09:39:51 compute-1 ceph-mon[80115]: Standby manager daemon compute-2.gzlyac started
Feb 02 09:39:51 compute-1 ceph-mon[80115]: Standby manager daemon compute-1.teascl started
Feb 02 09:39:51 compute-1 ceph-mon[80115]: from='client.? 192.168.122.100:0/2386791214' entity='client.admin' cmd=[{"prefix": "osd pool create", "pool": "cephfs.cephfs.meta", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"}]: dispatch
Feb 02 09:39:51 compute-1 ceph-mon[80115]: from='client.? 192.168.122.100:0/2386791214' entity='client.admin' cmd='[{"prefix": "osd pool create", "pool": "cephfs.cephfs.meta", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"}]': finished
Feb 02 09:39:51 compute-1 ceph-mon[80115]: osdmap e18: 3 total, 2 up, 3 in
Feb 02 09:39:51 compute-1 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch
Feb 02 09:39:52 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e19 e19: 3 total, 2 up, 3 in
Feb 02 09:39:52 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 19 pg[6.0( empty local-lis/les=18/19 n=0 ec=18/18 lis/c=0/0 les/c/f=0/0/0 sis=18) [0] r=0 lpr=18 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 02 09:39:52 compute-1 ceph-mon[80115]: mgrmap e10: compute-0.djvyfo(active, since 109s), standbys: compute-2.gzlyac, compute-1.teascl
Feb 02 09:39:52 compute-1 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "mgr metadata", "who": "compute-2.gzlyac", "id": "compute-2.gzlyac"}]: dispatch
Feb 02 09:39:52 compute-1 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "mgr metadata", "who": "compute-1.teascl", "id": "compute-1.teascl"}]: dispatch
Feb 02 09:39:52 compute-1 ceph-mon[80115]: from='client.? 192.168.122.100:0/4009666663' entity='client.admin' cmd=[{"prefix": "osd pool create", "pool": "cephfs.cephfs.data", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"}]: dispatch
Feb 02 09:39:52 compute-1 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' 
Feb 02 09:39:52 compute-1 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' 
Feb 02 09:39:52 compute-1 ceph-mon[80115]: from='client.? 192.168.122.100:0/4009666663' entity='client.admin' cmd='[{"prefix": "osd pool create", "pool": "cephfs.cephfs.data", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"}]': finished
Feb 02 09:39:52 compute-1 ceph-mon[80115]: osdmap e19: 3 total, 2 up, 3 in
Feb 02 09:39:52 compute-1 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch
Feb 02 09:39:53 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e20 e20: 3 total, 2 up, 3 in
Feb 02 09:39:53 compute-1 ceph-mon[80115]: pgmap v67: 7 pgs: 2 creating+peering, 5 active+clean; 449 KiB data, 53 MiB used, 40 GiB / 40 GiB avail
Feb 02 09:39:53 compute-1 ceph-mon[80115]: from='client.? 192.168.122.100:0/1859598156' entity='client.admin' cmd=[{"prefix": "osd pool application enable", "pool": "vms", "app": "rbd"}]: dispatch
Feb 02 09:39:53 compute-1 ceph-mon[80115]: from='client.? 192.168.122.100:0/1859598156' entity='client.admin' cmd='[{"prefix": "osd pool application enable", "pool": "vms", "app": "rbd"}]': finished
Feb 02 09:39:53 compute-1 ceph-mon[80115]: osdmap e20: 3 total, 2 up, 3 in
Feb 02 09:39:53 compute-1 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch
Feb 02 09:39:54 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e21 e21: 3 total, 2 up, 3 in
Feb 02 09:39:55 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e21 _set_new_cache_sizes cache_size:1020054713 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 02 09:39:55 compute-1 ceph-mon[80115]: osdmap e21: 3 total, 2 up, 3 in
Feb 02 09:39:55 compute-1 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch
Feb 02 09:39:55 compute-1 ceph-mon[80115]: from='client.? 192.168.122.100:0/740467932' entity='client.admin' cmd=[{"prefix": "osd pool application enable", "pool": "volumes", "app": "rbd"}]: dispatch
Feb 02 09:39:55 compute-1 ceph-mon[80115]: pgmap v70: 7 pgs: 2 creating+peering, 5 active+clean; 449 KiB data, 53 MiB used, 40 GiB / 40 GiB avail
Feb 02 09:39:55 compute-1 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "auth get", "entity": "osd.2"}]: dispatch
Feb 02 09:39:55 compute-1 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Feb 02 09:39:55 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e22 e22: 3 total, 2 up, 3 in
Feb 02 09:39:56 compute-1 ceph-mon[80115]: Deploying daemon osd.2 on compute-2
Feb 02 09:39:56 compute-1 ceph-mon[80115]: Health check update: 5 pool(s) do not have an application enabled (POOL_APP_NOT_ENABLED)
Feb 02 09:39:56 compute-1 ceph-mon[80115]: from='client.? 192.168.122.100:0/740467932' entity='client.admin' cmd='[{"prefix": "osd pool application enable", "pool": "volumes", "app": "rbd"}]': finished
Feb 02 09:39:56 compute-1 ceph-mon[80115]: osdmap e22: 3 total, 2 up, 3 in
Feb 02 09:39:56 compute-1 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch
Feb 02 09:39:56 compute-1 ceph-mon[80115]: from='client.? 192.168.122.100:0/1601645049' entity='client.admin' cmd=[{"prefix": "osd pool application enable", "pool": "backups", "app": "rbd"}]: dispatch
Feb 02 09:39:56 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e23 e23: 3 total, 2 up, 3 in
Feb 02 09:39:57 compute-1 ceph-mon[80115]: from='client.? 192.168.122.100:0/1601645049' entity='client.admin' cmd='[{"prefix": "osd pool application enable", "pool": "backups", "app": "rbd"}]': finished
Feb 02 09:39:57 compute-1 ceph-mon[80115]: osdmap e23: 3 total, 2 up, 3 in
Feb 02 09:39:57 compute-1 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch
Feb 02 09:39:57 compute-1 ceph-mon[80115]: pgmap v73: 7 pgs: 1 creating+peering, 6 active+clean; 449 KiB data, 53 MiB used, 40 GiB / 40 GiB avail
Feb 02 09:39:58 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e24 e24: 3 total, 2 up, 3 in
Feb 02 09:39:58 compute-1 ceph-mon[80115]: from='client.? 192.168.122.100:0/2116931678' entity='client.admin' cmd=[{"prefix": "osd pool application enable", "pool": "images", "app": "rbd"}]: dispatch
Feb 02 09:39:58 compute-1 ceph-mon[80115]: from='client.? 192.168.122.100:0/2116931678' entity='client.admin' cmd='[{"prefix": "osd pool application enable", "pool": "images", "app": "rbd"}]': finished
Feb 02 09:39:58 compute-1 ceph-mon[80115]: osdmap e24: 3 total, 2 up, 3 in
Feb 02 09:39:58 compute-1 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch
Feb 02 09:39:59 compute-1 ceph-mon[80115]: pgmap v75: 7 pgs: 7 active+clean; 449 KiB data, 53 MiB used, 40 GiB / 40 GiB avail
Feb 02 09:40:00 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e24 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 02 09:40:00 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e25 e25: 3 total, 2 up, 3 in
Feb 02 09:40:01 compute-1 ceph-mon[80115]: from='client.? 192.168.122.100:0/4028203447' entity='client.admin' cmd=[{"prefix": "osd pool application enable", "pool": "cephfs.cephfs.meta", "app": "cephfs"}]: dispatch
Feb 02 09:40:01 compute-1 ceph-mon[80115]: Health detail: HEALTH_WARN 3 pool(s) do not have an application enabled
Feb 02 09:40:01 compute-1 ceph-mon[80115]: [WRN] POOL_APP_NOT_ENABLED: 3 pool(s) do not have an application enabled
Feb 02 09:40:01 compute-1 ceph-mon[80115]:     application not enabled on pool 'images'
Feb 02 09:40:01 compute-1 ceph-mon[80115]:     application not enabled on pool 'cephfs.cephfs.meta'
Feb 02 09:40:01 compute-1 ceph-mon[80115]:     application not enabled on pool 'cephfs.cephfs.data'
Feb 02 09:40:01 compute-1 ceph-mon[80115]:     use 'ceph osd pool application enable <pool-name> <app-name>', where <app-name> is 'cephfs', 'rbd', 'rgw', or freeform for custom applications.
Feb 02 09:40:01 compute-1 ceph-mon[80115]: from='client.? 192.168.122.100:0/4028203447' entity='client.admin' cmd='[{"prefix": "osd pool application enable", "pool": "cephfs.cephfs.meta", "app": "cephfs"}]': finished
Feb 02 09:40:01 compute-1 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' 
Feb 02 09:40:01 compute-1 ceph-mon[80115]: osdmap e25: 3 total, 2 up, 3 in
Feb 02 09:40:01 compute-1 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch
Feb 02 09:40:01 compute-1 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' 
Feb 02 09:40:01 compute-1 ceph-mon[80115]: Health check update: 2 pool(s) do not have an application enabled (POOL_APP_NOT_ENABLED)
Feb 02 09:40:02 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e26 e26: 3 total, 2 up, 3 in
Feb 02 09:40:02 compute-1 ceph-mon[80115]: pgmap v77: 7 pgs: 7 active+clean; 449 KiB data, 53 MiB used, 40 GiB / 40 GiB avail
Feb 02 09:40:02 compute-1 ceph-mon[80115]: from='client.? 192.168.122.100:0/1055840720' entity='client.admin' cmd=[{"prefix": "osd pool application enable", "pool": "cephfs.cephfs.data", "app": "cephfs"}]: dispatch
Feb 02 09:40:03 compute-1 ceph-mon[80115]: from='client.? 192.168.122.100:0/1055840720' entity='client.admin' cmd='[{"prefix": "osd pool application enable", "pool": "cephfs.cephfs.data", "app": "cephfs"}]': finished
Feb 02 09:40:03 compute-1 ceph-mon[80115]: osdmap e26: 3 total, 2 up, 3 in
Feb 02 09:40:03 compute-1 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch
Feb 02 09:40:03 compute-1 ceph-mon[80115]: from='osd.2 [v2:192.168.122.102:6800/4043786308,v1:192.168.122.102:6801/4043786308]' entity='osd.2' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["2"]}]: dispatch
Feb 02 09:40:03 compute-1 ceph-mon[80115]: from='osd.2 ' entity='osd.2' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["2"]}]: dispatch
Feb 02 09:40:03 compute-1 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd pool set", "pool": "vms", "var": "pg_num", "val": "32"}]: dispatch
Feb 02 09:40:03 compute-1 ceph-mon[80115]: pgmap v79: 7 pgs: 7 active+clean; 449 KiB data, 53 MiB used, 40 GiB / 40 GiB avail
Feb 02 09:40:03 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e27 e27: 3 total, 2 up, 3 in
Feb 02 09:40:03 compute-1 sudo[80454]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 02 09:40:03 compute-1 sudo[80454]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:40:03 compute-1 sudo[80454]: pam_unix(sudo:session): session closed for user root
Feb 02 09:40:04 compute-1 sudo[80479]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 02 09:40:04 compute-1 sudo[80479]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:40:04 compute-1 sudo[80479]: pam_unix(sudo:session): session closed for user root
Feb 02 09:40:04 compute-1 sudo[80504]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/d241d473-9fcb-5f74-b163-f1ca4454e7f1/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --image quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec --timeout 895 ls
Feb 02 09:40:04 compute-1 sudo[80504]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:40:04 compute-1 ceph-mon[80115]: Health check cleared: POOL_APP_NOT_ENABLED (was: 1 pool(s) do not have an application enabled)
Feb 02 09:40:04 compute-1 ceph-mon[80115]: Cluster is now healthy
Feb 02 09:40:04 compute-1 ceph-mon[80115]: from='osd.2 ' entity='osd.2' cmd='[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["2"]}]': finished
Feb 02 09:40:04 compute-1 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' cmd='[{"prefix": "osd pool set", "pool": "vms", "var": "pg_num", "val": "32"}]': finished
Feb 02 09:40:04 compute-1 ceph-mon[80115]: osdmap e27: 3 total, 2 up, 3 in
Feb 02 09:40:04 compute-1 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch
Feb 02 09:40:04 compute-1 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd pool set", "pool": "volumes", "var": "pg_num", "val": "32"}]: dispatch
Feb 02 09:40:04 compute-1 ceph-mon[80115]: from='osd.2 [v2:192.168.122.102:6800/4043786308,v1:192.168.122.102:6801/4043786308]' entity='osd.2' cmd=[{"prefix": "osd crush create-or-move", "id": 2, "weight":0.0195, "args": ["host=compute-2", "root=default"]}]: dispatch
Feb 02 09:40:04 compute-1 ceph-mon[80115]: from='osd.2 ' entity='osd.2' cmd=[{"prefix": "osd crush create-or-move", "id": 2, "weight":0.0195, "args": ["host=compute-2", "root=default"]}]: dispatch
Feb 02 09:40:04 compute-1 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' 
Feb 02 09:40:04 compute-1 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' 
Feb 02 09:40:04 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e28 e28: 3 total, 2 up, 3 in
Feb 02 09:40:04 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 28 pg[5.0( empty local-lis/les=17/18 n=0 ec=17/17 lis/c=17/17 les/c/f=18/18/0 sis=28 pruub=10.911078453s) [] r=-1 lpr=28 pi=[17,28)/1 crt=0'0 mlcod 0'0 active pruub 66.645973206s@ mbc={}] PeeringState::start_peering_interval up [0] -> [], acting [0] -> [], acting_primary 0 -> -1, up_primary 0 -> -1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb 02 09:40:04 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 28 pg[3.0( empty local-lis/les=15/16 n=0 ec=15/15 lis/c=15/15 les/c/f=16/16/0 sis=28 pruub=8.888126373s) [] r=-1 lpr=28 pi=[15,28)/1 crt=0'0 mlcod 0'0 active pruub 64.623085022s@ mbc={}] PeeringState::start_peering_interval up [0] -> [], acting [0] -> [], acting_primary 0 -> -1, up_primary 0 -> -1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb 02 09:40:04 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 28 pg[5.0( empty local-lis/les=17/18 n=0 ec=17/17 lis/c=17/17 les/c/f=18/18/0 sis=28 pruub=10.911078453s) [] r=-1 lpr=28 pi=[17,28)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 66.645973206s@ mbc={}] state<Start>: transitioning to Stray
Feb 02 09:40:04 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 28 pg[3.0( empty local-lis/les=15/16 n=0 ec=15/15 lis/c=15/15 les/c/f=16/16/0 sis=28 pruub=8.888126373s) [] r=-1 lpr=28 pi=[15,28)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 64.623085022s@ mbc={}] state<Start>: transitioning to Stray
Feb 02 09:40:04 compute-1 podman[80602]: 2026-02-02 09:40:04.767236964 +0000 UTC m=+0.077869896 container exec 01cf0f34952fee77036b3444f3bcffc4063a933b24861a159799fb77cf2e6e0e (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-crash-compute-1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, io.buildah.version=1.40.1)
Feb 02 09:40:04 compute-1 podman[80602]: 2026-02-02 09:40:04.896600326 +0000 UTC m=+0.207233278 container exec_died 01cf0f34952fee77036b3444f3bcffc4063a933b24861a159799fb77cf2e6e0e (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-crash-compute-1, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, CEPH_REF=squid, io.buildah.version=1.40.1, OSD_FLAVOR=default, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Feb 02 09:40:05 compute-1 sudo[80504]: pam_unix(sudo:session): session closed for user root
Feb 02 09:40:05 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e28 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 02 09:40:05 compute-1 sudo[80689]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 02 09:40:05 compute-1 sudo[80689]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:40:05 compute-1 sudo[80689]: pam_unix(sudo:session): session closed for user root
Feb 02 09:40:05 compute-1 sudo[80714]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/d241d473-9fcb-5f74-b163-f1ca4454e7f1/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Feb 02 09:40:05 compute-1 sudo[80714]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:40:05 compute-1 ceph-mon[80115]: purged_snaps scrub starts
Feb 02 09:40:05 compute-1 ceph-mon[80115]: purged_snaps scrub ok
Feb 02 09:40:05 compute-1 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' cmd='[{"prefix": "osd pool set", "pool": "volumes", "var": "pg_num", "val": "32"}]': finished
Feb 02 09:40:05 compute-1 ceph-mon[80115]: from='osd.2 ' entity='osd.2' cmd='[{"prefix": "osd crush create-or-move", "id": 2, "weight":0.0195, "args": ["host=compute-2", "root=default"]}]': finished
Feb 02 09:40:05 compute-1 ceph-mon[80115]: osdmap e28: 3 total, 2 up, 3 in
Feb 02 09:40:05 compute-1 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch
Feb 02 09:40:05 compute-1 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd pool set", "pool": "backups", "var": "pg_num", "val": "32"}]: dispatch
Feb 02 09:40:05 compute-1 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch
Feb 02 09:40:05 compute-1 ceph-mon[80115]: pgmap v82: 7 pgs: 7 active+clean; 449 KiB data, 53 MiB used, 40 GiB / 40 GiB avail
Feb 02 09:40:05 compute-1 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd pool set", "pool": "vms", "var": "pg_num_actual", "val": "32"}]: dispatch
Feb 02 09:40:05 compute-1 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd pool set", "pool": "volumes", "var": "pg_num_actual", "val": "32"}]: dispatch
Feb 02 09:40:05 compute-1 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' 
Feb 02 09:40:05 compute-1 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' 
Feb 02 09:40:05 compute-1 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' 
Feb 02 09:40:05 compute-1 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' 
Feb 02 09:40:05 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e29 e29: 3 total, 2 up, 3 in
Feb 02 09:40:05 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 29 pg[3.0( empty local-lis/les=15/16 n=0 ec=15/15 lis/c=15/15 les/c/f=16/16/0 sis=29 pruub=7.829770565s) [] r=-1 lpr=29 pi=[15,29)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 64.623085022s@ mbc={}] PeeringState::start_peering_interval up [] -> [], acting [] -> [], acting_primary ? -> -1, up_primary ? -> -1, role -1 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb 02 09:40:05 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 29 pg[3.0( empty local-lis/les=15/16 n=0 ec=15/15 lis/c=15/15 les/c/f=16/16/0 sis=29 pruub=7.829770565s) [] r=-1 lpr=29 pi=[15,29)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 64.623085022s@ mbc={}] state<Start>: transitioning to Stray
Feb 02 09:40:05 compute-1 sudo[80714]: pam_unix(sudo:session): session closed for user root
Feb 02 09:40:06 compute-1 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch
Feb 02 09:40:06 compute-1 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' cmd='[{"prefix": "osd pool set", "pool": "backups", "var": "pg_num", "val": "32"}]': finished
Feb 02 09:40:06 compute-1 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' cmd='[{"prefix": "osd pool set", "pool": "vms", "var": "pg_num_actual", "val": "32"}]': finished
Feb 02 09:40:06 compute-1 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' cmd='[{"prefix": "osd pool set", "pool": "volumes", "var": "pg_num_actual", "val": "32"}]': finished
Feb 02 09:40:06 compute-1 ceph-mon[80115]: osdmap e29: 3 total, 2 up, 3 in
Feb 02 09:40:06 compute-1 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch
Feb 02 09:40:06 compute-1 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd pool set", "pool": "images", "var": "pg_num", "val": "32"}]: dispatch
Feb 02 09:40:06 compute-1 ceph-mon[80115]: from='client.? 192.168.122.100:0/2425208278' entity='client.admin' cmd=[{"prefix": "config assimilate-conf"}]: dispatch
Feb 02 09:40:06 compute-1 ceph-mon[80115]: from='client.? 192.168.122.100:0/2425208278' entity='client.admin' cmd='[{"prefix": "config assimilate-conf"}]': finished
Feb 02 09:40:06 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e30 e30: 3 total, 2 up, 3 in
Feb 02 09:40:07 compute-1 sudo[80771]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /etc/ceph
Feb 02 09:40:07 compute-1 sudo[80771]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:40:07 compute-1 sudo[80771]: pam_unix(sudo:session): session closed for user root
Feb 02 09:40:07 compute-1 sudo[80796]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-d241d473-9fcb-5f74-b163-f1ca4454e7f1/etc/ceph
Feb 02 09:40:07 compute-1 sudo[80796]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:40:07 compute-1 sudo[80796]: pam_unix(sudo:session): session closed for user root
Feb 02 09:40:07 compute-1 sudo[80821]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-d241d473-9fcb-5f74-b163-f1ca4454e7f1/etc/ceph/ceph.conf.new
Feb 02 09:40:07 compute-1 sudo[80821]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:40:07 compute-1 sudo[80821]: pam_unix(sudo:session): session closed for user root
Feb 02 09:40:07 compute-1 sudo[80846]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-d241d473-9fcb-5f74-b163-f1ca4454e7f1
Feb 02 09:40:07 compute-1 sudo[80846]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:40:07 compute-1 sudo[80846]: pam_unix(sudo:session): session closed for user root
Feb 02 09:40:07 compute-1 sudo[80871]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-d241d473-9fcb-5f74-b163-f1ca4454e7f1/etc/ceph/ceph.conf.new
Feb 02 09:40:07 compute-1 sudo[80871]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:40:07 compute-1 sudo[80871]: pam_unix(sudo:session): session closed for user root
Feb 02 09:40:07 compute-1 sudo[80919]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-d241d473-9fcb-5f74-b163-f1ca4454e7f1/etc/ceph/ceph.conf.new
Feb 02 09:40:07 compute-1 sudo[80919]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:40:07 compute-1 sudo[80919]: pam_unix(sudo:session): session closed for user root
Feb 02 09:40:07 compute-1 sudo[80944]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-d241d473-9fcb-5f74-b163-f1ca4454e7f1/etc/ceph/ceph.conf.new
Feb 02 09:40:07 compute-1 sudo[80944]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:40:07 compute-1 sudo[80944]: pam_unix(sudo:session): session closed for user root
Feb 02 09:40:07 compute-1 sudo[80969]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-d241d473-9fcb-5f74-b163-f1ca4454e7f1/etc/ceph/ceph.conf.new /etc/ceph/ceph.conf
Feb 02 09:40:07 compute-1 sudo[80969]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:40:07 compute-1 sudo[80969]: pam_unix(sudo:session): session closed for user root
Feb 02 09:40:07 compute-1 sudo[80994]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/d241d473-9fcb-5f74-b163-f1ca4454e7f1/config
Feb 02 09:40:07 compute-1 sudo[80994]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:40:07 compute-1 sudo[80994]: pam_unix(sudo:session): session closed for user root
Feb 02 09:40:07 compute-1 sudo[81019]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-d241d473-9fcb-5f74-b163-f1ca4454e7f1/var/lib/ceph/d241d473-9fcb-5f74-b163-f1ca4454e7f1/config
Feb 02 09:40:07 compute-1 sudo[81019]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:40:07 compute-1 sudo[81019]: pam_unix(sudo:session): session closed for user root
Feb 02 09:40:07 compute-1 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch
Feb 02 09:40:07 compute-1 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' cmd='[{"prefix": "osd pool set", "pool": "images", "var": "pg_num", "val": "32"}]': finished
Feb 02 09:40:07 compute-1 ceph-mon[80115]: osdmap e30: 3 total, 2 up, 3 in
Feb 02 09:40:07 compute-1 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch
Feb 02 09:40:07 compute-1 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pg_num", "val": "32"}]: dispatch
Feb 02 09:40:07 compute-1 ceph-mon[80115]: from='client.? 192.168.122.100:0/1005529416' entity='client.admin' 
Feb 02 09:40:07 compute-1 ceph-mon[80115]: pgmap v85: 69 pgs: 62 unknown, 7 active+clean; 449 KiB data, 53 MiB used, 40 GiB / 40 GiB avail
Feb 02 09:40:07 compute-1 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd pool set", "pool": "backups", "var": "pg_num_actual", "val": "32"}]: dispatch
Feb 02 09:40:07 compute-1 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd pool set", "pool": "images", "var": "pg_num_actual", "val": "32"}]: dispatch
Feb 02 09:40:07 compute-1 ceph-mon[80115]: OSD bench result of 6231.058141 IOPS is not within the threshold limit range of 50.000000 IOPS and 500.000000 IOPS for osd.2. IOPS capacity is unchanged at 315.000000 IOPS. The recommendation is to establish the osd's IOPS capacity using other benchmark tools (e.g. Fio) and then override osd_mclock_max_capacity_iops_[hdd|ssd].
Feb 02 09:40:07 compute-1 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' 
Feb 02 09:40:07 compute-1 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' 
Feb 02 09:40:07 compute-1 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"}]: dispatch
Feb 02 09:40:07 compute-1 ceph-mon[80115]: Adjusting osd_memory_target on compute-2 to 127.9M
Feb 02 09:40:07 compute-1 ceph-mon[80115]: Unable to set osd_memory_target on compute-2 to 134203392: error parsing value: Value '134203392' is below minimum 939524096
Feb 02 09:40:07 compute-1 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Feb 02 09:40:07 compute-1 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Feb 02 09:40:07 compute-1 ceph-mon[80115]: Updating compute-0:/etc/ceph/ceph.conf
Feb 02 09:40:07 compute-1 ceph-mon[80115]: Updating compute-1:/etc/ceph/ceph.conf
Feb 02 09:40:07 compute-1 ceph-mon[80115]: Updating compute-2:/etc/ceph/ceph.conf
Feb 02 09:40:07 compute-1 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch
Feb 02 09:40:07 compute-1 sudo[81044]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-d241d473-9fcb-5f74-b163-f1ca4454e7f1/var/lib/ceph/d241d473-9fcb-5f74-b163-f1ca4454e7f1/config/ceph.conf.new
Feb 02 09:40:07 compute-1 sudo[81044]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:40:07 compute-1 sudo[81044]: pam_unix(sudo:session): session closed for user root
Feb 02 09:40:07 compute-1 sudo[81069]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-d241d473-9fcb-5f74-b163-f1ca4454e7f1
Feb 02 09:40:07 compute-1 sudo[81069]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:40:07 compute-1 sudo[81069]: pam_unix(sudo:session): session closed for user root
Feb 02 09:40:07 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e31 e31: 3 total, 3 up, 3 in
Feb 02 09:40:07 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 31 pg[3.19( empty local-lis/les=15/16 n=0 ec=29/15 lis/c=15/15 les/c/f=16/16/0 sis=31) [2] r=-1 lpr=31 pi=[15,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] PeeringState::start_peering_interval up [] -> [2], acting [] -> [2], acting_primary ? -> 2, up_primary ? -> 2, role -1 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb 02 09:40:07 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 31 pg[3.19( empty local-lis/les=15/16 n=0 ec=29/15 lis/c=15/15 les/c/f=16/16/0 sis=31) [2] r=-1 lpr=31 pi=[15,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 02 09:40:07 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 31 pg[3.17( empty local-lis/les=15/16 n=0 ec=29/15 lis/c=15/15 les/c/f=16/16/0 sis=31) [2] r=-1 lpr=31 pi=[15,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] PeeringState::start_peering_interval up [] -> [2], acting [] -> [2], acting_primary ? -> 2, up_primary ? -> 2, role -1 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb 02 09:40:07 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 31 pg[3.17( empty local-lis/les=15/16 n=0 ec=29/15 lis/c=15/15 les/c/f=16/16/0 sis=31) [2] r=-1 lpr=31 pi=[15,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 02 09:40:07 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 31 pg[3.16( empty local-lis/les=15/16 n=0 ec=29/15 lis/c=15/15 les/c/f=16/16/0 sis=31) [2] r=-1 lpr=31 pi=[15,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] PeeringState::start_peering_interval up [] -> [2], acting [] -> [2], acting_primary ? -> 2, up_primary ? -> 2, role -1 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb 02 09:40:07 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 31 pg[3.18( empty local-lis/les=15/16 n=0 ec=29/15 lis/c=15/15 les/c/f=16/16/0 sis=31) [2] r=-1 lpr=31 pi=[15,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] PeeringState::start_peering_interval up [] -> [2], acting [] -> [2], acting_primary ? -> 2, up_primary ? -> 2, role -1 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb 02 09:40:07 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 31 pg[3.16( empty local-lis/les=15/16 n=0 ec=29/15 lis/c=15/15 les/c/f=16/16/0 sis=31) [2] r=-1 lpr=31 pi=[15,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 02 09:40:07 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 31 pg[3.18( empty local-lis/les=15/16 n=0 ec=29/15 lis/c=15/15 les/c/f=16/16/0 sis=31) [2] r=-1 lpr=31 pi=[15,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 02 09:40:07 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 31 pg[3.15( empty local-lis/les=15/16 n=0 ec=29/15 lis/c=15/15 les/c/f=16/16/0 sis=31) [2] r=-1 lpr=31 pi=[15,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] PeeringState::start_peering_interval up [] -> [2], acting [] -> [2], acting_primary ? -> 2, up_primary ? -> 2, role -1 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb 02 09:40:07 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 31 pg[3.14( empty local-lis/les=15/16 n=0 ec=29/15 lis/c=15/15 les/c/f=16/16/0 sis=31) [2] r=-1 lpr=31 pi=[15,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] PeeringState::start_peering_interval up [] -> [2], acting [] -> [2], acting_primary ? -> 2, up_primary ? -> 2, role -1 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb 02 09:40:07 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 31 pg[3.13( empty local-lis/les=15/16 n=0 ec=29/15 lis/c=15/15 les/c/f=16/16/0 sis=31) [2] r=-1 lpr=31 pi=[15,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] PeeringState::start_peering_interval up [] -> [2], acting [] -> [2], acting_primary ? -> 2, up_primary ? -> 2, role -1 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb 02 09:40:07 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 31 pg[3.13( empty local-lis/les=15/16 n=0 ec=29/15 lis/c=15/15 les/c/f=16/16/0 sis=31) [2] r=-1 lpr=31 pi=[15,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 02 09:40:07 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 31 pg[3.15( empty local-lis/les=15/16 n=0 ec=29/15 lis/c=15/15 les/c/f=16/16/0 sis=31) [2] r=-1 lpr=31 pi=[15,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 02 09:40:07 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 31 pg[3.14( empty local-lis/les=15/16 n=0 ec=29/15 lis/c=15/15 les/c/f=16/16/0 sis=31) [2] r=-1 lpr=31 pi=[15,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 02 09:40:07 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 31 pg[3.12( empty local-lis/les=15/16 n=0 ec=29/15 lis/c=15/15 les/c/f=16/16/0 sis=31) [2] r=-1 lpr=31 pi=[15,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] PeeringState::start_peering_interval up [] -> [2], acting [] -> [2], acting_primary ? -> 2, up_primary ? -> 2, role -1 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb 02 09:40:07 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 31 pg[3.12( empty local-lis/les=15/16 n=0 ec=29/15 lis/c=15/15 les/c/f=16/16/0 sis=31) [2] r=-1 lpr=31 pi=[15,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 02 09:40:07 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 31 pg[3.11( empty local-lis/les=15/16 n=0 ec=29/15 lis/c=15/15 les/c/f=16/16/0 sis=31) [2] r=-1 lpr=31 pi=[15,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] PeeringState::start_peering_interval up [] -> [2], acting [] -> [2], acting_primary ? -> 2, up_primary ? -> 2, role -1 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb 02 09:40:07 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 31 pg[3.11( empty local-lis/les=15/16 n=0 ec=29/15 lis/c=15/15 les/c/f=16/16/0 sis=31) [2] r=-1 lpr=31 pi=[15,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 02 09:40:07 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 31 pg[3.10( empty local-lis/les=15/16 n=0 ec=29/15 lis/c=15/15 les/c/f=16/16/0 sis=31) [2] r=-1 lpr=31 pi=[15,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] PeeringState::start_peering_interval up [] -> [2], acting [] -> [2], acting_primary ? -> 2, up_primary ? -> 2, role -1 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb 02 09:40:07 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 31 pg[3.f( empty local-lis/les=15/16 n=0 ec=29/15 lis/c=15/15 les/c/f=16/16/0 sis=31) [2] r=-1 lpr=31 pi=[15,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] PeeringState::start_peering_interval up [] -> [2], acting [] -> [2], acting_primary ? -> 2, up_primary ? -> 2, role -1 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb 02 09:40:07 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 31 pg[3.10( empty local-lis/les=15/16 n=0 ec=29/15 lis/c=15/15 les/c/f=16/16/0 sis=31) [2] r=-1 lpr=31 pi=[15,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 02 09:40:07 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 31 pg[3.e( empty local-lis/les=15/16 n=0 ec=29/15 lis/c=15/15 les/c/f=16/16/0 sis=31) [2] r=-1 lpr=31 pi=[15,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] PeeringState::start_peering_interval up [] -> [2], acting [] -> [2], acting_primary ? -> 2, up_primary ? -> 2, role -1 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb 02 09:40:07 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 31 pg[3.f( empty local-lis/les=15/16 n=0 ec=29/15 lis/c=15/15 les/c/f=16/16/0 sis=31) [2] r=-1 lpr=31 pi=[15,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 02 09:40:07 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 31 pg[3.d( empty local-lis/les=15/16 n=0 ec=29/15 lis/c=15/15 les/c/f=16/16/0 sis=31) [2] r=-1 lpr=31 pi=[15,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] PeeringState::start_peering_interval up [] -> [2], acting [] -> [2], acting_primary ? -> 2, up_primary ? -> 2, role -1 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb 02 09:40:07 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 31 pg[3.e( empty local-lis/les=15/16 n=0 ec=29/15 lis/c=15/15 les/c/f=16/16/0 sis=31) [2] r=-1 lpr=31 pi=[15,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 02 09:40:07 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 31 pg[3.c( empty local-lis/les=15/16 n=0 ec=29/15 lis/c=15/15 les/c/f=16/16/0 sis=31) [2] r=-1 lpr=31 pi=[15,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] PeeringState::start_peering_interval up [] -> [2], acting [] -> [2], acting_primary ? -> 2, up_primary ? -> 2, role -1 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb 02 09:40:07 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 31 pg[3.c( empty local-lis/les=15/16 n=0 ec=29/15 lis/c=15/15 les/c/f=16/16/0 sis=31) [2] r=-1 lpr=31 pi=[15,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 02 09:40:07 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 31 pg[3.b( empty local-lis/les=15/16 n=0 ec=29/15 lis/c=15/15 les/c/f=16/16/0 sis=31) [2] r=-1 lpr=31 pi=[15,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] PeeringState::start_peering_interval up [] -> [2], acting [] -> [2], acting_primary ? -> 2, up_primary ? -> 2, role -1 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb 02 09:40:07 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 31 pg[3.a( empty local-lis/les=15/16 n=0 ec=29/15 lis/c=15/15 les/c/f=16/16/0 sis=31) [2] r=-1 lpr=31 pi=[15,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] PeeringState::start_peering_interval up [] -> [2], acting [] -> [2], acting_primary ? -> 2, up_primary ? -> 2, role -1 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb 02 09:40:07 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 31 pg[3.d( empty local-lis/les=15/16 n=0 ec=29/15 lis/c=15/15 les/c/f=16/16/0 sis=31) [2] r=-1 lpr=31 pi=[15,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 02 09:40:07 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 31 pg[3.a( empty local-lis/les=15/16 n=0 ec=29/15 lis/c=15/15 les/c/f=16/16/0 sis=31) [2] r=-1 lpr=31 pi=[15,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 02 09:40:07 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 31 pg[3.b( empty local-lis/les=15/16 n=0 ec=29/15 lis/c=15/15 les/c/f=16/16/0 sis=31) [2] r=-1 lpr=31 pi=[15,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 02 09:40:07 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 31 pg[3.0( empty local-lis/les=15/16 n=0 ec=15/15 lis/c=15/15 les/c/f=16/16/0 sis=31 pruub=5.751711845s) [2] r=-1 lpr=31 pi=[15,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 64.623085022s@ mbc={}] PeeringState::start_peering_interval up [] -> [2], acting [] -> [2], acting_primary ? -> 2, up_primary ? -> 2, role -1 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb 02 09:40:07 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 31 pg[3.7( empty local-lis/les=15/16 n=0 ec=29/15 lis/c=15/15 les/c/f=16/16/0 sis=31) [2] r=-1 lpr=31 pi=[15,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] PeeringState::start_peering_interval up [] -> [2], acting [] -> [2], acting_primary ? -> 2, up_primary ? -> 2, role -1 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb 02 09:40:07 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 31 pg[3.0( empty local-lis/les=15/16 n=0 ec=15/15 lis/c=15/15 les/c/f=16/16/0 sis=31 pruub=5.751691341s) [2] r=-1 lpr=31 pi=[15,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 64.623085022s@ mbc={}] state<Start>: transitioning to Stray
Feb 02 09:40:07 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 31 pg[5.0( empty local-lis/les=17/18 n=0 ec=17/17 lis/c=17/17 les/c/f=18/18/0 sis=31 pruub=7.774549484s) [2] r=-1 lpr=31 pi=[17,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 66.645973206s@ mbc={}] PeeringState::start_peering_interval up [] -> [2], acting [] -> [2], acting_primary ? -> 2, up_primary ? -> 2, role -1 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb 02 09:40:07 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 31 pg[3.7( empty local-lis/les=15/16 n=0 ec=29/15 lis/c=15/15 les/c/f=16/16/0 sis=31) [2] r=-1 lpr=31 pi=[15,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 02 09:40:07 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 31 pg[4.0( empty local-lis/les=16/17 n=0 ec=16/16 lis/c=16/16 les/c/f=17/17/0 sis=31 pruub=14.765381813s) [0] r=0 lpr=31 pi=[16,31)/1 crt=0'0 mlcod 0'0 active pruub 73.636924744s@ mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [0] -> [0], acting_primary 0 -> 0, up_primary 0 -> 0, role 0 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Feb 02 09:40:07 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 31 pg[3.6( empty local-lis/les=15/16 n=0 ec=29/15 lis/c=15/15 les/c/f=16/16/0 sis=31) [2] r=-1 lpr=31 pi=[15,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] PeeringState::start_peering_interval up [] -> [2], acting [] -> [2], acting_primary ? -> 2, up_primary ? -> 2, role -1 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb 02 09:40:07 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 31 pg[3.6( empty local-lis/les=15/16 n=0 ec=29/15 lis/c=15/15 les/c/f=16/16/0 sis=31) [2] r=-1 lpr=31 pi=[15,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 02 09:40:07 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 31 pg[3.1( empty local-lis/les=15/16 n=0 ec=29/15 lis/c=15/15 les/c/f=16/16/0 sis=31) [2] r=-1 lpr=31 pi=[15,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] PeeringState::start_peering_interval up [] -> [2], acting [] -> [2], acting_primary ? -> 2, up_primary ? -> 2, role -1 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb 02 09:40:07 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 31 pg[3.5( empty local-lis/les=15/16 n=0 ec=29/15 lis/c=15/15 les/c/f=16/16/0 sis=31) [2] r=-1 lpr=31 pi=[15,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] PeeringState::start_peering_interval up [] -> [2], acting [] -> [2], acting_primary ? -> 2, up_primary ? -> 2, role -1 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb 02 09:40:07 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 31 pg[3.2( empty local-lis/les=15/16 n=0 ec=29/15 lis/c=15/15 les/c/f=16/16/0 sis=31) [2] r=-1 lpr=31 pi=[15,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] PeeringState::start_peering_interval up [] -> [2], acting [] -> [2], acting_primary ? -> 2, up_primary ? -> 2, role -1 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb 02 09:40:07 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 31 pg[3.5( empty local-lis/les=15/16 n=0 ec=29/15 lis/c=15/15 les/c/f=16/16/0 sis=31) [2] r=-1 lpr=31 pi=[15,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 02 09:40:07 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 31 pg[3.1( empty local-lis/les=15/16 n=0 ec=29/15 lis/c=15/15 les/c/f=16/16/0 sis=31) [2] r=-1 lpr=31 pi=[15,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 02 09:40:07 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 31 pg[3.3( empty local-lis/les=15/16 n=0 ec=29/15 lis/c=15/15 les/c/f=16/16/0 sis=31) [2] r=-1 lpr=31 pi=[15,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] PeeringState::start_peering_interval up [] -> [2], acting [] -> [2], acting_primary ? -> 2, up_primary ? -> 2, role -1 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb 02 09:40:07 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 31 pg[3.3( empty local-lis/les=15/16 n=0 ec=29/15 lis/c=15/15 les/c/f=16/16/0 sis=31) [2] r=-1 lpr=31 pi=[15,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 02 09:40:07 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 31 pg[3.2( empty local-lis/les=15/16 n=0 ec=29/15 lis/c=15/15 les/c/f=16/16/0 sis=31) [2] r=-1 lpr=31 pi=[15,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 02 09:40:07 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 31 pg[3.4( empty local-lis/les=15/16 n=0 ec=29/15 lis/c=15/15 les/c/f=16/16/0 sis=31) [2] r=-1 lpr=31 pi=[15,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] PeeringState::start_peering_interval up [] -> [2], acting [] -> [2], acting_primary ? -> 2, up_primary ? -> 2, role -1 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb 02 09:40:07 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 31 pg[3.8( empty local-lis/les=15/16 n=0 ec=29/15 lis/c=15/15 les/c/f=16/16/0 sis=31) [2] r=-1 lpr=31 pi=[15,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] PeeringState::start_peering_interval up [] -> [2], acting [] -> [2], acting_primary ? -> 2, up_primary ? -> 2, role -1 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb 02 09:40:07 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 31 pg[3.8( empty local-lis/les=15/16 n=0 ec=29/15 lis/c=15/15 les/c/f=16/16/0 sis=31) [2] r=-1 lpr=31 pi=[15,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 02 09:40:07 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 31 pg[3.4( empty local-lis/les=15/16 n=0 ec=29/15 lis/c=15/15 les/c/f=16/16/0 sis=31) [2] r=-1 lpr=31 pi=[15,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 02 09:40:07 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 31 pg[3.9( empty local-lis/les=15/16 n=0 ec=29/15 lis/c=15/15 les/c/f=16/16/0 sis=31) [2] r=-1 lpr=31 pi=[15,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] PeeringState::start_peering_interval up [] -> [2], acting [] -> [2], acting_primary ? -> 2, up_primary ? -> 2, role -1 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb 02 09:40:07 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 31 pg[3.1a( empty local-lis/les=15/16 n=0 ec=29/15 lis/c=15/15 les/c/f=16/16/0 sis=31) [2] r=-1 lpr=31 pi=[15,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] PeeringState::start_peering_interval up [] -> [2], acting [] -> [2], acting_primary ? -> 2, up_primary ? -> 2, role -1 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb 02 09:40:07 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 31 pg[3.9( empty local-lis/les=15/16 n=0 ec=29/15 lis/c=15/15 les/c/f=16/16/0 sis=31) [2] r=-1 lpr=31 pi=[15,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 02 09:40:07 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 31 pg[3.1b( empty local-lis/les=15/16 n=0 ec=29/15 lis/c=15/15 les/c/f=16/16/0 sis=31) [2] r=-1 lpr=31 pi=[15,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] PeeringState::start_peering_interval up [] -> [2], acting [] -> [2], acting_primary ? -> 2, up_primary ? -> 2, role -1 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb 02 09:40:07 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 31 pg[3.1a( empty local-lis/les=15/16 n=0 ec=29/15 lis/c=15/15 les/c/f=16/16/0 sis=31) [2] r=-1 lpr=31 pi=[15,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 02 09:40:07 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 31 pg[3.1b( empty local-lis/les=15/16 n=0 ec=29/15 lis/c=15/15 les/c/f=16/16/0 sis=31) [2] r=-1 lpr=31 pi=[15,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 02 09:40:07 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 31 pg[3.1c( empty local-lis/les=15/16 n=0 ec=29/15 lis/c=15/15 les/c/f=16/16/0 sis=31) [2] r=-1 lpr=31 pi=[15,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] PeeringState::start_peering_interval up [] -> [2], acting [] -> [2], acting_primary ? -> 2, up_primary ? -> 2, role -1 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb 02 09:40:07 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 31 pg[3.1d( empty local-lis/les=15/16 n=0 ec=29/15 lis/c=15/15 les/c/f=16/16/0 sis=31) [2] r=-1 lpr=31 pi=[15,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] PeeringState::start_peering_interval up [] -> [2], acting [] -> [2], acting_primary ? -> 2, up_primary ? -> 2, role -1 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb 02 09:40:07 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 31 pg[3.1c( empty local-lis/les=15/16 n=0 ec=29/15 lis/c=15/15 les/c/f=16/16/0 sis=31) [2] r=-1 lpr=31 pi=[15,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 02 09:40:07 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 31 pg[3.1e( empty local-lis/les=15/16 n=0 ec=29/15 lis/c=15/15 les/c/f=16/16/0 sis=31) [2] r=-1 lpr=31 pi=[15,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] PeeringState::start_peering_interval up [] -> [2], acting [] -> [2], acting_primary ? -> 2, up_primary ? -> 2, role -1 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb 02 09:40:07 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 31 pg[3.1d( empty local-lis/les=15/16 n=0 ec=29/15 lis/c=15/15 les/c/f=16/16/0 sis=31) [2] r=-1 lpr=31 pi=[15,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 02 09:40:07 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 31 pg[3.1e( empty local-lis/les=15/16 n=0 ec=29/15 lis/c=15/15 les/c/f=16/16/0 sis=31) [2] r=-1 lpr=31 pi=[15,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 02 09:40:07 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 31 pg[3.1f( empty local-lis/les=15/16 n=0 ec=29/15 lis/c=15/15 les/c/f=16/16/0 sis=31) [2] r=-1 lpr=31 pi=[15,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] PeeringState::start_peering_interval up [] -> [2], acting [] -> [2], acting_primary ? -> 2, up_primary ? -> 2, role -1 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb 02 09:40:07 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 31 pg[3.1f( empty local-lis/les=15/16 n=0 ec=29/15 lis/c=15/15 les/c/f=16/16/0 sis=31) [2] r=-1 lpr=31 pi=[15,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 02 09:40:07 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 31 pg[5.0( empty local-lis/les=17/18 n=0 ec=17/17 lis/c=17/17 les/c/f=18/18/0 sis=31 pruub=7.771995544s) [2] r=-1 lpr=31 pi=[17,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 66.645973206s@ mbc={}] state<Start>: transitioning to Stray
Feb 02 09:40:07 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 31 pg[4.0( empty local-lis/les=16/17 n=0 ec=16/16 lis/c=16/16 les/c/f=17/17/0 sis=31 pruub=14.765381813s) [0] r=0 lpr=31 pi=[16,31)/1 crt=0'0 mlcod 0'0 unknown pruub 73.636924744s@ mbc={}] state<Start>: transitioning to Primary
Feb 02 09:40:07 compute-1 sudo[81094]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-d241d473-9fcb-5f74-b163-f1ca4454e7f1/var/lib/ceph/d241d473-9fcb-5f74-b163-f1ca4454e7f1/config/ceph.conf.new
Feb 02 09:40:07 compute-1 sudo[81094]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:40:07 compute-1 sudo[81094]: pam_unix(sudo:session): session closed for user root
Feb 02 09:40:07 compute-1 sudo[81142]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-d241d473-9fcb-5f74-b163-f1ca4454e7f1/var/lib/ceph/d241d473-9fcb-5f74-b163-f1ca4454e7f1/config/ceph.conf.new
Feb 02 09:40:07 compute-1 sudo[81142]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:40:07 compute-1 sudo[81142]: pam_unix(sudo:session): session closed for user root
Feb 02 09:40:07 compute-1 sudo[81167]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-d241d473-9fcb-5f74-b163-f1ca4454e7f1/var/lib/ceph/d241d473-9fcb-5f74-b163-f1ca4454e7f1/config/ceph.conf.new
Feb 02 09:40:07 compute-1 sudo[81167]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:40:07 compute-1 sudo[81167]: pam_unix(sudo:session): session closed for user root
Feb 02 09:40:07 compute-1 sudo[81192]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-d241d473-9fcb-5f74-b163-f1ca4454e7f1/var/lib/ceph/d241d473-9fcb-5f74-b163-f1ca4454e7f1/config/ceph.conf.new /var/lib/ceph/d241d473-9fcb-5f74-b163-f1ca4454e7f1/config/ceph.conf
Feb 02 09:40:07 compute-1 sudo[81192]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:40:07 compute-1 sudo[81192]: pam_unix(sudo:session): session closed for user root
Feb 02 09:40:08 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e32 e32: 3 total, 3 up, 3 in
Feb 02 09:40:08 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 32 pg[5.1e( empty local-lis/les=17/18 n=0 ec=31/17 lis/c=17/17 les/c/f=18/18/0 sis=31) [2] r=-1 lpr=31 pi=[17,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 02 09:40:08 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 32 pg[5.1f( empty local-lis/les=17/18 n=0 ec=31/17 lis/c=17/17 les/c/f=18/18/0 sis=31) [2] r=-1 lpr=31 pi=[17,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 02 09:40:08 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 32 pg[4.1e( empty local-lis/les=16/17 n=0 ec=31/16 lis/c=16/16 les/c/f=17/17/0 sis=31) [0] r=0 lpr=31 pi=[16,31)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 02 09:40:08 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 32 pg[4.10( empty local-lis/les=16/17 n=0 ec=31/16 lis/c=16/16 les/c/f=17/17/0 sis=31) [0] r=0 lpr=31 pi=[16,31)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 02 09:40:08 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 32 pg[4.1f( empty local-lis/les=16/17 n=0 ec=31/16 lis/c=16/16 les/c/f=17/17/0 sis=31) [0] r=0 lpr=31 pi=[16,31)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 02 09:40:08 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 32 pg[5.11( empty local-lis/les=17/18 n=0 ec=31/17 lis/c=17/17 les/c/f=18/18/0 sis=31) [2] r=-1 lpr=31 pi=[17,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 02 09:40:08 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 32 pg[5.13( empty local-lis/les=17/18 n=0 ec=31/17 lis/c=17/17 les/c/f=18/18/0 sis=31) [2] r=-1 lpr=31 pi=[17,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 02 09:40:08 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 32 pg[4.11( empty local-lis/les=16/17 n=0 ec=31/16 lis/c=16/16 les/c/f=17/17/0 sis=31) [0] r=0 lpr=31 pi=[16,31)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 02 09:40:08 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 32 pg[5.10( empty local-lis/les=17/18 n=0 ec=31/17 lis/c=17/17 les/c/f=18/18/0 sis=31) [2] r=-1 lpr=31 pi=[17,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 02 09:40:08 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 32 pg[4.12( empty local-lis/les=16/17 n=0 ec=31/16 lis/c=16/16 les/c/f=17/17/0 sis=31) [0] r=0 lpr=31 pi=[16,31)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 02 09:40:08 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 32 pg[5.15( empty local-lis/les=17/18 n=0 ec=31/17 lis/c=17/17 les/c/f=18/18/0 sis=31) [2] r=-1 lpr=31 pi=[17,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 02 09:40:08 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 32 pg[5.12( empty local-lis/les=17/18 n=0 ec=31/17 lis/c=17/17 les/c/f=18/18/0 sis=31) [2] r=-1 lpr=31 pi=[17,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 02 09:40:08 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 32 pg[4.13( empty local-lis/les=16/17 n=0 ec=31/16 lis/c=16/16 les/c/f=17/17/0 sis=31) [0] r=0 lpr=31 pi=[16,31)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 02 09:40:08 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 32 pg[4.14( empty local-lis/les=16/17 n=0 ec=31/16 lis/c=16/16 les/c/f=17/17/0 sis=31) [0] r=0 lpr=31 pi=[16,31)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 02 09:40:08 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 32 pg[5.14( empty local-lis/les=17/18 n=0 ec=31/17 lis/c=17/17 les/c/f=18/18/0 sis=31) [2] r=-1 lpr=31 pi=[17,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 02 09:40:08 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 32 pg[4.15( empty local-lis/les=16/17 n=0 ec=31/16 lis/c=16/16 les/c/f=17/17/0 sis=31) [0] r=0 lpr=31 pi=[16,31)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 02 09:40:08 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 32 pg[5.17( empty local-lis/les=17/18 n=0 ec=31/17 lis/c=17/17 les/c/f=18/18/0 sis=31) [2] r=-1 lpr=31 pi=[17,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 02 09:40:08 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 32 pg[5.16( empty local-lis/les=17/18 n=0 ec=31/17 lis/c=17/17 les/c/f=18/18/0 sis=31) [2] r=-1 lpr=31 pi=[17,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 02 09:40:08 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 32 pg[4.16( empty local-lis/les=16/17 n=0 ec=31/16 lis/c=16/16 les/c/f=17/17/0 sis=31) [0] r=0 lpr=31 pi=[16,31)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 02 09:40:08 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 32 pg[4.17( empty local-lis/les=16/17 n=0 ec=31/16 lis/c=16/16 les/c/f=17/17/0 sis=31) [0] r=0 lpr=31 pi=[16,31)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 02 09:40:08 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 32 pg[4.8( empty local-lis/les=16/17 n=0 ec=31/16 lis/c=16/16 les/c/f=17/17/0 sis=31) [0] r=0 lpr=31 pi=[16,31)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 02 09:40:08 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 32 pg[5.9( empty local-lis/les=17/18 n=0 ec=31/17 lis/c=17/17 les/c/f=18/18/0 sis=31) [2] r=-1 lpr=31 pi=[17,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 02 09:40:08 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 32 pg[5.8( empty local-lis/les=17/18 n=0 ec=31/17 lis/c=17/17 les/c/f=18/18/0 sis=31) [2] r=-1 lpr=31 pi=[17,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 02 09:40:08 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 32 pg[4.9( empty local-lis/les=16/17 n=0 ec=31/16 lis/c=16/16 les/c/f=17/17/0 sis=31) [0] r=0 lpr=31 pi=[16,31)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 02 09:40:08 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 32 pg[5.b( empty local-lis/les=17/18 n=0 ec=31/17 lis/c=17/17 les/c/f=18/18/0 sis=31) [2] r=-1 lpr=31 pi=[17,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 02 09:40:08 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 32 pg[4.a( empty local-lis/les=16/17 n=0 ec=31/16 lis/c=16/16 les/c/f=17/17/0 sis=31) [0] r=0 lpr=31 pi=[16,31)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 02 09:40:08 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 32 pg[5.a( empty local-lis/les=17/18 n=0 ec=31/17 lis/c=17/17 les/c/f=18/18/0 sis=31) [2] r=-1 lpr=31 pi=[17,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 02 09:40:08 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 32 pg[4.b( empty local-lis/les=16/17 n=0 ec=31/16 lis/c=16/16 les/c/f=17/17/0 sis=31) [0] r=0 lpr=31 pi=[16,31)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 02 09:40:08 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 32 pg[5.d( empty local-lis/les=17/18 n=0 ec=31/17 lis/c=17/17 les/c/f=18/18/0 sis=31) [2] r=-1 lpr=31 pi=[17,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 02 09:40:08 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 32 pg[4.c( empty local-lis/les=16/17 n=0 ec=31/16 lis/c=16/16 les/c/f=17/17/0 sis=31) [0] r=0 lpr=31 pi=[16,31)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 02 09:40:08 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 32 pg[4.d( empty local-lis/les=16/17 n=0 ec=31/16 lis/c=16/16 les/c/f=17/17/0 sis=31) [0] r=0 lpr=31 pi=[16,31)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 02 09:40:08 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 32 pg[5.6( empty local-lis/les=17/18 n=0 ec=31/17 lis/c=17/17 les/c/f=18/18/0 sis=31) [2] r=-1 lpr=31 pi=[17,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 02 09:40:08 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 32 pg[4.7( empty local-lis/les=16/17 n=0 ec=31/16 lis/c=16/16 les/c/f=17/17/0 sis=31) [0] r=0 lpr=31 pi=[16,31)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 02 09:40:08 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 32 pg[5.1( empty local-lis/les=17/18 n=0 ec=31/17 lis/c=17/17 les/c/f=18/18/0 sis=31) [2] r=-1 lpr=31 pi=[17,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 02 09:40:08 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 32 pg[4.1( empty local-lis/les=16/17 n=0 ec=31/16 lis/c=16/16 les/c/f=17/17/0 sis=31) [0] r=0 lpr=31 pi=[16,31)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 02 09:40:08 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 32 pg[5.c( empty local-lis/les=17/18 n=0 ec=31/17 lis/c=17/17 les/c/f=18/18/0 sis=31) [2] r=-1 lpr=31 pi=[17,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 02 09:40:08 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 32 pg[5.3( empty local-lis/les=17/18 n=0 ec=31/17 lis/c=17/17 les/c/f=18/18/0 sis=31) [2] r=-1 lpr=31 pi=[17,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 02 09:40:08 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 32 pg[4.2( empty local-lis/les=16/17 n=0 ec=31/16 lis/c=16/16 les/c/f=17/17/0 sis=31) [0] r=0 lpr=31 pi=[16,31)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 02 09:40:08 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 32 pg[4.6( empty local-lis/les=16/17 n=0 ec=31/16 lis/c=16/16 les/c/f=17/17/0 sis=31) [0] r=0 lpr=31 pi=[16,31)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 02 09:40:08 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 32 pg[5.4( empty local-lis/les=17/18 n=0 ec=31/17 lis/c=17/17 les/c/f=18/18/0 sis=31) [2] r=-1 lpr=31 pi=[17,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 02 09:40:08 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 32 pg[4.5( empty local-lis/les=16/17 n=0 ec=31/16 lis/c=16/16 les/c/f=17/17/0 sis=31) [0] r=0 lpr=31 pi=[16,31)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 02 09:40:08 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 32 pg[5.7( empty local-lis/les=17/18 n=0 ec=31/17 lis/c=17/17 les/c/f=18/18/0 sis=31) [2] r=-1 lpr=31 pi=[17,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 02 09:40:08 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 32 pg[5.5( empty local-lis/les=17/18 n=0 ec=31/17 lis/c=17/17 les/c/f=18/18/0 sis=31) [2] r=-1 lpr=31 pi=[17,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 02 09:40:08 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 32 pg[4.4( empty local-lis/les=16/17 n=0 ec=31/16 lis/c=16/16 les/c/f=17/17/0 sis=31) [0] r=0 lpr=31 pi=[16,31)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 02 09:40:08 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 32 pg[5.2( empty local-lis/les=17/18 n=0 ec=31/17 lis/c=17/17 les/c/f=18/18/0 sis=31) [2] r=-1 lpr=31 pi=[17,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 02 09:40:08 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 32 pg[5.e( empty local-lis/les=17/18 n=0 ec=31/17 lis/c=17/17 les/c/f=18/18/0 sis=31) [2] r=-1 lpr=31 pi=[17,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 02 09:40:08 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 32 pg[4.f( empty local-lis/les=16/17 n=0 ec=31/16 lis/c=16/16 les/c/f=17/17/0 sis=31) [0] r=0 lpr=31 pi=[16,31)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 02 09:40:08 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 32 pg[4.3( empty local-lis/les=16/17 n=0 ec=31/16 lis/c=16/16 les/c/f=17/17/0 sis=31) [0] r=0 lpr=31 pi=[16,31)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 02 09:40:08 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 32 pg[5.f( empty local-lis/les=17/18 n=0 ec=31/17 lis/c=17/17 les/c/f=18/18/0 sis=31) [2] r=-1 lpr=31 pi=[17,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 02 09:40:08 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 32 pg[5.1c( empty local-lis/les=17/18 n=0 ec=31/17 lis/c=17/17 les/c/f=18/18/0 sis=31) [2] r=-1 lpr=31 pi=[17,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 02 09:40:08 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 32 pg[4.e( empty local-lis/les=16/17 n=0 ec=31/16 lis/c=16/16 les/c/f=17/17/0 sis=31) [0] r=0 lpr=31 pi=[16,31)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 02 09:40:08 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 32 pg[4.1d( empty local-lis/les=16/17 n=0 ec=31/16 lis/c=16/16 les/c/f=17/17/0 sis=31) [0] r=0 lpr=31 pi=[16,31)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 02 09:40:08 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 32 pg[5.1d( empty local-lis/les=17/18 n=0 ec=31/17 lis/c=17/17 les/c/f=18/18/0 sis=31) [2] r=-1 lpr=31 pi=[17,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 02 09:40:08 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 32 pg[4.1c( empty local-lis/les=16/17 n=0 ec=31/16 lis/c=16/16 les/c/f=17/17/0 sis=31) [0] r=0 lpr=31 pi=[16,31)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 02 09:40:08 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 32 pg[5.1a( empty local-lis/les=17/18 n=0 ec=31/17 lis/c=17/17 les/c/f=18/18/0 sis=31) [2] r=-1 lpr=31 pi=[17,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 02 09:40:08 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 32 pg[4.1b( empty local-lis/les=16/17 n=0 ec=31/16 lis/c=16/16 les/c/f=17/17/0 sis=31) [0] r=0 lpr=31 pi=[16,31)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 02 09:40:08 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 32 pg[4.1a( empty local-lis/les=16/17 n=0 ec=31/16 lis/c=16/16 les/c/f=17/17/0 sis=31) [0] r=0 lpr=31 pi=[16,31)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 02 09:40:08 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 32 pg[5.1b( empty local-lis/les=17/18 n=0 ec=31/17 lis/c=17/17 les/c/f=18/18/0 sis=31) [2] r=-1 lpr=31 pi=[17,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 02 09:40:08 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 32 pg[5.18( empty local-lis/les=17/18 n=0 ec=31/17 lis/c=17/17 les/c/f=18/18/0 sis=31) [2] r=-1 lpr=31 pi=[17,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 02 09:40:08 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 32 pg[4.19( empty local-lis/les=16/17 n=0 ec=31/16 lis/c=16/16 les/c/f=17/17/0 sis=31) [0] r=0 lpr=31 pi=[16,31)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 02 09:40:08 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 32 pg[5.19( empty local-lis/les=17/18 n=0 ec=31/17 lis/c=17/17 les/c/f=18/18/0 sis=31) [2] r=-1 lpr=31 pi=[17,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 02 09:40:08 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 32 pg[4.18( empty local-lis/les=16/17 n=0 ec=31/16 lis/c=16/16 les/c/f=17/17/0 sis=31) [0] r=0 lpr=31 pi=[16,31)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 02 09:40:08 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 32 pg[4.1e( empty local-lis/les=31/32 n=0 ec=31/16 lis/c=16/16 les/c/f=17/17/0 sis=31) [0] r=0 lpr=31 pi=[16,31)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 02 09:40:08 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 32 pg[4.10( empty local-lis/les=31/32 n=0 ec=31/16 lis/c=16/16 les/c/f=17/17/0 sis=31) [0] r=0 lpr=31 pi=[16,31)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 02 09:40:08 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 32 pg[4.1f( empty local-lis/les=31/32 n=0 ec=31/16 lis/c=16/16 les/c/f=17/17/0 sis=31) [0] r=0 lpr=31 pi=[16,31)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 02 09:40:08 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 32 pg[4.13( empty local-lis/les=31/32 n=0 ec=31/16 lis/c=16/16 les/c/f=17/17/0 sis=31) [0] r=0 lpr=31 pi=[16,31)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 02 09:40:08 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 32 pg[4.14( empty local-lis/les=31/32 n=0 ec=31/16 lis/c=16/16 les/c/f=17/17/0 sis=31) [0] r=0 lpr=31 pi=[16,31)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 02 09:40:08 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 32 pg[4.11( empty local-lis/les=31/32 n=0 ec=31/16 lis/c=16/16 les/c/f=17/17/0 sis=31) [0] r=0 lpr=31 pi=[16,31)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 02 09:40:08 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 32 pg[4.16( empty local-lis/les=31/32 n=0 ec=31/16 lis/c=16/16 les/c/f=17/17/0 sis=31) [0] r=0 lpr=31 pi=[16,31)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 02 09:40:08 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 32 pg[4.8( empty local-lis/les=31/32 n=0 ec=31/16 lis/c=16/16 les/c/f=17/17/0 sis=31) [0] r=0 lpr=31 pi=[16,31)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 02 09:40:08 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 32 pg[4.12( empty local-lis/les=31/32 n=0 ec=31/16 lis/c=16/16 les/c/f=17/17/0 sis=31) [0] r=0 lpr=31 pi=[16,31)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 02 09:40:08 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 32 pg[4.17( empty local-lis/les=31/32 n=0 ec=31/16 lis/c=16/16 les/c/f=17/17/0 sis=31) [0] r=0 lpr=31 pi=[16,31)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 02 09:40:08 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 32 pg[4.9( empty local-lis/les=31/32 n=0 ec=31/16 lis/c=16/16 les/c/f=17/17/0 sis=31) [0] r=0 lpr=31 pi=[16,31)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 02 09:40:08 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 32 pg[4.b( empty local-lis/les=31/32 n=0 ec=31/16 lis/c=16/16 les/c/f=17/17/0 sis=31) [0] r=0 lpr=31 pi=[16,31)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 02 09:40:08 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 32 pg[4.15( empty local-lis/les=31/32 n=0 ec=31/16 lis/c=16/16 les/c/f=17/17/0 sis=31) [0] r=0 lpr=31 pi=[16,31)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 02 09:40:08 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 32 pg[4.d( empty local-lis/les=31/32 n=0 ec=31/16 lis/c=16/16 les/c/f=17/17/0 sis=31) [0] r=0 lpr=31 pi=[16,31)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 02 09:40:08 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 32 pg[4.c( empty local-lis/les=31/32 n=0 ec=31/16 lis/c=16/16 les/c/f=17/17/0 sis=31) [0] r=0 lpr=31 pi=[16,31)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 02 09:40:08 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 32 pg[4.0( empty local-lis/les=31/32 n=0 ec=16/16 lis/c=16/16 les/c/f=17/17/0 sis=31) [0] r=0 lpr=31 pi=[16,31)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 02 09:40:08 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 32 pg[4.7( empty local-lis/les=31/32 n=0 ec=31/16 lis/c=16/16 les/c/f=17/17/0 sis=31) [0] r=0 lpr=31 pi=[16,31)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 02 09:40:08 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 32 pg[4.1( empty local-lis/les=31/32 n=0 ec=31/16 lis/c=16/16 les/c/f=17/17/0 sis=31) [0] r=0 lpr=31 pi=[16,31)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 02 09:40:08 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 32 pg[4.2( empty local-lis/les=31/32 n=0 ec=31/16 lis/c=16/16 les/c/f=17/17/0 sis=31) [0] r=0 lpr=31 pi=[16,31)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 02 09:40:08 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 32 pg[4.a( empty local-lis/les=31/32 n=0 ec=31/16 lis/c=16/16 les/c/f=17/17/0 sis=31) [0] r=0 lpr=31 pi=[16,31)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 02 09:40:08 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 32 pg[4.5( empty local-lis/les=31/32 n=0 ec=31/16 lis/c=16/16 les/c/f=17/17/0 sis=31) [0] r=0 lpr=31 pi=[16,31)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 02 09:40:08 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 32 pg[4.4( empty local-lis/les=31/32 n=0 ec=31/16 lis/c=16/16 les/c/f=17/17/0 sis=31) [0] r=0 lpr=31 pi=[16,31)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 02 09:40:08 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 32 pg[4.f( empty local-lis/les=31/32 n=0 ec=31/16 lis/c=16/16 les/c/f=17/17/0 sis=31) [0] r=0 lpr=31 pi=[16,31)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 02 09:40:08 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 32 pg[4.6( empty local-lis/les=31/32 n=0 ec=31/16 lis/c=16/16 les/c/f=17/17/0 sis=31) [0] r=0 lpr=31 pi=[16,31)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 02 09:40:08 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 32 pg[4.1c( empty local-lis/les=31/32 n=0 ec=31/16 lis/c=16/16 les/c/f=17/17/0 sis=31) [0] r=0 lpr=31 pi=[16,31)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 02 09:40:08 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 32 pg[4.3( empty local-lis/les=31/32 n=0 ec=31/16 lis/c=16/16 les/c/f=17/17/0 sis=31) [0] r=0 lpr=31 pi=[16,31)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 02 09:40:08 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 32 pg[4.1d( empty local-lis/les=31/32 n=0 ec=31/16 lis/c=16/16 les/c/f=17/17/0 sis=31) [0] r=0 lpr=31 pi=[16,31)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 02 09:40:08 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 32 pg[4.1b( empty local-lis/les=31/32 n=0 ec=31/16 lis/c=16/16 les/c/f=17/17/0 sis=31) [0] r=0 lpr=31 pi=[16,31)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 02 09:40:08 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 32 pg[4.e( empty local-lis/les=31/32 n=0 ec=31/16 lis/c=16/16 les/c/f=17/17/0 sis=31) [0] r=0 lpr=31 pi=[16,31)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 02 09:40:08 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 32 pg[4.1a( empty local-lis/les=31/32 n=0 ec=31/16 lis/c=16/16 les/c/f=17/17/0 sis=31) [0] r=0 lpr=31 pi=[16,31)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 02 09:40:08 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 32 pg[4.18( empty local-lis/les=31/32 n=0 ec=31/16 lis/c=16/16 les/c/f=17/17/0 sis=31) [0] r=0 lpr=31 pi=[16,31)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 02 09:40:08 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 32 pg[4.19( empty local-lis/les=31/32 n=0 ec=31/16 lis/c=16/16 les/c/f=17/17/0 sis=31) [0] r=0 lpr=31 pi=[16,31)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 02 09:40:08 compute-1 ceph-mon[80115]: Updating compute-2:/var/lib/ceph/d241d473-9fcb-5f74-b163-f1ca4454e7f1/config/ceph.conf
Feb 02 09:40:08 compute-1 ceph-mon[80115]: Updating compute-1:/var/lib/ceph/d241d473-9fcb-5f74-b163-f1ca4454e7f1/config/ceph.conf
Feb 02 09:40:08 compute-1 ceph-mon[80115]: Updating compute-0:/var/lib/ceph/d241d473-9fcb-5f74-b163-f1ca4454e7f1/config/ceph.conf
Feb 02 09:40:08 compute-1 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pg_num", "val": "32"}]': finished
Feb 02 09:40:08 compute-1 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' cmd='[{"prefix": "osd pool set", "pool": "backups", "var": "pg_num_actual", "val": "32"}]': finished
Feb 02 09:40:08 compute-1 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' cmd='[{"prefix": "osd pool set", "pool": "images", "var": "pg_num_actual", "val": "32"}]': finished
Feb 02 09:40:08 compute-1 ceph-mon[80115]: osd.2 [v2:192.168.122.102:6800/4043786308,v1:192.168.122.102:6801/4043786308] boot
Feb 02 09:40:08 compute-1 ceph-mon[80115]: osdmap e31: 3 total, 3 up, 3 in
Feb 02 09:40:08 compute-1 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch
Feb 02 09:40:08 compute-1 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.data", "var": "pg_num", "val": "32"}]: dispatch
Feb 02 09:40:08 compute-1 ceph-mon[80115]: from='client.14304 -' entity='client.admin' cmd=[{"prefix": "orch apply", "target": ["mon-mgr", ""]}]: dispatch
Feb 02 09:40:08 compute-1 ceph-mon[80115]: Saving service rgw.rgw spec with placement compute-0;compute-1;compute-2
Feb 02 09:40:08 compute-1 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' 
Feb 02 09:40:08 compute-1 ceph-mon[80115]: Saving service ingress.rgw.default spec with placement count:2
Feb 02 09:40:08 compute-1 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' 
Feb 02 09:40:08 compute-1 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' 
Feb 02 09:40:08 compute-1 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' 
Feb 02 09:40:08 compute-1 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' 
Feb 02 09:40:08 compute-1 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' 
Feb 02 09:40:08 compute-1 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' 
Feb 02 09:40:08 compute-1 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' 
Feb 02 09:40:08 compute-1 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' 
Feb 02 09:40:08 compute-1 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Feb 02 09:40:08 compute-1 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Feb 02 09:40:08 compute-1 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Feb 02 09:40:09 compute-1 ceph-osd[77691]: log_channel(cluster) log [DBG] : 4.1e scrub starts
Feb 02 09:40:09 compute-1 ceph-osd[77691]: log_channel(cluster) log [DBG] : 4.1e scrub ok
Feb 02 09:40:10 compute-1 ceph-osd[77691]: log_channel(cluster) log [DBG] : 4.10 scrub starts
Feb 02 09:40:10 compute-1 ceph-osd[77691]: log_channel(cluster) log [DBG] : 4.10 scrub ok
Feb 02 09:40:10 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e33 e33: 3 total, 3 up, 3 in
Feb 02 09:40:10 compute-1 ceph-mon[80115]: 2.1b scrub starts
Feb 02 09:40:10 compute-1 ceph-mon[80115]: 2.1b scrub ok
Feb 02 09:40:10 compute-1 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.data", "var": "pg_num", "val": "32"}]': finished
Feb 02 09:40:10 compute-1 ceph-mon[80115]: osdmap e32: 3 total, 3 up, 3 in
Feb 02 09:40:10 compute-1 ceph-mon[80115]: 2.1f scrub starts
Feb 02 09:40:10 compute-1 ceph-mon[80115]: 2.1f scrub ok
Feb 02 09:40:10 compute-1 ceph-mon[80115]: pgmap v88: 131 pgs: 33 peering, 32 activating, 62 unknown, 4 active+clean; 449 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Feb 02 09:40:10 compute-1 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.data", "var": "pg_num_actual", "val": "32"}]: dispatch
Feb 02 09:40:10 compute-1 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pg_num_actual", "val": "32"}]: dispatch
Feb 02 09:40:10 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e33 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 02 09:40:11 compute-1 ceph-osd[77691]: log_channel(cluster) log [DBG] : 4.1f scrub starts
Feb 02 09:40:11 compute-1 ceph-osd[77691]: log_channel(cluster) log [DBG] : 4.1f scrub ok
Feb 02 09:40:11 compute-1 ceph-mon[80115]: 4.1e scrub starts
Feb 02 09:40:11 compute-1 ceph-mon[80115]: 4.1e scrub ok
Feb 02 09:40:11 compute-1 ceph-mon[80115]: 3.f scrub starts
Feb 02 09:40:11 compute-1 ceph-mon[80115]: 3.f scrub ok
Feb 02 09:40:11 compute-1 ceph-mon[80115]: from='client.14310 -' entity='client.admin' cmd=[{"prefix": "orch apply", "target": ["mon-mgr", ""]}]: dispatch
Feb 02 09:40:11 compute-1 ceph-mon[80115]: Saving service node-exporter spec with placement *
Feb 02 09:40:11 compute-1 ceph-mon[80115]: 2.a scrub starts
Feb 02 09:40:11 compute-1 ceph-mon[80115]: 2.a scrub ok
Feb 02 09:40:11 compute-1 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' 
Feb 02 09:40:11 compute-1 ceph-mon[80115]: 4.10 scrub starts
Feb 02 09:40:11 compute-1 ceph-mon[80115]: 4.10 scrub ok
Feb 02 09:40:11 compute-1 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.data", "var": "pg_num_actual", "val": "32"}]': finished
Feb 02 09:40:11 compute-1 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pg_num_actual", "val": "32"}]': finished
Feb 02 09:40:11 compute-1 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' 
Feb 02 09:40:11 compute-1 ceph-mon[80115]: osdmap e33: 3 total, 3 up, 3 in
Feb 02 09:40:11 compute-1 ceph-mon[80115]: Saving service grafana spec with placement compute-0;count:1
Feb 02 09:40:11 compute-1 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' 
Feb 02 09:40:11 compute-1 ceph-mon[80115]: Saving service prometheus spec with placement compute-0;count:1
Feb 02 09:40:11 compute-1 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' 
Feb 02 09:40:11 compute-1 ceph-mon[80115]: Saving service alertmanager spec with placement compute-0;count:1
Feb 02 09:40:11 compute-1 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' 
Feb 02 09:40:11 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e34 e34: 3 total, 3 up, 3 in
Feb 02 09:40:11 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 33 pg[6.0( empty local-lis/les=18/19 n=0 ec=18/18 lis/c=18/18 les/c/f=19/19/0 sis=33 pruub=13.375972748s) [0] r=0 lpr=33 pi=[18,33)/1 crt=0'0 mlcod 0'0 active pruub 75.653724670s@ mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [0] -> [0], acting_primary 0 -> 0, up_primary 0 -> 0, role 0 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Feb 02 09:40:11 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 34 pg[6.0( empty local-lis/les=18/19 n=0 ec=18/18 lis/c=18/18 les/c/f=19/19/0 sis=33 pruub=13.375972748s) [0] r=0 lpr=33 pi=[18,33)/1 crt=0'0 mlcod 0'0 unknown pruub 75.653724670s@ mbc={}] state<Start>: transitioning to Primary
Feb 02 09:40:11 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 34 pg[6.19( empty local-lis/les=18/19 n=0 ec=33/18 lis/c=18/18 les/c/f=19/19/0 sis=33) [0] r=0 lpr=33 pi=[18,33)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 02 09:40:11 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 34 pg[6.e( empty local-lis/les=18/19 n=0 ec=33/18 lis/c=18/18 les/c/f=19/19/0 sis=33) [0] r=0 lpr=33 pi=[18,33)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 02 09:40:11 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 34 pg[6.17( empty local-lis/les=18/19 n=0 ec=33/18 lis/c=18/18 les/c/f=19/19/0 sis=33) [0] r=0 lpr=33 pi=[18,33)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 02 09:40:11 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 34 pg[6.12( empty local-lis/les=18/19 n=0 ec=33/18 lis/c=18/18 les/c/f=19/19/0 sis=33) [0] r=0 lpr=33 pi=[18,33)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 02 09:40:11 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 34 pg[6.1b( empty local-lis/les=18/19 n=0 ec=33/18 lis/c=18/18 les/c/f=19/19/0 sis=33) [0] r=0 lpr=33 pi=[18,33)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 02 09:40:11 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 34 pg[6.1c( empty local-lis/les=18/19 n=0 ec=33/18 lis/c=18/18 les/c/f=19/19/0 sis=33) [0] r=0 lpr=33 pi=[18,33)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 02 09:40:11 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 34 pg[6.1f( empty local-lis/les=18/19 n=0 ec=33/18 lis/c=18/18 les/c/f=19/19/0 sis=33) [0] r=0 lpr=33 pi=[18,33)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 02 09:40:11 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 34 pg[6.c( empty local-lis/les=18/19 n=0 ec=33/18 lis/c=18/18 les/c/f=19/19/0 sis=33) [0] r=0 lpr=33 pi=[18,33)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 02 09:40:11 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 34 pg[6.1( empty local-lis/les=18/19 n=0 ec=33/18 lis/c=18/18 les/c/f=19/19/0 sis=33) [0] r=0 lpr=33 pi=[18,33)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 02 09:40:11 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 34 pg[6.9( empty local-lis/les=18/19 n=0 ec=33/18 lis/c=18/18 les/c/f=19/19/0 sis=33) [0] r=0 lpr=33 pi=[18,33)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 02 09:40:11 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 34 pg[6.8( empty local-lis/les=18/19 n=0 ec=33/18 lis/c=18/18 les/c/f=19/19/0 sis=33) [0] r=0 lpr=33 pi=[18,33)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 02 09:40:11 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 34 pg[6.b( empty local-lis/les=18/19 n=0 ec=33/18 lis/c=18/18 les/c/f=19/19/0 sis=33) [0] r=0 lpr=33 pi=[18,33)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 02 09:40:11 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 34 pg[6.6( empty local-lis/les=18/19 n=0 ec=33/18 lis/c=18/18 les/c/f=19/19/0 sis=33) [0] r=0 lpr=33 pi=[18,33)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 02 09:40:11 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 34 pg[6.f( empty local-lis/les=18/19 n=0 ec=33/18 lis/c=18/18 les/c/f=19/19/0 sis=33) [0] r=0 lpr=33 pi=[18,33)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 02 09:40:11 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 34 pg[6.18( empty local-lis/les=18/19 n=0 ec=33/18 lis/c=18/18 les/c/f=19/19/0 sis=33) [0] r=0 lpr=33 pi=[18,33)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 02 09:40:11 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 34 pg[6.a( empty local-lis/les=18/19 n=0 ec=33/18 lis/c=18/18 les/c/f=19/19/0 sis=33) [0] r=0 lpr=33 pi=[18,33)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 02 09:40:11 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 34 pg[6.1d( empty local-lis/les=18/19 n=0 ec=33/18 lis/c=18/18 les/c/f=19/19/0 sis=33) [0] r=0 lpr=33 pi=[18,33)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 02 09:40:11 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 34 pg[6.2( empty local-lis/les=18/19 n=0 ec=33/18 lis/c=18/18 les/c/f=19/19/0 sis=33) [0] r=0 lpr=33 pi=[18,33)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 02 09:40:11 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 34 pg[6.d( empty local-lis/les=18/19 n=0 ec=33/18 lis/c=18/18 les/c/f=19/19/0 sis=33) [0] r=0 lpr=33 pi=[18,33)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 02 09:40:11 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 34 pg[6.1a( empty local-lis/les=18/19 n=0 ec=33/18 lis/c=18/18 les/c/f=19/19/0 sis=33) [0] r=0 lpr=33 pi=[18,33)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 02 09:40:11 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 34 pg[6.4( empty local-lis/les=18/19 n=0 ec=33/18 lis/c=18/18 les/c/f=19/19/0 sis=33) [0] r=0 lpr=33 pi=[18,33)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 02 09:40:11 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 34 pg[6.7( empty local-lis/les=18/19 n=0 ec=33/18 lis/c=18/18 les/c/f=19/19/0 sis=33) [0] r=0 lpr=33 pi=[18,33)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 02 09:40:11 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 34 pg[6.11( empty local-lis/les=18/19 n=0 ec=33/18 lis/c=18/18 les/c/f=19/19/0 sis=33) [0] r=0 lpr=33 pi=[18,33)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 02 09:40:11 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 34 pg[6.3( empty local-lis/les=18/19 n=0 ec=33/18 lis/c=18/18 les/c/f=19/19/0 sis=33) [0] r=0 lpr=33 pi=[18,33)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 02 09:40:11 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 34 pg[6.1e( empty local-lis/les=18/19 n=0 ec=33/18 lis/c=18/18 les/c/f=19/19/0 sis=33) [0] r=0 lpr=33 pi=[18,33)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 02 09:40:11 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 34 pg[6.16( empty local-lis/les=18/19 n=0 ec=33/18 lis/c=18/18 les/c/f=19/19/0 sis=33) [0] r=0 lpr=33 pi=[18,33)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 02 09:40:11 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 34 pg[6.13( empty local-lis/les=18/19 n=0 ec=33/18 lis/c=18/18 les/c/f=19/19/0 sis=33) [0] r=0 lpr=33 pi=[18,33)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 02 09:40:11 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 34 pg[6.14( empty local-lis/les=18/19 n=0 ec=33/18 lis/c=18/18 les/c/f=19/19/0 sis=33) [0] r=0 lpr=33 pi=[18,33)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 02 09:40:11 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 34 pg[6.5( empty local-lis/les=18/19 n=0 ec=33/18 lis/c=18/18 les/c/f=19/19/0 sis=33) [0] r=0 lpr=33 pi=[18,33)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 02 09:40:11 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 34 pg[6.15( empty local-lis/les=18/19 n=0 ec=33/18 lis/c=18/18 les/c/f=19/19/0 sis=33) [0] r=0 lpr=33 pi=[18,33)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 02 09:40:11 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 34 pg[6.10( empty local-lis/les=18/19 n=0 ec=33/18 lis/c=18/18 les/c/f=19/19/0 sis=33) [0] r=0 lpr=33 pi=[18,33)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 02 09:40:12 compute-1 ceph-osd[77691]: log_channel(cluster) log [DBG] : 4.13 scrub starts
Feb 02 09:40:12 compute-1 ceph-osd[77691]: log_channel(cluster) log [DBG] : 4.13 scrub ok
Feb 02 09:40:12 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e35 e35: 3 total, 3 up, 3 in
Feb 02 09:40:12 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 35 pg[6.1b( empty local-lis/les=33/35 n=0 ec=33/18 lis/c=18/18 les/c/f=19/19/0 sis=33) [0] r=0 lpr=33 pi=[18,33)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 02 09:40:12 compute-1 ceph-mon[80115]: 3.1e scrub starts
Feb 02 09:40:12 compute-1 ceph-mon[80115]: 3.1e scrub ok
Feb 02 09:40:12 compute-1 ceph-mon[80115]: pgmap v90: 193 pgs: 33 peering, 32 activating, 124 unknown, 4 active+clean; 449 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Feb 02 09:40:12 compute-1 ceph-mon[80115]: 2.9 scrub starts
Feb 02 09:40:12 compute-1 ceph-mon[80115]: 2.9 scrub ok
Feb 02 09:40:12 compute-1 ceph-mon[80115]: 4.1f scrub starts
Feb 02 09:40:12 compute-1 ceph-mon[80115]: 4.1f scrub ok
Feb 02 09:40:12 compute-1 ceph-mon[80115]: osdmap e34: 3 total, 3 up, 3 in
Feb 02 09:40:12 compute-1 ceph-mon[80115]: from='client.? 192.168.122.100:0/2304504428' entity='client.admin' 
Feb 02 09:40:12 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 35 pg[6.1a( empty local-lis/les=33/35 n=0 ec=33/18 lis/c=18/18 les/c/f=19/19/0 sis=33) [0] r=0 lpr=33 pi=[18,33)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 02 09:40:12 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 35 pg[6.18( empty local-lis/les=33/35 n=0 ec=33/18 lis/c=18/18 les/c/f=19/19/0 sis=33) [0] r=0 lpr=33 pi=[18,33)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 02 09:40:12 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 35 pg[6.1e( empty local-lis/les=33/35 n=0 ec=33/18 lis/c=18/18 les/c/f=19/19/0 sis=33) [0] r=0 lpr=33 pi=[18,33)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 02 09:40:12 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 35 pg[6.c( empty local-lis/les=33/35 n=0 ec=33/18 lis/c=18/18 les/c/f=19/19/0 sis=33) [0] r=0 lpr=33 pi=[18,33)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 02 09:40:12 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 35 pg[6.19( empty local-lis/les=33/35 n=0 ec=33/18 lis/c=18/18 les/c/f=19/19/0 sis=33) [0] r=0 lpr=33 pi=[18,33)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 02 09:40:12 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 35 pg[6.1f( empty local-lis/les=33/35 n=0 ec=33/18 lis/c=18/18 les/c/f=19/19/0 sis=33) [0] r=0 lpr=33 pi=[18,33)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 02 09:40:12 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 35 pg[6.6( empty local-lis/les=33/35 n=0 ec=33/18 lis/c=18/18 les/c/f=19/19/0 sis=33) [0] r=0 lpr=33 pi=[18,33)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 02 09:40:12 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 35 pg[6.7( empty local-lis/les=33/35 n=0 ec=33/18 lis/c=18/18 les/c/f=19/19/0 sis=33) [0] r=0 lpr=33 pi=[18,33)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 02 09:40:12 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 35 pg[6.1( empty local-lis/les=33/35 n=0 ec=33/18 lis/c=18/18 les/c/f=19/19/0 sis=33) [0] r=0 lpr=33 pi=[18,33)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 02 09:40:12 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 35 pg[6.4( empty local-lis/les=33/35 n=0 ec=33/18 lis/c=18/18 les/c/f=19/19/0 sis=33) [0] r=0 lpr=33 pi=[18,33)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 02 09:40:12 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 35 pg[6.d( empty local-lis/les=33/35 n=0 ec=33/18 lis/c=18/18 les/c/f=19/19/0 sis=33) [0] r=0 lpr=33 pi=[18,33)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 02 09:40:12 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 35 pg[6.0( empty local-lis/les=33/35 n=0 ec=18/18 lis/c=18/18 les/c/f=19/19/0 sis=33) [0] r=0 lpr=33 pi=[18,33)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 02 09:40:12 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 35 pg[6.5( empty local-lis/les=33/35 n=0 ec=33/18 lis/c=18/18 les/c/f=19/19/0 sis=33) [0] r=0 lpr=33 pi=[18,33)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 02 09:40:12 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 35 pg[6.f( empty local-lis/les=33/35 n=0 ec=33/18 lis/c=18/18 les/c/f=19/19/0 sis=33) [0] r=0 lpr=33 pi=[18,33)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 02 09:40:12 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 35 pg[6.e( empty local-lis/les=33/35 n=0 ec=33/18 lis/c=18/18 les/c/f=19/19/0 sis=33) [0] r=0 lpr=33 pi=[18,33)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 02 09:40:12 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 35 pg[6.3( empty local-lis/les=33/35 n=0 ec=33/18 lis/c=18/18 les/c/f=19/19/0 sis=33) [0] r=0 lpr=33 pi=[18,33)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 02 09:40:12 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 35 pg[6.2( empty local-lis/les=33/35 n=0 ec=33/18 lis/c=18/18 les/c/f=19/19/0 sis=33) [0] r=0 lpr=33 pi=[18,33)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 02 09:40:12 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 35 pg[6.b( empty local-lis/les=33/35 n=0 ec=33/18 lis/c=18/18 les/c/f=19/19/0 sis=33) [0] r=0 lpr=33 pi=[18,33)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 02 09:40:12 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 35 pg[6.a( empty local-lis/les=33/35 n=0 ec=33/18 lis/c=18/18 les/c/f=19/19/0 sis=33) [0] r=0 lpr=33 pi=[18,33)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 02 09:40:12 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 35 pg[6.15( empty local-lis/les=33/35 n=0 ec=33/18 lis/c=18/18 les/c/f=19/19/0 sis=33) [0] r=0 lpr=33 pi=[18,33)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 02 09:40:12 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 35 pg[6.8( empty local-lis/les=33/35 n=0 ec=33/18 lis/c=18/18 les/c/f=19/19/0 sis=33) [0] r=0 lpr=33 pi=[18,33)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 02 09:40:12 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 35 pg[6.16( empty local-lis/les=33/35 n=0 ec=33/18 lis/c=18/18 les/c/f=19/19/0 sis=33) [0] r=0 lpr=33 pi=[18,33)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 02 09:40:12 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 35 pg[6.17( empty local-lis/les=33/35 n=0 ec=33/18 lis/c=18/18 les/c/f=19/19/0 sis=33) [0] r=0 lpr=33 pi=[18,33)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 02 09:40:12 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 35 pg[6.11( empty local-lis/les=33/35 n=0 ec=33/18 lis/c=18/18 les/c/f=19/19/0 sis=33) [0] r=0 lpr=33 pi=[18,33)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 02 09:40:12 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 35 pg[6.14( empty local-lis/les=33/35 n=0 ec=33/18 lis/c=18/18 les/c/f=19/19/0 sis=33) [0] r=0 lpr=33 pi=[18,33)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 02 09:40:12 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 35 pg[6.10( empty local-lis/les=33/35 n=0 ec=33/18 lis/c=18/18 les/c/f=19/19/0 sis=33) [0] r=0 lpr=33 pi=[18,33)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 02 09:40:12 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 35 pg[6.13( empty local-lis/les=33/35 n=0 ec=33/18 lis/c=18/18 les/c/f=19/19/0 sis=33) [0] r=0 lpr=33 pi=[18,33)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 02 09:40:12 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 35 pg[6.12( empty local-lis/les=33/35 n=0 ec=33/18 lis/c=18/18 les/c/f=19/19/0 sis=33) [0] r=0 lpr=33 pi=[18,33)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 02 09:40:12 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 35 pg[6.1d( empty local-lis/les=33/35 n=0 ec=33/18 lis/c=18/18 les/c/f=19/19/0 sis=33) [0] r=0 lpr=33 pi=[18,33)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 02 09:40:12 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 35 pg[6.1c( empty local-lis/les=33/35 n=0 ec=33/18 lis/c=18/18 les/c/f=19/19/0 sis=33) [0] r=0 lpr=33 pi=[18,33)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 02 09:40:12 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 35 pg[6.9( empty local-lis/les=33/35 n=0 ec=33/18 lis/c=18/18 les/c/f=19/19/0 sis=33) [0] r=0 lpr=33 pi=[18,33)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 02 09:40:13 compute-1 ceph-osd[77691]: log_channel(cluster) log [DBG] : 4.16 scrub starts
Feb 02 09:40:13 compute-1 ceph-osd[77691]: log_channel(cluster) log [DBG] : 4.16 scrub ok
Feb 02 09:40:13 compute-1 ceph-mon[80115]: 3.1f scrub starts
Feb 02 09:40:13 compute-1 ceph-mon[80115]: 3.1f scrub ok
Feb 02 09:40:13 compute-1 ceph-mon[80115]: 2.7 scrub starts
Feb 02 09:40:13 compute-1 ceph-mon[80115]: 2.7 scrub ok
Feb 02 09:40:13 compute-1 ceph-mon[80115]: 4.13 scrub starts
Feb 02 09:40:13 compute-1 ceph-mon[80115]: 4.13 scrub ok
Feb 02 09:40:13 compute-1 ceph-mon[80115]: osdmap e35: 3 total, 3 up, 3 in
Feb 02 09:40:13 compute-1 ceph-mon[80115]: from='client.? 192.168.122.100:0/3997762270' entity='client.admin' 
Feb 02 09:40:13 compute-1 ceph-mon[80115]: pgmap v93: 193 pgs: 65 peering, 31 unknown, 97 active+clean; 449 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Feb 02 09:40:13 compute-1 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' 
Feb 02 09:40:13 compute-1 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' 
Feb 02 09:40:13 compute-1 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "auth get-or-create", "entity": "client.rgw.rgw.compute-2.zjyufj", "caps": ["mon", "allow *", "mgr", "allow rw", "osd", "allow rwx tag rgw *=*"]}]: dispatch
Feb 02 09:40:13 compute-1 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' cmd='[{"prefix": "auth get-or-create", "entity": "client.rgw.rgw.compute-2.zjyufj", "caps": ["mon", "allow *", "mgr", "allow rw", "osd", "allow rwx tag rgw *=*"]}]': finished
Feb 02 09:40:13 compute-1 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' 
Feb 02 09:40:13 compute-1 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Feb 02 09:40:13 compute-1 ceph-mon[80115]: Deploying daemon rgw.rgw.compute-2.zjyufj on compute-2
Feb 02 09:40:14 compute-1 ceph-osd[77691]: log_channel(cluster) log [DBG] : 4.8 scrub starts
Feb 02 09:40:14 compute-1 ceph-osd[77691]: log_channel(cluster) log [DBG] : 4.8 scrub ok
Feb 02 09:40:14 compute-1 ceph-mon[80115]: 3.10 deep-scrub starts
Feb 02 09:40:14 compute-1 ceph-mon[80115]: 3.10 deep-scrub ok
Feb 02 09:40:14 compute-1 ceph-mon[80115]: 2.6 deep-scrub starts
Feb 02 09:40:14 compute-1 ceph-mon[80115]: 2.6 deep-scrub ok
Feb 02 09:40:14 compute-1 ceph-mon[80115]: 4.16 scrub starts
Feb 02 09:40:14 compute-1 ceph-mon[80115]: 4.16 scrub ok
Feb 02 09:40:14 compute-1 ceph-mon[80115]: from='client.? 192.168.122.100:0/3629224831' entity='client.admin' 
Feb 02 09:40:14 compute-1 sudo[81217]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 02 09:40:14 compute-1 sudo[81217]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:40:14 compute-1 sudo[81217]: pam_unix(sudo:session): session closed for user root
Feb 02 09:40:14 compute-1 sudo[81242]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/d241d473-9fcb-5f74-b163-f1ca4454e7f1/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --image quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec --timeout 895 _orch deploy --fsid d241d473-9fcb-5f74-b163-f1ca4454e7f1
Feb 02 09:40:14 compute-1 sudo[81242]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:40:15 compute-1 ceph-osd[77691]: log_channel(cluster) log [DBG] : 4.12 scrub starts
Feb 02 09:40:15 compute-1 ceph-osd[77691]: log_channel(cluster) log [DBG] : 4.12 scrub ok
Feb 02 09:40:15 compute-1 ceph-mon[80115]: 3.11 scrub starts
Feb 02 09:40:15 compute-1 ceph-mon[80115]: 3.11 scrub ok
Feb 02 09:40:15 compute-1 ceph-mon[80115]: 2.8 scrub starts
Feb 02 09:40:15 compute-1 ceph-mon[80115]: 2.8 scrub ok
Feb 02 09:40:15 compute-1 ceph-mon[80115]: 4.8 scrub starts
Feb 02 09:40:15 compute-1 ceph-mon[80115]: 4.8 scrub ok
Feb 02 09:40:15 compute-1 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' 
Feb 02 09:40:15 compute-1 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' 
Feb 02 09:40:15 compute-1 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' 
Feb 02 09:40:15 compute-1 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "auth get-or-create", "entity": "client.rgw.rgw.compute-1.ezjvcf", "caps": ["mon", "allow *", "mgr", "allow rw", "osd", "allow rwx tag rgw *=*"]}]: dispatch
Feb 02 09:40:15 compute-1 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' cmd='[{"prefix": "auth get-or-create", "entity": "client.rgw.rgw.compute-1.ezjvcf", "caps": ["mon", "allow *", "mgr", "allow rw", "osd", "allow rwx tag rgw *=*"]}]': finished
Feb 02 09:40:15 compute-1 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' 
Feb 02 09:40:15 compute-1 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Feb 02 09:40:15 compute-1 ceph-mon[80115]: Deploying daemon rgw.rgw.compute-1.ezjvcf on compute-1
Feb 02 09:40:15 compute-1 ceph-mon[80115]: pgmap v94: 193 pgs: 32 peering, 161 active+clean; 449 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Feb 02 09:40:15 compute-1 podman[81309]: 2026-02-02 09:40:15.222783942 +0000 UTC m=+0.038046251 container create 6a4f746bf2197116f1a1bba494055e4ea738460928c31bed16f9b09928bb3dca (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=nostalgic_driscoll, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.40.1, ceph=True, OSD_FLAVOR=default, org.label-schema.build-date=20250325, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS)
Feb 02 09:40:15 compute-1 systemd[1]: Started libpod-conmon-6a4f746bf2197116f1a1bba494055e4ea738460928c31bed16f9b09928bb3dca.scope.
Feb 02 09:40:15 compute-1 systemd[1]: Started libcrun container.
Feb 02 09:40:15 compute-1 podman[81309]: 2026-02-02 09:40:15.295970467 +0000 UTC m=+0.111232846 container init 6a4f746bf2197116f1a1bba494055e4ea738460928c31bed16f9b09928bb3dca (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=nostalgic_driscoll, ceph=True, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, CEPH_REF=squid, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Feb 02 09:40:15 compute-1 podman[81309]: 2026-02-02 09:40:15.203324361 +0000 UTC m=+0.018586690 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Feb 02 09:40:15 compute-1 podman[81309]: 2026-02-02 09:40:15.303783648 +0000 UTC m=+0.119045977 container start 6a4f746bf2197116f1a1bba494055e4ea738460928c31bed16f9b09928bb3dca (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=nostalgic_driscoll, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/)
Feb 02 09:40:15 compute-1 nostalgic_driscoll[81325]: 167 167
Feb 02 09:40:15 compute-1 systemd[1]: libpod-6a4f746bf2197116f1a1bba494055e4ea738460928c31bed16f9b09928bb3dca.scope: Deactivated successfully.
Feb 02 09:40:15 compute-1 conmon[81325]: conmon 6a4f746bf2197116f1a1 <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-6a4f746bf2197116f1a1bba494055e4ea738460928c31bed16f9b09928bb3dca.scope/container/memory.events
Feb 02 09:40:15 compute-1 podman[81309]: 2026-02-02 09:40:15.313337744 +0000 UTC m=+0.128600083 container attach 6a4f746bf2197116f1a1bba494055e4ea738460928c31bed16f9b09928bb3dca (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=nostalgic_driscoll, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, io.buildah.version=1.40.1, ceph=True)
Feb 02 09:40:15 compute-1 podman[81309]: 2026-02-02 09:40:15.313925379 +0000 UTC m=+0.129187718 container died 6a4f746bf2197116f1a1bba494055e4ea738460928c31bed16f9b09928bb3dca (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=nostalgic_driscoll, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_REF=squid, org.label-schema.license=GPLv2, OSD_FLAVOR=default)
Feb 02 09:40:15 compute-1 systemd[1]: var-lib-containers-storage-overlay-0937e87ad384aee4e628dbd5763f25cc2a03f2655138eca49aad3566aef90868-merged.mount: Deactivated successfully.
Feb 02 09:40:15 compute-1 podman[81309]: 2026-02-02 09:40:15.36092028 +0000 UTC m=+0.176182619 container remove 6a4f746bf2197116f1a1bba494055e4ea738460928c31bed16f9b09928bb3dca (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=nostalgic_driscoll, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Feb 02 09:40:15 compute-1 systemd[1]: libpod-conmon-6a4f746bf2197116f1a1bba494055e4ea738460928c31bed16f9b09928bb3dca.scope: Deactivated successfully.
Feb 02 09:40:15 compute-1 systemd[1]: Reloading.
Feb 02 09:40:15 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e35 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 02 09:40:15 compute-1 systemd-rc-local-generator[81362]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 02 09:40:15 compute-1 systemd-sysv-generator[81366]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 02 09:40:15 compute-1 sudo[81400]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gdqrdilfsnytpmfzmzogftoumibmxnlp ; /usr/bin/python3'
Feb 02 09:40:15 compute-1 sudo[81400]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:40:15 compute-1 systemd[1]: Reloading.
Feb 02 09:40:15 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e36 e36: 3 total, 3 up, 3 in
Feb 02 09:40:15 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 36 pg[8.0( empty local-lis/les=0/0 n=0 ec=36/36 lis/c=0/0 les/c/f=0/0/0 sis=36) [0] r=0 lpr=36 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 02 09:40:15 compute-1 systemd-sysv-generator[81437]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 02 09:40:15 compute-1 systemd-rc-local-generator[81433]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 02 09:40:15 compute-1 python3[81404]: ansible-ansible.legacy.command Invoked with _raw_params=podman ps -a -f 'name=ceph-?(.*)-mgr.*' --format \{\{\.Command\}\} --no-trunc
                                           _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 02 09:40:15 compute-1 sudo[81400]: pam_unix(sudo:session): session closed for user root
Feb 02 09:40:15 compute-1 systemd[1]: Starting Ceph rgw.rgw.compute-1.ezjvcf for d241d473-9fcb-5f74-b163-f1ca4454e7f1...
Feb 02 09:40:15 compute-1 ceph-osd[77691]: log_channel(cluster) log [DBG] : 4.14 scrub starts
Feb 02 09:40:15 compute-1 ceph-osd[77691]: log_channel(cluster) log [DBG] : 4.14 scrub ok
Feb 02 09:40:16 compute-1 podman[81508]: 2026-02-02 09:40:16.200448303 +0000 UTC m=+0.052555215 container create e606a473626c5a5a09847083ea79e9733c181fd2942d240723a9ade446a340dd (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-rgw-rgw-compute-1-ezjvcf, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, OSD_FLAVOR=default, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Feb 02 09:40:16 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c9e030790992795d171cc14f037f50136b47708a0fde4cb6d08d5e5623e1a535/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Feb 02 09:40:16 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c9e030790992795d171cc14f037f50136b47708a0fde4cb6d08d5e5623e1a535/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Feb 02 09:40:16 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c9e030790992795d171cc14f037f50136b47708a0fde4cb6d08d5e5623e1a535/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Feb 02 09:40:16 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c9e030790992795d171cc14f037f50136b47708a0fde4cb6d08d5e5623e1a535/merged/var/lib/ceph/radosgw/ceph-rgw.rgw.compute-1.ezjvcf supports timestamps until 2038 (0x7fffffff)
Feb 02 09:40:16 compute-1 podman[81508]: 2026-02-02 09:40:16.179870453 +0000 UTC m=+0.031977355 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Feb 02 09:40:16 compute-1 podman[81508]: 2026-02-02 09:40:16.280105495 +0000 UTC m=+0.132212447 container init e606a473626c5a5a09847083ea79e9733c181fd2942d240723a9ade446a340dd (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-rgw-rgw-compute-1-ezjvcf, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.build-date=20250325, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, CEPH_REF=squid, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 02 09:40:16 compute-1 podman[81508]: 2026-02-02 09:40:16.286855759 +0000 UTC m=+0.138962671 container start e606a473626c5a5a09847083ea79e9733c181fd2942d240723a9ade446a340dd (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-rgw-rgw-compute-1-ezjvcf, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, CEPH_REF=squid, io.buildah.version=1.40.1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Feb 02 09:40:16 compute-1 bash[81508]: e606a473626c5a5a09847083ea79e9733c181fd2942d240723a9ade446a340dd
Feb 02 09:40:16 compute-1 systemd[1]: Started Ceph rgw.rgw.compute-1.ezjvcf for d241d473-9fcb-5f74-b163-f1ca4454e7f1.
Feb 02 09:40:16 compute-1 sudo[81242]: pam_unix(sudo:session): session closed for user root
Feb 02 09:40:16 compute-1 ceph-mon[80115]: 3.12 scrub starts
Feb 02 09:40:16 compute-1 ceph-mon[80115]: 3.12 scrub ok
Feb 02 09:40:16 compute-1 ceph-mon[80115]: 2.1e deep-scrub starts
Feb 02 09:40:16 compute-1 ceph-mon[80115]: 2.1e deep-scrub ok
Feb 02 09:40:16 compute-1 ceph-mon[80115]: 4.12 scrub starts
Feb 02 09:40:16 compute-1 ceph-mon[80115]: 4.12 scrub ok
Feb 02 09:40:16 compute-1 ceph-mon[80115]: from='client.? 192.168.122.100:0/3477511090' entity='client.admin' 
Feb 02 09:40:16 compute-1 ceph-mon[80115]: osdmap e36: 3 total, 3 up, 3 in
Feb 02 09:40:16 compute-1 ceph-mon[80115]: from='client.? 192.168.122.102:0/343742408' entity='client.rgw.rgw.compute-2.zjyufj' cmd=[{"prefix": "osd pool application enable","pool": ".rgw.root","app": "rgw"}]: dispatch
Feb 02 09:40:16 compute-1 ceph-mon[80115]: from='client.? ' entity='client.rgw.rgw.compute-2.zjyufj' cmd=[{"prefix": "osd pool application enable","pool": ".rgw.root","app": "rgw"}]: dispatch
Feb 02 09:40:16 compute-1 radosgw[81528]: deferred set uid:gid to 167:167 (ceph:ceph)
Feb 02 09:40:16 compute-1 radosgw[81528]: ceph version 19.2.3 (c92aebb279828e9c3c1f5d24613efca272649e62) squid (stable), process radosgw, pid 2
Feb 02 09:40:16 compute-1 radosgw[81528]: framework: beast
Feb 02 09:40:16 compute-1 radosgw[81528]: framework conf key: endpoint, val: 192.168.122.101:8082
Feb 02 09:40:16 compute-1 radosgw[81528]: init_numa not setting numa affinity
Feb 02 09:40:16 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e37 e37: 3 total, 3 up, 3 in
Feb 02 09:40:16 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 37 pg[8.0( empty local-lis/les=36/37 n=0 ec=36/36 lis/c=0/0 les/c/f=0/0/0 sis=36) [0] r=0 lpr=36 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 02 09:40:17 compute-1 ceph-osd[77691]: log_channel(cluster) log [DBG] : 4.11 scrub starts
Feb 02 09:40:17 compute-1 ceph-osd[77691]: log_channel(cluster) log [DBG] : 4.11 scrub ok
Feb 02 09:40:17 compute-1 ceph-mon[80115]: 3.1c scrub starts
Feb 02 09:40:17 compute-1 ceph-mon[80115]: 3.1c scrub ok
Feb 02 09:40:17 compute-1 ceph-mon[80115]: 2.1d scrub starts
Feb 02 09:40:17 compute-1 ceph-mon[80115]: 2.1d scrub ok
Feb 02 09:40:17 compute-1 ceph-mon[80115]: 4.14 scrub starts
Feb 02 09:40:17 compute-1 ceph-mon[80115]: 4.14 scrub ok
Feb 02 09:40:17 compute-1 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' 
Feb 02 09:40:17 compute-1 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' 
Feb 02 09:40:17 compute-1 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' 
Feb 02 09:40:17 compute-1 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "auth get-or-create", "entity": "client.rgw.rgw.compute-0.vltabo", "caps": ["mon", "allow *", "mgr", "allow rw", "osd", "allow rwx tag rgw *=*"]}]: dispatch
Feb 02 09:40:17 compute-1 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' cmd='[{"prefix": "auth get-or-create", "entity": "client.rgw.rgw.compute-0.vltabo", "caps": ["mon", "allow *", "mgr", "allow rw", "osd", "allow rwx tag rgw *=*"]}]': finished
Feb 02 09:40:17 compute-1 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' 
Feb 02 09:40:17 compute-1 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Feb 02 09:40:17 compute-1 ceph-mon[80115]: Deploying daemon rgw.rgw.compute-0.vltabo on compute-0
Feb 02 09:40:17 compute-1 ceph-mon[80115]: from='client.? ' entity='client.rgw.rgw.compute-2.zjyufj' cmd='[{"prefix": "osd pool application enable","pool": ".rgw.root","app": "rgw"}]': finished
Feb 02 09:40:17 compute-1 ceph-mon[80115]: osdmap e37: 3 total, 3 up, 3 in
Feb 02 09:40:17 compute-1 ceph-mon[80115]: pgmap v97: 194 pgs: 1 creating+peering, 193 active+clean; 449 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Feb 02 09:40:17 compute-1 ceph-mon[80115]: from='client.? 192.168.122.100:0/1069846288' entity='client.admin' 
Feb 02 09:40:17 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e38 e38: 3 total, 3 up, 3 in
Feb 02 09:40:17 compute-1 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd pool application enable","pool": "default.rgw.log","app": "rgw"} v 0)
Feb 02 09:40:17 compute-1 ceph-mon[80115]: log_channel(audit) log [INF] : from='client.? 192.168.122.101:0/1861488831' entity='client.rgw.rgw.compute-1.ezjvcf' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.log","app": "rgw"}]: dispatch
Feb 02 09:40:18 compute-1 ceph-osd[77691]: log_channel(cluster) log [DBG] : 4.17 deep-scrub starts
Feb 02 09:40:18 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 38 pg[9.0( empty local-lis/les=0/0 n=0 ec=38/38 lis/c=0/0 les/c/f=0/0/0 sis=38) [0] r=0 lpr=38 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 02 09:40:18 compute-1 ceph-osd[77691]: log_channel(cluster) log [DBG] : 4.17 deep-scrub ok
Feb 02 09:40:18 compute-1 ceph-mon[80115]: 3.15 scrub starts
Feb 02 09:40:18 compute-1 ceph-mon[80115]: 3.15 scrub ok
Feb 02 09:40:18 compute-1 ceph-mon[80115]: 2.4 scrub starts
Feb 02 09:40:18 compute-1 ceph-mon[80115]: 2.4 scrub ok
Feb 02 09:40:18 compute-1 ceph-mon[80115]: 4.11 scrub starts
Feb 02 09:40:18 compute-1 ceph-mon[80115]: 4.11 scrub ok
Feb 02 09:40:18 compute-1 ceph-mon[80115]: osdmap e38: 3 total, 3 up, 3 in
Feb 02 09:40:18 compute-1 ceph-mon[80115]: from='client.? 192.168.122.102:0/1995934692' entity='client.rgw.rgw.compute-2.zjyufj' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.log","app": "rgw"}]: dispatch
Feb 02 09:40:18 compute-1 ceph-mon[80115]: from='client.? ' entity='client.rgw.rgw.compute-2.zjyufj' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.log","app": "rgw"}]: dispatch
Feb 02 09:40:18 compute-1 ceph-mon[80115]: from='client.? 192.168.122.101:0/1861488831' entity='client.rgw.rgw.compute-1.ezjvcf' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.log","app": "rgw"}]: dispatch
Feb 02 09:40:18 compute-1 ceph-mon[80115]: from='client.? ' entity='client.rgw.rgw.compute-1.ezjvcf' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.log","app": "rgw"}]: dispatch
Feb 02 09:40:18 compute-1 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' 
Feb 02 09:40:18 compute-1 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' 
Feb 02 09:40:18 compute-1 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' 
Feb 02 09:40:18 compute-1 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' 
Feb 02 09:40:18 compute-1 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' 
Feb 02 09:40:18 compute-1 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' 
Feb 02 09:40:18 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e39 e39: 3 total, 3 up, 3 in
Feb 02 09:40:19 compute-1 ceph-osd[77691]: log_channel(cluster) log [DBG] : 4.9 scrub starts
Feb 02 09:40:19 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 39 pg[9.0( empty local-lis/les=38/39 n=0 ec=38/38 lis/c=0/0 les/c/f=0/0/0 sis=38) [0] r=0 lpr=38 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 02 09:40:19 compute-1 ceph-osd[77691]: log_channel(cluster) log [DBG] : 4.9 scrub ok
Feb 02 09:40:19 compute-1 ceph-mon[80115]: 3.c scrub starts
Feb 02 09:40:19 compute-1 ceph-mon[80115]: 3.c scrub ok
Feb 02 09:40:19 compute-1 ceph-mon[80115]: 2.1 scrub starts
Feb 02 09:40:19 compute-1 ceph-mon[80115]: 2.1 scrub ok
Feb 02 09:40:19 compute-1 ceph-mon[80115]: 4.17 deep-scrub starts
Feb 02 09:40:19 compute-1 ceph-mon[80115]: 4.17 deep-scrub ok
Feb 02 09:40:19 compute-1 ceph-mon[80115]: Saving service rgw.rgw spec with placement compute-0;compute-1;compute-2
Feb 02 09:40:19 compute-1 ceph-mon[80115]: Deploying daemon haproxy.rgw.default.compute-0.avekxu on compute-0
Feb 02 09:40:19 compute-1 ceph-mon[80115]: from='client.? 192.168.122.100:0/3639574610' entity='client.admin' 
Feb 02 09:40:19 compute-1 ceph-mon[80115]: pgmap v99: 195 pgs: 1 unknown, 1 creating+peering, 193 active+clean; 449 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Feb 02 09:40:19 compute-1 ceph-mon[80115]: from='client.? ' entity='client.rgw.rgw.compute-2.zjyufj' cmd='[{"prefix": "osd pool application enable","pool": "default.rgw.log","app": "rgw"}]': finished
Feb 02 09:40:19 compute-1 ceph-mon[80115]: from='client.? ' entity='client.rgw.rgw.compute-1.ezjvcf' cmd='[{"prefix": "osd pool application enable","pool": "default.rgw.log","app": "rgw"}]': finished
Feb 02 09:40:19 compute-1 ceph-mon[80115]: osdmap e39: 3 total, 3 up, 3 in
Feb 02 09:40:19 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e40 e40: 3 total, 3 up, 3 in
Feb 02 09:40:19 compute-1 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd pool application enable","pool": "default.rgw.control","app": "rgw"} v 0)
Feb 02 09:40:19 compute-1 ceph-mon[80115]: log_channel(audit) log [INF] : from='client.? 192.168.122.101:0/1861488831' entity='client.rgw.rgw.compute-1.ezjvcf' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.control","app": "rgw"}]: dispatch
Feb 02 09:40:19 compute-1 ceph-osd[77691]: log_channel(cluster) log [DBG] : 4.b scrub starts
Feb 02 09:40:19 compute-1 ceph-osd[77691]: log_channel(cluster) log [DBG] : 4.b scrub ok
Feb 02 09:40:20 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e40 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 02 09:40:20 compute-1 ceph-mon[80115]: 3.14 scrub starts
Feb 02 09:40:20 compute-1 ceph-mon[80115]: 3.14 scrub ok
Feb 02 09:40:20 compute-1 ceph-mon[80115]: 2.5 scrub starts
Feb 02 09:40:20 compute-1 ceph-mon[80115]: 2.5 scrub ok
Feb 02 09:40:20 compute-1 ceph-mon[80115]: 4.9 scrub starts
Feb 02 09:40:20 compute-1 ceph-mon[80115]: 4.9 scrub ok
Feb 02 09:40:20 compute-1 ceph-mon[80115]: Health check failed: 1 pool(s) do not have an application enabled (POOL_APP_NOT_ENABLED)
Feb 02 09:40:20 compute-1 ceph-mon[80115]: from='client.? 192.168.122.100:0/2604604119' entity='client.admin' cmd=[{"prefix": "mgr module disable", "module": "dashboard"}]: dispatch
Feb 02 09:40:20 compute-1 ceph-mon[80115]: osdmap e40: 3 total, 3 up, 3 in
Feb 02 09:40:20 compute-1 ceph-mon[80115]: from='client.? 192.168.122.100:0/2805705687' entity='client.rgw.rgw.compute-0.vltabo' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.control","app": "rgw"}]: dispatch
Feb 02 09:40:20 compute-1 ceph-mon[80115]: from='client.? 192.168.122.102:0/1995934692' entity='client.rgw.rgw.compute-2.zjyufj' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.control","app": "rgw"}]: dispatch
Feb 02 09:40:20 compute-1 ceph-mon[80115]: from='client.? ' entity='client.rgw.rgw.compute-2.zjyufj' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.control","app": "rgw"}]: dispatch
Feb 02 09:40:20 compute-1 ceph-mon[80115]: from='client.? 192.168.122.101:0/1861488831' entity='client.rgw.rgw.compute-1.ezjvcf' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.control","app": "rgw"}]: dispatch
Feb 02 09:40:20 compute-1 ceph-mon[80115]: from='client.? ' entity='client.rgw.rgw.compute-1.ezjvcf' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.control","app": "rgw"}]: dispatch
Feb 02 09:40:20 compute-1 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' 
Feb 02 09:40:20 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e41 e41: 3 total, 3 up, 3 in
Feb 02 09:40:20 compute-1 ceph-osd[77691]: log_channel(cluster) log [DBG] : 4.7 scrub starts
Feb 02 09:40:20 compute-1 ceph-osd[77691]: log_channel(cluster) log [DBG] : 4.7 scrub ok
Feb 02 09:40:21 compute-1 ceph-mon[80115]: 3.17 scrub starts
Feb 02 09:40:21 compute-1 ceph-mon[80115]: 3.17 scrub ok
Feb 02 09:40:21 compute-1 ceph-mon[80115]: 2.1c scrub starts
Feb 02 09:40:21 compute-1 ceph-mon[80115]: 2.1c scrub ok
Feb 02 09:40:21 compute-1 ceph-mon[80115]: 4.b scrub starts
Feb 02 09:40:21 compute-1 ceph-mon[80115]: 4.b scrub ok
Feb 02 09:40:21 compute-1 ceph-mon[80115]: from='client.? 192.168.122.100:0/2604604119' entity='client.admin' cmd='[{"prefix": "mgr module disable", "module": "dashboard"}]': finished
Feb 02 09:40:21 compute-1 ceph-mon[80115]: 3.18 scrub starts
Feb 02 09:40:21 compute-1 ceph-mon[80115]: 3.18 scrub ok
Feb 02 09:40:21 compute-1 ceph-mon[80115]: mgrmap e11: compute-0.djvyfo(active, since 2m), standbys: compute-2.gzlyac, compute-1.teascl
Feb 02 09:40:21 compute-1 ceph-mon[80115]: pgmap v102: 196 pgs: 2 unknown, 1 creating+peering, 193 active+clean; 449 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Feb 02 09:40:21 compute-1 ceph-mon[80115]: from='client.? 192.168.122.100:0/2805705687' entity='client.rgw.rgw.compute-0.vltabo' cmd='[{"prefix": "osd pool application enable","pool": "default.rgw.control","app": "rgw"}]': finished
Feb 02 09:40:21 compute-1 ceph-mon[80115]: from='client.? ' entity='client.rgw.rgw.compute-2.zjyufj' cmd='[{"prefix": "osd pool application enable","pool": "default.rgw.control","app": "rgw"}]': finished
Feb 02 09:40:21 compute-1 ceph-mon[80115]: from='client.? ' entity='client.rgw.rgw.compute-1.ezjvcf' cmd='[{"prefix": "osd pool application enable","pool": "default.rgw.control","app": "rgw"}]': finished
Feb 02 09:40:21 compute-1 ceph-mon[80115]: osdmap e41: 3 total, 3 up, 3 in
Feb 02 09:40:21 compute-1 ceph-osd[77691]: log_channel(cluster) log [DBG] : 4.1 scrub starts
Feb 02 09:40:21 compute-1 ceph-osd[77691]: log_channel(cluster) log [DBG] : 4.1 scrub ok
Feb 02 09:40:22 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e42 e42: 3 total, 3 up, 3 in
Feb 02 09:40:22 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 42 pg[11.0( empty local-lis/les=0/0 n=0 ec=42/42 lis/c=0/0 les/c/f=0/0/0 sis=42) [0] r=0 lpr=42 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 02 09:40:22 compute-1 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd pool application enable","pool": "default.rgw.meta","app": "rgw"} v 0)
Feb 02 09:40:22 compute-1 ceph-mon[80115]: log_channel(audit) log [INF] : from='client.? 192.168.122.101:0/1861488831' entity='client.rgw.rgw.compute-1.ezjvcf' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.meta","app": "rgw"}]: dispatch
Feb 02 09:40:22 compute-1 ceph-mon[80115]: 2.0 scrub starts
Feb 02 09:40:22 compute-1 ceph-mon[80115]: 2.0 scrub ok
Feb 02 09:40:22 compute-1 ceph-mon[80115]: 4.7 scrub starts
Feb 02 09:40:22 compute-1 ceph-mon[80115]: 4.7 scrub ok
Feb 02 09:40:22 compute-1 ceph-mon[80115]: 3.d scrub starts
Feb 02 09:40:22 compute-1 ceph-mon[80115]: 3.d scrub ok
Feb 02 09:40:22 compute-1 ceph-mon[80115]: osdmap e42: 3 total, 3 up, 3 in
Feb 02 09:40:22 compute-1 ceph-mon[80115]: from='client.? 192.168.122.100:0/2805705687' entity='client.rgw.rgw.compute-0.vltabo' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.meta","app": "rgw"}]: dispatch
Feb 02 09:40:22 compute-1 ceph-mon[80115]: from='client.? 192.168.122.102:0/1995934692' entity='client.rgw.rgw.compute-2.zjyufj' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.meta","app": "rgw"}]: dispatch
Feb 02 09:40:22 compute-1 ceph-mon[80115]: from='client.? ' entity='client.rgw.rgw.compute-2.zjyufj' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.meta","app": "rgw"}]: dispatch
Feb 02 09:40:22 compute-1 ceph-mon[80115]: from='client.? 192.168.122.100:0/1965440456' entity='client.admin' cmd=[{"prefix": "mgr module enable", "module": "dashboard"}]: dispatch
Feb 02 09:40:22 compute-1 ceph-mon[80115]: from='client.? 192.168.122.101:0/1861488831' entity='client.rgw.rgw.compute-1.ezjvcf' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.meta","app": "rgw"}]: dispatch
Feb 02 09:40:22 compute-1 ceph-mon[80115]: from='client.? ' entity='client.rgw.rgw.compute-1.ezjvcf' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.meta","app": "rgw"}]: dispatch
Feb 02 09:40:22 compute-1 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' 
Feb 02 09:40:22 compute-1 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' 
Feb 02 09:40:22 compute-1 ceph-mon[80115]: from='mgr.14122 192.168.122.100:0/4293432189' entity='mgr.compute-0.djvyfo' 
Feb 02 09:40:22 compute-1 ceph-osd[77691]: log_channel(cluster) log [DBG] : 4.15 deep-scrub starts
Feb 02 09:40:22 compute-1 ceph-osd[77691]: log_channel(cluster) log [DBG] : 4.15 deep-scrub ok
Feb 02 09:40:23 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e43 e43: 3 total, 3 up, 3 in
Feb 02 09:40:23 compute-1 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_autoscale_bias", "val": "4"} v 0)
Feb 02 09:40:23 compute-1 ceph-mon[80115]: log_channel(audit) log [INF] : from='client.? 192.168.122.101:0/1861488831' entity='client.rgw.rgw.compute-1.ezjvcf' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_autoscale_bias", "val": "4"}]: dispatch
Feb 02 09:40:23 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 43 pg[11.0( empty local-lis/les=42/43 n=0 ec=42/42 lis/c=0/0 les/c/f=0/0/0 sis=42) [0] r=0 lpr=42 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 02 09:40:23 compute-1 ceph-mgr[80422]: mgr handle_mgr_map respawning because set of enabled modules changed!
Feb 02 09:40:23 compute-1 ceph-mgr[80422]: mgr respawn  e: '/usr/bin/ceph-mgr'
Feb 02 09:40:23 compute-1 ceph-mgr[80422]: mgr respawn  0: '/usr/bin/ceph-mgr'
Feb 02 09:40:23 compute-1 ceph-mgr[80422]: mgr respawn  1: '-n'
Feb 02 09:40:23 compute-1 ceph-mgr[80422]: mgr respawn  2: 'mgr.compute-1.teascl'
Feb 02 09:40:23 compute-1 ceph-mgr[80422]: mgr respawn  3: '-f'
Feb 02 09:40:23 compute-1 ceph-mgr[80422]: mgr respawn  4: '--setuser'
Feb 02 09:40:23 compute-1 ceph-mgr[80422]: mgr respawn  5: 'ceph'
Feb 02 09:40:23 compute-1 ceph-mgr[80422]: mgr respawn  6: '--setgroup'
Feb 02 09:40:23 compute-1 ceph-mgr[80422]: mgr respawn  7: 'ceph'
Feb 02 09:40:23 compute-1 ceph-mgr[80422]: mgr respawn  8: '--default-log-to-file=false'
Feb 02 09:40:23 compute-1 ceph-mgr[80422]: mgr respawn  9: '--default-log-to-journald=true'
Feb 02 09:40:23 compute-1 ceph-mgr[80422]: mgr respawn  10: '--default-log-to-stderr=false'
Feb 02 09:40:23 compute-1 sshd-session[72963]: Connection closed by 192.168.122.100 port 35238
Feb 02 09:40:23 compute-1 sshd-session[72733]: Connection closed by 192.168.122.100 port 35164
Feb 02 09:40:23 compute-1 sshd-session[72849]: Connection closed by 192.168.122.100 port 35204
Feb 02 09:40:23 compute-1 sshd-session[72762]: Connection closed by 192.168.122.100 port 35172
Feb 02 09:40:23 compute-1 sshd-session[72934]: Connection closed by 192.168.122.100 port 35226
Feb 02 09:40:23 compute-1 sshd-session[72820]: Connection closed by 192.168.122.100 port 35196
Feb 02 09:40:23 compute-1 sshd-session[72704]: Connection closed by 192.168.122.100 port 35150
Feb 02 09:40:23 compute-1 sshd-session[72907]: Connection closed by 192.168.122.100 port 35210
Feb 02 09:40:23 compute-1 sshd-session[72791]: Connection closed by 192.168.122.100 port 35186
Feb 02 09:40:23 compute-1 sshd-session[72878]: Connection closed by 192.168.122.100 port 35206
Feb 02 09:40:23 compute-1 sshd-session[72675]: Connection closed by 192.168.122.100 port 35142
Feb 02 09:40:23 compute-1 sshd-session[72674]: Connection closed by 192.168.122.100 port 35128
Feb 02 09:40:23 compute-1 sshd-session[72846]: pam_unix(sshd:session): session closed for user ceph-admin
Feb 02 09:40:23 compute-1 sshd-session[72931]: pam_unix(sshd:session): session closed for user ceph-admin
Feb 02 09:40:23 compute-1 sshd-session[72651]: pam_unix(sshd:session): session closed for user ceph-admin
Feb 02 09:40:23 compute-1 sshd-session[72904]: pam_unix(sshd:session): session closed for user ceph-admin
Feb 02 09:40:23 compute-1 systemd[1]: session-28.scope: Deactivated successfully.
Feb 02 09:40:23 compute-1 sshd-session[72960]: pam_unix(sshd:session): session closed for user ceph-admin
Feb 02 09:40:23 compute-1 systemd[1]: session-31.scope: Deactivated successfully.
Feb 02 09:40:23 compute-1 sshd-session[72817]: pam_unix(sshd:session): session closed for user ceph-admin
Feb 02 09:40:23 compute-1 systemd[1]: session-30.scope: Deactivated successfully.
Feb 02 09:40:23 compute-1 sshd-session[72701]: pam_unix(sshd:session): session closed for user ceph-admin
Feb 02 09:40:23 compute-1 systemd[1]: session-32.scope: Deactivated successfully.
Feb 02 09:40:23 compute-1 systemd[1]: session-32.scope: Consumed 55.279s CPU time.
Feb 02 09:40:23 compute-1 systemd[1]: session-20.scope: Deactivated successfully.
Feb 02 09:40:23 compute-1 systemd[1]: session-27.scope: Deactivated successfully.
Feb 02 09:40:23 compute-1 systemd-logind[805]: Session 28 logged out. Waiting for processes to exit.
Feb 02 09:40:23 compute-1 systemd[1]: session-23.scope: Deactivated successfully.
Feb 02 09:40:23 compute-1 systemd-logind[805]: Session 31 logged out. Waiting for processes to exit.
Feb 02 09:40:23 compute-1 systemd-logind[805]: Session 20 logged out. Waiting for processes to exit.
Feb 02 09:40:23 compute-1 systemd-logind[805]: Session 32 logged out. Waiting for processes to exit.
Feb 02 09:40:23 compute-1 systemd-logind[805]: Session 23 logged out. Waiting for processes to exit.
Feb 02 09:40:23 compute-1 systemd-logind[805]: Session 27 logged out. Waiting for processes to exit.
Feb 02 09:40:23 compute-1 systemd-logind[805]: Session 30 logged out. Waiting for processes to exit.
Feb 02 09:40:23 compute-1 sshd-session[72759]: pam_unix(sshd:session): session closed for user ceph-admin
Feb 02 09:40:23 compute-1 sshd-session[72730]: pam_unix(sshd:session): session closed for user ceph-admin
Feb 02 09:40:23 compute-1 systemd[1]: session-25.scope: Deactivated successfully.
Feb 02 09:40:23 compute-1 sshd-session[72875]: pam_unix(sshd:session): session closed for user ceph-admin
Feb 02 09:40:23 compute-1 systemd[1]: session-29.scope: Deactivated successfully.
Feb 02 09:40:23 compute-1 sshd-session[72788]: pam_unix(sshd:session): session closed for user ceph-admin
Feb 02 09:40:23 compute-1 sshd-session[72669]: pam_unix(sshd:session): session closed for user ceph-admin
Feb 02 09:40:23 compute-1 systemd[1]: session-24.scope: Deactivated successfully.
Feb 02 09:40:23 compute-1 systemd-logind[805]: Session 25 logged out. Waiting for processes to exit.
Feb 02 09:40:23 compute-1 systemd[1]: session-26.scope: Deactivated successfully.
Feb 02 09:40:23 compute-1 systemd-logind[805]: Session 29 logged out. Waiting for processes to exit.
Feb 02 09:40:23 compute-1 systemd[1]: session-22.scope: Deactivated successfully.
Feb 02 09:40:23 compute-1 systemd-logind[805]: Session 26 logged out. Waiting for processes to exit.
Feb 02 09:40:23 compute-1 systemd-logind[805]: Session 24 logged out. Waiting for processes to exit.
Feb 02 09:40:23 compute-1 systemd-logind[805]: Session 22 logged out. Waiting for processes to exit.
Feb 02 09:40:23 compute-1 systemd-logind[805]: Removed session 28.
Feb 02 09:40:23 compute-1 systemd-logind[805]: Removed session 31.
Feb 02 09:40:23 compute-1 systemd-logind[805]: Removed session 30.
Feb 02 09:40:23 compute-1 systemd-logind[805]: Removed session 32.
Feb 02 09:40:23 compute-1 systemd-logind[805]: Removed session 20.
Feb 02 09:40:23 compute-1 systemd-logind[805]: Removed session 27.
Feb 02 09:40:23 compute-1 systemd-logind[805]: Removed session 23.
Feb 02 09:40:23 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]: ignoring --setuser ceph since I am not root
Feb 02 09:40:23 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]: ignoring --setgroup ceph since I am not root
Feb 02 09:40:23 compute-1 systemd-logind[805]: Removed session 25.
Feb 02 09:40:23 compute-1 systemd-logind[805]: Removed session 29.
Feb 02 09:40:23 compute-1 systemd-logind[805]: Removed session 24.
Feb 02 09:40:23 compute-1 systemd-logind[805]: Removed session 26.
Feb 02 09:40:23 compute-1 systemd-logind[805]: Removed session 22.
Feb 02 09:40:23 compute-1 ceph-mgr[80422]: ceph version 19.2.3 (c92aebb279828e9c3c1f5d24613efca272649e62) squid (stable), process ceph-mgr, pid 2
Feb 02 09:40:23 compute-1 ceph-mgr[80422]: pidfile_write: ignore empty --pid-file
Feb 02 09:40:23 compute-1 ceph-mgr[80422]: mgr[py] Loading python module 'alerts'
Feb 02 09:40:23 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]: 2026-02-02T09:40:23.293+0000 7f6a2fa24140 -1 mgr[py] Module alerts has missing NOTIFY_TYPES member
Feb 02 09:40:23 compute-1 ceph-mgr[80422]: mgr[py] Module alerts has missing NOTIFY_TYPES member
Feb 02 09:40:23 compute-1 ceph-mgr[80422]: mgr[py] Loading python module 'balancer'
Feb 02 09:40:23 compute-1 ceph-mgr[80422]: mgr[py] Module balancer has missing NOTIFY_TYPES member
Feb 02 09:40:23 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]: 2026-02-02T09:40:23.366+0000 7f6a2fa24140 -1 mgr[py] Module balancer has missing NOTIFY_TYPES member
Feb 02 09:40:23 compute-1 ceph-mgr[80422]: mgr[py] Loading python module 'cephadm'
Feb 02 09:40:23 compute-1 ceph-mon[80115]: 2.d scrub starts
Feb 02 09:40:23 compute-1 ceph-mon[80115]: 2.d scrub ok
Feb 02 09:40:23 compute-1 ceph-mon[80115]: 4.1 scrub starts
Feb 02 09:40:23 compute-1 ceph-mon[80115]: 4.1 scrub ok
Feb 02 09:40:23 compute-1 ceph-mon[80115]: 3.13 scrub starts
Feb 02 09:40:23 compute-1 ceph-mon[80115]: 3.13 scrub ok
Feb 02 09:40:23 compute-1 ceph-mon[80115]: from='client.? 192.168.122.100:0/2805705687' entity='client.rgw.rgw.compute-0.vltabo' cmd='[{"prefix": "osd pool application enable","pool": "default.rgw.meta","app": "rgw"}]': finished
Feb 02 09:40:23 compute-1 ceph-mon[80115]: from='client.? ' entity='client.rgw.rgw.compute-2.zjyufj' cmd='[{"prefix": "osd pool application enable","pool": "default.rgw.meta","app": "rgw"}]': finished
Feb 02 09:40:23 compute-1 ceph-mon[80115]: from='client.? ' entity='client.rgw.rgw.compute-1.ezjvcf' cmd='[{"prefix": "osd pool application enable","pool": "default.rgw.meta","app": "rgw"}]': finished
Feb 02 09:40:23 compute-1 ceph-mon[80115]: osdmap e43: 3 total, 3 up, 3 in
Feb 02 09:40:23 compute-1 ceph-mon[80115]: from='client.? 192.168.122.101:0/1861488831' entity='client.rgw.rgw.compute-1.ezjvcf' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_autoscale_bias", "val": "4"}]: dispatch
Feb 02 09:40:23 compute-1 ceph-mon[80115]: from='client.? 192.168.122.102:0/1995934692' entity='client.rgw.rgw.compute-2.zjyufj' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_autoscale_bias", "val": "4"}]: dispatch
Feb 02 09:40:23 compute-1 ceph-mon[80115]: from='client.? ' entity='client.rgw.rgw.compute-2.zjyufj' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_autoscale_bias", "val": "4"}]: dispatch
Feb 02 09:40:23 compute-1 ceph-mon[80115]: from='client.? ' entity='client.rgw.rgw.compute-1.ezjvcf' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_autoscale_bias", "val": "4"}]: dispatch
Feb 02 09:40:23 compute-1 ceph-mon[80115]: from='client.? 192.168.122.100:0/2805705687' entity='client.rgw.rgw.compute-0.vltabo' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_autoscale_bias", "val": "4"}]: dispatch
Feb 02 09:40:23 compute-1 ceph-mon[80115]: from='client.? 192.168.122.100:0/1965440456' entity='client.admin' cmd='[{"prefix": "mgr module enable", "module": "dashboard"}]': finished
Feb 02 09:40:23 compute-1 ceph-mon[80115]: mgrmap e12: compute-0.djvyfo(active, since 2m), standbys: compute-2.gzlyac, compute-1.teascl
Feb 02 09:40:23 compute-1 ceph-osd[77691]: log_channel(cluster) log [DBG] : 4.0 scrub starts
Feb 02 09:40:23 compute-1 ceph-osd[77691]: log_channel(cluster) log [DBG] : 4.0 scrub ok
Feb 02 09:40:24 compute-1 ceph-mgr[80422]: mgr[py] Loading python module 'crash'
Feb 02 09:40:24 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e44 e44: 3 total, 3 up, 3 in
Feb 02 09:40:24 compute-1 ceph-mgr[80422]: mgr[py] Module crash has missing NOTIFY_TYPES member
Feb 02 09:40:24 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]: 2026-02-02T09:40:24.091+0000 7f6a2fa24140 -1 mgr[py] Module crash has missing NOTIFY_TYPES member
Feb 02 09:40:24 compute-1 ceph-mgr[80422]: mgr[py] Loading python module 'dashboard'
Feb 02 09:40:24 compute-1 radosgw[81528]: v1 topic migration: starting v1 topic migration..
Feb 02 09:40:24 compute-1 radosgw[81528]: LDAP not started since no server URIs were provided in the configuration.
Feb 02 09:40:24 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-rgw-rgw-compute-1-ezjvcf[81524]: 2026-02-02T09:40:24.539+0000 7fc5d4838980 -1 LDAP not started since no server URIs were provided in the configuration.
Feb 02 09:40:24 compute-1 radosgw[81528]: v1 topic migration: finished v1 topic migration
Feb 02 09:40:24 compute-1 radosgw[81528]: INFO: RGWReshardLock::lock found lock on reshard.0000000000 to be held by another RGW process; skipping for now
Feb 02 09:40:24 compute-1 radosgw[81528]: INFO: RGWReshardLock::lock found lock on reshard.0000000002 to be held by another RGW process; skipping for now
Feb 02 09:40:24 compute-1 radosgw[81528]: framework: beast
Feb 02 09:40:24 compute-1 radosgw[81528]: framework conf key: ssl_certificate, val: config://rgw/cert/$realm/$zone.crt
Feb 02 09:40:24 compute-1 radosgw[81528]: framework conf key: ssl_private_key, val: config://rgw/cert/$realm/$zone.key
Feb 02 09:40:24 compute-1 radosgw[81528]: starting handler: beast
Feb 02 09:40:24 compute-1 radosgw[81528]: set uid:gid to 167:167 (ceph:ceph)
Feb 02 09:40:24 compute-1 radosgw[81528]: mgrc service_daemon_register rgw.24170 metadata {arch=x86_64,ceph_release=squid,ceph_version=ceph version 19.2.3 (c92aebb279828e9c3c1f5d24613efca272649e62) squid (stable),ceph_version_short=19.2.3,container_hostname=compute-1,container_image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec,cpu=AMD EPYC-Rome Processor,distro=centos,distro_description=CentOS Stream 9,distro_version=9,frontend_config#0=beast endpoint=192.168.122.101:8082,frontend_type#0=beast,hostname=compute-1,id=rgw.compute-1.ezjvcf,kernel_description=#1 SMP PREEMPT_DYNAMIC Thu Jan 22 12:30:22 UTC 2026,kernel_version=5.14.0-665.el9.x86_64,mem_swap_kb=1048572,mem_total_kb=7864292,num_handles=1,os=Linux,pid=2,realm_id=,realm_name=,zone_id=d5604b0e-c827-4596-94de-7709c44354e7,zone_name=default,zonegroup_id=d74d963d-58da-4c60-ad13-18a6b0033c09,zonegroup_name=default}
Feb 02 09:40:24 compute-1 ceph-mgr[80422]: mgr[py] Loading python module 'devicehealth'
Feb 02 09:40:24 compute-1 radosgw[81528]: INFO: RGWReshardLock::lock found lock on reshard.0000000004 to be held by another RGW process; skipping for now
Feb 02 09:40:24 compute-1 radosgw[81528]: INFO: RGWReshardLock::lock found lock on reshard.0000000005 to be held by another RGW process; skipping for now
Feb 02 09:40:24 compute-1 radosgw[81528]: INFO: RGWReshardLock::lock found lock on reshard.0000000007 to be held by another RGW process; skipping for now
Feb 02 09:40:24 compute-1 ceph-mgr[80422]: mgr[py] Module devicehealth has missing NOTIFY_TYPES member
Feb 02 09:40:24 compute-1 ceph-mgr[80422]: mgr[py] Loading python module 'diskprediction_local'
Feb 02 09:40:24 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]: 2026-02-02T09:40:24.676+0000 7f6a2fa24140 -1 mgr[py] Module devicehealth has missing NOTIFY_TYPES member
Feb 02 09:40:24 compute-1 radosgw[81528]: INFO: RGWReshardLock::lock found lock on reshard.0000000008 to be held by another RGW process; skipping for now
Feb 02 09:40:24 compute-1 ceph-mon[80115]: 2.b scrub starts
Feb 02 09:40:24 compute-1 ceph-mon[80115]: 2.b scrub ok
Feb 02 09:40:24 compute-1 ceph-mon[80115]: 4.15 deep-scrub starts
Feb 02 09:40:24 compute-1 ceph-mon[80115]: 4.15 deep-scrub ok
Feb 02 09:40:24 compute-1 ceph-mon[80115]: 3.b scrub starts
Feb 02 09:40:24 compute-1 ceph-mon[80115]: 3.b scrub ok
Feb 02 09:40:24 compute-1 ceph-mon[80115]: from='client.? ' entity='client.rgw.rgw.compute-2.zjyufj' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_autoscale_bias", "val": "4"}]': finished
Feb 02 09:40:24 compute-1 ceph-mon[80115]: from='client.? ' entity='client.rgw.rgw.compute-1.ezjvcf' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_autoscale_bias", "val": "4"}]': finished
Feb 02 09:40:24 compute-1 ceph-mon[80115]: from='client.? 192.168.122.100:0/2805705687' entity='client.rgw.rgw.compute-0.vltabo' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_autoscale_bias", "val": "4"}]': finished
Feb 02 09:40:24 compute-1 ceph-mon[80115]: osdmap e44: 3 total, 3 up, 3 in
Feb 02 09:40:24 compute-1 radosgw[81528]: INFO: RGWReshardLock::lock found lock on reshard.0000000010 to be held by another RGW process; skipping for now
Feb 02 09:40:24 compute-1 radosgw[81528]: INFO: RGWReshardLock::lock found lock on reshard.0000000011 to be held by another RGW process; skipping for now
Feb 02 09:40:24 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]: /lib64/python3.9/site-packages/scipy/__init__.py:73: UserWarning: NumPy was imported from a Python sub-interpreter but NumPy does not properly support sub-interpreters. This will likely work for most users but might cause hard to track down issues or subtle bugs. A common user of the rare sub-interpreter feature is wsgi which also allows single-interpreter mode.
Feb 02 09:40:24 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]: Improvements in the case of bugs are welcome, but is not on the NumPy roadmap, and full support may require significant effort to achieve.
Feb 02 09:40:24 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]:   from numpy import show_config as show_numpy_config
Feb 02 09:40:24 compute-1 ceph-mgr[80422]: mgr[py] Module diskprediction_local has missing NOTIFY_TYPES member
Feb 02 09:40:24 compute-1 ceph-mgr[80422]: mgr[py] Loading python module 'influx'
Feb 02 09:40:24 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]: 2026-02-02T09:40:24.820+0000 7f6a2fa24140 -1 mgr[py] Module diskprediction_local has missing NOTIFY_TYPES member
Feb 02 09:40:24 compute-1 radosgw[81528]: INFO: RGWReshardLock::lock found lock on reshard.0000000013 to be held by another RGW process; skipping for now
Feb 02 09:40:24 compute-1 radosgw[81528]: INFO: RGWReshardLock::lock found lock on reshard.0000000014 to be held by another RGW process; skipping for now
Feb 02 09:40:24 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:40:24 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 09:40:24 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:40:24.868 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 09:40:24 compute-1 ceph-mgr[80422]: mgr[py] Module influx has missing NOTIFY_TYPES member
Feb 02 09:40:24 compute-1 ceph-mgr[80422]: mgr[py] Loading python module 'insights'
Feb 02 09:40:24 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]: 2026-02-02T09:40:24.884+0000 7f6a2fa24140 -1 mgr[py] Module influx has missing NOTIFY_TYPES member
Feb 02 09:40:24 compute-1 ceph-mgr[80422]: mgr[py] Loading python module 'iostat'
Feb 02 09:40:24 compute-1 ceph-osd[77691]: log_channel(cluster) log [DBG] : 4.a deep-scrub starts
Feb 02 09:40:24 compute-1 ceph-osd[77691]: log_channel(cluster) log [DBG] : 4.a deep-scrub ok
Feb 02 09:40:25 compute-1 ceph-mgr[80422]: mgr[py] Module iostat has missing NOTIFY_TYPES member
Feb 02 09:40:25 compute-1 ceph-mgr[80422]: mgr[py] Loading python module 'k8sevents'
Feb 02 09:40:25 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]: 2026-02-02T09:40:25.006+0000 7f6a2fa24140 -1 mgr[py] Module iostat has missing NOTIFY_TYPES member
Feb 02 09:40:25 compute-1 ceph-mgr[80422]: mgr[py] Loading python module 'localpool'
Feb 02 09:40:25 compute-1 ceph-mgr[80422]: mgr[py] Loading python module 'mds_autoscaler'
Feb 02 09:40:25 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e44 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 02 09:40:25 compute-1 ceph-mgr[80422]: mgr[py] Loading python module 'mirroring'
Feb 02 09:40:25 compute-1 ceph-mgr[80422]: mgr[py] Loading python module 'nfs'
Feb 02 09:40:25 compute-1 ceph-mon[80115]: 2.f scrub starts
Feb 02 09:40:25 compute-1 ceph-mon[80115]: 2.f scrub ok
Feb 02 09:40:25 compute-1 ceph-mon[80115]: 4.0 scrub starts
Feb 02 09:40:25 compute-1 ceph-mon[80115]: 4.0 scrub ok
Feb 02 09:40:25 compute-1 ceph-mon[80115]: 3.19 scrub starts
Feb 02 09:40:25 compute-1 ceph-mon[80115]: 3.19 scrub ok
Feb 02 09:40:25 compute-1 ceph-mgr[80422]: mgr[py] Module nfs has missing NOTIFY_TYPES member
Feb 02 09:40:25 compute-1 ceph-mgr[80422]: mgr[py] Loading python module 'orchestrator'
Feb 02 09:40:25 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]: 2026-02-02T09:40:25.875+0000 7f6a2fa24140 -1 mgr[py] Module nfs has missing NOTIFY_TYPES member
Feb 02 09:40:25 compute-1 ceph-osd[77691]: log_channel(cluster) log [DBG] : 4.d scrub starts
Feb 02 09:40:25 compute-1 ceph-osd[77691]: log_channel(cluster) log [DBG] : 4.d scrub ok
Feb 02 09:40:26 compute-1 ceph-mgr[80422]: mgr[py] Module orchestrator has missing NOTIFY_TYPES member
Feb 02 09:40:26 compute-1 ceph-mgr[80422]: mgr[py] Loading python module 'osd_perf_query'
Feb 02 09:40:26 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]: 2026-02-02T09:40:26.078+0000 7f6a2fa24140 -1 mgr[py] Module orchestrator has missing NOTIFY_TYPES member
Feb 02 09:40:26 compute-1 ceph-mgr[80422]: mgr[py] Module osd_perf_query has missing NOTIFY_TYPES member
Feb 02 09:40:26 compute-1 ceph-mgr[80422]: mgr[py] Loading python module 'osd_support'
Feb 02 09:40:26 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]: 2026-02-02T09:40:26.152+0000 7f6a2fa24140 -1 mgr[py] Module osd_perf_query has missing NOTIFY_TYPES member
Feb 02 09:40:26 compute-1 ceph-mgr[80422]: mgr[py] Module osd_support has missing NOTIFY_TYPES member
Feb 02 09:40:26 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]: 2026-02-02T09:40:26.216+0000 7f6a2fa24140 -1 mgr[py] Module osd_support has missing NOTIFY_TYPES member
Feb 02 09:40:26 compute-1 ceph-mgr[80422]: mgr[py] Loading python module 'pg_autoscaler'
Feb 02 09:40:26 compute-1 ceph-mgr[80422]: mgr[py] Module pg_autoscaler has missing NOTIFY_TYPES member
Feb 02 09:40:26 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]: 2026-02-02T09:40:26.286+0000 7f6a2fa24140 -1 mgr[py] Module pg_autoscaler has missing NOTIFY_TYPES member
Feb 02 09:40:26 compute-1 ceph-mgr[80422]: mgr[py] Loading python module 'progress'
Feb 02 09:40:26 compute-1 ceph-mgr[80422]: mgr[py] Module progress has missing NOTIFY_TYPES member
Feb 02 09:40:26 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]: 2026-02-02T09:40:26.350+0000 7f6a2fa24140 -1 mgr[py] Module progress has missing NOTIFY_TYPES member
Feb 02 09:40:26 compute-1 ceph-mgr[80422]: mgr[py] Loading python module 'prometheus'
Feb 02 09:40:26 compute-1 ceph-mgr[80422]: mgr[py] Module prometheus has missing NOTIFY_TYPES member
Feb 02 09:40:26 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]: 2026-02-02T09:40:26.670+0000 7f6a2fa24140 -1 mgr[py] Module prometheus has missing NOTIFY_TYPES member
Feb 02 09:40:26 compute-1 ceph-mgr[80422]: mgr[py] Loading python module 'rbd_support'
Feb 02 09:40:26 compute-1 ceph-mon[80115]: 2.c scrub starts
Feb 02 09:40:26 compute-1 ceph-mon[80115]: 2.c scrub ok
Feb 02 09:40:26 compute-1 ceph-mon[80115]: 4.a deep-scrub starts
Feb 02 09:40:26 compute-1 ceph-mon[80115]: 4.a deep-scrub ok
Feb 02 09:40:26 compute-1 ceph-mon[80115]: 3.16 scrub starts
Feb 02 09:40:26 compute-1 ceph-mon[80115]: 3.16 scrub ok
Feb 02 09:40:26 compute-1 ceph-mgr[80422]: mgr[py] Module rbd_support has missing NOTIFY_TYPES member
Feb 02 09:40:26 compute-1 ceph-mgr[80422]: mgr[py] Loading python module 'restful'
Feb 02 09:40:26 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]: 2026-02-02T09:40:26.758+0000 7f6a2fa24140 -1 mgr[py] Module rbd_support has missing NOTIFY_TYPES member
Feb 02 09:40:26 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:40:26 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:40:26 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:40:26.779 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:40:26 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:40:26 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 09:40:26 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:40:26.871 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 09:40:26 compute-1 ceph-osd[77691]: log_channel(cluster) log [DBG] : 4.6 scrub starts
Feb 02 09:40:26 compute-1 ceph-osd[77691]: log_channel(cluster) log [DBG] : 4.6 scrub ok
Feb 02 09:40:26 compute-1 ceph-mgr[80422]: mgr[py] Loading python module 'rgw'
Feb 02 09:40:27 compute-1 ceph-mgr[80422]: mgr[py] Module rgw has missing NOTIFY_TYPES member
Feb 02 09:40:27 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]: 2026-02-02T09:40:27.137+0000 7f6a2fa24140 -1 mgr[py] Module rgw has missing NOTIFY_TYPES member
Feb 02 09:40:27 compute-1 ceph-mgr[80422]: mgr[py] Loading python module 'rook'
Feb 02 09:40:27 compute-1 ceph-mgr[80422]: mgr[py] Module rook has missing NOTIFY_TYPES member
Feb 02 09:40:27 compute-1 ceph-mgr[80422]: mgr[py] Loading python module 'selftest'
Feb 02 09:40:27 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]: 2026-02-02T09:40:27.607+0000 7f6a2fa24140 -1 mgr[py] Module rook has missing NOTIFY_TYPES member
Feb 02 09:40:27 compute-1 ceph-mgr[80422]: mgr[py] Module selftest has missing NOTIFY_TYPES member
Feb 02 09:40:27 compute-1 ceph-mgr[80422]: mgr[py] Loading python module 'snap_schedule'
Feb 02 09:40:27 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]: 2026-02-02T09:40:27.669+0000 7f6a2fa24140 -1 mgr[py] Module selftest has missing NOTIFY_TYPES member
Feb 02 09:40:27 compute-1 ceph-mgr[80422]: mgr[py] Module snap_schedule has missing NOTIFY_TYPES member
Feb 02 09:40:27 compute-1 ceph-mgr[80422]: mgr[py] Loading python module 'stats'
Feb 02 09:40:27 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]: 2026-02-02T09:40:27.738+0000 7f6a2fa24140 -1 mgr[py] Module snap_schedule has missing NOTIFY_TYPES member
Feb 02 09:40:27 compute-1 ceph-mgr[80422]: mgr[py] Loading python module 'status'
Feb 02 09:40:27 compute-1 ceph-mgr[80422]: mgr[py] Module status has missing NOTIFY_TYPES member
Feb 02 09:40:27 compute-1 ceph-mgr[80422]: mgr[py] Loading python module 'telegraf'
Feb 02 09:40:27 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]: 2026-02-02T09:40:27.864+0000 7f6a2fa24140 -1 mgr[py] Module status has missing NOTIFY_TYPES member
Feb 02 09:40:27 compute-1 ceph-osd[77691]: log_channel(cluster) log [DBG] : 4.5 scrub starts
Feb 02 09:40:27 compute-1 ceph-mon[80115]: 2.e deep-scrub starts
Feb 02 09:40:27 compute-1 ceph-osd[77691]: log_channel(cluster) log [DBG] : 4.5 scrub ok
Feb 02 09:40:27 compute-1 ceph-mon[80115]: 2.e deep-scrub ok
Feb 02 09:40:27 compute-1 ceph-mon[80115]: 4.d scrub starts
Feb 02 09:40:27 compute-1 ceph-mon[80115]: 4.d scrub ok
Feb 02 09:40:27 compute-1 ceph-mon[80115]: 3.0 scrub starts
Feb 02 09:40:27 compute-1 ceph-mon[80115]: 3.0 scrub ok
Feb 02 09:40:27 compute-1 ceph-mgr[80422]: mgr[py] Module telegraf has missing NOTIFY_TYPES member
Feb 02 09:40:27 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]: 2026-02-02T09:40:27.925+0000 7f6a2fa24140 -1 mgr[py] Module telegraf has missing NOTIFY_TYPES member
Feb 02 09:40:27 compute-1 ceph-mgr[80422]: mgr[py] Loading python module 'telemetry'
Feb 02 09:40:28 compute-1 ceph-mgr[80422]: mgr[py] Module telemetry has missing NOTIFY_TYPES member
Feb 02 09:40:28 compute-1 ceph-mgr[80422]: mgr[py] Loading python module 'test_orchestrator'
Feb 02 09:40:28 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]: 2026-02-02T09:40:28.056+0000 7f6a2fa24140 -1 mgr[py] Module telemetry has missing NOTIFY_TYPES member
Feb 02 09:40:28 compute-1 ceph-mgr[80422]: mgr[py] Module test_orchestrator has missing NOTIFY_TYPES member
Feb 02 09:40:28 compute-1 ceph-mgr[80422]: mgr[py] Loading python module 'volumes'
Feb 02 09:40:28 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]: 2026-02-02T09:40:28.250+0000 7f6a2fa24140 -1 mgr[py] Module test_orchestrator has missing NOTIFY_TYPES member
Feb 02 09:40:28 compute-1 ceph-mgr[80422]: mgr[py] Module volumes has missing NOTIFY_TYPES member
Feb 02 09:40:28 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]: 2026-02-02T09:40:28.494+0000 7f6a2fa24140 -1 mgr[py] Module volumes has missing NOTIFY_TYPES member
Feb 02 09:40:28 compute-1 ceph-mgr[80422]: mgr[py] Loading python module 'zabbix'
Feb 02 09:40:28 compute-1 ceph-mgr[80422]: mgr[py] Module zabbix has missing NOTIFY_TYPES member
Feb 02 09:40:28 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]: 2026-02-02T09:40:28.557+0000 7f6a2fa24140 -1 mgr[py] Module zabbix has missing NOTIFY_TYPES member
Feb 02 09:40:28 compute-1 ceph-mgr[80422]: [dashboard DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Feb 02 09:40:28 compute-1 ceph-mgr[80422]: mgr load Constructed class from module: dashboard
Feb 02 09:40:28 compute-1 ceph-mgr[80422]: [dashboard INFO root] server: ssl=no host=192.168.122.101 port=8443
Feb 02 09:40:28 compute-1 ceph-mgr[80422]: ms_deliver_dispatch: unhandled message 0x55d9b1b59860 mon_map magic: 0 from mon.2 v2:192.168.122.101:3300/0
Feb 02 09:40:28 compute-1 ceph-mgr[80422]: [dashboard INFO root] Configured CherryPy, starting engine...
Feb 02 09:40:28 compute-1 ceph-mgr[80422]: [dashboard INFO root] Starting engine...
Feb 02 09:40:28 compute-1 ceph-mgr[80422]: [dashboard INFO root] Engine started...
Feb 02 09:40:28 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:40:28 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:40:28 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:40:28.782 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:40:28 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:40:28 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 09:40:28 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:40:28.874 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 09:40:28 compute-1 ceph-osd[77691]: log_channel(cluster) log [DBG] : 4.f scrub starts
Feb 02 09:40:28 compute-1 ceph-osd[77691]: log_channel(cluster) log [DBG] : 4.f scrub ok
Feb 02 09:40:28 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e45 e45: 3 total, 3 up, 3 in
Feb 02 09:40:29 compute-1 ceph-mon[80115]: 2.3 scrub starts
Feb 02 09:40:29 compute-1 ceph-mon[80115]: 2.3 scrub ok
Feb 02 09:40:29 compute-1 ceph-mon[80115]: 4.6 scrub starts
Feb 02 09:40:29 compute-1 ceph-mon[80115]: 4.6 scrub ok
Feb 02 09:40:29 compute-1 ceph-mon[80115]: 3.6 scrub starts
Feb 02 09:40:29 compute-1 ceph-mon[80115]: 3.6 scrub ok
Feb 02 09:40:29 compute-1 ceph-mon[80115]: 2.12 scrub starts
Feb 02 09:40:29 compute-1 ceph-mon[80115]: 2.12 scrub ok
Feb 02 09:40:29 compute-1 ceph-mon[80115]: 4.5 scrub starts
Feb 02 09:40:29 compute-1 ceph-mon[80115]: 4.5 scrub ok
Feb 02 09:40:29 compute-1 ceph-mon[80115]: Standby manager daemon compute-1.teascl restarted
Feb 02 09:40:29 compute-1 ceph-mon[80115]: Standby manager daemon compute-1.teascl started
Feb 02 09:40:29 compute-1 ceph-mon[80115]: Standby manager daemon compute-2.gzlyac restarted
Feb 02 09:40:29 compute-1 ceph-mon[80115]: Standby manager daemon compute-2.gzlyac started
Feb 02 09:40:29 compute-1 ceph-mon[80115]: Active manager daemon compute-0.djvyfo restarted
Feb 02 09:40:29 compute-1 ceph-mon[80115]: Activating manager daemon compute-0.djvyfo
Feb 02 09:40:29 compute-1 sshd-session[82193]: Accepted publickey for ceph-admin from 192.168.122.100 port 33354 ssh2: RSA SHA256:U0yYyMay/+pOHGkTC+bWOMAOMtoKtn/A+YnW2fdFMFU
Feb 02 09:40:29 compute-1 systemd-logind[805]: New session 33 of user ceph-admin.
Feb 02 09:40:29 compute-1 systemd[1]: Started Session 33 of User ceph-admin.
Feb 02 09:40:29 compute-1 sshd-session[82193]: pam_unix(sshd:session): session opened for user ceph-admin(uid=42477) by ceph-admin(uid=0)
Feb 02 09:40:29 compute-1 sudo[82197]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 02 09:40:29 compute-1 sudo[82197]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:40:29 compute-1 sudo[82197]: pam_unix(sudo:session): session closed for user root
Feb 02 09:40:29 compute-1 sudo[82222]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/d241d473-9fcb-5f74-b163-f1ca4454e7f1/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --image quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec --timeout 895 ls
Feb 02 09:40:29 compute-1 sudo[82222]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:40:29 compute-1 ceph-osd[77691]: log_channel(cluster) log [DBG] : 4.2 scrub starts
Feb 02 09:40:29 compute-1 ceph-osd[77691]: log_channel(cluster) log [DBG] : 4.2 scrub ok
Feb 02 09:40:30 compute-1 ceph-mon[80115]: 3.7 scrub starts
Feb 02 09:40:30 compute-1 ceph-mon[80115]: 3.7 scrub ok
Feb 02 09:40:30 compute-1 ceph-mon[80115]: 2.10 scrub starts
Feb 02 09:40:30 compute-1 ceph-mon[80115]: 2.10 scrub ok
Feb 02 09:40:30 compute-1 ceph-mon[80115]: 4.f scrub starts
Feb 02 09:40:30 compute-1 ceph-mon[80115]: 4.f scrub ok
Feb 02 09:40:30 compute-1 ceph-mon[80115]: osdmap e45: 3 total, 3 up, 3 in
Feb 02 09:40:30 compute-1 ceph-mon[80115]: mgrmap e13: compute-0.djvyfo(active, starting, since 0.077214s), standbys: compute-1.teascl, compute-2.gzlyac
Feb 02 09:40:30 compute-1 ceph-mon[80115]: from='mgr.24202 192.168.122.100:0/2430780993' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "mon metadata", "id": "compute-0"}]: dispatch
Feb 02 09:40:30 compute-1 ceph-mon[80115]: from='mgr.24202 192.168.122.100:0/2430780993' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "mon metadata", "id": "compute-1"}]: dispatch
Feb 02 09:40:30 compute-1 ceph-mon[80115]: from='mgr.24202 192.168.122.100:0/2430780993' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "mon metadata", "id": "compute-2"}]: dispatch
Feb 02 09:40:30 compute-1 ceph-mon[80115]: from='mgr.24202 192.168.122.100:0/2430780993' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "mgr metadata", "who": "compute-0.djvyfo", "id": "compute-0.djvyfo"}]: dispatch
Feb 02 09:40:30 compute-1 ceph-mon[80115]: from='mgr.24202 192.168.122.100:0/2430780993' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "mgr metadata", "who": "compute-1.teascl", "id": "compute-1.teascl"}]: dispatch
Feb 02 09:40:30 compute-1 ceph-mon[80115]: from='mgr.24202 192.168.122.100:0/2430780993' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "mgr metadata", "who": "compute-2.gzlyac", "id": "compute-2.gzlyac"}]: dispatch
Feb 02 09:40:30 compute-1 ceph-mon[80115]: from='mgr.24202 192.168.122.100:0/2430780993' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd metadata", "id": 0}]: dispatch
Feb 02 09:40:30 compute-1 ceph-mon[80115]: from='mgr.24202 192.168.122.100:0/2430780993' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch
Feb 02 09:40:30 compute-1 ceph-mon[80115]: from='mgr.24202 192.168.122.100:0/2430780993' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch
Feb 02 09:40:30 compute-1 ceph-mon[80115]: from='mgr.24202 192.168.122.100:0/2430780993' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "mds metadata"}]: dispatch
Feb 02 09:40:30 compute-1 ceph-mon[80115]: from='mgr.24202 192.168.122.100:0/2430780993' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd metadata"}]: dispatch
Feb 02 09:40:30 compute-1 ceph-mon[80115]: from='mgr.24202 192.168.122.100:0/2430780993' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "mon metadata"}]: dispatch
Feb 02 09:40:30 compute-1 ceph-mon[80115]: Manager daemon compute-0.djvyfo is now available
Feb 02 09:40:30 compute-1 ceph-mon[80115]: from='mgr.24202 192.168.122.100:0/2430780993' entity='mgr.compute-0.djvyfo' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/compute-0.djvyfo/mirror_snapshot_schedule"}]: dispatch
Feb 02 09:40:30 compute-1 ceph-mon[80115]: from='mgr.24202 192.168.122.100:0/2430780993' entity='mgr.compute-0.djvyfo' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/compute-0.djvyfo/trash_purge_schedule"}]: dispatch
Feb 02 09:40:30 compute-1 ceph-mon[80115]: 4.2 scrub starts
Feb 02 09:40:30 compute-1 ceph-mon[80115]: mgrmap e14: compute-0.djvyfo(active, since 1.10632s), standbys: compute-1.teascl, compute-2.gzlyac
Feb 02 09:40:30 compute-1 ceph-mon[80115]: from='mgr.24202 192.168.122.100:0/2430780993' entity='mgr.compute-0.djvyfo' 
Feb 02 09:40:30 compute-1 podman[82319]: 2026-02-02 09:40:30.124516565 +0000 UTC m=+0.071227215 container exec 01cf0f34952fee77036b3444f3bcffc4063a933b24861a159799fb77cf2e6e0e (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-crash-compute-1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2)
Feb 02 09:40:30 compute-1 podman[82319]: 2026-02-02 09:40:30.230742071 +0000 UTC m=+0.177452741 container exec_died 01cf0f34952fee77036b3444f3bcffc4063a933b24861a159799fb77cf2e6e0e (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-crash-compute-1, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 02 09:40:30 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e45 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 02 09:40:30 compute-1 sudo[82222]: pam_unix(sudo:session): session closed for user root
Feb 02 09:40:30 compute-1 sudo[82426]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 02 09:40:30 compute-1 sudo[82426]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:40:30 compute-1 sudo[82426]: pam_unix(sudo:session): session closed for user root
Feb 02 09:40:30 compute-1 sudo[82451]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/d241d473-9fcb-5f74-b163-f1ca4454e7f1/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Feb 02 09:40:30 compute-1 sudo[82451]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:40:30 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:40:30 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 09:40:30 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:40:30.784 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 09:40:30 compute-1 ceph-osd[77691]: log_channel(cluster) log [DBG] : 4.1c scrub starts
Feb 02 09:40:30 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:40:30 compute-1 ceph-osd[77691]: log_channel(cluster) log [DBG] : 4.1c scrub ok
Feb 02 09:40:30 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 09:40:30 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:40:30.877 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 09:40:31 compute-1 ceph-mon[80115]: 3.a scrub starts
Feb 02 09:40:31 compute-1 ceph-mon[80115]: 3.a scrub ok
Feb 02 09:40:31 compute-1 ceph-mon[80115]: 2.11 scrub starts
Feb 02 09:40:31 compute-1 ceph-mon[80115]: 2.11 scrub ok
Feb 02 09:40:31 compute-1 ceph-mon[80115]: 4.2 scrub ok
Feb 02 09:40:31 compute-1 ceph-mon[80115]: [02/Feb/2026:09:40:30] ENGINE Bus STARTING
Feb 02 09:40:31 compute-1 ceph-mon[80115]: from='mgr.24202 192.168.122.100:0/2430780993' entity='mgr.compute-0.djvyfo' 
Feb 02 09:40:31 compute-1 ceph-mon[80115]: from='mgr.24202 192.168.122.100:0/2430780993' entity='mgr.compute-0.djvyfo' 
Feb 02 09:40:31 compute-1 ceph-mon[80115]: from='mgr.24202 192.168.122.100:0/2430780993' entity='mgr.compute-0.djvyfo' 
Feb 02 09:40:31 compute-1 ceph-mon[80115]: from='mgr.24202 192.168.122.100:0/2430780993' entity='mgr.compute-0.djvyfo' 
Feb 02 09:40:31 compute-1 ceph-mon[80115]: from='mgr.24202 192.168.122.100:0/2430780993' entity='mgr.compute-0.djvyfo' 
Feb 02 09:40:31 compute-1 ceph-mon[80115]: from='mgr.24202 192.168.122.100:0/2430780993' entity='mgr.compute-0.djvyfo' 
Feb 02 09:40:31 compute-1 ceph-mon[80115]: from='mgr.24202 192.168.122.100:0/2430780993' entity='mgr.compute-0.djvyfo' 
Feb 02 09:40:31 compute-1 ceph-mon[80115]: Health check cleared: POOL_APP_NOT_ENABLED (was: 1 pool(s) do not have an application enabled)
Feb 02 09:40:31 compute-1 ceph-mon[80115]: Cluster is now healthy
Feb 02 09:40:31 compute-1 ceph-mon[80115]: from='mgr.24202 192.168.122.100:0/2430780993' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd pool set", "pool": "backups", "var": "pgp_num_actual", "val": "32"}]: dispatch
Feb 02 09:40:31 compute-1 ceph-mon[80115]: from='mgr.24202 192.168.122.100:0/2430780993' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.data", "var": "pgp_num_actual", "val": "32"}]: dispatch
Feb 02 09:40:31 compute-1 ceph-mon[80115]: from='mgr.24202 192.168.122.100:0/2430780993' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "32"}]: dispatch
Feb 02 09:40:31 compute-1 ceph-mon[80115]: from='mgr.24202 192.168.122.100:0/2430780993' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd pool set", "pool": "images", "var": "pgp_num_actual", "val": "32"}]: dispatch
Feb 02 09:40:31 compute-1 ceph-mon[80115]: from='mgr.24202 192.168.122.100:0/2430780993' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd pool set", "pool": "vms", "var": "pgp_num_actual", "val": "32"}]: dispatch
Feb 02 09:40:31 compute-1 ceph-mon[80115]: from='mgr.24202 192.168.122.100:0/2430780993' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd pool set", "pool": "volumes", "var": "pgp_num_actual", "val": "32"}]: dispatch
Feb 02 09:40:31 compute-1 sudo[82451]: pam_unix(sudo:session): session closed for user root
Feb 02 09:40:31 compute-1 sudo[82508]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 02 09:40:31 compute-1 sudo[82508]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:40:31 compute-1 sudo[82508]: pam_unix(sudo:session): session closed for user root
Feb 02 09:40:31 compute-1 sudo[82533]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/d241d473-9fcb-5f74-b163-f1ca4454e7f1/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --image quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec --timeout 895 list-networks
Feb 02 09:40:31 compute-1 sudo[82533]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:40:31 compute-1 sudo[82533]: pam_unix(sudo:session): session closed for user root
Feb 02 09:40:31 compute-1 sudo[82577]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /etc/ceph
Feb 02 09:40:31 compute-1 sudo[82577]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:40:31 compute-1 sudo[82577]: pam_unix(sudo:session): session closed for user root
Feb 02 09:40:31 compute-1 sudo[82602]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-d241d473-9fcb-5f74-b163-f1ca4454e7f1/etc/ceph
Feb 02 09:40:31 compute-1 sudo[82602]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:40:31 compute-1 ceph-osd[77691]: log_channel(cluster) log [DBG] : 4.4 scrub starts
Feb 02 09:40:31 compute-1 sudo[82602]: pam_unix(sudo:session): session closed for user root
Feb 02 09:40:31 compute-1 ceph-osd[77691]: log_channel(cluster) log [DBG] : 4.4 scrub ok
Feb 02 09:40:31 compute-1 sudo[82627]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-d241d473-9fcb-5f74-b163-f1ca4454e7f1/etc/ceph/ceph.conf.new
Feb 02 09:40:31 compute-1 sudo[82627]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:40:31 compute-1 sudo[82627]: pam_unix(sudo:session): session closed for user root
Feb 02 09:40:31 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e46 e46: 3 total, 3 up, 3 in
Feb 02 09:40:32 compute-1 sudo[82652]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-d241d473-9fcb-5f74-b163-f1ca4454e7f1
Feb 02 09:40:32 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 46 pg[3.19( empty local-lis/les=0/0 n=0 ec=29/15 lis/c=31/31 les/c/f=32/32/0 sis=46) [0] r=0 lpr=46 pi=[31,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 02 09:40:32 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 46 pg[3.18( empty local-lis/les=0/0 n=0 ec=29/15 lis/c=31/31 les/c/f=32/32/0 sis=46) [0] r=0 lpr=46 pi=[31,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 02 09:40:32 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 46 pg[5.1e( empty local-lis/les=0/0 n=0 ec=31/17 lis/c=31/31 les/c/f=32/32/0 sis=46) [0] r=0 lpr=46 pi=[31,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 02 09:40:32 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 46 pg[2.19( empty local-lis/les=0/0 n=0 ec=29/14 lis/c=29/29 les/c/f=31/31/0 sis=46) [0] r=0 lpr=46 pi=[29,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 02 09:40:32 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 46 pg[3.17( empty local-lis/les=0/0 n=0 ec=29/15 lis/c=31/31 les/c/f=32/32/0 sis=46) [0] r=0 lpr=46 pi=[31,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 02 09:40:32 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 46 pg[7.13( empty local-lis/les=0/0 n=0 ec=33/19 lis/c=33/33 les/c/f=34/34/0 sis=46) [0] r=0 lpr=46 pi=[33,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 02 09:40:32 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 46 pg[7.10( empty local-lis/les=0/0 n=0 ec=33/19 lis/c=33/33 les/c/f=34/34/0 sis=46) [0] r=0 lpr=46 pi=[33,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 02 09:40:32 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 46 pg[3.12( empty local-lis/les=0/0 n=0 ec=29/15 lis/c=31/31 les/c/f=32/32/0 sis=46) [0] r=0 lpr=46 pi=[31,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 02 09:40:32 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 46 pg[5.14( empty local-lis/les=0/0 n=0 ec=31/17 lis/c=31/31 les/c/f=32/32/0 sis=46) [0] r=0 lpr=46 pi=[31,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 02 09:40:32 compute-1 sudo[82652]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:40:32 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 46 pg[5.17( empty local-lis/les=0/0 n=0 ec=31/17 lis/c=31/31 les/c/f=32/32/0 sis=46) [0] r=0 lpr=46 pi=[31,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 02 09:40:32 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 46 pg[7.b( empty local-lis/les=0/0 n=0 ec=33/19 lis/c=33/33 les/c/f=34/34/0 sis=46) [0] r=0 lpr=46 pi=[33,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 02 09:40:32 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 46 pg[2.e( empty local-lis/les=0/0 n=0 ec=29/14 lis/c=29/29 les/c/f=31/31/0 sis=46) [0] r=0 lpr=46 pi=[29,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 02 09:40:32 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 46 pg[7.9( empty local-lis/les=0/0 n=0 ec=33/19 lis/c=33/33 les/c/f=34/34/0 sis=46) [0] r=0 lpr=46 pi=[33,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 02 09:40:32 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 46 pg[5.a( empty local-lis/les=0/0 n=0 ec=31/17 lis/c=31/31 les/c/f=32/32/0 sis=46) [0] r=0 lpr=46 pi=[31,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 02 09:40:32 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 46 pg[7.8( empty local-lis/les=0/0 n=0 ec=33/19 lis/c=33/33 les/c/f=34/34/0 sis=46) [0] r=0 lpr=46 pi=[33,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 02 09:40:32 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 46 pg[3.b( empty local-lis/les=0/0 n=0 ec=29/15 lis/c=31/31 les/c/f=32/32/0 sis=46) [0] r=0 lpr=46 pi=[31,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 02 09:40:32 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 46 pg[7.f( empty local-lis/les=0/0 n=0 ec=33/19 lis/c=33/33 les/c/f=34/34/0 sis=46) [0] r=0 lpr=46 pi=[33,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 02 09:40:32 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 46 pg[5.c( empty local-lis/les=0/0 n=0 ec=31/17 lis/c=31/31 les/c/f=32/32/0 sis=46) [0] r=0 lpr=46 pi=[31,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 02 09:40:32 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 46 pg[7.e( empty local-lis/les=0/0 n=0 ec=33/19 lis/c=33/33 les/c/f=34/34/0 sis=46) [0] r=0 lpr=46 pi=[33,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 02 09:40:32 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 46 pg[5.6( empty local-lis/les=0/0 n=0 ec=31/17 lis/c=31/31 les/c/f=32/32/0 sis=46) [0] r=0 lpr=46 pi=[31,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 02 09:40:32 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 46 pg[7.4( empty local-lis/les=0/0 n=0 ec=33/19 lis/c=33/33 les/c/f=34/34/0 sis=46) [0] r=0 lpr=46 pi=[33,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 02 09:40:32 compute-1 sudo[82652]: pam_unix(sudo:session): session closed for user root
Feb 02 09:40:32 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 46 pg[2.1( empty local-lis/les=0/0 n=0 ec=29/14 lis/c=29/29 les/c/f=31/31/0 sis=46) [0] r=0 lpr=46 pi=[29,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 02 09:40:32 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 46 pg[3.7( empty local-lis/les=0/0 n=0 ec=29/15 lis/c=31/31 les/c/f=32/32/0 sis=46) [0] r=0 lpr=46 pi=[31,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 02 09:40:32 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 46 pg[2.6( empty local-lis/les=0/0 n=0 ec=29/14 lis/c=29/29 les/c/f=31/31/0 sis=46) [0] r=0 lpr=46 pi=[29,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 02 09:40:32 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 46 pg[7.3( empty local-lis/les=0/0 n=0 ec=33/19 lis/c=33/33 les/c/f=34/34/0 sis=46) [0] r=0 lpr=46 pi=[33,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 02 09:40:32 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 46 pg[3.6( empty local-lis/les=0/0 n=0 ec=29/15 lis/c=31/31 les/c/f=32/32/0 sis=46) [0] r=0 lpr=46 pi=[31,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 02 09:40:32 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 46 pg[7.2( empty local-lis/les=0/0 n=0 ec=33/19 lis/c=33/33 les/c/f=34/34/0 sis=46) [0] r=0 lpr=46 pi=[33,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 02 09:40:32 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 46 pg[5.3( empty local-lis/les=0/0 n=0 ec=31/17 lis/c=31/31 les/c/f=32/32/0 sis=46) [0] r=0 lpr=46 pi=[31,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 02 09:40:32 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 46 pg[2.4( empty local-lis/les=0/0 n=0 ec=29/14 lis/c=29/29 les/c/f=31/31/0 sis=46) [0] r=0 lpr=46 pi=[29,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 02 09:40:32 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 46 pg[3.1( empty local-lis/les=0/0 n=0 ec=29/15 lis/c=31/31 les/c/f=32/32/0 sis=46) [0] r=0 lpr=46 pi=[31,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 02 09:40:32 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 46 pg[3.2( empty local-lis/les=0/0 n=0 ec=29/15 lis/c=31/31 les/c/f=32/32/0 sis=46) [0] r=0 lpr=46 pi=[31,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 02 09:40:32 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 46 pg[7.6( empty local-lis/les=0/0 n=0 ec=33/19 lis/c=33/33 les/c/f=34/34/0 sis=46) [0] r=0 lpr=46 pi=[33,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 02 09:40:32 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 46 pg[5.5( empty local-lis/les=0/0 n=0 ec=31/17 lis/c=31/31 les/c/f=32/32/0 sis=46) [0] r=0 lpr=46 pi=[31,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 02 09:40:32 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 46 pg[3.4( empty local-lis/les=0/0 n=0 ec=29/15 lis/c=31/31 les/c/f=32/32/0 sis=46) [0] r=0 lpr=46 pi=[31,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 02 09:40:32 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 46 pg[2.9( empty local-lis/les=0/0 n=0 ec=29/14 lis/c=29/29 les/c/f=31/31/0 sis=46) [0] r=0 lpr=46 pi=[29,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 02 09:40:32 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 46 pg[7.1e( empty local-lis/les=0/0 n=0 ec=33/19 lis/c=33/33 les/c/f=34/34/0 sis=46) [0] r=0 lpr=46 pi=[33,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 02 09:40:32 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 46 pg[5.1d( empty local-lis/les=0/0 n=0 ec=31/17 lis/c=31/31 les/c/f=32/32/0 sis=46) [0] r=0 lpr=46 pi=[31,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 02 09:40:32 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 46 pg[7.18( empty local-lis/les=0/0 n=0 ec=33/19 lis/c=33/33 les/c/f=34/34/0 sis=46) [0] r=0 lpr=46 pi=[33,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 02 09:40:32 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 46 pg[3.1e( empty local-lis/les=0/0 n=0 ec=29/15 lis/c=31/31 les/c/f=32/32/0 sis=46) [0] r=0 lpr=46 pi=[31,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 02 09:40:32 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 46 pg[2.1f( empty local-lis/les=0/0 n=0 ec=29/14 lis/c=29/29 les/c/f=31/31/0 sis=46) [0] r=0 lpr=46 pi=[29,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 02 09:40:32 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 46 pg[5.19( empty local-lis/les=0/0 n=0 ec=31/17 lis/c=31/31 les/c/f=32/32/0 sis=46) [0] r=0 lpr=46 pi=[31,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 02 09:40:32 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 46 pg[3.1f( empty local-lis/les=0/0 n=0 ec=29/15 lis/c=31/31 les/c/f=32/32/0 sis=46) [0] r=0 lpr=46 pi=[31,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 02 09:40:32 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 46 pg[7.1b( empty local-lis/les=0/0 n=0 ec=33/19 lis/c=33/33 les/c/f=34/34/0 sis=46) [0] r=0 lpr=46 pi=[33,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 02 09:40:32 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 46 pg[2.1e( empty local-lis/les=0/0 n=0 ec=29/14 lis/c=29/29 les/c/f=31/31/0 sis=46) [0] r=0 lpr=46 pi=[29,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 02 09:40:32 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 46 pg[6.1b( empty local-lis/les=33/35 n=0 ec=33/18 lis/c=33/33 les/c/f=35/35/0 sis=46 pruub=12.118619919s) [2] r=-1 lpr=46 pi=[33,46)/1 crt=0'0 mlcod 0'0 active pruub 95.339263916s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb 02 09:40:32 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 46 pg[4.1a( empty local-lis/les=31/32 n=0 ec=31/16 lis/c=31/31 les/c/f=32/32/0 sis=46 pruub=8.752305984s) [1] r=-1 lpr=46 pi=[31,46)/1 crt=0'0 mlcod 0'0 active pruub 91.972938538s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb 02 09:40:32 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 46 pg[6.1b( empty local-lis/les=33/35 n=0 ec=33/18 lis/c=33/33 les/c/f=35/35/0 sis=46 pruub=12.118590355s) [2] r=-1 lpr=46 pi=[33,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 95.339263916s@ mbc={}] state<Start>: transitioning to Stray
Feb 02 09:40:32 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 46 pg[4.1a( empty local-lis/les=31/32 n=0 ec=31/16 lis/c=31/31 les/c/f=32/32/0 sis=46 pruub=8.752251625s) [1] r=-1 lpr=46 pi=[31,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 91.972938538s@ mbc={}] state<Start>: transitioning to Stray
Feb 02 09:40:32 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 46 pg[4.18( empty local-lis/les=31/32 n=0 ec=31/16 lis/c=31/31 les/c/f=32/32/0 sis=46 pruub=8.752285004s) [1] r=-1 lpr=46 pi=[31,46)/1 crt=0'0 mlcod 0'0 active pruub 91.973014832s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb 02 09:40:32 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 46 pg[4.18( empty local-lis/les=31/32 n=0 ec=31/16 lis/c=31/31 les/c/f=32/32/0 sis=46 pruub=8.752253532s) [1] r=-1 lpr=46 pi=[31,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 91.973014832s@ mbc={}] state<Start>: transitioning to Stray
Feb 02 09:40:32 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 46 pg[6.19( empty local-lis/les=33/35 n=0 ec=33/18 lis/c=33/33 les/c/f=35/35/0 sis=46 pruub=12.128526688s) [1] r=-1 lpr=46 pi=[33,46)/1 crt=0'0 mlcod 0'0 active pruub 95.349411011s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb 02 09:40:32 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 46 pg[4.1b( empty local-lis/les=31/32 n=0 ec=31/16 lis/c=31/31 les/c/f=32/32/0 sis=46 pruub=8.752019882s) [1] r=-1 lpr=46 pi=[31,46)/1 crt=0'0 mlcod 0'0 active pruub 91.972885132s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb 02 09:40:32 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 46 pg[4.1c( empty local-lis/les=31/32 n=0 ec=31/16 lis/c=31/31 les/c/f=32/32/0 sis=46 pruub=8.751934052s) [2] r=-1 lpr=46 pi=[31,46)/1 crt=0'0 mlcod 0'0 active pruub 91.972869873s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb 02 09:40:32 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 46 pg[6.19( empty local-lis/les=33/35 n=0 ec=33/18 lis/c=33/33 les/c/f=35/35/0 sis=46 pruub=12.128501892s) [1] r=-1 lpr=46 pi=[33,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 95.349411011s@ mbc={}] state<Start>: transitioning to Stray
Feb 02 09:40:32 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 46 pg[4.1b( empty local-lis/les=31/32 n=0 ec=31/16 lis/c=31/31 les/c/f=32/32/0 sis=46 pruub=8.751987457s) [1] r=-1 lpr=46 pi=[31,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 91.972885132s@ mbc={}] state<Start>: transitioning to Stray
Feb 02 09:40:32 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 46 pg[4.1c( empty local-lis/les=31/32 n=0 ec=31/16 lis/c=31/31 les/c/f=32/32/0 sis=46 pruub=8.751916885s) [2] r=-1 lpr=46 pi=[31,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 91.972869873s@ mbc={}] state<Start>: transitioning to Stray
Feb 02 09:40:32 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 46 pg[6.1e( empty local-lis/les=33/35 n=0 ec=33/18 lis/c=33/33 les/c/f=35/35/0 sis=46 pruub=12.128319740s) [2] r=-1 lpr=46 pi=[33,46)/1 crt=0'0 mlcod 0'0 active pruub 95.349395752s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb 02 09:40:32 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 46 pg[6.1e( empty local-lis/les=33/35 n=0 ec=33/18 lis/c=33/33 les/c/f=35/35/0 sis=46 pruub=12.128303528s) [2] r=-1 lpr=46 pi=[33,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 95.349395752s@ mbc={}] state<Start>: transitioning to Stray
Feb 02 09:40:32 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 46 pg[4.e( empty local-lis/les=31/32 n=0 ec=31/16 lis/c=31/31 les/c/f=32/32/0 sis=46 pruub=8.751770973s) [1] r=-1 lpr=46 pi=[31,46)/1 crt=0'0 mlcod 0'0 active pruub 91.972930908s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb 02 09:40:32 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 46 pg[4.1d( empty local-lis/les=31/32 n=0 ec=31/16 lis/c=31/31 les/c/f=32/32/0 sis=46 pruub=8.751785278s) [2] r=-1 lpr=46 pi=[31,46)/1 crt=0'0 mlcod 0'0 active pruub 91.972946167s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb 02 09:40:32 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 46 pg[4.e( empty local-lis/les=31/32 n=0 ec=31/16 lis/c=31/31 les/c/f=32/32/0 sis=46 pruub=8.751747131s) [1] r=-1 lpr=46 pi=[31,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 91.972930908s@ mbc={}] state<Start>: transitioning to Stray
Feb 02 09:40:32 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 46 pg[4.1d( empty local-lis/les=31/32 n=0 ec=31/16 lis/c=31/31 les/c/f=32/32/0 sis=46 pruub=8.751760483s) [2] r=-1 lpr=46 pi=[31,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 91.972946167s@ mbc={}] state<Start>: transitioning to Stray
Feb 02 09:40:32 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 46 pg[6.d( empty local-lis/les=33/35 n=0 ec=33/18 lis/c=33/33 les/c/f=35/35/0 sis=46 pruub=12.128523827s) [1] r=-1 lpr=46 pi=[33,46)/1 crt=0'0 mlcod 0'0 active pruub 95.349822998s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb 02 09:40:32 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 46 pg[6.d( empty local-lis/les=33/35 n=0 ec=33/18 lis/c=33/33 les/c/f=35/35/0 sis=46 pruub=12.128508568s) [1] r=-1 lpr=46 pi=[33,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 95.349822998s@ mbc={}] state<Start>: transitioning to Stray
Feb 02 09:40:32 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 46 pg[4.3( empty local-lis/les=31/32 n=0 ec=31/16 lis/c=31/31 les/c/f=32/32/0 sis=46 pruub=8.751496315s) [2] r=-1 lpr=46 pi=[31,46)/1 crt=0'0 mlcod 0'0 active pruub 91.972885132s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb 02 09:40:32 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 46 pg[4.3( empty local-lis/les=31/32 n=0 ec=31/16 lis/c=31/31 les/c/f=32/32/0 sis=46 pruub=8.751472473s) [2] r=-1 lpr=46 pi=[31,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 91.972885132s@ mbc={}] state<Start>: transitioning to Stray
Feb 02 09:40:32 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 46 pg[6.1( empty local-lis/les=33/35 n=0 ec=33/18 lis/c=33/33 les/c/f=35/35/0 sis=46 pruub=12.128314972s) [2] r=-1 lpr=46 pi=[33,46)/1 crt=0'0 mlcod 0'0 active pruub 95.349754333s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb 02 09:40:32 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 46 pg[6.1( empty local-lis/les=33/35 n=0 ec=33/18 lis/c=33/33 les/c/f=35/35/0 sis=46 pruub=12.128300667s) [2] r=-1 lpr=46 pi=[33,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 95.349754333s@ mbc={}] state<Start>: transitioning to Stray
Feb 02 09:40:32 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 46 pg[4.5( empty local-lis/les=31/32 n=0 ec=31/16 lis/c=31/31 les/c/f=32/32/0 sis=46 pruub=8.751129150s) [1] r=-1 lpr=46 pi=[31,46)/1 crt=0'0 mlcod 0'0 active pruub 91.972656250s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb 02 09:40:32 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 46 pg[6.7( empty local-lis/les=33/35 n=0 ec=33/18 lis/c=33/33 les/c/f=35/35/0 sis=46 pruub=12.127918243s) [1] r=-1 lpr=46 pi=[33,46)/1 crt=0'0 mlcod 0'0 active pruub 95.349494934s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb 02 09:40:32 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 46 pg[4.5( empty local-lis/les=31/32 n=0 ec=31/16 lis/c=31/31 les/c/f=32/32/0 sis=46 pruub=8.751101494s) [1] r=-1 lpr=46 pi=[31,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 91.972656250s@ mbc={}] state<Start>: transitioning to Stray
Feb 02 09:40:32 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 46 pg[6.7( empty local-lis/les=33/35 n=0 ec=33/18 lis/c=33/33 les/c/f=35/35/0 sis=46 pruub=12.127898216s) [1] r=-1 lpr=46 pi=[33,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 95.349494934s@ mbc={}] state<Start>: transitioning to Stray
Feb 02 09:40:32 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 46 pg[4.6( empty local-lis/les=31/32 n=0 ec=31/16 lis/c=31/31 les/c/f=32/32/0 sis=46 pruub=8.751144409s) [2] r=-1 lpr=46 pi=[31,46)/1 crt=0'0 mlcod 0'0 active pruub 91.972885132s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb 02 09:40:32 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 46 pg[4.6( empty local-lis/les=31/32 n=0 ec=31/16 lis/c=31/31 les/c/f=32/32/0 sis=46 pruub=8.751119614s) [2] r=-1 lpr=46 pi=[31,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 91.972885132s@ mbc={}] state<Start>: transitioning to Stray
Feb 02 09:40:32 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 46 pg[4.2( empty local-lis/les=31/32 n=0 ec=31/16 lis/c=31/31 les/c/f=32/32/0 sis=46 pruub=8.750867844s) [2] r=-1 lpr=46 pi=[31,46)/1 crt=0'0 mlcod 0'0 active pruub 91.972656250s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb 02 09:40:32 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 46 pg[6.1a( empty local-lis/les=33/35 n=0 ec=33/18 lis/c=33/33 les/c/f=35/35/0 sis=46 pruub=12.128810883s) [1] r=-1 lpr=46 pi=[33,46)/1 crt=0'0 mlcod 0'0 active pruub 95.349357605s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb 02 09:40:32 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 46 pg[4.2( empty local-lis/les=31/32 n=0 ec=31/16 lis/c=31/31 les/c/f=32/32/0 sis=46 pruub=8.750843048s) [2] r=-1 lpr=46 pi=[31,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 91.972656250s@ mbc={}] state<Start>: transitioning to Stray
Feb 02 09:40:32 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 46 pg[6.1a( empty local-lis/les=33/35 n=0 ec=33/18 lis/c=33/33 les/c/f=35/35/0 sis=46 pruub=12.127510071s) [1] r=-1 lpr=46 pi=[33,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 95.349357605s@ mbc={}] state<Start>: transitioning to Stray
Feb 02 09:40:32 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 46 pg[4.1( empty local-lis/les=31/32 n=0 ec=31/16 lis/c=31/31 les/c/f=32/32/0 sis=46 pruub=8.750620842s) [2] r=-1 lpr=46 pi=[31,46)/1 crt=0'0 mlcod 0'0 active pruub 91.972595215s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb 02 09:40:32 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 46 pg[6.3( empty local-lis/les=33/35 n=0 ec=33/18 lis/c=33/33 les/c/f=35/35/0 sis=46 pruub=12.127817154s) [1] r=-1 lpr=46 pi=[33,46)/1 crt=0'0 mlcod 0'0 active pruub 95.349807739s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb 02 09:40:32 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 46 pg[4.1( empty local-lis/les=31/32 n=0 ec=31/16 lis/c=31/31 les/c/f=32/32/0 sis=46 pruub=8.750597954s) [2] r=-1 lpr=46 pi=[31,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 91.972595215s@ mbc={}] state<Start>: transitioning to Stray
Feb 02 09:40:32 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 46 pg[6.3( empty local-lis/les=33/35 n=0 ec=33/18 lis/c=33/33 les/c/f=35/35/0 sis=46 pruub=12.127795219s) [1] r=-1 lpr=46 pi=[33,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 95.349807739s@ mbc={}] state<Start>: transitioning to Stray
Feb 02 09:40:32 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 46 pg[6.2( empty local-lis/les=33/35 n=0 ec=33/18 lis/c=33/33 les/c/f=35/35/0 sis=46 pruub=12.127767563s) [1] r=-1 lpr=46 pi=[33,46)/1 crt=0'0 mlcod 0'0 active pruub 95.349815369s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb 02 09:40:32 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 46 pg[6.2( empty local-lis/les=33/35 n=0 ec=33/18 lis/c=33/33 les/c/f=35/35/0 sis=46 pruub=12.127752304s) [1] r=-1 lpr=46 pi=[33,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 95.349815369s@ mbc={}] state<Start>: transitioning to Stray
Feb 02 09:40:32 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 46 pg[4.d( empty local-lis/les=31/32 n=0 ec=31/16 lis/c=31/31 les/c/f=32/32/0 sis=46 pruub=8.750327110s) [1] r=-1 lpr=46 pi=[31,46)/1 crt=0'0 mlcod 0'0 active pruub 91.972518921s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb 02 09:40:32 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 46 pg[6.5( empty local-lis/les=33/35 n=0 ec=33/18 lis/c=33/33 les/c/f=35/35/0 sis=46 pruub=12.127785683s) [1] r=-1 lpr=46 pi=[33,46)/1 crt=0'0 mlcod 0'0 active pruub 95.349975586s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb 02 09:40:32 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 46 pg[4.d( empty local-lis/les=31/32 n=0 ec=31/16 lis/c=31/31 les/c/f=32/32/0 sis=46 pruub=8.750284195s) [1] r=-1 lpr=46 pi=[31,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 91.972518921s@ mbc={}] state<Start>: transitioning to Stray
Feb 02 09:40:32 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 46 pg[4.c( empty local-lis/les=31/32 n=0 ec=31/16 lis/c=31/31 les/c/f=32/32/0 sis=46 pruub=8.750309944s) [1] r=-1 lpr=46 pi=[31,46)/1 crt=0'0 mlcod 0'0 active pruub 91.972595215s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb 02 09:40:32 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 46 pg[6.5( empty local-lis/les=33/35 n=0 ec=33/18 lis/c=33/33 les/c/f=35/35/0 sis=46 pruub=12.127717972s) [1] r=-1 lpr=46 pi=[33,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 95.349975586s@ mbc={}] state<Start>: transitioning to Stray
Feb 02 09:40:32 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 46 pg[4.c( empty local-lis/les=31/32 n=0 ec=31/16 lis/c=31/31 les/c/f=32/32/0 sis=46 pruub=8.750294685s) [1] r=-1 lpr=46 pi=[31,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 91.972595215s@ mbc={}] state<Start>: transitioning to Stray
Feb 02 09:40:32 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 46 pg[6.e( empty local-lis/les=33/35 n=0 ec=33/18 lis/c=33/33 les/c/f=35/35/0 sis=46 pruub=12.127771378s) [1] r=-1 lpr=46 pi=[33,46)/1 crt=0'0 mlcod 0'0 active pruub 95.350097656s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb 02 09:40:32 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 46 pg[6.e( empty local-lis/les=33/35 n=0 ec=33/18 lis/c=33/33 les/c/f=35/35/0 sis=46 pruub=12.127698898s) [1] r=-1 lpr=46 pi=[33,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 95.350097656s@ mbc={}] state<Start>: transitioning to Stray
Feb 02 09:40:32 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 46 pg[4.a( empty local-lis/les=31/32 n=0 ec=31/16 lis/c=31/31 les/c/f=32/32/0 sis=46 pruub=8.750201225s) [1] r=-1 lpr=46 pi=[31,46)/1 crt=0'0 mlcod 0'0 active pruub 91.972648621s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb 02 09:40:32 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 46 pg[4.a( empty local-lis/les=31/32 n=0 ec=31/16 lis/c=31/31 les/c/f=32/32/0 sis=46 pruub=8.750186920s) [1] r=-1 lpr=46 pi=[31,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 91.972648621s@ mbc={}] state<Start>: transitioning to Stray
Feb 02 09:40:32 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 46 pg[6.8( empty local-lis/les=33/35 n=0 ec=33/18 lis/c=33/33 les/c/f=35/35/0 sis=46 pruub=12.127700806s) [1] r=-1 lpr=46 pi=[33,46)/1 crt=0'0 mlcod 0'0 active pruub 95.350204468s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb 02 09:40:32 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 46 pg[4.9( empty local-lis/les=31/32 n=0 ec=31/16 lis/c=31/31 les/c/f=32/32/0 sis=46 pruub=8.749849319s) [2] r=-1 lpr=46 pi=[31,46)/1 crt=0'0 mlcod 0'0 active pruub 91.972381592s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb 02 09:40:32 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 46 pg[6.a( empty local-lis/les=33/35 n=0 ec=33/18 lis/c=33/33 les/c/f=35/35/0 sis=46 pruub=12.127601624s) [1] r=-1 lpr=46 pi=[33,46)/1 crt=0'0 mlcod 0'0 active pruub 95.350158691s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb 02 09:40:32 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 46 pg[6.8( empty local-lis/les=33/35 n=0 ec=33/18 lis/c=33/33 les/c/f=35/35/0 sis=46 pruub=12.127652168s) [1] r=-1 lpr=46 pi=[33,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 95.350204468s@ mbc={}] state<Start>: transitioning to Stray
Feb 02 09:40:32 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 46 pg[4.9( empty local-lis/les=31/32 n=0 ec=31/16 lis/c=31/31 les/c/f=32/32/0 sis=46 pruub=8.749823570s) [2] r=-1 lpr=46 pi=[31,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 91.972381592s@ mbc={}] state<Start>: transitioning to Stray
Feb 02 09:40:32 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 46 pg[6.a( empty local-lis/les=33/35 n=0 ec=33/18 lis/c=33/33 les/c/f=35/35/0 sis=46 pruub=12.127588272s) [1] r=-1 lpr=46 pi=[33,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 95.350158691s@ mbc={}] state<Start>: transitioning to Stray
Feb 02 09:40:32 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 46 pg[4.8( empty local-lis/les=31/32 n=0 ec=31/16 lis/c=31/31 les/c/f=32/32/0 sis=46 pruub=8.749593735s) [2] r=-1 lpr=46 pi=[31,46)/1 crt=0'0 mlcod 0'0 active pruub 91.972244263s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb 02 09:40:32 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 46 pg[4.8( empty local-lis/les=31/32 n=0 ec=31/16 lis/c=31/31 les/c/f=32/32/0 sis=46 pruub=8.749562263s) [2] r=-1 lpr=46 pi=[31,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 91.972244263s@ mbc={}] state<Start>: transitioning to Stray
Feb 02 09:40:32 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 46 pg[6.15( empty local-lis/les=33/35 n=0 ec=33/18 lis/c=33/33 les/c/f=35/35/0 sis=46 pruub=12.127418518s) [1] r=-1 lpr=46 pi=[33,46)/1 crt=0'0 mlcod 0'0 active pruub 95.350196838s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb 02 09:40:32 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 46 pg[6.17( empty local-lis/les=33/35 n=0 ec=33/18 lis/c=33/33 les/c/f=35/35/0 sis=46 pruub=12.127482414s) [2] r=-1 lpr=46 pi=[33,46)/1 crt=0'0 mlcod 0'0 active pruub 95.350265503s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb 02 09:40:32 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 46 pg[6.17( empty local-lis/les=33/35 n=0 ec=33/18 lis/c=33/33 les/c/f=35/35/0 sis=46 pruub=12.127470016s) [2] r=-1 lpr=46 pi=[33,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 95.350265503s@ mbc={}] state<Start>: transitioning to Stray
Feb 02 09:40:32 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 46 pg[4.14( empty local-lis/les=31/32 n=0 ec=31/16 lis/c=31/31 les/c/f=32/32/0 sis=46 pruub=8.749266624s) [2] r=-1 lpr=46 pi=[31,46)/1 crt=0'0 mlcod 0'0 active pruub 91.972091675s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb 02 09:40:32 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 46 pg[6.15( empty local-lis/les=33/35 n=0 ec=33/18 lis/c=33/33 les/c/f=35/35/0 sis=46 pruub=12.127391815s) [1] r=-1 lpr=46 pi=[33,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 95.350196838s@ mbc={}] state<Start>: transitioning to Stray
Feb 02 09:40:32 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 46 pg[4.15( empty local-lis/les=31/32 n=0 ec=31/16 lis/c=31/31 les/c/f=32/32/0 sis=46 pruub=8.749685287s) [2] r=-1 lpr=46 pi=[31,46)/1 crt=0'0 mlcod 0'0 active pruub 91.972518921s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb 02 09:40:32 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 46 pg[4.14( empty local-lis/les=31/32 n=0 ec=31/16 lis/c=31/31 les/c/f=32/32/0 sis=46 pruub=8.749248505s) [2] r=-1 lpr=46 pi=[31,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 91.972091675s@ mbc={}] state<Start>: transitioning to Stray
Feb 02 09:40:32 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 46 pg[4.15( empty local-lis/les=31/32 n=0 ec=31/16 lis/c=31/31 les/c/f=32/32/0 sis=46 pruub=8.749658585s) [2] r=-1 lpr=46 pi=[31,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 91.972518921s@ mbc={}] state<Start>: transitioning to Stray
Feb 02 09:40:32 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 46 pg[4.13( empty local-lis/les=31/32 n=0 ec=31/16 lis/c=31/31 les/c/f=32/32/0 sis=46 pruub=8.749163628s) [1] r=-1 lpr=46 pi=[31,46)/1 crt=0'0 mlcod 0'0 active pruub 91.972068787s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb 02 09:40:32 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 46 pg[4.13( empty local-lis/les=31/32 n=0 ec=31/16 lis/c=31/31 les/c/f=32/32/0 sis=46 pruub=8.749152184s) [1] r=-1 lpr=46 pi=[31,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 91.972068787s@ mbc={}] state<Start>: transitioning to Stray
Feb 02 09:40:32 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 46 pg[6.12( empty local-lis/les=33/35 n=0 ec=33/18 lis/c=33/33 les/c/f=35/35/0 sis=46 pruub=12.127462387s) [2] r=-1 lpr=46 pi=[33,46)/1 crt=0'0 mlcod 0'0 active pruub 95.350486755s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb 02 09:40:32 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 46 pg[4.1f( empty local-lis/les=31/32 n=0 ec=31/16 lis/c=31/31 les/c/f=32/32/0 sis=46 pruub=8.748907089s) [2] r=-1 lpr=46 pi=[31,46)/1 crt=0'0 mlcod 0'0 active pruub 91.971946716s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb 02 09:40:32 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 46 pg[4.1f( empty local-lis/les=31/32 n=0 ec=31/16 lis/c=31/31 les/c/f=32/32/0 sis=46 pruub=8.748896599s) [2] r=-1 lpr=46 pi=[31,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 91.971946716s@ mbc={}] state<Start>: transitioning to Stray
Feb 02 09:40:32 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 46 pg[6.1c( empty local-lis/les=33/35 n=0 ec=33/18 lis/c=33/33 les/c/f=35/35/0 sis=46 pruub=12.127389908s) [2] r=-1 lpr=46 pi=[33,46)/1 crt=0'0 mlcod 0'0 active pruub 95.350494385s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb 02 09:40:32 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 46 pg[6.12( empty local-lis/les=33/35 n=0 ec=33/18 lis/c=33/33 les/c/f=35/35/0 sis=46 pruub=12.127447128s) [2] r=-1 lpr=46 pi=[33,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 95.350486755s@ mbc={}] state<Start>: transitioning to Stray
Feb 02 09:40:32 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 46 pg[6.1c( empty local-lis/les=33/35 n=0 ec=33/18 lis/c=33/33 les/c/f=35/35/0 sis=46 pruub=12.127364159s) [2] r=-1 lpr=46 pi=[33,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 95.350494385s@ mbc={}] state<Start>: transitioning to Stray
Feb 02 09:40:32 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 46 pg[4.19( empty local-lis/les=31/32 n=0 ec=31/16 lis/c=31/31 les/c/f=32/32/0 sis=46 pruub=8.748805046s) [2] r=-1 lpr=46 pi=[31,46)/1 crt=0'0 mlcod 0'0 active pruub 91.973007202s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb 02 09:40:32 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 46 pg[4.19( empty local-lis/les=31/32 n=0 ec=31/16 lis/c=31/31 les/c/f=32/32/0 sis=46 pruub=8.748677254s) [2] r=-1 lpr=46 pi=[31,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 91.973007202s@ mbc={}] state<Start>: transitioning to Stray
Feb 02 09:40:32 compute-1 sudo[82677]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-d241d473-9fcb-5f74-b163-f1ca4454e7f1/etc/ceph/ceph.conf.new
Feb 02 09:40:32 compute-1 ceph-mon[80115]: [02/Feb/2026:09:40:30] ENGINE Serving on https://192.168.122.100:7150
Feb 02 09:40:32 compute-1 ceph-mon[80115]: [02/Feb/2026:09:40:30] ENGINE Client ('192.168.122.100', 53184) lost — peer dropped the TLS connection suddenly, during handshake: (6, 'TLS/SSL connection has been closed (EOF) (_ssl.c:1147)')
Feb 02 09:40:32 compute-1 ceph-mon[80115]: [02/Feb/2026:09:40:30] ENGINE Serving on http://192.168.122.100:8765
Feb 02 09:40:32 compute-1 ceph-mon[80115]: [02/Feb/2026:09:40:30] ENGINE Bus STARTED
Feb 02 09:40:32 compute-1 ceph-mon[80115]: 3.e scrub starts
Feb 02 09:40:32 compute-1 ceph-mon[80115]: 3.e scrub ok
Feb 02 09:40:32 compute-1 ceph-mon[80115]: 2.13 scrub starts
Feb 02 09:40:32 compute-1 ceph-mon[80115]: 4.1c scrub starts
Feb 02 09:40:32 compute-1 ceph-mon[80115]: 2.13 scrub ok
Feb 02 09:40:32 compute-1 ceph-mon[80115]: 4.1c scrub ok
Feb 02 09:40:32 compute-1 ceph-mon[80115]: from='client.14445 -' entity='client.admin' cmd=[{"prefix": "dashboard set-grafana-api-password", "target": ["mon-mgr", ""]}]: dispatch
Feb 02 09:40:32 compute-1 ceph-mon[80115]: pgmap v4: 197 pgs: 197 active+clean; 454 KiB data, 85 MiB used, 60 GiB / 60 GiB avail
Feb 02 09:40:32 compute-1 ceph-mon[80115]: from='mgr.24202 192.168.122.100:0/2430780993' entity='mgr.compute-0.djvyfo' 
Feb 02 09:40:32 compute-1 ceph-mon[80115]: from='mgr.24202 192.168.122.100:0/2430780993' entity='mgr.compute-0.djvyfo' 
Feb 02 09:40:32 compute-1 ceph-mon[80115]: from='mgr.24202 192.168.122.100:0/2430780993' entity='mgr.compute-0.djvyfo' 
Feb 02 09:40:32 compute-1 ceph-mon[80115]: from='mgr.24202 192.168.122.100:0/2430780993' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "config rm", "who": "osd/host:compute-1", "name": "osd_memory_target"}]: dispatch
Feb 02 09:40:32 compute-1 ceph-mon[80115]: from='mgr.24202 192.168.122.100:0/2430780993' entity='mgr.compute-0.djvyfo' 
Feb 02 09:40:32 compute-1 ceph-mon[80115]: from='mgr.24202 192.168.122.100:0/2430780993' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "config rm", "who": "osd/host:compute-0", "name": "osd_memory_target"}]: dispatch
Feb 02 09:40:32 compute-1 ceph-mon[80115]: from='mgr.24202 192.168.122.100:0/2430780993' entity='mgr.compute-0.djvyfo' cmd='[{"prefix": "config rm", "who": "osd/host:compute-1", "name": "osd_memory_target"}]': finished
Feb 02 09:40:32 compute-1 ceph-mon[80115]: from='mgr.24202 192.168.122.100:0/2430780993' entity='mgr.compute-0.djvyfo' 
Feb 02 09:40:32 compute-1 ceph-mon[80115]: from='mgr.24202 192.168.122.100:0/2430780993' entity='mgr.compute-0.djvyfo' 
Feb 02 09:40:32 compute-1 ceph-mon[80115]: from='mgr.24202 192.168.122.100:0/2430780993' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "config rm", "who": "osd/host:compute-2", "name": "osd_memory_target"}]: dispatch
Feb 02 09:40:32 compute-1 ceph-mon[80115]: from='mgr.24202 192.168.122.100:0/2430780993' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Feb 02 09:40:32 compute-1 ceph-mon[80115]: from='mgr.24202 192.168.122.100:0/2430780993' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Feb 02 09:40:32 compute-1 ceph-mon[80115]: mgrmap e15: compute-0.djvyfo(active, since 2s), standbys: compute-1.teascl, compute-2.gzlyac
Feb 02 09:40:32 compute-1 ceph-mon[80115]: from='mgr.24202 192.168.122.100:0/2430780993' entity='mgr.compute-0.djvyfo' 
Feb 02 09:40:32 compute-1 ceph-mon[80115]: from='mgr.24202 192.168.122.100:0/2430780993' entity='mgr.compute-0.djvyfo' cmd='[{"prefix": "osd pool set", "pool": "backups", "var": "pgp_num_actual", "val": "32"}]': finished
Feb 02 09:40:32 compute-1 ceph-mon[80115]: from='mgr.24202 192.168.122.100:0/2430780993' entity='mgr.compute-0.djvyfo' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.data", "var": "pgp_num_actual", "val": "32"}]': finished
Feb 02 09:40:32 compute-1 ceph-mon[80115]: from='mgr.24202 192.168.122.100:0/2430780993' entity='mgr.compute-0.djvyfo' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "32"}]': finished
Feb 02 09:40:32 compute-1 ceph-mon[80115]: from='mgr.24202 192.168.122.100:0/2430780993' entity='mgr.compute-0.djvyfo' cmd='[{"prefix": "osd pool set", "pool": "images", "var": "pgp_num_actual", "val": "32"}]': finished
Feb 02 09:40:32 compute-1 ceph-mon[80115]: from='mgr.24202 192.168.122.100:0/2430780993' entity='mgr.compute-0.djvyfo' cmd='[{"prefix": "osd pool set", "pool": "vms", "var": "pgp_num_actual", "val": "32"}]': finished
Feb 02 09:40:32 compute-1 ceph-mon[80115]: from='mgr.24202 192.168.122.100:0/2430780993' entity='mgr.compute-0.djvyfo' cmd='[{"prefix": "osd pool set", "pool": "volumes", "var": "pgp_num_actual", "val": "32"}]': finished
Feb 02 09:40:32 compute-1 ceph-mon[80115]: osdmap e46: 3 total, 3 up, 3 in
Feb 02 09:40:32 compute-1 sudo[82677]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:40:32 compute-1 sudo[82677]: pam_unix(sudo:session): session closed for user root
Feb 02 09:40:32 compute-1 sudo[82725]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-d241d473-9fcb-5f74-b163-f1ca4454e7f1/etc/ceph/ceph.conf.new
Feb 02 09:40:32 compute-1 sudo[82725]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:40:32 compute-1 sudo[82725]: pam_unix(sudo:session): session closed for user root
Feb 02 09:40:32 compute-1 sudo[82750]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-d241d473-9fcb-5f74-b163-f1ca4454e7f1/etc/ceph/ceph.conf.new
Feb 02 09:40:32 compute-1 sudo[82750]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:40:32 compute-1 sudo[82750]: pam_unix(sudo:session): session closed for user root
Feb 02 09:40:32 compute-1 sudo[82775]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-d241d473-9fcb-5f74-b163-f1ca4454e7f1/etc/ceph/ceph.conf.new /etc/ceph/ceph.conf
Feb 02 09:40:32 compute-1 sudo[82775]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:40:32 compute-1 sudo[82775]: pam_unix(sudo:session): session closed for user root
Feb 02 09:40:32 compute-1 sudo[82800]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/d241d473-9fcb-5f74-b163-f1ca4454e7f1/config
Feb 02 09:40:32 compute-1 sudo[82800]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:40:32 compute-1 sudo[82800]: pam_unix(sudo:session): session closed for user root
Feb 02 09:40:32 compute-1 sshd-session[82507]: Invalid user solv from 80.94.92.184 port 57254
Feb 02 09:40:32 compute-1 sudo[82825]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-d241d473-9fcb-5f74-b163-f1ca4454e7f1/var/lib/ceph/d241d473-9fcb-5f74-b163-f1ca4454e7f1/config
Feb 02 09:40:32 compute-1 sudo[82825]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:40:32 compute-1 sudo[82825]: pam_unix(sudo:session): session closed for user root
Feb 02 09:40:32 compute-1 sudo[82850]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-d241d473-9fcb-5f74-b163-f1ca4454e7f1/var/lib/ceph/d241d473-9fcb-5f74-b163-f1ca4454e7f1/config/ceph.conf.new
Feb 02 09:40:32 compute-1 sudo[82850]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:40:32 compute-1 sudo[82850]: pam_unix(sudo:session): session closed for user root
Feb 02 09:40:32 compute-1 sshd-session[82507]: Connection closed by invalid user solv 80.94.92.184 port 57254 [preauth]
Feb 02 09:40:32 compute-1 sudo[82875]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-d241d473-9fcb-5f74-b163-f1ca4454e7f1
Feb 02 09:40:32 compute-1 sudo[82875]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:40:32 compute-1 sudo[82875]: pam_unix(sudo:session): session closed for user root
Feb 02 09:40:32 compute-1 sudo[82900]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-d241d473-9fcb-5f74-b163-f1ca4454e7f1/var/lib/ceph/d241d473-9fcb-5f74-b163-f1ca4454e7f1/config/ceph.conf.new
Feb 02 09:40:32 compute-1 sudo[82900]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:40:32 compute-1 sudo[82900]: pam_unix(sudo:session): session closed for user root
Feb 02 09:40:32 compute-1 sudo[82948]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-d241d473-9fcb-5f74-b163-f1ca4454e7f1/var/lib/ceph/d241d473-9fcb-5f74-b163-f1ca4454e7f1/config/ceph.conf.new
Feb 02 09:40:32 compute-1 sudo[82948]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:40:32 compute-1 sudo[82948]: pam_unix(sudo:session): session closed for user root
Feb 02 09:40:32 compute-1 sudo[82973]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-d241d473-9fcb-5f74-b163-f1ca4454e7f1/var/lib/ceph/d241d473-9fcb-5f74-b163-f1ca4454e7f1/config/ceph.conf.new
Feb 02 09:40:32 compute-1 sudo[82973]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:40:32 compute-1 sudo[82973]: pam_unix(sudo:session): session closed for user root
Feb 02 09:40:32 compute-1 sudo[82998]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-d241d473-9fcb-5f74-b163-f1ca4454e7f1/var/lib/ceph/d241d473-9fcb-5f74-b163-f1ca4454e7f1/config/ceph.conf.new /var/lib/ceph/d241d473-9fcb-5f74-b163-f1ca4454e7f1/config/ceph.conf
Feb 02 09:40:32 compute-1 sudo[82998]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:40:32 compute-1 sudo[82998]: pam_unix(sudo:session): session closed for user root
Feb 02 09:40:32 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:40:32 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 09:40:32 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:40:32.787 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 09:40:32 compute-1 sudo[83023]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /etc/ceph
Feb 02 09:40:32 compute-1 sudo[83023]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:40:32 compute-1 sudo[83023]: pam_unix(sudo:session): session closed for user root
Feb 02 09:40:32 compute-1 sudo[83048]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-d241d473-9fcb-5f74-b163-f1ca4454e7f1/etc/ceph
Feb 02 09:40:32 compute-1 sudo[83048]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:40:32 compute-1 sudo[83048]: pam_unix(sudo:session): session closed for user root
Feb 02 09:40:32 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:40:32 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 09:40:32 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:40:32.880 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 09:40:32 compute-1 sudo[83073]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-d241d473-9fcb-5f74-b163-f1ca4454e7f1/etc/ceph/ceph.client.admin.keyring.new
Feb 02 09:40:32 compute-1 sudo[83073]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:40:32 compute-1 sudo[83073]: pam_unix(sudo:session): session closed for user root
Feb 02 09:40:32 compute-1 ceph-osd[77691]: log_channel(cluster) log [DBG] : 6.18 scrub starts
Feb 02 09:40:32 compute-1 ceph-osd[77691]: log_channel(cluster) log [DBG] : 6.18 scrub ok
Feb 02 09:40:32 compute-1 sudo[83098]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-d241d473-9fcb-5f74-b163-f1ca4454e7f1
Feb 02 09:40:32 compute-1 sudo[83098]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:40:32 compute-1 sudo[83098]: pam_unix(sudo:session): session closed for user root
Feb 02 09:40:33 compute-1 sudo[83123]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-d241d473-9fcb-5f74-b163-f1ca4454e7f1/etc/ceph/ceph.client.admin.keyring.new
Feb 02 09:40:33 compute-1 sudo[83123]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:40:33 compute-1 sudo[83123]: pam_unix(sudo:session): session closed for user root
Feb 02 09:40:33 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e47 e47: 3 total, 3 up, 3 in
Feb 02 09:40:33 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 47 pg[2.1e( empty local-lis/les=46/47 n=0 ec=29/14 lis/c=29/29 les/c/f=31/31/0 sis=46) [0] r=0 lpr=46 pi=[29,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 02 09:40:33 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 47 pg[3.1f( empty local-lis/les=46/47 n=0 ec=29/15 lis/c=31/31 les/c/f=32/32/0 sis=46) [0] r=0 lpr=46 pi=[31,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 02 09:40:33 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 47 pg[2.1f( empty local-lis/les=46/47 n=0 ec=29/14 lis/c=29/29 les/c/f=31/31/0 sis=46) [0] r=0 lpr=46 pi=[29,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 02 09:40:33 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 47 pg[3.1e( empty local-lis/les=46/47 n=0 ec=29/15 lis/c=31/31 les/c/f=32/32/0 sis=46) [0] r=0 lpr=46 pi=[31,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 02 09:40:33 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 47 pg[5.19( empty local-lis/les=46/47 n=0 ec=31/17 lis/c=31/31 les/c/f=32/32/0 sis=46) [0] r=0 lpr=46 pi=[31,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 02 09:40:33 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 47 pg[7.1b( empty local-lis/les=46/47 n=0 ec=33/19 lis/c=33/33 les/c/f=34/34/0 sis=46) [0] r=0 lpr=46 pi=[33,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 02 09:40:33 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 47 pg[7.1e( empty local-lis/les=46/47 n=0 ec=33/19 lis/c=33/33 les/c/f=34/34/0 sis=46) [0] r=0 lpr=46 pi=[33,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 02 09:40:33 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 47 pg[5.1d( empty local-lis/les=46/47 n=0 ec=31/17 lis/c=31/31 les/c/f=32/32/0 sis=46) [0] r=0 lpr=46 pi=[31,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 02 09:40:33 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 47 pg[7.18( empty local-lis/les=46/47 n=0 ec=33/19 lis/c=33/33 les/c/f=34/34/0 sis=46) [0] r=0 lpr=46 pi=[33,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 02 09:40:33 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 47 pg[3.4( empty local-lis/les=46/47 n=0 ec=29/15 lis/c=31/31 les/c/f=32/32/0 sis=46) [0] r=0 lpr=46 pi=[31,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 02 09:40:33 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 47 pg[5.5( empty local-lis/les=46/47 n=0 ec=31/17 lis/c=31/31 les/c/f=32/32/0 sis=46) [0] r=0 lpr=46 pi=[31,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 02 09:40:33 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 47 pg[7.6( empty local-lis/les=46/47 n=0 ec=33/19 lis/c=33/33 les/c/f=34/34/0 sis=46) [0] r=0 lpr=46 pi=[33,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 02 09:40:33 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 47 pg[3.2( empty local-lis/les=46/47 n=0 ec=29/15 lis/c=31/31 les/c/f=32/32/0 sis=46) [0] r=0 lpr=46 pi=[31,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 02 09:40:33 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 47 pg[3.1( empty local-lis/les=46/47 n=0 ec=29/15 lis/c=31/31 les/c/f=32/32/0 sis=46) [0] r=0 lpr=46 pi=[31,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 02 09:40:33 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 47 pg[5.3( empty local-lis/les=46/47 n=0 ec=31/17 lis/c=31/31 les/c/f=32/32/0 sis=46) [0] r=0 lpr=46 pi=[31,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 02 09:40:33 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 47 pg[7.2( empty local-lis/les=46/47 n=0 ec=33/19 lis/c=33/33 les/c/f=34/34/0 sis=46) [0] r=0 lpr=46 pi=[33,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 02 09:40:33 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 47 pg[2.9( empty local-lis/les=46/47 n=0 ec=29/14 lis/c=29/29 les/c/f=31/31/0 sis=46) [0] r=0 lpr=46 pi=[29,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 02 09:40:33 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 47 pg[2.4( empty local-lis/les=46/47 n=0 ec=29/14 lis/c=29/29 les/c/f=31/31/0 sis=46) [0] r=0 lpr=46 pi=[29,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 02 09:40:33 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 47 pg[3.6( empty local-lis/les=46/47 n=0 ec=29/15 lis/c=31/31 les/c/f=32/32/0 sis=46) [0] r=0 lpr=46 pi=[31,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 02 09:40:33 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 47 pg[2.6( empty local-lis/les=46/47 n=0 ec=29/14 lis/c=29/29 les/c/f=31/31/0 sis=46) [0] r=0 lpr=46 pi=[29,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 02 09:40:33 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 47 pg[7.3( empty local-lis/les=46/47 n=0 ec=33/19 lis/c=33/33 les/c/f=34/34/0 sis=46) [0] r=0 lpr=46 pi=[33,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 02 09:40:33 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 47 pg[5.6( empty local-lis/les=46/47 n=0 ec=31/17 lis/c=31/31 les/c/f=32/32/0 sis=46) [0] r=0 lpr=46 pi=[31,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 02 09:40:33 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 47 pg[3.7( empty local-lis/les=46/47 n=0 ec=29/15 lis/c=31/31 les/c/f=32/32/0 sis=46) [0] r=0 lpr=46 pi=[31,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 02 09:40:33 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 47 pg[7.e( empty local-lis/les=46/47 n=0 ec=33/19 lis/c=33/33 les/c/f=34/34/0 sis=46) [0] r=0 lpr=46 pi=[33,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 02 09:40:33 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 47 pg[7.4( empty local-lis/les=46/47 n=0 ec=33/19 lis/c=33/33 les/c/f=34/34/0 sis=46) [0] r=0 lpr=46 pi=[33,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 02 09:40:33 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 47 pg[2.1( empty local-lis/les=46/47 n=0 ec=29/14 lis/c=29/29 les/c/f=31/31/0 sis=46) [0] r=0 lpr=46 pi=[29,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 02 09:40:33 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 47 pg[5.c( empty local-lis/les=46/47 n=0 ec=31/17 lis/c=31/31 les/c/f=32/32/0 sis=46) [0] r=0 lpr=46 pi=[31,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 02 09:40:33 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 47 pg[5.a( empty local-lis/les=46/47 n=0 ec=31/17 lis/c=31/31 les/c/f=32/32/0 sis=46) [0] r=0 lpr=46 pi=[31,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 02 09:40:33 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 47 pg[7.9( empty local-lis/les=46/47 n=0 ec=33/19 lis/c=33/33 les/c/f=34/34/0 sis=46) [0] r=0 lpr=46 pi=[33,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 02 09:40:33 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 47 pg[7.8( empty local-lis/les=46/47 n=0 ec=33/19 lis/c=33/33 les/c/f=34/34/0 sis=46) [0] r=0 lpr=46 pi=[33,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 02 09:40:33 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 47 pg[2.e( empty local-lis/les=46/47 n=0 ec=29/14 lis/c=29/29 les/c/f=31/31/0 sis=46) [0] r=0 lpr=46 pi=[29,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 02 09:40:33 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 47 pg[7.f( empty local-lis/les=46/47 n=0 ec=33/19 lis/c=33/33 les/c/f=34/34/0 sis=46) [0] r=0 lpr=46 pi=[33,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 02 09:40:33 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 47 pg[7.b( empty local-lis/les=46/47 n=0 ec=33/19 lis/c=33/33 les/c/f=34/34/0 sis=46) [0] r=0 lpr=46 pi=[33,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 02 09:40:33 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 47 pg[5.17( empty local-lis/les=46/47 n=0 ec=31/17 lis/c=31/31 les/c/f=32/32/0 sis=46) [0] r=0 lpr=46 pi=[31,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 02 09:40:33 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 47 pg[3.b( empty local-lis/les=46/47 n=0 ec=29/15 lis/c=31/31 les/c/f=32/32/0 sis=46) [0] r=0 lpr=46 pi=[31,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 02 09:40:33 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 47 pg[7.10( empty local-lis/les=46/47 n=0 ec=33/19 lis/c=33/33 les/c/f=34/34/0 sis=46) [0] r=0 lpr=46 pi=[33,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 02 09:40:33 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 47 pg[7.13( empty local-lis/les=46/47 n=0 ec=33/19 lis/c=33/33 les/c/f=34/34/0 sis=46) [0] r=0 lpr=46 pi=[33,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 02 09:40:33 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 47 pg[3.17( empty local-lis/les=46/47 n=0 ec=29/15 lis/c=31/31 les/c/f=32/32/0 sis=46) [0] r=0 lpr=46 pi=[31,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 02 09:40:33 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 47 pg[2.19( empty local-lis/les=46/47 n=0 ec=29/14 lis/c=29/29 les/c/f=31/31/0 sis=46) [0] r=0 lpr=46 pi=[29,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 02 09:40:33 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 47 pg[3.12( empty local-lis/les=46/47 n=0 ec=29/15 lis/c=31/31 les/c/f=32/32/0 sis=46) [0] r=0 lpr=46 pi=[31,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 02 09:40:33 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 47 pg[3.18( empty local-lis/les=46/47 n=0 ec=29/15 lis/c=31/31 les/c/f=32/32/0 sis=46) [0] r=0 lpr=46 pi=[31,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 02 09:40:33 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 47 pg[3.19( empty local-lis/les=46/47 n=0 ec=29/15 lis/c=31/31 les/c/f=32/32/0 sis=46) [0] r=0 lpr=46 pi=[31,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 02 09:40:33 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 47 pg[5.1e( empty local-lis/les=46/47 n=0 ec=31/17 lis/c=31/31 les/c/f=32/32/0 sis=46) [0] r=0 lpr=46 pi=[31,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 02 09:40:33 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 47 pg[5.14( empty local-lis/les=46/47 n=0 ec=31/17 lis/c=31/31 les/c/f=32/32/0 sis=46) [0] r=0 lpr=46 pi=[31,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 02 09:40:33 compute-1 ceph-mon[80115]: 3.2 scrub starts
Feb 02 09:40:33 compute-1 ceph-mon[80115]: 3.2 scrub ok
Feb 02 09:40:33 compute-1 ceph-mon[80115]: Updating compute-0:/etc/ceph/ceph.conf
Feb 02 09:40:33 compute-1 ceph-mon[80115]: Updating compute-1:/etc/ceph/ceph.conf
Feb 02 09:40:33 compute-1 ceph-mon[80115]: Updating compute-2:/etc/ceph/ceph.conf
Feb 02 09:40:33 compute-1 ceph-mon[80115]: 2.14 scrub starts
Feb 02 09:40:33 compute-1 ceph-mon[80115]: 2.14 scrub ok
Feb 02 09:40:33 compute-1 ceph-mon[80115]: 4.4 scrub starts
Feb 02 09:40:33 compute-1 ceph-mon[80115]: from='client.14457 -' entity='client.admin' cmd=[{"prefix": "dashboard set-alertmanager-api-host", "value": "http://192.168.122.100:9093", "target": ["mon-mgr", ""]}]: dispatch
Feb 02 09:40:33 compute-1 ceph-mon[80115]: 4.4 scrub ok
Feb 02 09:40:33 compute-1 ceph-mon[80115]: from='mgr.24202 192.168.122.100:0/2430780993' entity='mgr.compute-0.djvyfo' 
Feb 02 09:40:33 compute-1 sudo[83171]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-d241d473-9fcb-5f74-b163-f1ca4454e7f1/etc/ceph/ceph.client.admin.keyring.new
Feb 02 09:40:33 compute-1 sudo[83171]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:40:33 compute-1 sudo[83171]: pam_unix(sudo:session): session closed for user root
Feb 02 09:40:33 compute-1 sudo[83196]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 600 /tmp/cephadm-d241d473-9fcb-5f74-b163-f1ca4454e7f1/etc/ceph/ceph.client.admin.keyring.new
Feb 02 09:40:33 compute-1 sudo[83196]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:40:33 compute-1 sudo[83196]: pam_unix(sudo:session): session closed for user root
Feb 02 09:40:33 compute-1 sudo[83221]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-d241d473-9fcb-5f74-b163-f1ca4454e7f1/etc/ceph/ceph.client.admin.keyring.new /etc/ceph/ceph.client.admin.keyring
Feb 02 09:40:33 compute-1 sudo[83221]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:40:33 compute-1 sudo[83221]: pam_unix(sudo:session): session closed for user root
Feb 02 09:40:33 compute-1 sudo[83246]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/d241d473-9fcb-5f74-b163-f1ca4454e7f1/config
Feb 02 09:40:33 compute-1 sudo[83246]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:40:33 compute-1 sudo[83246]: pam_unix(sudo:session): session closed for user root
Feb 02 09:40:33 compute-1 sudo[83271]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-d241d473-9fcb-5f74-b163-f1ca4454e7f1/var/lib/ceph/d241d473-9fcb-5f74-b163-f1ca4454e7f1/config
Feb 02 09:40:33 compute-1 sudo[83271]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:40:33 compute-1 sudo[83271]: pam_unix(sudo:session): session closed for user root
Feb 02 09:40:33 compute-1 sudo[83296]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-d241d473-9fcb-5f74-b163-f1ca4454e7f1/var/lib/ceph/d241d473-9fcb-5f74-b163-f1ca4454e7f1/config/ceph.client.admin.keyring.new
Feb 02 09:40:33 compute-1 sudo[83296]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:40:33 compute-1 sudo[83296]: pam_unix(sudo:session): session closed for user root
Feb 02 09:40:33 compute-1 sudo[83321]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-d241d473-9fcb-5f74-b163-f1ca4454e7f1
Feb 02 09:40:33 compute-1 sudo[83321]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:40:33 compute-1 sudo[83321]: pam_unix(sudo:session): session closed for user root
Feb 02 09:40:33 compute-1 sudo[83346]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-d241d473-9fcb-5f74-b163-f1ca4454e7f1/var/lib/ceph/d241d473-9fcb-5f74-b163-f1ca4454e7f1/config/ceph.client.admin.keyring.new
Feb 02 09:40:33 compute-1 sudo[83346]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:40:33 compute-1 sudo[83346]: pam_unix(sudo:session): session closed for user root
Feb 02 09:40:33 compute-1 sudo[83394]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-d241d473-9fcb-5f74-b163-f1ca4454e7f1/var/lib/ceph/d241d473-9fcb-5f74-b163-f1ca4454e7f1/config/ceph.client.admin.keyring.new
Feb 02 09:40:33 compute-1 sudo[83394]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:40:33 compute-1 sudo[83394]: pam_unix(sudo:session): session closed for user root
Feb 02 09:40:33 compute-1 sudo[83419]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 600 /tmp/cephadm-d241d473-9fcb-5f74-b163-f1ca4454e7f1/var/lib/ceph/d241d473-9fcb-5f74-b163-f1ca4454e7f1/config/ceph.client.admin.keyring.new
Feb 02 09:40:33 compute-1 sudo[83419]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:40:33 compute-1 sudo[83419]: pam_unix(sudo:session): session closed for user root
Feb 02 09:40:33 compute-1 sudo[83444]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-d241d473-9fcb-5f74-b163-f1ca4454e7f1/var/lib/ceph/d241d473-9fcb-5f74-b163-f1ca4454e7f1/config/ceph.client.admin.keyring.new /var/lib/ceph/d241d473-9fcb-5f74-b163-f1ca4454e7f1/config/ceph.client.admin.keyring
Feb 02 09:40:33 compute-1 sudo[83444]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:40:33 compute-1 sudo[83444]: pam_unix(sudo:session): session closed for user root
Feb 02 09:40:33 compute-1 ceph-osd[77691]: log_channel(cluster) log [DBG] : 6.1f scrub starts
Feb 02 09:40:33 compute-1 ceph-osd[77691]: log_channel(cluster) log [DBG] : 6.1f scrub ok
Feb 02 09:40:34 compute-1 ceph-mon[80115]: Updating compute-1:/var/lib/ceph/d241d473-9fcb-5f74-b163-f1ca4454e7f1/config/ceph.conf
Feb 02 09:40:34 compute-1 ceph-mon[80115]: Updating compute-2:/var/lib/ceph/d241d473-9fcb-5f74-b163-f1ca4454e7f1/config/ceph.conf
Feb 02 09:40:34 compute-1 ceph-mon[80115]: Updating compute-0:/var/lib/ceph/d241d473-9fcb-5f74-b163-f1ca4454e7f1/config/ceph.conf
Feb 02 09:40:34 compute-1 ceph-mon[80115]: 3.1b scrub starts
Feb 02 09:40:34 compute-1 ceph-mon[80115]: 3.1b scrub ok
Feb 02 09:40:34 compute-1 ceph-mon[80115]: Updating compute-1:/etc/ceph/ceph.client.admin.keyring
Feb 02 09:40:34 compute-1 ceph-mon[80115]: Updating compute-0:/etc/ceph/ceph.client.admin.keyring
Feb 02 09:40:34 compute-1 ceph-mon[80115]: Updating compute-2:/etc/ceph/ceph.client.admin.keyring
Feb 02 09:40:34 compute-1 ceph-mon[80115]: from='client.14463 -' entity='client.admin' cmd=[{"prefix": "dashboard set-prometheus-api-host", "value": "http://192.168.122.100:9092", "target": ["mon-mgr", ""]}]: dispatch
Feb 02 09:40:34 compute-1 ceph-mon[80115]: 2.1a scrub starts
Feb 02 09:40:34 compute-1 ceph-mon[80115]: 2.1a scrub ok
Feb 02 09:40:34 compute-1 ceph-mon[80115]: 6.18 scrub starts
Feb 02 09:40:34 compute-1 ceph-mon[80115]: 6.18 scrub ok
Feb 02 09:40:34 compute-1 ceph-mon[80115]: pgmap v6: 197 pgs: 197 active+clean; 454 KiB data, 85 MiB used, 60 GiB / 60 GiB avail
Feb 02 09:40:34 compute-1 ceph-mon[80115]: osdmap e47: 3 total, 3 up, 3 in
Feb 02 09:40:34 compute-1 ceph-mon[80115]: mgrmap e16: compute-0.djvyfo(active, since 4s), standbys: compute-1.teascl, compute-2.gzlyac
Feb 02 09:40:34 compute-1 ceph-mon[80115]: from='mgr.24202 192.168.122.100:0/2430780993' entity='mgr.compute-0.djvyfo' 
Feb 02 09:40:34 compute-1 ceph-mon[80115]: from='mgr.24202 192.168.122.100:0/2430780993' entity='mgr.compute-0.djvyfo' 
Feb 02 09:40:34 compute-1 ceph-mon[80115]: from='mgr.24202 192.168.122.100:0/2430780993' entity='mgr.compute-0.djvyfo' 
Feb 02 09:40:34 compute-1 ceph-mon[80115]: from='mgr.24202 192.168.122.100:0/2430780993' entity='mgr.compute-0.djvyfo' 
Feb 02 09:40:34 compute-1 ceph-mon[80115]: from='mgr.24202 192.168.122.100:0/2430780993' entity='mgr.compute-0.djvyfo' 
Feb 02 09:40:34 compute-1 ceph-mon[80115]: from='mgr.24202 192.168.122.100:0/2430780993' entity='mgr.compute-0.djvyfo' 
Feb 02 09:40:34 compute-1 ceph-mon[80115]: from='mgr.24202 192.168.122.100:0/2430780993' entity='mgr.compute-0.djvyfo' 
Feb 02 09:40:34 compute-1 ceph-mon[80115]: from='mgr.24202 192.168.122.100:0/2430780993' entity='mgr.compute-0.djvyfo' 
Feb 02 09:40:34 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:40:34 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:40:34 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:40:34.791 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:40:34 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:40:34 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 09:40:34 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:40:34.883 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 09:40:34 compute-1 ceph-osd[77691]: log_channel(cluster) log [DBG] : 6.c scrub starts
Feb 02 09:40:34 compute-1 ceph-osd[77691]: log_channel(cluster) log [DBG] : 6.c scrub ok
Feb 02 09:40:35 compute-1 ceph-mon[80115]: Updating compute-0:/var/lib/ceph/d241d473-9fcb-5f74-b163-f1ca4454e7f1/config/ceph.client.admin.keyring
Feb 02 09:40:35 compute-1 ceph-mon[80115]: Updating compute-1:/var/lib/ceph/d241d473-9fcb-5f74-b163-f1ca4454e7f1/config/ceph.client.admin.keyring
Feb 02 09:40:35 compute-1 ceph-mon[80115]: Updating compute-2:/var/lib/ceph/d241d473-9fcb-5f74-b163-f1ca4454e7f1/config/ceph.client.admin.keyring
Feb 02 09:40:35 compute-1 ceph-mon[80115]: 3.1a scrub starts
Feb 02 09:40:35 compute-1 ceph-mon[80115]: 3.1a scrub ok
Feb 02 09:40:35 compute-1 ceph-mon[80115]: 7.1c deep-scrub starts
Feb 02 09:40:35 compute-1 ceph-mon[80115]: 7.1c deep-scrub ok
Feb 02 09:40:35 compute-1 ceph-mon[80115]: 6.1f scrub starts
Feb 02 09:40:35 compute-1 ceph-mon[80115]: 6.1f scrub ok
Feb 02 09:40:35 compute-1 ceph-mon[80115]: Deploying daemon node-exporter.compute-0 on compute-0
Feb 02 09:40:35 compute-1 ceph-mon[80115]: from='client.14469 -' entity='client.admin' cmd=[{"prefix": "dashboard set-grafana-api-url", "value": "http://192.168.122.100:3100", "target": ["mon-mgr", ""]}]: dispatch
Feb 02 09:40:35 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 02 09:40:35 compute-1 ceph-osd[77691]: log_channel(cluster) log [DBG] : 6.6 scrub starts
Feb 02 09:40:35 compute-1 ceph-osd[77691]: log_channel(cluster) log [DBG] : 6.6 scrub ok
Feb 02 09:40:36 compute-1 ceph-mon[80115]: 3.9 scrub starts
Feb 02 09:40:36 compute-1 ceph-mon[80115]: 3.9 scrub ok
Feb 02 09:40:36 compute-1 ceph-mon[80115]: 7.12 deep-scrub starts
Feb 02 09:40:36 compute-1 ceph-mon[80115]: 7.12 deep-scrub ok
Feb 02 09:40:36 compute-1 ceph-mon[80115]: 6.c scrub starts
Feb 02 09:40:36 compute-1 ceph-mon[80115]: 6.c scrub ok
Feb 02 09:40:36 compute-1 ceph-mon[80115]: pgmap v8: 197 pgs: 197 active+clean; 454 KiB data, 85 MiB used, 60 GiB / 60 GiB avail; 35 KiB/s rd, 0 B/s wr, 14 op/s
Feb 02 09:40:36 compute-1 ceph-mon[80115]: from='client.? 192.168.122.100:0/3702593450' entity='client.admin' cmd=[{"prefix": "mgr module disable", "module": "dashboard"}]: dispatch
Feb 02 09:40:36 compute-1 ceph-mon[80115]: 3.8 scrub starts
Feb 02 09:40:36 compute-1 ceph-mgr[80422]: mgr handle_mgr_map respawning because set of enabled modules changed!
Feb 02 09:40:36 compute-1 sshd-session[82196]: Connection closed by 192.168.122.100 port 33354
Feb 02 09:40:36 compute-1 sshd-session[82193]: pam_unix(sshd:session): session closed for user ceph-admin
Feb 02 09:40:36 compute-1 systemd[1]: session-33.scope: Deactivated successfully.
Feb 02 09:40:36 compute-1 systemd[1]: session-33.scope: Consumed 4.051s CPU time.
Feb 02 09:40:36 compute-1 systemd-logind[805]: Session 33 logged out. Waiting for processes to exit.
Feb 02 09:40:36 compute-1 systemd-logind[805]: Removed session 33.
Feb 02 09:40:36 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]: ignoring --setuser ceph since I am not root
Feb 02 09:40:36 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]: ignoring --setgroup ceph since I am not root
Feb 02 09:40:36 compute-1 ceph-mgr[80422]: ceph version 19.2.3 (c92aebb279828e9c3c1f5d24613efca272649e62) squid (stable), process ceph-mgr, pid 2
Feb 02 09:40:36 compute-1 ceph-mgr[80422]: pidfile_write: ignore empty --pid-file
Feb 02 09:40:36 compute-1 ceph-mgr[80422]: mgr[py] Loading python module 'alerts'
Feb 02 09:40:36 compute-1 ceph-mgr[80422]: mgr[py] Module alerts has missing NOTIFY_TYPES member
Feb 02 09:40:36 compute-1 ceph-mgr[80422]: mgr[py] Loading python module 'balancer'
Feb 02 09:40:36 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]: 2026-02-02T09:40:36.449+0000 7f820d21d140 -1 mgr[py] Module alerts has missing NOTIFY_TYPES member
Feb 02 09:40:36 compute-1 ceph-mgr[80422]: mgr[py] Module balancer has missing NOTIFY_TYPES member
Feb 02 09:40:36 compute-1 ceph-mgr[80422]: mgr[py] Loading python module 'cephadm'
Feb 02 09:40:36 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]: 2026-02-02T09:40:36.531+0000 7f820d21d140 -1 mgr[py] Module balancer has missing NOTIFY_TYPES member
Feb 02 09:40:36 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:40:36 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 09:40:36 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:40:36.794 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 09:40:36 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:40:36 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:40:36 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:40:36.887 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:40:36 compute-1 ceph-osd[77691]: log_channel(cluster) log [DBG] : 6.4 scrub starts
Feb 02 09:40:36 compute-1 ceph-osd[77691]: log_channel(cluster) log [DBG] : 6.4 scrub ok
Feb 02 09:40:37 compute-1 ceph-mon[80115]: 3.8 scrub ok
Feb 02 09:40:37 compute-1 ceph-mon[80115]: 6.6 scrub starts
Feb 02 09:40:37 compute-1 ceph-mon[80115]: 2.16 scrub starts
Feb 02 09:40:37 compute-1 ceph-mon[80115]: 6.6 scrub ok
Feb 02 09:40:37 compute-1 ceph-mon[80115]: 2.16 scrub ok
Feb 02 09:40:37 compute-1 ceph-mon[80115]: from='client.? 192.168.122.100:0/3702593450' entity='client.admin' cmd='[{"prefix": "mgr module disable", "module": "dashboard"}]': finished
Feb 02 09:40:37 compute-1 ceph-mon[80115]: mgrmap e17: compute-0.djvyfo(active, since 7s), standbys: compute-1.teascl, compute-2.gzlyac
Feb 02 09:40:37 compute-1 ceph-mgr[80422]: mgr[py] Loading python module 'crash'
Feb 02 09:40:37 compute-1 ceph-mgr[80422]: mgr[py] Module crash has missing NOTIFY_TYPES member
Feb 02 09:40:37 compute-1 ceph-mgr[80422]: mgr[py] Loading python module 'dashboard'
Feb 02 09:40:37 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]: 2026-02-02T09:40:37.347+0000 7f820d21d140 -1 mgr[py] Module crash has missing NOTIFY_TYPES member
Feb 02 09:40:37 compute-1 ceph-mgr[80422]: mgr[py] Loading python module 'devicehealth'
Feb 02 09:40:37 compute-1 ceph-osd[77691]: log_channel(cluster) log [DBG] : 6.0 deep-scrub starts
Feb 02 09:40:37 compute-1 ceph-osd[77691]: log_channel(cluster) log [DBG] : 6.0 deep-scrub ok
Feb 02 09:40:37 compute-1 ceph-mgr[80422]: mgr[py] Module devicehealth has missing NOTIFY_TYPES member
Feb 02 09:40:37 compute-1 ceph-mgr[80422]: mgr[py] Loading python module 'diskprediction_local'
Feb 02 09:40:37 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]: 2026-02-02T09:40:37.979+0000 7f820d21d140 -1 mgr[py] Module devicehealth has missing NOTIFY_TYPES member
Feb 02 09:40:38 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]: /lib64/python3.9/site-packages/scipy/__init__.py:73: UserWarning: NumPy was imported from a Python sub-interpreter but NumPy does not properly support sub-interpreters. This will likely work for most users but might cause hard to track down issues or subtle bugs. A common user of the rare sub-interpreter feature is wsgi which also allows single-interpreter mode.
Feb 02 09:40:38 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]: Improvements in the case of bugs are welcome, but is not on the NumPy roadmap, and full support may require significant effort to achieve.
Feb 02 09:40:38 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]:   from numpy import show_config as show_numpy_config
Feb 02 09:40:38 compute-1 ceph-mgr[80422]: mgr[py] Module diskprediction_local has missing NOTIFY_TYPES member
Feb 02 09:40:38 compute-1 ceph-mgr[80422]: mgr[py] Loading python module 'influx'
Feb 02 09:40:38 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]: 2026-02-02T09:40:38.138+0000 7f820d21d140 -1 mgr[py] Module diskprediction_local has missing NOTIFY_TYPES member
Feb 02 09:40:38 compute-1 ceph-mgr[80422]: mgr[py] Module influx has missing NOTIFY_TYPES member
Feb 02 09:40:38 compute-1 ceph-mgr[80422]: mgr[py] Loading python module 'insights'
Feb 02 09:40:38 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]: 2026-02-02T09:40:38.211+0000 7f820d21d140 -1 mgr[py] Module influx has missing NOTIFY_TYPES member
Feb 02 09:40:38 compute-1 ceph-mon[80115]: 5.e scrub starts
Feb 02 09:40:38 compute-1 ceph-mon[80115]: 5.e scrub ok
Feb 02 09:40:38 compute-1 ceph-mon[80115]: 6.4 scrub starts
Feb 02 09:40:38 compute-1 ceph-mon[80115]: 7.17 scrub starts
Feb 02 09:40:38 compute-1 ceph-mon[80115]: 6.4 scrub ok
Feb 02 09:40:38 compute-1 ceph-mon[80115]: 7.17 scrub ok
Feb 02 09:40:38 compute-1 ceph-mon[80115]: from='client.? 192.168.122.100:0/2174886532' entity='client.admin' cmd=[{"prefix": "mgr module enable", "module": "dashboard"}]: dispatch
Feb 02 09:40:38 compute-1 ceph-mon[80115]: 3.1d scrub starts
Feb 02 09:40:38 compute-1 ceph-mgr[80422]: mgr[py] Loading python module 'iostat'
Feb 02 09:40:38 compute-1 ceph-mgr[80422]: mgr[py] Module iostat has missing NOTIFY_TYPES member
Feb 02 09:40:38 compute-1 ceph-mgr[80422]: mgr[py] Loading python module 'k8sevents'
Feb 02 09:40:38 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]: 2026-02-02T09:40:38.370+0000 7f820d21d140 -1 mgr[py] Module iostat has missing NOTIFY_TYPES member
Feb 02 09:40:38 compute-1 ceph-mgr[80422]: mgr[py] Loading python module 'localpool'
Feb 02 09:40:38 compute-1 ceph-mgr[80422]: mgr[py] Loading python module 'mds_autoscaler'
Feb 02 09:40:38 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:40:38 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:40:38 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:40:38.798 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:40:38 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:40:38 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 09:40:38 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:40:38.889 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 09:40:38 compute-1 ceph-osd[77691]: log_channel(cluster) log [DBG] : 6.f scrub starts
Feb 02 09:40:38 compute-1 ceph-mgr[80422]: mgr[py] Loading python module 'mirroring'
Feb 02 09:40:38 compute-1 ceph-osd[77691]: log_channel(cluster) log [DBG] : 6.f scrub ok
Feb 02 09:40:39 compute-1 ceph-mgr[80422]: mgr[py] Loading python module 'nfs'
Feb 02 09:40:39 compute-1 ceph-mgr[80422]: mgr[py] Module nfs has missing NOTIFY_TYPES member
Feb 02 09:40:39 compute-1 ceph-mgr[80422]: mgr[py] Loading python module 'orchestrator'
Feb 02 09:40:39 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]: 2026-02-02T09:40:39.241+0000 7f820d21d140 -1 mgr[py] Module nfs has missing NOTIFY_TYPES member
Feb 02 09:40:39 compute-1 ceph-mon[80115]: 3.1d scrub ok
Feb 02 09:40:39 compute-1 ceph-mon[80115]: 6.0 deep-scrub starts
Feb 02 09:40:39 compute-1 ceph-mon[80115]: 6.0 deep-scrub ok
Feb 02 09:40:39 compute-1 ceph-mon[80115]: 7.15 scrub starts
Feb 02 09:40:39 compute-1 ceph-mon[80115]: 7.15 scrub ok
Feb 02 09:40:39 compute-1 ceph-mon[80115]: from='client.? 192.168.122.100:0/2174886532' entity='client.admin' cmd='[{"prefix": "mgr module enable", "module": "dashboard"}]': finished
Feb 02 09:40:39 compute-1 ceph-mon[80115]: mgrmap e18: compute-0.djvyfo(active, since 9s), standbys: compute-1.teascl, compute-2.gzlyac
Feb 02 09:40:39 compute-1 ceph-mon[80115]: 5.4 deep-scrub starts
Feb 02 09:40:39 compute-1 ceph-mgr[80422]: mgr[py] Module orchestrator has missing NOTIFY_TYPES member
Feb 02 09:40:39 compute-1 ceph-mgr[80422]: mgr[py] Loading python module 'osd_perf_query'
Feb 02 09:40:39 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]: 2026-02-02T09:40:39.435+0000 7f820d21d140 -1 mgr[py] Module orchestrator has missing NOTIFY_TYPES member
Feb 02 09:40:39 compute-1 ceph-mgr[80422]: mgr[py] Module osd_perf_query has missing NOTIFY_TYPES member
Feb 02 09:40:39 compute-1 ceph-mgr[80422]: mgr[py] Loading python module 'osd_support'
Feb 02 09:40:39 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]: 2026-02-02T09:40:39.502+0000 7f820d21d140 -1 mgr[py] Module osd_perf_query has missing NOTIFY_TYPES member
Feb 02 09:40:39 compute-1 ceph-mgr[80422]: mgr[py] Module osd_support has missing NOTIFY_TYPES member
Feb 02 09:40:39 compute-1 ceph-mgr[80422]: mgr[py] Loading python module 'pg_autoscaler'
Feb 02 09:40:39 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]: 2026-02-02T09:40:39.560+0000 7f820d21d140 -1 mgr[py] Module osd_support has missing NOTIFY_TYPES member
Feb 02 09:40:39 compute-1 ceph-mgr[80422]: mgr[py] Module pg_autoscaler has missing NOTIFY_TYPES member
Feb 02 09:40:39 compute-1 ceph-mgr[80422]: mgr[py] Loading python module 'progress'
Feb 02 09:40:39 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]: 2026-02-02T09:40:39.629+0000 7f820d21d140 -1 mgr[py] Module pg_autoscaler has missing NOTIFY_TYPES member
Feb 02 09:40:39 compute-1 ceph-mgr[80422]: mgr[py] Module progress has missing NOTIFY_TYPES member
Feb 02 09:40:39 compute-1 ceph-mgr[80422]: mgr[py] Loading python module 'prometheus'
Feb 02 09:40:39 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]: 2026-02-02T09:40:39.693+0000 7f820d21d140 -1 mgr[py] Module progress has missing NOTIFY_TYPES member
Feb 02 09:40:39 compute-1 ceph-osd[77691]: log_channel(cluster) log [DBG] : 6.9 scrub starts
Feb 02 09:40:39 compute-1 ceph-osd[77691]: log_channel(cluster) log [DBG] : 6.9 scrub ok
Feb 02 09:40:39 compute-1 ceph-mgr[80422]: mgr[py] Module prometheus has missing NOTIFY_TYPES member
Feb 02 09:40:39 compute-1 ceph-mgr[80422]: mgr[py] Loading python module 'rbd_support'
Feb 02 09:40:39 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]: 2026-02-02T09:40:39.991+0000 7f820d21d140 -1 mgr[py] Module prometheus has missing NOTIFY_TYPES member
Feb 02 09:40:40 compute-1 ceph-mgr[80422]: mgr[py] Module rbd_support has missing NOTIFY_TYPES member
Feb 02 09:40:40 compute-1 ceph-mgr[80422]: mgr[py] Loading python module 'restful'
Feb 02 09:40:40 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]: 2026-02-02T09:40:40.077+0000 7f820d21d140 -1 mgr[py] Module rbd_support has missing NOTIFY_TYPES member
Feb 02 09:40:40 compute-1 ceph-mgr[80422]: mgr[py] Loading python module 'rgw'
Feb 02 09:40:40 compute-1 ceph-mon[80115]: 5.4 deep-scrub ok
Feb 02 09:40:40 compute-1 ceph-mon[80115]: 2.17 scrub starts
Feb 02 09:40:40 compute-1 ceph-mon[80115]: 2.17 scrub ok
Feb 02 09:40:40 compute-1 ceph-mon[80115]: 6.f scrub starts
Feb 02 09:40:40 compute-1 ceph-mon[80115]: 6.f scrub ok
Feb 02 09:40:40 compute-1 ceph-mon[80115]: 5.0 scrub starts
Feb 02 09:40:40 compute-1 ceph-mon[80115]: 5.0 scrub ok
Feb 02 09:40:40 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 02 09:40:40 compute-1 ceph-mgr[80422]: mgr[py] Module rgw has missing NOTIFY_TYPES member
Feb 02 09:40:40 compute-1 ceph-mgr[80422]: mgr[py] Loading python module 'rook'
Feb 02 09:40:40 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]: 2026-02-02T09:40:40.455+0000 7f820d21d140 -1 mgr[py] Module rgw has missing NOTIFY_TYPES member
Feb 02 09:40:40 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:40:40 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:40:40 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:40:40.801 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:40:40 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:40:40 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 09:40:40 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:40:40.893 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 09:40:40 compute-1 ceph-mgr[80422]: mgr[py] Module rook has missing NOTIFY_TYPES member
Feb 02 09:40:40 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]: 2026-02-02T09:40:40.931+0000 7f820d21d140 -1 mgr[py] Module rook has missing NOTIFY_TYPES member
Feb 02 09:40:40 compute-1 ceph-mgr[80422]: mgr[py] Loading python module 'selftest'
Feb 02 09:40:40 compute-1 ceph-osd[77691]: log_channel(cluster) log [DBG] : 6.b scrub starts
Feb 02 09:40:40 compute-1 ceph-osd[77691]: log_channel(cluster) log [DBG] : 6.b scrub ok
Feb 02 09:40:40 compute-1 ceph-mgr[80422]: mgr[py] Module selftest has missing NOTIFY_TYPES member
Feb 02 09:40:40 compute-1 ceph-mgr[80422]: mgr[py] Loading python module 'snap_schedule'
Feb 02 09:40:40 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]: 2026-02-02T09:40:40.997+0000 7f820d21d140 -1 mgr[py] Module selftest has missing NOTIFY_TYPES member
Feb 02 09:40:41 compute-1 ceph-mgr[80422]: mgr[py] Module snap_schedule has missing NOTIFY_TYPES member
Feb 02 09:40:41 compute-1 ceph-mgr[80422]: mgr[py] Loading python module 'stats'
Feb 02 09:40:41 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]: 2026-02-02T09:40:41.068+0000 7f820d21d140 -1 mgr[py] Module snap_schedule has missing NOTIFY_TYPES member
Feb 02 09:40:41 compute-1 ceph-mgr[80422]: mgr[py] Loading python module 'status'
Feb 02 09:40:41 compute-1 ceph-mgr[80422]: mgr[py] Module status has missing NOTIFY_TYPES member
Feb 02 09:40:41 compute-1 ceph-mgr[80422]: mgr[py] Loading python module 'telegraf'
Feb 02 09:40:41 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]: 2026-02-02T09:40:41.202+0000 7f820d21d140 -1 mgr[py] Module status has missing NOTIFY_TYPES member
Feb 02 09:40:41 compute-1 ceph-mgr[80422]: mgr[py] Module telegraf has missing NOTIFY_TYPES member
Feb 02 09:40:41 compute-1 ceph-mgr[80422]: mgr[py] Loading python module 'telemetry'
Feb 02 09:40:41 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]: 2026-02-02T09:40:41.265+0000 7f820d21d140 -1 mgr[py] Module telegraf has missing NOTIFY_TYPES member
Feb 02 09:40:41 compute-1 ceph-mon[80115]: 7.0 scrub starts
Feb 02 09:40:41 compute-1 ceph-mon[80115]: 6.9 scrub starts
Feb 02 09:40:41 compute-1 ceph-mon[80115]: 7.0 scrub ok
Feb 02 09:40:41 compute-1 ceph-mon[80115]: 6.9 scrub ok
Feb 02 09:40:41 compute-1 ceph-mon[80115]: 5.d scrub starts
Feb 02 09:40:41 compute-1 ceph-mon[80115]: 5.d scrub ok
Feb 02 09:40:41 compute-1 ceph-mgr[80422]: mgr[py] Module telemetry has missing NOTIFY_TYPES member
Feb 02 09:40:41 compute-1 ceph-mgr[80422]: mgr[py] Loading python module 'test_orchestrator'
Feb 02 09:40:41 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]: 2026-02-02T09:40:41.414+0000 7f820d21d140 -1 mgr[py] Module telemetry has missing NOTIFY_TYPES member
Feb 02 09:40:41 compute-1 ceph-mgr[80422]: mgr[py] Module test_orchestrator has missing NOTIFY_TYPES member
Feb 02 09:40:41 compute-1 ceph-mgr[80422]: mgr[py] Loading python module 'volumes'
Feb 02 09:40:41 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]: 2026-02-02T09:40:41.615+0000 7f820d21d140 -1 mgr[py] Module test_orchestrator has missing NOTIFY_TYPES member
Feb 02 09:40:41 compute-1 ceph-mgr[80422]: mgr[py] Module volumes has missing NOTIFY_TYPES member
Feb 02 09:40:41 compute-1 ceph-mgr[80422]: mgr[py] Loading python module 'zabbix'
Feb 02 09:40:41 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]: 2026-02-02T09:40:41.865+0000 7f820d21d140 -1 mgr[py] Module volumes has missing NOTIFY_TYPES member
Feb 02 09:40:41 compute-1 ceph-osd[77691]: log_channel(cluster) log [DBG] : 6.14 scrub starts
Feb 02 09:40:41 compute-1 ceph-osd[77691]: log_channel(cluster) log [DBG] : 6.14 scrub ok
Feb 02 09:40:41 compute-1 ceph-mgr[80422]: mgr[py] Module zabbix has missing NOTIFY_TYPES member
Feb 02 09:40:41 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]: 2026-02-02T09:40:41.929+0000 7f820d21d140 -1 mgr[py] Module zabbix has missing NOTIFY_TYPES member
Feb 02 09:40:41 compute-1 ceph-mgr[80422]: ms_deliver_dispatch: unhandled message 0x56375e79f860 mon_map magic: 0 from mon.2 v2:192.168.122.101:3300/0
Feb 02 09:40:41 compute-1 ceph-mgr[80422]: mgr handle_mgr_map respawning because set of enabled modules changed!
Feb 02 09:40:41 compute-1 ceph-mgr[80422]: mgr respawn  e: '/usr/bin/ceph-mgr'
Feb 02 09:40:41 compute-1 ceph-mgr[80422]: mgr respawn  0: '/usr/bin/ceph-mgr'
Feb 02 09:40:41 compute-1 ceph-mgr[80422]: mgr respawn  1: '-n'
Feb 02 09:40:41 compute-1 ceph-mgr[80422]: mgr respawn  2: 'mgr.compute-1.teascl'
Feb 02 09:40:41 compute-1 ceph-mgr[80422]: mgr respawn  3: '-f'
Feb 02 09:40:41 compute-1 ceph-mgr[80422]: mgr respawn  4: '--setuser'
Feb 02 09:40:41 compute-1 ceph-mgr[80422]: mgr respawn  5: 'ceph'
Feb 02 09:40:41 compute-1 ceph-mgr[80422]: mgr respawn  6: '--setgroup'
Feb 02 09:40:41 compute-1 ceph-mgr[80422]: mgr respawn  7: 'ceph'
Feb 02 09:40:41 compute-1 ceph-mgr[80422]: mgr respawn  8: '--default-log-to-file=false'
Feb 02 09:40:41 compute-1 ceph-mgr[80422]: mgr respawn  9: '--default-log-to-journald=true'
Feb 02 09:40:41 compute-1 ceph-mgr[80422]: mgr respawn  10: '--default-log-to-stderr=false'
Feb 02 09:40:41 compute-1 ceph-mgr[80422]: mgr respawn respawning with exe /usr/bin/ceph-mgr
Feb 02 09:40:41 compute-1 ceph-mgr[80422]: mgr respawn  exe_path /proc/self/exe
Feb 02 09:40:41 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]: ignoring --setuser ceph since I am not root
Feb 02 09:40:41 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]: ignoring --setgroup ceph since I am not root
Feb 02 09:40:41 compute-1 ceph-mgr[80422]: ceph version 19.2.3 (c92aebb279828e9c3c1f5d24613efca272649e62) squid (stable), process ceph-mgr, pid 2
Feb 02 09:40:41 compute-1 ceph-mgr[80422]: pidfile_write: ignore empty --pid-file
Feb 02 09:40:42 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e48 e48: 3 total, 3 up, 3 in
Feb 02 09:40:42 compute-1 ceph-mgr[80422]: mgr[py] Loading python module 'alerts'
Feb 02 09:40:42 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]: 2026-02-02T09:40:42.127+0000 7f947fe91140 -1 mgr[py] Module alerts has missing NOTIFY_TYPES member
Feb 02 09:40:42 compute-1 ceph-mgr[80422]: mgr[py] Module alerts has missing NOTIFY_TYPES member
Feb 02 09:40:42 compute-1 ceph-mgr[80422]: mgr[py] Loading python module 'balancer'
Feb 02 09:40:42 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]: 2026-02-02T09:40:42.204+0000 7f947fe91140 -1 mgr[py] Module balancer has missing NOTIFY_TYPES member
Feb 02 09:40:42 compute-1 ceph-mgr[80422]: mgr[py] Module balancer has missing NOTIFY_TYPES member
Feb 02 09:40:42 compute-1 ceph-mgr[80422]: mgr[py] Loading python module 'cephadm'
Feb 02 09:40:42 compute-1 ceph-mon[80115]: 6.b scrub starts
Feb 02 09:40:42 compute-1 ceph-mon[80115]: 6.b scrub ok
Feb 02 09:40:42 compute-1 ceph-mon[80115]: 2.2 scrub starts
Feb 02 09:40:42 compute-1 ceph-mon[80115]: 2.2 scrub ok
Feb 02 09:40:42 compute-1 ceph-mon[80115]: 5.b deep-scrub starts
Feb 02 09:40:42 compute-1 ceph-mon[80115]: 5.b deep-scrub ok
Feb 02 09:40:42 compute-1 ceph-mon[80115]: Standby manager daemon compute-2.gzlyac restarted
Feb 02 09:40:42 compute-1 ceph-mon[80115]: Standby manager daemon compute-2.gzlyac started
Feb 02 09:40:42 compute-1 ceph-mon[80115]: Standby manager daemon compute-1.teascl restarted
Feb 02 09:40:42 compute-1 ceph-mon[80115]: Standby manager daemon compute-1.teascl started
Feb 02 09:40:42 compute-1 ceph-mon[80115]: Active manager daemon compute-0.djvyfo restarted
Feb 02 09:40:42 compute-1 ceph-mon[80115]: Activating manager daemon compute-0.djvyfo
Feb 02 09:40:42 compute-1 ceph-mon[80115]: osdmap e48: 3 total, 3 up, 3 in
Feb 02 09:40:42 compute-1 ceph-mon[80115]: mgrmap e19: compute-0.djvyfo(active, starting, since 0.0330787s), standbys: compute-1.teascl, compute-2.gzlyac
Feb 02 09:40:42 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:40:42 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:40:42 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:40:42.804 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:40:42 compute-1 ceph-mgr[80422]: mgr[py] Loading python module 'crash'
Feb 02 09:40:42 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:40:42 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:40:42 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:40:42.897 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:40:42 compute-1 ceph-osd[77691]: log_channel(cluster) log [DBG] : 6.16 scrub starts
Feb 02 09:40:42 compute-1 ceph-osd[77691]: log_channel(cluster) log [DBG] : 6.16 scrub ok
Feb 02 09:40:42 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]: 2026-02-02T09:40:42.969+0000 7f947fe91140 -1 mgr[py] Module crash has missing NOTIFY_TYPES member
Feb 02 09:40:42 compute-1 ceph-mgr[80422]: mgr[py] Module crash has missing NOTIFY_TYPES member
Feb 02 09:40:42 compute-1 ceph-mgr[80422]: mgr[py] Loading python module 'dashboard'
Feb 02 09:40:43 compute-1 ceph-mon[80115]: 6.14 scrub starts
Feb 02 09:40:43 compute-1 ceph-mon[80115]: 6.14 scrub ok
Feb 02 09:40:43 compute-1 ceph-mon[80115]: 7.7 scrub starts
Feb 02 09:40:43 compute-1 ceph-mon[80115]: 7.7 scrub ok
Feb 02 09:40:43 compute-1 ceph-mon[80115]: 5.8 scrub starts
Feb 02 09:40:43 compute-1 ceph-mon[80115]: 5.8 scrub ok
Feb 02 09:40:43 compute-1 ceph-mgr[80422]: mgr[py] Loading python module 'devicehealth'
Feb 02 09:40:43 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]: 2026-02-02T09:40:43.527+0000 7f947fe91140 -1 mgr[py] Module devicehealth has missing NOTIFY_TYPES member
Feb 02 09:40:43 compute-1 ceph-mgr[80422]: mgr[py] Module devicehealth has missing NOTIFY_TYPES member
Feb 02 09:40:43 compute-1 ceph-mgr[80422]: mgr[py] Loading python module 'diskprediction_local'
Feb 02 09:40:43 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]: /lib64/python3.9/site-packages/scipy/__init__.py:73: UserWarning: NumPy was imported from a Python sub-interpreter but NumPy does not properly support sub-interpreters. This will likely work for most users but might cause hard to track down issues or subtle bugs. A common user of the rare sub-interpreter feature is wsgi which also allows single-interpreter mode.
Feb 02 09:40:43 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]: Improvements in the case of bugs are welcome, but is not on the NumPy roadmap, and full support may require significant effort to achieve.
Feb 02 09:40:43 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]:   from numpy import show_config as show_numpy_config
Feb 02 09:40:43 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]: 2026-02-02T09:40:43.665+0000 7f947fe91140 -1 mgr[py] Module diskprediction_local has missing NOTIFY_TYPES member
Feb 02 09:40:43 compute-1 ceph-mgr[80422]: mgr[py] Module diskprediction_local has missing NOTIFY_TYPES member
Feb 02 09:40:43 compute-1 ceph-mgr[80422]: mgr[py] Loading python module 'influx'
Feb 02 09:40:43 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]: 2026-02-02T09:40:43.727+0000 7f947fe91140 -1 mgr[py] Module influx has missing NOTIFY_TYPES member
Feb 02 09:40:43 compute-1 ceph-mgr[80422]: mgr[py] Module influx has missing NOTIFY_TYPES member
Feb 02 09:40:43 compute-1 ceph-mgr[80422]: mgr[py] Loading python module 'insights'
Feb 02 09:40:43 compute-1 ceph-mgr[80422]: mgr[py] Loading python module 'iostat'
Feb 02 09:40:43 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]: 2026-02-02T09:40:43.849+0000 7f947fe91140 -1 mgr[py] Module iostat has missing NOTIFY_TYPES member
Feb 02 09:40:43 compute-1 ceph-mgr[80422]: mgr[py] Module iostat has missing NOTIFY_TYPES member
Feb 02 09:40:43 compute-1 ceph-mgr[80422]: mgr[py] Loading python module 'k8sevents'
Feb 02 09:40:43 compute-1 ceph-osd[77691]: log_channel(cluster) log [DBG] : 6.11 deep-scrub starts
Feb 02 09:40:43 compute-1 ceph-osd[77691]: log_channel(cluster) log [DBG] : 6.11 deep-scrub ok
Feb 02 09:40:44 compute-1 ceph-mgr[80422]: mgr[py] Loading python module 'localpool'
Feb 02 09:40:44 compute-1 ceph-mgr[80422]: mgr[py] Loading python module 'mds_autoscaler'
Feb 02 09:40:44 compute-1 ceph-mon[80115]: 6.16 scrub starts
Feb 02 09:40:44 compute-1 ceph-mon[80115]: 6.16 scrub ok
Feb 02 09:40:44 compute-1 ceph-mon[80115]: 7.1 deep-scrub starts
Feb 02 09:40:44 compute-1 ceph-mon[80115]: 7.1 deep-scrub ok
Feb 02 09:40:44 compute-1 ceph-mon[80115]: 5.13 scrub starts
Feb 02 09:40:44 compute-1 ceph-mon[80115]: 5.13 scrub ok
Feb 02 09:40:44 compute-1 ceph-mgr[80422]: mgr[py] Loading python module 'mirroring'
Feb 02 09:40:44 compute-1 ceph-mgr[80422]: mgr[py] Loading python module 'nfs'
Feb 02 09:40:44 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:40:44 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:40:44 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:40:44.807 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:40:44 compute-1 ceph-osd[77691]: log_channel(cluster) log [DBG] : 6.10 scrub starts
Feb 02 09:40:44 compute-1 ceph-osd[77691]: log_channel(cluster) log [DBG] : 6.10 scrub ok
Feb 02 09:40:44 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]: 2026-02-02T09:40:44.893+0000 7f947fe91140 -1 mgr[py] Module nfs has missing NOTIFY_TYPES member
Feb 02 09:40:44 compute-1 ceph-mgr[80422]: mgr[py] Module nfs has missing NOTIFY_TYPES member
Feb 02 09:40:44 compute-1 ceph-mgr[80422]: mgr[py] Loading python module 'orchestrator'
Feb 02 09:40:44 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:40:44 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 09:40:44 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:40:44.900 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 09:40:45 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]: 2026-02-02T09:40:45.122+0000 7f947fe91140 -1 mgr[py] Module orchestrator has missing NOTIFY_TYPES member
Feb 02 09:40:45 compute-1 ceph-mgr[80422]: mgr[py] Module orchestrator has missing NOTIFY_TYPES member
Feb 02 09:40:45 compute-1 ceph-mgr[80422]: mgr[py] Loading python module 'osd_perf_query'
Feb 02 09:40:45 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]: 2026-02-02T09:40:45.200+0000 7f947fe91140 -1 mgr[py] Module osd_perf_query has missing NOTIFY_TYPES member
Feb 02 09:40:45 compute-1 ceph-mgr[80422]: mgr[py] Module osd_perf_query has missing NOTIFY_TYPES member
Feb 02 09:40:45 compute-1 ceph-mgr[80422]: mgr[py] Loading python module 'osd_support'
Feb 02 09:40:45 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]: 2026-02-02T09:40:45.269+0000 7f947fe91140 -1 mgr[py] Module osd_support has missing NOTIFY_TYPES member
Feb 02 09:40:45 compute-1 ceph-mgr[80422]: mgr[py] Module osd_support has missing NOTIFY_TYPES member
Feb 02 09:40:45 compute-1 ceph-mgr[80422]: mgr[py] Loading python module 'pg_autoscaler'
Feb 02 09:40:45 compute-1 ceph-mon[80115]: 6.11 deep-scrub starts
Feb 02 09:40:45 compute-1 ceph-mon[80115]: 6.11 deep-scrub ok
Feb 02 09:40:45 compute-1 ceph-mon[80115]: 7.d scrub starts
Feb 02 09:40:45 compute-1 ceph-mon[80115]: 7.d scrub ok
Feb 02 09:40:45 compute-1 ceph-mon[80115]: 5.12 deep-scrub starts
Feb 02 09:40:45 compute-1 ceph-mon[80115]: 5.12 deep-scrub ok
Feb 02 09:40:45 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]: 2026-02-02T09:40:45.354+0000 7f947fe91140 -1 mgr[py] Module pg_autoscaler has missing NOTIFY_TYPES member
Feb 02 09:40:45 compute-1 ceph-mgr[80422]: mgr[py] Module pg_autoscaler has missing NOTIFY_TYPES member
Feb 02 09:40:45 compute-1 ceph-mgr[80422]: mgr[py] Loading python module 'progress'
Feb 02 09:40:45 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]: 2026-02-02T09:40:45.427+0000 7f947fe91140 -1 mgr[py] Module progress has missing NOTIFY_TYPES member
Feb 02 09:40:45 compute-1 ceph-mgr[80422]: mgr[py] Module progress has missing NOTIFY_TYPES member
Feb 02 09:40:45 compute-1 ceph-mgr[80422]: mgr[py] Loading python module 'prometheus'
Feb 02 09:40:45 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e48 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 02 09:40:45 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]: 2026-02-02T09:40:45.757+0000 7f947fe91140 -1 mgr[py] Module prometheus has missing NOTIFY_TYPES member
Feb 02 09:40:45 compute-1 ceph-mgr[80422]: mgr[py] Module prometheus has missing NOTIFY_TYPES member
Feb 02 09:40:45 compute-1 ceph-mgr[80422]: mgr[py] Loading python module 'rbd_support'
Feb 02 09:40:45 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]: 2026-02-02T09:40:45.854+0000 7f947fe91140 -1 mgr[py] Module rbd_support has missing NOTIFY_TYPES member
Feb 02 09:40:45 compute-1 ceph-mgr[80422]: mgr[py] Module rbd_support has missing NOTIFY_TYPES member
Feb 02 09:40:45 compute-1 ceph-mgr[80422]: mgr[py] Loading python module 'restful'
Feb 02 09:40:45 compute-1 ceph-osd[77691]: log_channel(cluster) log [DBG] : 6.13 scrub starts
Feb 02 09:40:45 compute-1 ceph-osd[77691]: log_channel(cluster) log [DBG] : 6.13 scrub ok
Feb 02 09:40:46 compute-1 ceph-mgr[80422]: mgr[py] Loading python module 'rgw'
Feb 02 09:40:46 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]: 2026-02-02T09:40:46.295+0000 7f947fe91140 -1 mgr[py] Module rgw has missing NOTIFY_TYPES member
Feb 02 09:40:46 compute-1 ceph-mgr[80422]: mgr[py] Module rgw has missing NOTIFY_TYPES member
Feb 02 09:40:46 compute-1 ceph-mgr[80422]: mgr[py] Loading python module 'rook'
Feb 02 09:40:46 compute-1 systemd[1]: Stopping User Manager for UID 42477...
Feb 02 09:40:46 compute-1 systemd[72655]: Activating special unit Exit the Session...
Feb 02 09:40:46 compute-1 systemd[72655]: Stopped target Main User Target.
Feb 02 09:40:46 compute-1 systemd[72655]: Stopped target Basic System.
Feb 02 09:40:46 compute-1 systemd[72655]: Stopped target Paths.
Feb 02 09:40:46 compute-1 systemd[72655]: Stopped target Sockets.
Feb 02 09:40:46 compute-1 systemd[72655]: Stopped target Timers.
Feb 02 09:40:46 compute-1 systemd[72655]: Stopped Mark boot as successful after the user session has run 2 minutes.
Feb 02 09:40:46 compute-1 systemd[72655]: Stopped Daily Cleanup of User's Temporary Directories.
Feb 02 09:40:46 compute-1 systemd[72655]: Closed D-Bus User Message Bus Socket.
Feb 02 09:40:46 compute-1 systemd[72655]: Stopped Create User's Volatile Files and Directories.
Feb 02 09:40:46 compute-1 systemd[72655]: Removed slice User Application Slice.
Feb 02 09:40:46 compute-1 systemd[72655]: Reached target Shutdown.
Feb 02 09:40:46 compute-1 systemd[72655]: Finished Exit the Session.
Feb 02 09:40:46 compute-1 systemd[72655]: Reached target Exit the Session.
Feb 02 09:40:46 compute-1 systemd[1]: user@42477.service: Deactivated successfully.
Feb 02 09:40:46 compute-1 systemd[1]: Stopped User Manager for UID 42477.
Feb 02 09:40:46 compute-1 systemd[1]: Stopping User Runtime Directory /run/user/42477...
Feb 02 09:40:46 compute-1 systemd[1]: run-user-42477.mount: Deactivated successfully.
Feb 02 09:40:46 compute-1 systemd[1]: user-runtime-dir@42477.service: Deactivated successfully.
Feb 02 09:40:46 compute-1 ceph-mon[80115]: 6.10 scrub starts
Feb 02 09:40:46 compute-1 systemd[1]: Stopped User Runtime Directory /run/user/42477.
Feb 02 09:40:46 compute-1 ceph-mon[80115]: 6.10 scrub ok
Feb 02 09:40:46 compute-1 ceph-mon[80115]: 7.c scrub starts
Feb 02 09:40:46 compute-1 ceph-mon[80115]: 7.c scrub ok
Feb 02 09:40:46 compute-1 ceph-mon[80115]: 5.1a scrub starts
Feb 02 09:40:46 compute-1 ceph-mon[80115]: 5.1a scrub ok
Feb 02 09:40:46 compute-1 systemd[1]: Removed slice User Slice of UID 42477.
Feb 02 09:40:46 compute-1 systemd[1]: user-42477.slice: Consumed 1min 531ms CPU time.
Feb 02 09:40:46 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:40:46 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:40:46 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:40:46.810 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:40:46 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]: 2026-02-02T09:40:46.878+0000 7f947fe91140 -1 mgr[py] Module rook has missing NOTIFY_TYPES member
Feb 02 09:40:46 compute-1 ceph-mgr[80422]: mgr[py] Module rook has missing NOTIFY_TYPES member
Feb 02 09:40:46 compute-1 ceph-mgr[80422]: mgr[py] Loading python module 'selftest'
Feb 02 09:40:46 compute-1 ceph-osd[77691]: log_channel(cluster) log [DBG] : 6.1d scrub starts
Feb 02 09:40:46 compute-1 ceph-osd[77691]: log_channel(cluster) log [DBG] : 6.1d scrub ok
Feb 02 09:40:46 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:40:46 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:40:46 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:40:46.904 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:40:46 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]: 2026-02-02T09:40:46.964+0000 7f947fe91140 -1 mgr[py] Module selftest has missing NOTIFY_TYPES member
Feb 02 09:40:46 compute-1 ceph-mgr[80422]: mgr[py] Module selftest has missing NOTIFY_TYPES member
Feb 02 09:40:46 compute-1 ceph-mgr[80422]: mgr[py] Loading python module 'snap_schedule'
Feb 02 09:40:47 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]: 2026-02-02T09:40:47.045+0000 7f947fe91140 -1 mgr[py] Module snap_schedule has missing NOTIFY_TYPES member
Feb 02 09:40:47 compute-1 ceph-mgr[80422]: mgr[py] Module snap_schedule has missing NOTIFY_TYPES member
Feb 02 09:40:47 compute-1 ceph-mgr[80422]: mgr[py] Loading python module 'stats'
Feb 02 09:40:47 compute-1 ceph-mgr[80422]: mgr[py] Loading python module 'status'
Feb 02 09:40:47 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]: 2026-02-02T09:40:47.187+0000 7f947fe91140 -1 mgr[py] Module status has missing NOTIFY_TYPES member
Feb 02 09:40:47 compute-1 ceph-mgr[80422]: mgr[py] Module status has missing NOTIFY_TYPES member
Feb 02 09:40:47 compute-1 ceph-mgr[80422]: mgr[py] Loading python module 'telegraf'
Feb 02 09:40:47 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]: 2026-02-02T09:40:47.254+0000 7f947fe91140 -1 mgr[py] Module telegraf has missing NOTIFY_TYPES member
Feb 02 09:40:47 compute-1 ceph-mgr[80422]: mgr[py] Module telegraf has missing NOTIFY_TYPES member
Feb 02 09:40:47 compute-1 ceph-mgr[80422]: mgr[py] Loading python module 'telemetry'
Feb 02 09:40:47 compute-1 ceph-mon[80115]: 6.13 scrub starts
Feb 02 09:40:47 compute-1 ceph-mon[80115]: 6.13 scrub ok
Feb 02 09:40:47 compute-1 ceph-mon[80115]: 7.19 scrub starts
Feb 02 09:40:47 compute-1 ceph-mon[80115]: 7.19 scrub ok
Feb 02 09:40:47 compute-1 ceph-mon[80115]: 6.1b scrub starts
Feb 02 09:40:47 compute-1 ceph-mon[80115]: 6.1b scrub ok
Feb 02 09:40:47 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]: 2026-02-02T09:40:47.405+0000 7f947fe91140 -1 mgr[py] Module telemetry has missing NOTIFY_TYPES member
Feb 02 09:40:47 compute-1 ceph-mgr[80422]: mgr[py] Module telemetry has missing NOTIFY_TYPES member
Feb 02 09:40:47 compute-1 ceph-mgr[80422]: mgr[py] Loading python module 'test_orchestrator'
Feb 02 09:40:47 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e49 e49: 3 total, 3 up, 3 in
Feb 02 09:40:47 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]: 2026-02-02T09:40:47.641+0000 7f947fe91140 -1 mgr[py] Module test_orchestrator has missing NOTIFY_TYPES member
Feb 02 09:40:47 compute-1 ceph-mgr[80422]: mgr[py] Module test_orchestrator has missing NOTIFY_TYPES member
Feb 02 09:40:47 compute-1 ceph-mgr[80422]: mgr[py] Loading python module 'volumes'
Feb 02 09:40:47 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]: 2026-02-02T09:40:47.917+0000 7f947fe91140 -1 mgr[py] Module volumes has missing NOTIFY_TYPES member
Feb 02 09:40:47 compute-1 ceph-mgr[80422]: mgr[py] Module volumes has missing NOTIFY_TYPES member
Feb 02 09:40:47 compute-1 ceph-mgr[80422]: mgr[py] Loading python module 'zabbix'
Feb 02 09:40:47 compute-1 ceph-osd[77691]: log_channel(cluster) log [DBG] : 5.19 scrub starts
Feb 02 09:40:47 compute-1 ceph-osd[77691]: log_channel(cluster) log [DBG] : 5.19 scrub ok
Feb 02 09:40:47 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]: 2026-02-02T09:40:47.991+0000 7f947fe91140 -1 mgr[py] Module zabbix has missing NOTIFY_TYPES member
Feb 02 09:40:47 compute-1 ceph-mgr[80422]: mgr[py] Module zabbix has missing NOTIFY_TYPES member
Feb 02 09:40:47 compute-1 ceph-mgr[80422]: [dashboard DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Feb 02 09:40:47 compute-1 ceph-mgr[80422]: ms_deliver_dispatch: unhandled message 0x55ee1e8816c0 mon_map magic: 0 from mon.2 v2:192.168.122.101:3300/0
Feb 02 09:40:47 compute-1 ceph-mgr[80422]: mgr load Constructed class from module: dashboard
Feb 02 09:40:47 compute-1 ceph-mgr[80422]: [dashboard INFO root] server: ssl=no host=192.168.122.101 port=8443
Feb 02 09:40:47 compute-1 ceph-mgr[80422]: [dashboard INFO root] Configured CherryPy, starting engine...
Feb 02 09:40:47 compute-1 ceph-mgr[80422]: [dashboard INFO root] Starting engine...
Feb 02 09:40:48 compute-1 ceph-mgr[80422]: [dashboard INFO root] Engine started...
Feb 02 09:40:48 compute-1 sshd-session[83533]: Accepted publickey for ceph-admin from 192.168.122.100 port 48194 ssh2: RSA SHA256:U0yYyMay/+pOHGkTC+bWOMAOMtoKtn/A+YnW2fdFMFU
Feb 02 09:40:48 compute-1 systemd[1]: Created slice User Slice of UID 42477.
Feb 02 09:40:48 compute-1 systemd[1]: Starting User Runtime Directory /run/user/42477...
Feb 02 09:40:48 compute-1 systemd-logind[805]: New session 34 of user ceph-admin.
Feb 02 09:40:48 compute-1 systemd[1]: Finished User Runtime Directory /run/user/42477.
Feb 02 09:40:48 compute-1 systemd[1]: Starting User Manager for UID 42477...
Feb 02 09:40:48 compute-1 systemd[83549]: pam_unix(systemd-user:session): session opened for user ceph-admin(uid=42477) by ceph-admin(uid=0)
Feb 02 09:40:48 compute-1 ceph-mon[80115]: 6.1d scrub starts
Feb 02 09:40:48 compute-1 ceph-mon[80115]: 6.1d scrub ok
Feb 02 09:40:48 compute-1 ceph-mon[80115]: 7.1a scrub starts
Feb 02 09:40:48 compute-1 ceph-mon[80115]: 7.1a scrub ok
Feb 02 09:40:48 compute-1 ceph-mon[80115]: 4.19 scrub starts
Feb 02 09:40:48 compute-1 ceph-mon[80115]: 4.19 scrub ok
Feb 02 09:40:48 compute-1 ceph-mon[80115]: Standby manager daemon compute-2.gzlyac restarted
Feb 02 09:40:48 compute-1 ceph-mon[80115]: Standby manager daemon compute-2.gzlyac started
Feb 02 09:40:48 compute-1 ceph-mon[80115]: Active manager daemon compute-0.djvyfo restarted
Feb 02 09:40:48 compute-1 ceph-mon[80115]: Activating manager daemon compute-0.djvyfo
Feb 02 09:40:48 compute-1 ceph-mon[80115]: osdmap e49: 3 total, 3 up, 3 in
Feb 02 09:40:48 compute-1 ceph-mon[80115]: mgrmap e20: compute-0.djvyfo(active, starting, since 0.0410296s), standbys: compute-1.teascl, compute-2.gzlyac
Feb 02 09:40:48 compute-1 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "mon metadata", "id": "compute-0"}]: dispatch
Feb 02 09:40:48 compute-1 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "mon metadata", "id": "compute-1"}]: dispatch
Feb 02 09:40:48 compute-1 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "mon metadata", "id": "compute-2"}]: dispatch
Feb 02 09:40:48 compute-1 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "mgr metadata", "who": "compute-0.djvyfo", "id": "compute-0.djvyfo"}]: dispatch
Feb 02 09:40:48 compute-1 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "mgr metadata", "who": "compute-1.teascl", "id": "compute-1.teascl"}]: dispatch
Feb 02 09:40:48 compute-1 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "mgr metadata", "who": "compute-2.gzlyac", "id": "compute-2.gzlyac"}]: dispatch
Feb 02 09:40:48 compute-1 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd metadata", "id": 0}]: dispatch
Feb 02 09:40:48 compute-1 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch
Feb 02 09:40:48 compute-1 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch
Feb 02 09:40:48 compute-1 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "mds metadata"}]: dispatch
Feb 02 09:40:48 compute-1 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd metadata"}]: dispatch
Feb 02 09:40:48 compute-1 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "mon metadata"}]: dispatch
Feb 02 09:40:48 compute-1 ceph-mon[80115]: Manager daemon compute-0.djvyfo is now available
Feb 02 09:40:48 compute-1 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/compute-0.djvyfo/mirror_snapshot_schedule"}]: dispatch
Feb 02 09:40:48 compute-1 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/compute-0.djvyfo/trash_purge_schedule"}]: dispatch
Feb 02 09:40:48 compute-1 ceph-mon[80115]: Standby manager daemon compute-1.teascl restarted
Feb 02 09:40:48 compute-1 ceph-mon[80115]: Standby manager daemon compute-1.teascl started
Feb 02 09:40:48 compute-1 systemd[83549]: Queued start job for default target Main User Target.
Feb 02 09:40:48 compute-1 systemd[83549]: Created slice User Application Slice.
Feb 02 09:40:48 compute-1 systemd[83549]: Started Mark boot as successful after the user session has run 2 minutes.
Feb 02 09:40:48 compute-1 systemd[83549]: Started Daily Cleanup of User's Temporary Directories.
Feb 02 09:40:48 compute-1 systemd[83549]: Reached target Paths.
Feb 02 09:40:48 compute-1 systemd[83549]: Reached target Timers.
Feb 02 09:40:48 compute-1 systemd[83549]: Starting D-Bus User Message Bus Socket...
Feb 02 09:40:48 compute-1 systemd[83549]: Starting Create User's Volatile Files and Directories...
Feb 02 09:40:48 compute-1 systemd[83549]: Listening on D-Bus User Message Bus Socket.
Feb 02 09:40:48 compute-1 systemd[83549]: Reached target Sockets.
Feb 02 09:40:48 compute-1 systemd[83549]: Finished Create User's Volatile Files and Directories.
Feb 02 09:40:48 compute-1 systemd[83549]: Reached target Basic System.
Feb 02 09:40:48 compute-1 systemd[83549]: Reached target Main User Target.
Feb 02 09:40:48 compute-1 systemd[83549]: Startup finished in 119ms.
Feb 02 09:40:48 compute-1 systemd[1]: Started User Manager for UID 42477.
Feb 02 09:40:48 compute-1 systemd[1]: Started Session 34 of User ceph-admin.
Feb 02 09:40:48 compute-1 sshd-session[83533]: pam_unix(sshd:session): session opened for user ceph-admin(uid=42477) by ceph-admin(uid=0)
Feb 02 09:40:48 compute-1 sudo[83565]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 02 09:40:48 compute-1 sudo[83565]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:40:48 compute-1 sudo[83565]: pam_unix(sudo:session): session closed for user root
Feb 02 09:40:48 compute-1 sudo[83590]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/d241d473-9fcb-5f74-b163-f1ca4454e7f1/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --image quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec --timeout 895 ls
Feb 02 09:40:48 compute-1 sudo[83590]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:40:48 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).mds e2 new map
Feb 02 09:40:48 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).mds e2 print_map
                                           e2
                                           btime 2026-02-02T09:40:48:656641+0000
                                           enable_multiple, ever_enabled_multiple: 1,1
                                           default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}
                                           legacy client fscid: 1
                                            
                                           Filesystem 'cephfs' (1)
                                           fs_name        cephfs
                                           epoch        2
                                           flags        12 joinable allow_snaps allow_multimds_snaps
                                           created        2026-02-02T09:40:48.656583+0000
                                           modified        2026-02-02T09:40:48.656583+0000
                                           tableserver        0
                                           root        0
                                           session_timeout        60
                                           session_autoclose        300
                                           max_file_size        1099511627776
                                           max_xattr_size        65536
                                           required_client_features        {}
                                           last_failure        0
                                           last_failure_osd_epoch        0
                                           compat        compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}
                                           max_mds        1
                                           in        
                                           up        {}
                                           failed        
                                           damaged        
                                           stopped        
                                           data_pools        [7]
                                           metadata_pool        6
                                           inline_data        disabled
                                           balancer        
                                           bal_rank_mask        -1
                                           standby_count_wanted        0
                                           qdb_cluster        leader: 0 members: 
                                            
                                            
Feb 02 09:40:48 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e50 e50: 3 total, 3 up, 3 in
Feb 02 09:40:48 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:40:48 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:40:48 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:40:48.813 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:40:48 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:40:48 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 09:40:48 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:40:48.906 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 09:40:48 compute-1 ceph-osd[77691]: log_channel(cluster) log [DBG] : 7.1b scrub starts
Feb 02 09:40:48 compute-1 ceph-osd[77691]: log_channel(cluster) log [DBG] : 7.1b scrub ok
Feb 02 09:40:49 compute-1 podman[83687]: 2026-02-02 09:40:49.193362615 +0000 UTC m=+0.073392480 container exec 01cf0f34952fee77036b3444f3bcffc4063a933b24861a159799fb77cf2e6e0e (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-crash-compute-1, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS)
Feb 02 09:40:49 compute-1 podman[83687]: 2026-02-02 09:40:49.287017481 +0000 UTC m=+0.167047326 container exec_died 01cf0f34952fee77036b3444f3bcffc4063a933b24861a159799fb77cf2e6e0e (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-crash-compute-1, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/)
Feb 02 09:40:49 compute-1 ceph-mon[80115]: 5.19 scrub starts
Feb 02 09:40:49 compute-1 ceph-mon[80115]: 5.19 scrub ok
Feb 02 09:40:49 compute-1 ceph-mon[80115]: 6.19 deep-scrub starts
Feb 02 09:40:49 compute-1 ceph-mon[80115]: 6.19 deep-scrub ok
Feb 02 09:40:49 compute-1 ceph-mon[80115]: 6.12 deep-scrub starts
Feb 02 09:40:49 compute-1 ceph-mon[80115]: 6.12 deep-scrub ok
Feb 02 09:40:49 compute-1 ceph-mon[80115]: mgrmap e21: compute-0.djvyfo(active, since 1.07277s), standbys: compute-1.teascl, compute-2.gzlyac
Feb 02 09:40:49 compute-1 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd pool create", "pool": "cephfs.cephfs.meta"}]: dispatch
Feb 02 09:40:49 compute-1 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' cmd=[{"bulk": true, "prefix": "osd pool create", "pool": "cephfs.cephfs.data"}]: dispatch
Feb 02 09:40:49 compute-1 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "fs new", "fs_name": "cephfs", "metadata": "cephfs.cephfs.meta", "data": "cephfs.cephfs.data"}]: dispatch
Feb 02 09:40:49 compute-1 ceph-mon[80115]: Health check failed: 1 filesystem is offline (MDS_ALL_DOWN)
Feb 02 09:40:49 compute-1 ceph-mon[80115]: Health check failed: 1 filesystem is online with fewer MDS than max_mds (MDS_UP_LESS_THAN_MAX)
Feb 02 09:40:49 compute-1 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' cmd='[{"prefix": "fs new", "fs_name": "cephfs", "metadata": "cephfs.cephfs.meta", "data": "cephfs.cephfs.data"}]': finished
Feb 02 09:40:49 compute-1 ceph-mon[80115]: osdmap e50: 3 total, 3 up, 3 in
Feb 02 09:40:49 compute-1 ceph-mon[80115]: fsmap cephfs:0
Feb 02 09:40:49 compute-1 ceph-mon[80115]: Saving service mds.cephfs spec with placement compute-0;compute-1;compute-2
Feb 02 09:40:49 compute-1 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' 
Feb 02 09:40:49 compute-1 sudo[83590]: pam_unix(sudo:session): session closed for user root
Feb 02 09:40:49 compute-1 sudo[83793]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 02 09:40:49 compute-1 sudo[83793]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:40:49 compute-1 sudo[83793]: pam_unix(sudo:session): session closed for user root
Feb 02 09:40:49 compute-1 sudo[83818]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/d241d473-9fcb-5f74-b163-f1ca4454e7f1/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Feb 02 09:40:49 compute-1 sudo[83818]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:40:49 compute-1 ceph-osd[77691]: log_channel(cluster) log [DBG] : 5.1d scrub starts
Feb 02 09:40:49 compute-1 ceph-osd[77691]: log_channel(cluster) log [DBG] : 5.1d scrub ok
Feb 02 09:40:50 compute-1 sudo[83818]: pam_unix(sudo:session): session closed for user root
Feb 02 09:40:50 compute-1 sudo[83874]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 02 09:40:50 compute-1 sudo[83874]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:40:50 compute-1 sudo[83874]: pam_unix(sudo:session): session closed for user root
Feb 02 09:40:50 compute-1 sudo[83899]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/d241d473-9fcb-5f74-b163-f1ca4454e7f1/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --image quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec --timeout 895 list-networks
Feb 02 09:40:50 compute-1 sudo[83899]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:40:50 compute-1 ceph-mon[80115]: 7.1b scrub starts
Feb 02 09:40:50 compute-1 ceph-mon[80115]: 7.1b scrub ok
Feb 02 09:40:50 compute-1 ceph-mon[80115]: 4.1b scrub starts
Feb 02 09:40:50 compute-1 ceph-mon[80115]: 4.1b scrub ok
Feb 02 09:40:50 compute-1 ceph-mon[80115]: 4.3 scrub starts
Feb 02 09:40:50 compute-1 ceph-mon[80115]: 4.3 scrub ok
Feb 02 09:40:50 compute-1 ceph-mon[80115]: [02/Feb/2026:09:40:49] ENGINE Bus STARTING
Feb 02 09:40:50 compute-1 ceph-mon[80115]: [02/Feb/2026:09:40:49] ENGINE Serving on https://192.168.122.100:7150
Feb 02 09:40:50 compute-1 ceph-mon[80115]: [02/Feb/2026:09:40:49] ENGINE Client ('192.168.122.100', 55968) lost — peer dropped the TLS connection suddenly, during handshake: (6, 'TLS/SSL connection has been closed (EOF) (_ssl.c:1147)')
Feb 02 09:40:50 compute-1 ceph-mon[80115]: from='client.14538 -' entity='client.admin' cmd=[{"prefix": "orch apply", "target": ["mon-mgr", ""]}]: dispatch
Feb 02 09:40:50 compute-1 ceph-mon[80115]: Saving service mds.cephfs spec with placement compute-0;compute-1;compute-2
Feb 02 09:40:50 compute-1 ceph-mon[80115]: pgmap v5: 197 pgs: 197 active+clean; 454 KiB data, 85 MiB used, 60 GiB / 60 GiB avail
Feb 02 09:40:50 compute-1 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' 
Feb 02 09:40:50 compute-1 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' 
Feb 02 09:40:50 compute-1 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' 
Feb 02 09:40:50 compute-1 ceph-mon[80115]: [02/Feb/2026:09:40:49] ENGINE Serving on http://192.168.122.100:8765
Feb 02 09:40:50 compute-1 ceph-mon[80115]: [02/Feb/2026:09:40:49] ENGINE Bus STARTED
Feb 02 09:40:50 compute-1 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' 
Feb 02 09:40:50 compute-1 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' 
Feb 02 09:40:50 compute-1 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' 
Feb 02 09:40:50 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e50 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 02 09:40:50 compute-1 sudo[83899]: pam_unix(sudo:session): session closed for user root
Feb 02 09:40:50 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:40:50 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:40:50 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:40:50.815 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:40:50 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:40:50 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:40:50 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:40:50.910 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:40:50 compute-1 ceph-osd[77691]: log_channel(cluster) log [DBG] : 5.5 scrub starts
Feb 02 09:40:50 compute-1 ceph-osd[77691]: log_channel(cluster) log [DBG] : 5.5 scrub ok
Feb 02 09:40:51 compute-1 ceph-mon[80115]: 5.1d scrub starts
Feb 02 09:40:51 compute-1 ceph-mon[80115]: 5.1d scrub ok
Feb 02 09:40:51 compute-1 ceph-mon[80115]: 4.18 scrub starts
Feb 02 09:40:51 compute-1 ceph-mon[80115]: 4.18 scrub ok
Feb 02 09:40:51 compute-1 ceph-mon[80115]: 4.1d scrub starts
Feb 02 09:40:51 compute-1 ceph-mon[80115]: 4.1d scrub ok
Feb 02 09:40:51 compute-1 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' 
Feb 02 09:40:51 compute-1 ceph-mon[80115]: mgrmap e22: compute-0.djvyfo(active, since 2s), standbys: compute-1.teascl, compute-2.gzlyac
Feb 02 09:40:51 compute-1 ceph-mon[80115]: from='client.14547 -' entity='client.admin' cmd=[{"prefix": "nfs cluster create", "cluster_id": "cephfs", "ingress": true, "virtual_ip": "192.168.122.2/24", "ingress_mode": "haproxy-protocol", "placement": "compute-0 compute-1 compute-2 ", "target": ["mon-mgr", ""]}]: dispatch
Feb 02 09:40:51 compute-1 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd pool create", "pool": ".nfs", "yes_i_really_mean_it": true}]: dispatch
Feb 02 09:40:51 compute-1 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' 
Feb 02 09:40:51 compute-1 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' 
Feb 02 09:40:51 compute-1 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "config rm", "who": "osd/host:compute-1", "name": "osd_memory_target"}]: dispatch
Feb 02 09:40:51 compute-1 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' 
Feb 02 09:40:51 compute-1 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' 
Feb 02 09:40:51 compute-1 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "config rm", "who": "osd/host:compute-2", "name": "osd_memory_target"}]: dispatch
Feb 02 09:40:51 compute-1 sudo[83942]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /etc/ceph
Feb 02 09:40:51 compute-1 sudo[83942]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:40:51 compute-1 sudo[83942]: pam_unix(sudo:session): session closed for user root
Feb 02 09:40:51 compute-1 sudo[83967]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-d241d473-9fcb-5f74-b163-f1ca4454e7f1/etc/ceph
Feb 02 09:40:51 compute-1 sudo[83967]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:40:51 compute-1 sudo[83967]: pam_unix(sudo:session): session closed for user root
Feb 02 09:40:51 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e51 e51: 3 total, 3 up, 3 in
Feb 02 09:40:51 compute-1 sudo[83992]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-d241d473-9fcb-5f74-b163-f1ca4454e7f1/etc/ceph/ceph.conf.new
Feb 02 09:40:51 compute-1 sudo[83992]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:40:51 compute-1 sudo[83992]: pam_unix(sudo:session): session closed for user root
Feb 02 09:40:51 compute-1 sudo[84017]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-d241d473-9fcb-5f74-b163-f1ca4454e7f1
Feb 02 09:40:51 compute-1 sudo[84017]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:40:51 compute-1 sudo[84017]: pam_unix(sudo:session): session closed for user root
Feb 02 09:40:51 compute-1 sudo[84042]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-d241d473-9fcb-5f74-b163-f1ca4454e7f1/etc/ceph/ceph.conf.new
Feb 02 09:40:51 compute-1 sudo[84042]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:40:51 compute-1 sudo[84042]: pam_unix(sudo:session): session closed for user root
Feb 02 09:40:51 compute-1 ceph-osd[77691]: log_channel(cluster) log [DBG] : 3.4 scrub starts
Feb 02 09:40:51 compute-1 sudo[84090]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-d241d473-9fcb-5f74-b163-f1ca4454e7f1/etc/ceph/ceph.conf.new
Feb 02 09:40:51 compute-1 ceph-osd[77691]: log_channel(cluster) log [DBG] : 3.4 scrub ok
Feb 02 09:40:51 compute-1 sudo[84090]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:40:51 compute-1 sudo[84090]: pam_unix(sudo:session): session closed for user root
Feb 02 09:40:51 compute-1 sudo[84115]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-d241d473-9fcb-5f74-b163-f1ca4454e7f1/etc/ceph/ceph.conf.new
Feb 02 09:40:51 compute-1 sudo[84115]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:40:51 compute-1 sudo[84115]: pam_unix(sudo:session): session closed for user root
Feb 02 09:40:52 compute-1 sudo[84140]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-d241d473-9fcb-5f74-b163-f1ca4454e7f1/etc/ceph/ceph.conf.new /etc/ceph/ceph.conf
Feb 02 09:40:52 compute-1 sudo[84140]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:40:52 compute-1 sudo[84140]: pam_unix(sudo:session): session closed for user root
Feb 02 09:40:52 compute-1 sudo[84165]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/d241d473-9fcb-5f74-b163-f1ca4454e7f1/config
Feb 02 09:40:52 compute-1 sudo[84165]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:40:52 compute-1 sudo[84165]: pam_unix(sudo:session): session closed for user root
Feb 02 09:40:52 compute-1 sudo[84190]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-d241d473-9fcb-5f74-b163-f1ca4454e7f1/var/lib/ceph/d241d473-9fcb-5f74-b163-f1ca4454e7f1/config
Feb 02 09:40:52 compute-1 sudo[84190]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:40:52 compute-1 sudo[84190]: pam_unix(sudo:session): session closed for user root
Feb 02 09:40:52 compute-1 sudo[84215]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-d241d473-9fcb-5f74-b163-f1ca4454e7f1/var/lib/ceph/d241d473-9fcb-5f74-b163-f1ca4454e7f1/config/ceph.conf.new
Feb 02 09:40:52 compute-1 sudo[84215]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:40:52 compute-1 sudo[84215]: pam_unix(sudo:session): session closed for user root
Feb 02 09:40:52 compute-1 sudo[84240]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-d241d473-9fcb-5f74-b163-f1ca4454e7f1
Feb 02 09:40:52 compute-1 sudo[84240]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:40:52 compute-1 sudo[84240]: pam_unix(sudo:session): session closed for user root
Feb 02 09:40:52 compute-1 sudo[84265]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-d241d473-9fcb-5f74-b163-f1ca4454e7f1/var/lib/ceph/d241d473-9fcb-5f74-b163-f1ca4454e7f1/config/ceph.conf.new
Feb 02 09:40:52 compute-1 sudo[84265]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:40:52 compute-1 sudo[84265]: pam_unix(sudo:session): session closed for user root
Feb 02 09:40:52 compute-1 sudo[84313]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-d241d473-9fcb-5f74-b163-f1ca4454e7f1/var/lib/ceph/d241d473-9fcb-5f74-b163-f1ca4454e7f1/config/ceph.conf.new
Feb 02 09:40:52 compute-1 sudo[84313]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:40:52 compute-1 sudo[84313]: pam_unix(sudo:session): session closed for user root
Feb 02 09:40:52 compute-1 ceph-mon[80115]: 5.5 scrub starts
Feb 02 09:40:52 compute-1 ceph-mon[80115]: 5.5 scrub ok
Feb 02 09:40:52 compute-1 ceph-mon[80115]: 5.18 deep-scrub starts
Feb 02 09:40:52 compute-1 ceph-mon[80115]: 5.18 deep-scrub ok
Feb 02 09:40:52 compute-1 ceph-mon[80115]: 2.18 scrub starts
Feb 02 09:40:52 compute-1 ceph-mon[80115]: 2.18 scrub ok
Feb 02 09:40:52 compute-1 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' 
Feb 02 09:40:52 compute-1 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' 
Feb 02 09:40:52 compute-1 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "config rm", "who": "osd/host:compute-0", "name": "osd_memory_target"}]: dispatch
Feb 02 09:40:52 compute-1 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Feb 02 09:40:52 compute-1 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Feb 02 09:40:52 compute-1 ceph-mon[80115]: Updating compute-0:/etc/ceph/ceph.conf
Feb 02 09:40:52 compute-1 ceph-mon[80115]: Updating compute-1:/etc/ceph/ceph.conf
Feb 02 09:40:52 compute-1 ceph-mon[80115]: Updating compute-2:/etc/ceph/ceph.conf
Feb 02 09:40:52 compute-1 ceph-mon[80115]: pgmap v6: 197 pgs: 197 active+clean; 454 KiB data, 85 MiB used, 60 GiB / 60 GiB avail
Feb 02 09:40:52 compute-1 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' cmd='[{"prefix": "osd pool create", "pool": ".nfs", "yes_i_really_mean_it": true}]': finished
Feb 02 09:40:52 compute-1 ceph-mon[80115]: osdmap e51: 3 total, 3 up, 3 in
Feb 02 09:40:52 compute-1 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd pool application enable", "pool": ".nfs", "app": "nfs"}]: dispatch
Feb 02 09:40:52 compute-1 ceph-mon[80115]: Updating compute-2:/var/lib/ceph/d241d473-9fcb-5f74-b163-f1ca4454e7f1/config/ceph.conf
Feb 02 09:40:52 compute-1 ceph-mon[80115]: Updating compute-1:/var/lib/ceph/d241d473-9fcb-5f74-b163-f1ca4454e7f1/config/ceph.conf
Feb 02 09:40:52 compute-1 sudo[84338]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-d241d473-9fcb-5f74-b163-f1ca4454e7f1/var/lib/ceph/d241d473-9fcb-5f74-b163-f1ca4454e7f1/config/ceph.conf.new
Feb 02 09:40:52 compute-1 sudo[84338]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:40:52 compute-1 sudo[84338]: pam_unix(sudo:session): session closed for user root
Feb 02 09:40:52 compute-1 sudo[84363]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-d241d473-9fcb-5f74-b163-f1ca4454e7f1/var/lib/ceph/d241d473-9fcb-5f74-b163-f1ca4454e7f1/config/ceph.conf.new /var/lib/ceph/d241d473-9fcb-5f74-b163-f1ca4454e7f1/config/ceph.conf
Feb 02 09:40:52 compute-1 sudo[84363]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:40:52 compute-1 sudo[84363]: pam_unix(sudo:session): session closed for user root
Feb 02 09:40:52 compute-1 sudo[84388]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /etc/ceph
Feb 02 09:40:52 compute-1 sudo[84388]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:40:52 compute-1 sudo[84388]: pam_unix(sudo:session): session closed for user root
Feb 02 09:40:52 compute-1 sudo[84413]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-d241d473-9fcb-5f74-b163-f1ca4454e7f1/etc/ceph
Feb 02 09:40:52 compute-1 sudo[84413]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:40:52 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e52 e52: 3 total, 3 up, 3 in
Feb 02 09:40:52 compute-1 sudo[84413]: pam_unix(sudo:session): session closed for user root
Feb 02 09:40:52 compute-1 sudo[84438]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-d241d473-9fcb-5f74-b163-f1ca4454e7f1/etc/ceph/ceph.client.admin.keyring.new
Feb 02 09:40:52 compute-1 sudo[84438]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:40:52 compute-1 sudo[84438]: pam_unix(sudo:session): session closed for user root
Feb 02 09:40:52 compute-1 sudo[84463]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-d241d473-9fcb-5f74-b163-f1ca4454e7f1
Feb 02 09:40:52 compute-1 sudo[84463]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:40:52 compute-1 sudo[84463]: pam_unix(sudo:session): session closed for user root
Feb 02 09:40:52 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:40:52 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:40:52 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:40:52.818 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:40:52 compute-1 sudo[84488]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-d241d473-9fcb-5f74-b163-f1ca4454e7f1/etc/ceph/ceph.client.admin.keyring.new
Feb 02 09:40:52 compute-1 sudo[84488]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:40:52 compute-1 sudo[84488]: pam_unix(sudo:session): session closed for user root
Feb 02 09:40:52 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:40:52 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:40:52 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:40:52.912 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:40:52 compute-1 ceph-osd[77691]: log_channel(cluster) log [DBG] : 5.3 scrub starts
Feb 02 09:40:52 compute-1 ceph-osd[77691]: log_channel(cluster) log [DBG] : 5.3 scrub ok
Feb 02 09:40:52 compute-1 sudo[84536]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-d241d473-9fcb-5f74-b163-f1ca4454e7f1/etc/ceph/ceph.client.admin.keyring.new
Feb 02 09:40:52 compute-1 sudo[84536]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:40:52 compute-1 sudo[84536]: pam_unix(sudo:session): session closed for user root
Feb 02 09:40:53 compute-1 sudo[84561]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 600 /tmp/cephadm-d241d473-9fcb-5f74-b163-f1ca4454e7f1/etc/ceph/ceph.client.admin.keyring.new
Feb 02 09:40:53 compute-1 sudo[84561]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:40:53 compute-1 sudo[84561]: pam_unix(sudo:session): session closed for user root
Feb 02 09:40:53 compute-1 sudo[84586]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-d241d473-9fcb-5f74-b163-f1ca4454e7f1/etc/ceph/ceph.client.admin.keyring.new /etc/ceph/ceph.client.admin.keyring
Feb 02 09:40:53 compute-1 sudo[84586]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:40:53 compute-1 sudo[84586]: pam_unix(sudo:session): session closed for user root
Feb 02 09:40:53 compute-1 sudo[84611]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/d241d473-9fcb-5f74-b163-f1ca4454e7f1/config
Feb 02 09:40:53 compute-1 sudo[84611]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:40:53 compute-1 sudo[84611]: pam_unix(sudo:session): session closed for user root
Feb 02 09:40:53 compute-1 sudo[84636]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-d241d473-9fcb-5f74-b163-f1ca4454e7f1/var/lib/ceph/d241d473-9fcb-5f74-b163-f1ca4454e7f1/config
Feb 02 09:40:53 compute-1 sudo[84636]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:40:53 compute-1 sudo[84636]: pam_unix(sudo:session): session closed for user root
Feb 02 09:40:53 compute-1 sudo[84661]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-d241d473-9fcb-5f74-b163-f1ca4454e7f1/var/lib/ceph/d241d473-9fcb-5f74-b163-f1ca4454e7f1/config/ceph.client.admin.keyring.new
Feb 02 09:40:53 compute-1 sudo[84661]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:40:53 compute-1 sudo[84661]: pam_unix(sudo:session): session closed for user root
Feb 02 09:40:53 compute-1 sudo[84686]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-d241d473-9fcb-5f74-b163-f1ca4454e7f1
Feb 02 09:40:53 compute-1 sudo[84686]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:40:53 compute-1 sudo[84686]: pam_unix(sudo:session): session closed for user root
Feb 02 09:40:53 compute-1 sudo[84711]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-d241d473-9fcb-5f74-b163-f1ca4454e7f1/var/lib/ceph/d241d473-9fcb-5f74-b163-f1ca4454e7f1/config/ceph.client.admin.keyring.new
Feb 02 09:40:53 compute-1 sudo[84711]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:40:53 compute-1 sudo[84711]: pam_unix(sudo:session): session closed for user root
Feb 02 09:40:53 compute-1 sudo[84759]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-d241d473-9fcb-5f74-b163-f1ca4454e7f1/var/lib/ceph/d241d473-9fcb-5f74-b163-f1ca4454e7f1/config/ceph.client.admin.keyring.new
Feb 02 09:40:53 compute-1 sudo[84759]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:40:53 compute-1 sudo[84759]: pam_unix(sudo:session): session closed for user root
Feb 02 09:40:53 compute-1 ceph-mon[80115]: 3.4 scrub starts
Feb 02 09:40:53 compute-1 ceph-mon[80115]: 3.4 scrub ok
Feb 02 09:40:53 compute-1 ceph-mon[80115]: 4.1a scrub starts
Feb 02 09:40:53 compute-1 ceph-mon[80115]: 4.1a scrub ok
Feb 02 09:40:53 compute-1 ceph-mon[80115]: Updating compute-0:/var/lib/ceph/d241d473-9fcb-5f74-b163-f1ca4454e7f1/config/ceph.conf
Feb 02 09:40:53 compute-1 ceph-mon[80115]: Updating compute-2:/etc/ceph/ceph.client.admin.keyring
Feb 02 09:40:53 compute-1 ceph-mon[80115]: 2.15 scrub starts
Feb 02 09:40:53 compute-1 ceph-mon[80115]: 2.15 scrub ok
Feb 02 09:40:53 compute-1 ceph-mon[80115]: mgrmap e23: compute-0.djvyfo(active, since 4s), standbys: compute-1.teascl, compute-2.gzlyac
Feb 02 09:40:53 compute-1 ceph-mon[80115]: Updating compute-1:/etc/ceph/ceph.client.admin.keyring
Feb 02 09:40:53 compute-1 ceph-mon[80115]: Updating compute-0:/etc/ceph/ceph.client.admin.keyring
Feb 02 09:40:53 compute-1 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' cmd='[{"prefix": "osd pool application enable", "pool": ".nfs", "app": "nfs"}]': finished
Feb 02 09:40:53 compute-1 ceph-mon[80115]: osdmap e52: 3 total, 3 up, 3 in
Feb 02 09:40:53 compute-1 ceph-mon[80115]: Saving service nfs.cephfs spec with placement compute-0;compute-1;compute-2
Feb 02 09:40:53 compute-1 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' 
Feb 02 09:40:53 compute-1 ceph-mon[80115]: Saving service ingress.nfs.cephfs spec with placement compute-0;compute-1;compute-2
Feb 02 09:40:53 compute-1 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' 
Feb 02 09:40:53 compute-1 ceph-mon[80115]: Updating compute-2:/var/lib/ceph/d241d473-9fcb-5f74-b163-f1ca4454e7f1/config/ceph.client.admin.keyring
Feb 02 09:40:53 compute-1 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' 
Feb 02 09:40:53 compute-1 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' 
Feb 02 09:40:53 compute-1 sudo[84784]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 600 /tmp/cephadm-d241d473-9fcb-5f74-b163-f1ca4454e7f1/var/lib/ceph/d241d473-9fcb-5f74-b163-f1ca4454e7f1/config/ceph.client.admin.keyring.new
Feb 02 09:40:53 compute-1 sudo[84784]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:40:53 compute-1 sudo[84784]: pam_unix(sudo:session): session closed for user root
Feb 02 09:40:53 compute-1 sudo[84809]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-d241d473-9fcb-5f74-b163-f1ca4454e7f1/var/lib/ceph/d241d473-9fcb-5f74-b163-f1ca4454e7f1/config/ceph.client.admin.keyring.new /var/lib/ceph/d241d473-9fcb-5f74-b163-f1ca4454e7f1/config/ceph.client.admin.keyring
Feb 02 09:40:53 compute-1 sudo[84809]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:40:53 compute-1 sudo[84809]: pam_unix(sudo:session): session closed for user root
Feb 02 09:40:53 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e53 e53: 3 total, 3 up, 3 in
Feb 02 09:40:53 compute-1 sudo[84834]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 02 09:40:53 compute-1 sudo[84834]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:40:53 compute-1 sudo[84834]: pam_unix(sudo:session): session closed for user root
Feb 02 09:40:53 compute-1 sudo[84859]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/d241d473-9fcb-5f74-b163-f1ca4454e7f1/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --image quay.io/prometheus/node-exporter:v1.7.0 --timeout 895 _orch deploy --fsid d241d473-9fcb-5f74-b163-f1ca4454e7f1
Feb 02 09:40:53 compute-1 sudo[84859]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:40:53 compute-1 ceph-osd[77691]: log_channel(cluster) log [DBG] : 3.1 scrub starts
Feb 02 09:40:53 compute-1 ceph-osd[77691]: log_channel(cluster) log [DBG] : 3.1 scrub ok
Feb 02 09:40:54 compute-1 systemd[1]: Reloading.
Feb 02 09:40:54 compute-1 systemd-sysv-generator[84952]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 02 09:40:54 compute-1 systemd-rc-local-generator[84943]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 02 09:40:54 compute-1 systemd[1]: Reloading.
Feb 02 09:40:54 compute-1 ceph-mon[80115]: 5.3 scrub starts
Feb 02 09:40:54 compute-1 ceph-mon[80115]: 5.3 scrub ok
Feb 02 09:40:54 compute-1 ceph-mon[80115]: 5.1b scrub starts
Feb 02 09:40:54 compute-1 ceph-mon[80115]: 5.1b scrub ok
Feb 02 09:40:54 compute-1 ceph-mon[80115]: Updating compute-1:/var/lib/ceph/d241d473-9fcb-5f74-b163-f1ca4454e7f1/config/ceph.client.admin.keyring
Feb 02 09:40:54 compute-1 ceph-mon[80115]: Updating compute-0:/var/lib/ceph/d241d473-9fcb-5f74-b163-f1ca4454e7f1/config/ceph.client.admin.keyring
Feb 02 09:40:54 compute-1 ceph-mon[80115]: 7.1f scrub starts
Feb 02 09:40:54 compute-1 ceph-mon[80115]: 7.1f scrub ok
Feb 02 09:40:54 compute-1 ceph-mon[80115]: pgmap v9: 198 pgs: 1 unknown, 197 active+clean; 454 KiB data, 85 MiB used, 60 GiB / 60 GiB avail
Feb 02 09:40:54 compute-1 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' 
Feb 02 09:40:54 compute-1 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' 
Feb 02 09:40:54 compute-1 ceph-mon[80115]: osdmap e53: 3 total, 3 up, 3 in
Feb 02 09:40:54 compute-1 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' 
Feb 02 09:40:54 compute-1 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' 
Feb 02 09:40:54 compute-1 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' 
Feb 02 09:40:54 compute-1 ceph-mon[80115]: Deploying daemon node-exporter.compute-1 on compute-1
Feb 02 09:40:54 compute-1 ceph-mon[80115]: 6.1e scrub starts
Feb 02 09:40:54 compute-1 ceph-mon[80115]: 6.1e scrub ok
Feb 02 09:40:54 compute-1 systemd-rc-local-generator[84995]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 02 09:40:54 compute-1 systemd-sysv-generator[84998]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 02 09:40:54 compute-1 systemd[1]: Starting Ceph node-exporter.compute-1 for d241d473-9fcb-5f74-b163-f1ca4454e7f1...
Feb 02 09:40:54 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:40:54 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 09:40:54 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:40:54.821 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 09:40:54 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:40:54 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:40:54 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:40:54.915 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:40:54 compute-1 ceph-osd[77691]: log_channel(cluster) log [DBG] : 5.6 deep-scrub starts
Feb 02 09:40:54 compute-1 ceph-osd[77691]: log_channel(cluster) log [DBG] : 5.6 deep-scrub ok
Feb 02 09:40:54 compute-1 bash[85049]: Trying to pull quay.io/prometheus/node-exporter:v1.7.0...
Feb 02 09:40:55 compute-1 bash[85049]: Getting image source signatures
Feb 02 09:40:55 compute-1 bash[85049]: Copying blob sha256:324153f2810a9927fcce320af9e4e291e0b6e805cbdd1f338386c756b9defa24
Feb 02 09:40:55 compute-1 bash[85049]: Copying blob sha256:2abcce694348cd2c949c0e98a7400ebdfd8341021bcf6b541bc72033ce982510
Feb 02 09:40:55 compute-1 bash[85049]: Copying blob sha256:455fd88e5221bc1e278ef2d059cd70e4df99a24e5af050ede621534276f6cf9a
Feb 02 09:40:55 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e53 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 02 09:40:55 compute-1 ceph-mon[80115]: 3.1 scrub starts
Feb 02 09:40:55 compute-1 ceph-mon[80115]: 3.1 scrub ok
Feb 02 09:40:55 compute-1 ceph-mon[80115]: 4.c scrub starts
Feb 02 09:40:55 compute-1 ceph-mon[80115]: 4.c scrub ok
Feb 02 09:40:55 compute-1 ceph-mon[80115]: mgrmap e24: compute-0.djvyfo(active, since 7s), standbys: compute-1.teascl, compute-2.gzlyac
Feb 02 09:40:55 compute-1 ceph-mon[80115]: from='client.? 192.168.122.100:0/1616834281' entity='client.admin' cmd=[{"prefix": "auth import"}]: dispatch
Feb 02 09:40:55 compute-1 ceph-mon[80115]: from='client.? 192.168.122.100:0/1616834281' entity='client.admin' cmd='[{"prefix": "auth import"}]': finished
Feb 02 09:40:55 compute-1 ceph-mon[80115]: 6.1 scrub starts
Feb 02 09:40:55 compute-1 ceph-mon[80115]: 6.1 scrub ok
Feb 02 09:40:55 compute-1 ceph-osd[77691]: log_channel(cluster) log [DBG] : 5.c scrub starts
Feb 02 09:40:55 compute-1 ceph-osd[77691]: log_channel(cluster) log [DBG] : 5.c scrub ok
Feb 02 09:40:55 compute-1 bash[85049]: Copying config sha256:72c9c208898624938c9e4183d6686ea4a5fd3f912bc29bc3f00147924c521a3e
Feb 02 09:40:55 compute-1 bash[85049]: Writing manifest to image destination
Feb 02 09:40:56 compute-1 podman[85049]: 2026-02-02 09:40:56.009607708 +0000 UTC m=+1.058240999 container create 8d747add3fe1f1bfe0d198ee6025faae61dcb9cc9c5a16a41a7205162dac9d07 (image=quay.io/prometheus/node-exporter:v1.7.0, name=ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-node-exporter-compute-1, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Feb 02 09:40:56 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/abf7e2a0b411b65b99726b953b1d4146d3ad8b02be300aae3a359d70a1365d66/merged/etc/node-exporter supports timestamps until 2038 (0x7fffffff)
Feb 02 09:40:56 compute-1 podman[85049]: 2026-02-02 09:40:56.057557615 +0000 UTC m=+1.106190926 container init 8d747add3fe1f1bfe0d198ee6025faae61dcb9cc9c5a16a41a7205162dac9d07 (image=quay.io/prometheus/node-exporter:v1.7.0, name=ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-node-exporter-compute-1, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Feb 02 09:40:56 compute-1 podman[85049]: 2026-02-02 09:40:56.061123788 +0000 UTC m=+1.109757069 container start 8d747add3fe1f1bfe0d198ee6025faae61dcb9cc9c5a16a41a7205162dac9d07 (image=quay.io/prometheus/node-exporter:v1.7.0, name=ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-node-exporter-compute-1, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Feb 02 09:40:56 compute-1 bash[85049]: 8d747add3fe1f1bfe0d198ee6025faae61dcb9cc9c5a16a41a7205162dac9d07
Feb 02 09:40:56 compute-1 podman[85049]: 2026-02-02 09:40:55.9966255 +0000 UTC m=+1.045258811 image pull 72c9c208898624938c9e4183d6686ea4a5fd3f912bc29bc3f00147924c521a3e quay.io/prometheus/node-exporter:v1.7.0
Feb 02 09:40:56 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-node-exporter-compute-1[85124]: ts=2026-02-02T09:40:56.067Z caller=node_exporter.go:192 level=info msg="Starting node_exporter" version="(version=1.7.0, branch=HEAD, revision=7333465abf9efba81876303bb57e6fadb946041b)"
Feb 02 09:40:56 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-node-exporter-compute-1[85124]: ts=2026-02-02T09:40:56.067Z caller=node_exporter.go:193 level=info msg="Build context" build_context="(go=go1.21.4, platform=linux/amd64, user=root@35918982f6d8, date=20231112-23:53:35, tags=netgo osusergo static_build)"
Feb 02 09:40:56 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-node-exporter-compute-1[85124]: ts=2026-02-02T09:40:56.068Z caller=diskstats_common.go:111 level=info collector=diskstats msg="Parsed flag --collector.diskstats.device-exclude" flag=^(ram|loop|fd|(h|s|v|xv)d[a-z]|nvme\d+n\d+p)\d+$
Feb 02 09:40:56 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-node-exporter-compute-1[85124]: ts=2026-02-02T09:40:56.069Z caller=diskstats_linux.go:265 level=error collector=diskstats msg="Failed to open directory, disabling udev device properties" path=/run/udev/data
Feb 02 09:40:56 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-node-exporter-compute-1[85124]: ts=2026-02-02T09:40:56.069Z caller=filesystem_common.go:111 level=info collector=filesystem msg="Parsed flag --collector.filesystem.mount-points-exclude" flag=^/(dev|proc|run/credentials/.+|sys|var/lib/docker/.+|var/lib/containers/storage/.+)($|/)
Feb 02 09:40:56 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-node-exporter-compute-1[85124]: ts=2026-02-02T09:40:56.069Z caller=filesystem_common.go:113 level=info collector=filesystem msg="Parsed flag --collector.filesystem.fs-types-exclude" flag=^(autofs|binfmt_misc|bpf|cgroup2?|configfs|debugfs|devpts|devtmpfs|fusectl|hugetlbfs|iso9660|mqueue|nsfs|overlay|proc|procfs|pstore|rpc_pipefs|securityfs|selinuxfs|squashfs|sysfs|tracefs)$
Feb 02 09:40:56 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-node-exporter-compute-1[85124]: ts=2026-02-02T09:40:56.070Z caller=node_exporter.go:110 level=info msg="Enabled collectors"
Feb 02 09:40:56 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-node-exporter-compute-1[85124]: ts=2026-02-02T09:40:56.070Z caller=node_exporter.go:117 level=info collector=arp
Feb 02 09:40:56 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-node-exporter-compute-1[85124]: ts=2026-02-02T09:40:56.070Z caller=node_exporter.go:117 level=info collector=bcache
Feb 02 09:40:56 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-node-exporter-compute-1[85124]: ts=2026-02-02T09:40:56.070Z caller=node_exporter.go:117 level=info collector=bonding
Feb 02 09:40:56 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-node-exporter-compute-1[85124]: ts=2026-02-02T09:40:56.070Z caller=node_exporter.go:117 level=info collector=btrfs
Feb 02 09:40:56 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-node-exporter-compute-1[85124]: ts=2026-02-02T09:40:56.070Z caller=node_exporter.go:117 level=info collector=conntrack
Feb 02 09:40:56 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-node-exporter-compute-1[85124]: ts=2026-02-02T09:40:56.070Z caller=node_exporter.go:117 level=info collector=cpu
Feb 02 09:40:56 compute-1 systemd[1]: Started Ceph node-exporter.compute-1 for d241d473-9fcb-5f74-b163-f1ca4454e7f1.
Feb 02 09:40:56 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-node-exporter-compute-1[85124]: ts=2026-02-02T09:40:56.070Z caller=node_exporter.go:117 level=info collector=cpufreq
Feb 02 09:40:56 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-node-exporter-compute-1[85124]: ts=2026-02-02T09:40:56.073Z caller=node_exporter.go:117 level=info collector=diskstats
Feb 02 09:40:56 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-node-exporter-compute-1[85124]: ts=2026-02-02T09:40:56.073Z caller=node_exporter.go:117 level=info collector=dmi
Feb 02 09:40:56 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-node-exporter-compute-1[85124]: ts=2026-02-02T09:40:56.073Z caller=node_exporter.go:117 level=info collector=edac
Feb 02 09:40:56 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-node-exporter-compute-1[85124]: ts=2026-02-02T09:40:56.073Z caller=node_exporter.go:117 level=info collector=entropy
Feb 02 09:40:56 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-node-exporter-compute-1[85124]: ts=2026-02-02T09:40:56.073Z caller=node_exporter.go:117 level=info collector=fibrechannel
Feb 02 09:40:56 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-node-exporter-compute-1[85124]: ts=2026-02-02T09:40:56.073Z caller=node_exporter.go:117 level=info collector=filefd
Feb 02 09:40:56 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-node-exporter-compute-1[85124]: ts=2026-02-02T09:40:56.073Z caller=node_exporter.go:117 level=info collector=filesystem
Feb 02 09:40:56 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-node-exporter-compute-1[85124]: ts=2026-02-02T09:40:56.073Z caller=node_exporter.go:117 level=info collector=hwmon
Feb 02 09:40:56 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-node-exporter-compute-1[85124]: ts=2026-02-02T09:40:56.073Z caller=node_exporter.go:117 level=info collector=infiniband
Feb 02 09:40:56 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-node-exporter-compute-1[85124]: ts=2026-02-02T09:40:56.073Z caller=node_exporter.go:117 level=info collector=ipvs
Feb 02 09:40:56 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-node-exporter-compute-1[85124]: ts=2026-02-02T09:40:56.073Z caller=node_exporter.go:117 level=info collector=loadavg
Feb 02 09:40:56 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-node-exporter-compute-1[85124]: ts=2026-02-02T09:40:56.073Z caller=node_exporter.go:117 level=info collector=mdadm
Feb 02 09:40:56 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-node-exporter-compute-1[85124]: ts=2026-02-02T09:40:56.073Z caller=node_exporter.go:117 level=info collector=meminfo
Feb 02 09:40:56 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-node-exporter-compute-1[85124]: ts=2026-02-02T09:40:56.073Z caller=node_exporter.go:117 level=info collector=netclass
Feb 02 09:40:56 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-node-exporter-compute-1[85124]: ts=2026-02-02T09:40:56.073Z caller=node_exporter.go:117 level=info collector=netdev
Feb 02 09:40:56 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-node-exporter-compute-1[85124]: ts=2026-02-02T09:40:56.073Z caller=node_exporter.go:117 level=info collector=netstat
Feb 02 09:40:56 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-node-exporter-compute-1[85124]: ts=2026-02-02T09:40:56.073Z caller=node_exporter.go:117 level=info collector=nfs
Feb 02 09:40:56 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-node-exporter-compute-1[85124]: ts=2026-02-02T09:40:56.073Z caller=node_exporter.go:117 level=info collector=nfsd
Feb 02 09:40:56 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-node-exporter-compute-1[85124]: ts=2026-02-02T09:40:56.073Z caller=node_exporter.go:117 level=info collector=nvme
Feb 02 09:40:56 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-node-exporter-compute-1[85124]: ts=2026-02-02T09:40:56.073Z caller=node_exporter.go:117 level=info collector=os
Feb 02 09:40:56 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-node-exporter-compute-1[85124]: ts=2026-02-02T09:40:56.073Z caller=node_exporter.go:117 level=info collector=powersupplyclass
Feb 02 09:40:56 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-node-exporter-compute-1[85124]: ts=2026-02-02T09:40:56.073Z caller=node_exporter.go:117 level=info collector=pressure
Feb 02 09:40:56 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-node-exporter-compute-1[85124]: ts=2026-02-02T09:40:56.073Z caller=node_exporter.go:117 level=info collector=rapl
Feb 02 09:40:56 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-node-exporter-compute-1[85124]: ts=2026-02-02T09:40:56.073Z caller=node_exporter.go:117 level=info collector=schedstat
Feb 02 09:40:56 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-node-exporter-compute-1[85124]: ts=2026-02-02T09:40:56.073Z caller=node_exporter.go:117 level=info collector=selinux
Feb 02 09:40:56 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-node-exporter-compute-1[85124]: ts=2026-02-02T09:40:56.073Z caller=node_exporter.go:117 level=info collector=sockstat
Feb 02 09:40:56 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-node-exporter-compute-1[85124]: ts=2026-02-02T09:40:56.073Z caller=node_exporter.go:117 level=info collector=softnet
Feb 02 09:40:56 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-node-exporter-compute-1[85124]: ts=2026-02-02T09:40:56.073Z caller=node_exporter.go:117 level=info collector=stat
Feb 02 09:40:56 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-node-exporter-compute-1[85124]: ts=2026-02-02T09:40:56.073Z caller=node_exporter.go:117 level=info collector=tapestats
Feb 02 09:40:56 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-node-exporter-compute-1[85124]: ts=2026-02-02T09:40:56.073Z caller=node_exporter.go:117 level=info collector=textfile
Feb 02 09:40:56 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-node-exporter-compute-1[85124]: ts=2026-02-02T09:40:56.073Z caller=node_exporter.go:117 level=info collector=thermal_zone
Feb 02 09:40:56 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-node-exporter-compute-1[85124]: ts=2026-02-02T09:40:56.073Z caller=node_exporter.go:117 level=info collector=time
Feb 02 09:40:56 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-node-exporter-compute-1[85124]: ts=2026-02-02T09:40:56.073Z caller=node_exporter.go:117 level=info collector=udp_queues
Feb 02 09:40:56 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-node-exporter-compute-1[85124]: ts=2026-02-02T09:40:56.073Z caller=node_exporter.go:117 level=info collector=uname
Feb 02 09:40:56 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-node-exporter-compute-1[85124]: ts=2026-02-02T09:40:56.073Z caller=node_exporter.go:117 level=info collector=vmstat
Feb 02 09:40:56 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-node-exporter-compute-1[85124]: ts=2026-02-02T09:40:56.073Z caller=node_exporter.go:117 level=info collector=xfs
Feb 02 09:40:56 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-node-exporter-compute-1[85124]: ts=2026-02-02T09:40:56.073Z caller=node_exporter.go:117 level=info collector=zfs
Feb 02 09:40:56 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-node-exporter-compute-1[85124]: ts=2026-02-02T09:40:56.075Z caller=tls_config.go:274 level=info msg="Listening on" address=[::]:9100
Feb 02 09:40:56 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-node-exporter-compute-1[85124]: ts=2026-02-02T09:40:56.075Z caller=tls_config.go:277 level=info msg="TLS is disabled." http2=false address=[::]:9100
Feb 02 09:40:56 compute-1 sudo[84859]: pam_unix(sudo:session): session closed for user root
Feb 02 09:40:56 compute-1 ceph-mon[80115]: 5.6 deep-scrub starts
Feb 02 09:40:56 compute-1 ceph-mon[80115]: 5.6 deep-scrub ok
Feb 02 09:40:56 compute-1 ceph-mon[80115]: 5.f deep-scrub starts
Feb 02 09:40:56 compute-1 ceph-mon[80115]: 5.f deep-scrub ok
Feb 02 09:40:56 compute-1 ceph-mon[80115]: pgmap v11: 198 pgs: 198 active+clean; 454 KiB data, 85 MiB used, 60 GiB / 60 GiB avail; 29 KiB/s rd, 0 B/s wr, 12 op/s
Feb 02 09:40:56 compute-1 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' 
Feb 02 09:40:56 compute-1 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' 
Feb 02 09:40:56 compute-1 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' 
Feb 02 09:40:56 compute-1 ceph-mon[80115]: 7.5 scrub starts
Feb 02 09:40:56 compute-1 ceph-mon[80115]: 7.5 scrub ok
Feb 02 09:40:56 compute-1 ceph-mon[80115]: from='client.? 192.168.122.100:0/3738810055' entity='client.admin' cmd=[{"prefix": "status", "format": "json"}]: dispatch
Feb 02 09:40:56 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:40:56 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:40:56 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:40:56.826 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:40:56 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:40:56 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 09:40:56 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:40:56.918 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 09:40:56 compute-1 ceph-osd[77691]: log_channel(cluster) log [DBG] : 5.a scrub starts
Feb 02 09:40:56 compute-1 ceph-osd[77691]: log_channel(cluster) log [DBG] : 5.a scrub ok
Feb 02 09:40:57 compute-1 ceph-mon[80115]: 5.c scrub starts
Feb 02 09:40:57 compute-1 ceph-mon[80115]: 5.c scrub ok
Feb 02 09:40:57 compute-1 ceph-mon[80115]: 3.5 scrub starts
Feb 02 09:40:57 compute-1 ceph-mon[80115]: 3.5 scrub ok
Feb 02 09:40:57 compute-1 ceph-mon[80115]: Deploying daemon node-exporter.compute-2 on compute-2
Feb 02 09:40:57 compute-1 ceph-mon[80115]: from='client.? 192.168.122.100:0/4097333223' entity='client.admin' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Feb 02 09:40:57 compute-1 ceph-mon[80115]: 6.1c scrub starts
Feb 02 09:40:57 compute-1 ceph-mon[80115]: 6.1c scrub ok
Feb 02 09:40:57 compute-1 ceph-osd[77691]: log_channel(cluster) log [DBG] : 5.17 deep-scrub starts
Feb 02 09:40:57 compute-1 ceph-osd[77691]: log_channel(cluster) log [DBG] : 5.17 deep-scrub ok
Feb 02 09:40:58 compute-1 ceph-mon[80115]: 5.a scrub starts
Feb 02 09:40:58 compute-1 ceph-mon[80115]: 5.a scrub ok
Feb 02 09:40:58 compute-1 ceph-mon[80115]: 5.1c deep-scrub starts
Feb 02 09:40:58 compute-1 ceph-mon[80115]: 5.1c deep-scrub ok
Feb 02 09:40:58 compute-1 ceph-mon[80115]: pgmap v12: 198 pgs: 198 active+clean; 454 KiB data, 85 MiB used, 60 GiB / 60 GiB avail; 29 KiB/s rd, 0 B/s wr, 12 op/s
Feb 02 09:40:58 compute-1 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' 
Feb 02 09:40:58 compute-1 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' 
Feb 02 09:40:58 compute-1 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' 
Feb 02 09:40:58 compute-1 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' 
Feb 02 09:40:58 compute-1 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Feb 02 09:40:58 compute-1 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Feb 02 09:40:58 compute-1 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Feb 02 09:40:58 compute-1 ceph-mon[80115]: 7.a deep-scrub starts
Feb 02 09:40:58 compute-1 ceph-mon[80115]: 7.a deep-scrub ok
Feb 02 09:40:58 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:40:58 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 09:40:58 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:40:58.828 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 09:40:58 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:40:58 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:40:58 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:40:58.921 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:40:58 compute-1 ceph-osd[77691]: log_channel(cluster) log [DBG] : 2.19 scrub starts
Feb 02 09:40:58 compute-1 ceph-osd[77691]: log_channel(cluster) log [DBG] : 2.19 scrub ok
Feb 02 09:40:59 compute-1 ceph-mon[80115]: 5.17 deep-scrub starts
Feb 02 09:40:59 compute-1 ceph-mon[80115]: 5.17 deep-scrub ok
Feb 02 09:40:59 compute-1 ceph-mon[80115]: 5.7 deep-scrub starts
Feb 02 09:40:59 compute-1 ceph-mon[80115]: 5.7 deep-scrub ok
Feb 02 09:40:59 compute-1 ceph-mon[80115]: from='client.? 192.168.122.100:0/2118971521' entity='client.admin' cmd=[{"prefix": "auth get", "entity": "client.openstack"}]: dispatch
Feb 02 09:40:59 compute-1 ceph-mon[80115]: 7.11 scrub starts
Feb 02 09:40:59 compute-1 ceph-mon[80115]: 7.11 scrub ok
Feb 02 09:40:59 compute-1 ceph-osd[77691]: log_channel(cluster) log [DBG] : 5.1e scrub starts
Feb 02 09:40:59 compute-1 ceph-osd[77691]: log_channel(cluster) log [DBG] : 5.1e scrub ok
Feb 02 09:41:00 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e53 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 02 09:41:00 compute-1 ceph-mon[80115]: 2.19 scrub starts
Feb 02 09:41:00 compute-1 ceph-mon[80115]: 2.19 scrub ok
Feb 02 09:41:00 compute-1 ceph-mon[80115]: 5.1 deep-scrub starts
Feb 02 09:41:00 compute-1 ceph-mon[80115]: 5.1 deep-scrub ok
Feb 02 09:41:00 compute-1 ceph-mon[80115]: pgmap v13: 198 pgs: 198 active+clean; 454 KiB data, 85 MiB used, 60 GiB / 60 GiB avail; 22 KiB/s rd, 0 B/s wr, 9 op/s
Feb 02 09:41:00 compute-1 ceph-mon[80115]: 6.17 scrub starts
Feb 02 09:41:00 compute-1 ceph-mon[80115]: 6.17 scrub ok
Feb 02 09:41:00 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:41:00 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 09:41:00 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:41:00.831 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 09:41:00 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:41:00 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 09:41:00 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:41:00.924 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 09:41:00 compute-1 ceph-osd[77691]: log_channel(cluster) log [DBG] : 5.14 scrub starts
Feb 02 09:41:00 compute-1 ceph-osd[77691]: log_channel(cluster) log [DBG] : 5.14 scrub ok
Feb 02 09:41:01 compute-1 ceph-mon[80115]: 5.1e scrub starts
Feb 02 09:41:01 compute-1 ceph-mon[80115]: 5.1e scrub ok
Feb 02 09:41:01 compute-1 ceph-mon[80115]: 3.3 scrub starts
Feb 02 09:41:01 compute-1 ceph-mon[80115]: 3.3 scrub ok
Feb 02 09:41:01 compute-1 ceph-mon[80115]: from='client.14583 -' entity='client.admin' cmd=[{"prefix": "orch status", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Feb 02 09:41:01 compute-1 ceph-mon[80115]: 7.16 scrub starts
Feb 02 09:41:01 compute-1 ceph-mon[80115]: 7.16 scrub ok
Feb 02 09:41:01 compute-1 ceph-osd[77691]: log_channel(cluster) log [DBG] : 7.1e scrub starts
Feb 02 09:41:01 compute-1 ceph-osd[77691]: log_channel(cluster) log [DBG] : 7.1e scrub ok
Feb 02 09:41:02 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:41:02 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:41:02 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:41:02.834 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:41:02 compute-1 ceph-osd[77691]: log_channel(cluster) log [DBG] : 7.18 scrub starts
Feb 02 09:41:02 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:41:02 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 09:41:02 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:41:02.926 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 09:41:02 compute-1 ceph-osd[77691]: log_channel(cluster) log [DBG] : 7.18 scrub ok
Feb 02 09:41:02 compute-1 ceph-mon[80115]: 5.14 scrub starts
Feb 02 09:41:02 compute-1 ceph-mon[80115]: 5.14 scrub ok
Feb 02 09:41:02 compute-1 ceph-mon[80115]: 5.9 deep-scrub starts
Feb 02 09:41:02 compute-1 ceph-mon[80115]: 5.9 deep-scrub ok
Feb 02 09:41:02 compute-1 ceph-mon[80115]: pgmap v14: 198 pgs: 198 active+clean; 454 KiB data, 85 MiB used, 60 GiB / 60 GiB avail; 20 KiB/s rd, 0 B/s wr, 8 op/s
Feb 02 09:41:02 compute-1 ceph-mon[80115]: 7.1d scrub starts
Feb 02 09:41:02 compute-1 ceph-mon[80115]: 7.1d scrub ok
Feb 02 09:41:03 compute-1 ceph-osd[77691]: log_channel(cluster) log [DBG] : 7.6 scrub starts
Feb 02 09:41:03 compute-1 ceph-osd[77691]: log_channel(cluster) log [DBG] : 7.6 scrub ok
Feb 02 09:41:03 compute-1 ceph-mon[80115]: 7.1e scrub starts
Feb 02 09:41:03 compute-1 ceph-mon[80115]: 7.1e scrub ok
Feb 02 09:41:03 compute-1 ceph-mon[80115]: 5.2 scrub starts
Feb 02 09:41:03 compute-1 ceph-mon[80115]: 5.2 scrub ok
Feb 02 09:41:03 compute-1 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' 
Feb 02 09:41:03 compute-1 ceph-mon[80115]: from='client.14589 -' entity='client.admin' cmd=[{"prefix": "orch status", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Feb 02 09:41:03 compute-1 ceph-mon[80115]: 7.18 scrub starts
Feb 02 09:41:03 compute-1 ceph-mon[80115]: 7.18 scrub ok
Feb 02 09:41:03 compute-1 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' 
Feb 02 09:41:03 compute-1 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' 
Feb 02 09:41:03 compute-1 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.compute-2.vvohrf", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch
Feb 02 09:41:03 compute-1 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' cmd='[{"prefix": "auth get-or-create", "entity": "mds.cephfs.compute-2.vvohrf", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]': finished
Feb 02 09:41:03 compute-1 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Feb 02 09:41:03 compute-1 ceph-mon[80115]: Deploying daemon mds.cephfs.compute-2.vvohrf on compute-2
Feb 02 09:41:03 compute-1 ceph-mon[80115]: 5.10 scrub starts
Feb 02 09:41:03 compute-1 ceph-mon[80115]: 5.10 scrub ok
Feb 02 09:41:03 compute-1 ceph-mon[80115]: 7.14 scrub starts
Feb 02 09:41:03 compute-1 ceph-mon[80115]: 7.14 scrub ok
Feb 02 09:41:04 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:41:04 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 09:41:04 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:41:04.836 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 09:41:04 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:41:04 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:41:04 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:41:04.930 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:41:04 compute-1 ceph-osd[77691]: log_channel(cluster) log [DBG] : 7.3 scrub starts
Feb 02 09:41:04 compute-1 ceph-osd[77691]: log_channel(cluster) log [DBG] : 7.3 scrub ok
Feb 02 09:41:05 compute-1 ceph-mon[80115]: pgmap v15: 198 pgs: 198 active+clean; 454 KiB data, 85 MiB used, 60 GiB / 60 GiB avail; 18 KiB/s rd, 0 B/s wr, 7 op/s
Feb 02 09:41:05 compute-1 ceph-mon[80115]: 7.6 scrub starts
Feb 02 09:41:05 compute-1 ceph-mon[80115]: 7.6 scrub ok
Feb 02 09:41:05 compute-1 ceph-mon[80115]: 5.1f scrub starts
Feb 02 09:41:05 compute-1 ceph-mon[80115]: 5.1f scrub ok
Feb 02 09:41:05 compute-1 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' 
Feb 02 09:41:05 compute-1 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' 
Feb 02 09:41:05 compute-1 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' 
Feb 02 09:41:05 compute-1 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.compute-0.clmmzw", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch
Feb 02 09:41:05 compute-1 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' cmd='[{"prefix": "auth get-or-create", "entity": "mds.cephfs.compute-0.clmmzw", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]': finished
Feb 02 09:41:05 compute-1 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Feb 02 09:41:05 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).mds e3 new map
Feb 02 09:41:05 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).mds e3 print_map
                                           e3
                                           btime 2026-02-02T09:41:05:061446+0000
                                           enable_multiple, ever_enabled_multiple: 1,1
                                           default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}
                                           legacy client fscid: 1
                                            
                                           Filesystem 'cephfs' (1)
                                           fs_name        cephfs
                                           epoch        2
                                           flags        12 joinable allow_snaps allow_multimds_snaps
                                           created        2026-02-02T09:40:48.656583+0000
                                           modified        2026-02-02T09:40:48.656583+0000
                                           tableserver        0
                                           root        0
                                           session_timeout        60
                                           session_autoclose        300
                                           max_file_size        1099511627776
                                           max_xattr_size        65536
                                           required_client_features        {}
                                           last_failure        0
                                           last_failure_osd_epoch        0
                                           compat        compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}
                                           max_mds        1
                                           in        
                                           up        {}
                                           failed        
                                           damaged        
                                           stopped        
                                           data_pools        [7]
                                           metadata_pool        6
                                           inline_data        disabled
                                           balancer        
                                           bal_rank_mask        -1
                                           standby_count_wanted        0
                                           qdb_cluster        leader: 0 members: 
                                            
                                            
                                           Standby daemons:
                                            
                                           [mds.cephfs.compute-2.vvohrf{-1:24310} state up:standby seq 1 addr [v2:192.168.122.102:6804/673721799,v1:192.168.122.102:6805/673721799] compat {c=[1],r=[1],i=[1fff]}]
Feb 02 09:41:05 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).mds e4 new map
Feb 02 09:41:05 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).mds e4 print_map
                                           e4
                                           btime 2026-02-02T09:41:05:094248+0000
                                           enable_multiple, ever_enabled_multiple: 1,1
                                           default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}
                                           legacy client fscid: 1
                                            
                                           Filesystem 'cephfs' (1)
                                           fs_name        cephfs
                                           epoch        4
                                           flags        12 joinable allow_snaps allow_multimds_snaps
                                           created        2026-02-02T09:40:48.656583+0000
                                           modified        2026-02-02T09:41:05.094239+0000
                                           tableserver        0
                                           root        0
                                           session_timeout        60
                                           session_autoclose        300
                                           max_file_size        1099511627776
                                           max_xattr_size        65536
                                           required_client_features        {}
                                           last_failure        0
                                           last_failure_osd_epoch        0
                                           compat        compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}
                                           max_mds        1
                                           in        0
                                           up        {0=24310}
                                           failed        
                                           damaged        
                                           stopped        
                                           data_pools        [7]
                                           metadata_pool        6
                                           inline_data        disabled
                                           balancer        
                                           bal_rank_mask        -1
                                           standby_count_wanted        0
                                           qdb_cluster        leader: 0 members: 
                                           [mds.cephfs.compute-2.vvohrf{0:24310} state up:creating seq 1 addr [v2:192.168.122.102:6804/673721799,v1:192.168.122.102:6805/673721799] compat {c=[1],r=[1],i=[1fff]}]
                                            
                                            
Feb 02 09:41:05 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e53 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 02 09:41:05 compute-1 ceph-osd[77691]: log_channel(cluster) log [DBG] : 7.2 scrub starts
Feb 02 09:41:05 compute-1 ceph-osd[77691]: log_channel(cluster) log [DBG] : 7.2 scrub ok
Feb 02 09:41:06 compute-1 ceph-mon[80115]: from='client.14595 -' entity='client.admin' cmd=[{"prefix": "orch ls", "export": true, "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Feb 02 09:41:06 compute-1 ceph-mon[80115]: Deploying daemon mds.cephfs.compute-0.clmmzw on compute-0
Feb 02 09:41:06 compute-1 ceph-mon[80115]: 7.3 scrub starts
Feb 02 09:41:06 compute-1 ceph-mon[80115]: 7.3 scrub ok
Feb 02 09:41:06 compute-1 ceph-mon[80115]: 5.15 scrub starts
Feb 02 09:41:06 compute-1 ceph-mon[80115]: 5.15 scrub ok
Feb 02 09:41:06 compute-1 ceph-mon[80115]: mds.? [v2:192.168.122.102:6804/673721799,v1:192.168.122.102:6805/673721799] up:boot
Feb 02 09:41:06 compute-1 ceph-mon[80115]: daemon mds.cephfs.compute-2.vvohrf assigned to filesystem cephfs as rank 0 (now has 1 ranks)
Feb 02 09:41:06 compute-1 ceph-mon[80115]: Health check cleared: MDS_ALL_DOWN (was: 1 filesystem is offline)
Feb 02 09:41:06 compute-1 ceph-mon[80115]: Health check cleared: MDS_UP_LESS_THAN_MAX (was: 1 filesystem is online with fewer MDS than max_mds)
Feb 02 09:41:06 compute-1 ceph-mon[80115]: Cluster is now healthy
Feb 02 09:41:06 compute-1 ceph-mon[80115]: fsmap cephfs:0 1 up:standby
Feb 02 09:41:06 compute-1 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "mds metadata", "who": "cephfs.compute-2.vvohrf"}]: dispatch
Feb 02 09:41:06 compute-1 ceph-mon[80115]: fsmap cephfs:1 {0=cephfs.compute-2.vvohrf=up:creating}
Feb 02 09:41:06 compute-1 ceph-mon[80115]: daemon mds.cephfs.compute-2.vvohrf is now active in filesystem cephfs as rank 0
Feb 02 09:41:06 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).mds e5 new map
Feb 02 09:41:06 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).mds e5 print_map
                                           e5
                                           btime 2026-02-02T09:41:06:101701+0000
                                           enable_multiple, ever_enabled_multiple: 1,1
                                           default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}
                                           legacy client fscid: 1
                                            
                                           Filesystem 'cephfs' (1)
                                           fs_name        cephfs
                                           epoch        5
                                           flags        12 joinable allow_snaps allow_multimds_snaps
                                           created        2026-02-02T09:40:48.656583+0000
                                           modified        2026-02-02T09:41:06.101697+0000
                                           tableserver        0
                                           root        0
                                           session_timeout        60
                                           session_autoclose        300
                                           max_file_size        1099511627776
                                           max_xattr_size        65536
                                           required_client_features        {}
                                           last_failure        0
                                           last_failure_osd_epoch        0
                                           compat        compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}
                                           max_mds        1
                                           in        0
                                           up        {0=24310}
                                           failed        
                                           damaged        
                                           stopped        
                                           data_pools        [7]
                                           metadata_pool        6
                                           inline_data        disabled
                                           balancer        
                                           bal_rank_mask        -1
                                           standby_count_wanted        0
                                           qdb_cluster        leader: 24310 members: 24310
                                           [mds.cephfs.compute-2.vvohrf{0:24310} state up:active seq 2 addr [v2:192.168.122.102:6804/673721799,v1:192.168.122.102:6805/673721799] compat {c=[1],r=[1],i=[1fff]}]
                                            
                                            
Feb 02 09:41:06 compute-1 sudo[85134]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 02 09:41:06 compute-1 sudo[85134]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:41:06 compute-1 sudo[85134]: pam_unix(sudo:session): session closed for user root
Feb 02 09:41:06 compute-1 sudo[85159]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/d241d473-9fcb-5f74-b163-f1ca4454e7f1/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --image quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec --timeout 895 _orch deploy --fsid d241d473-9fcb-5f74-b163-f1ca4454e7f1
Feb 02 09:41:06 compute-1 sudo[85159]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:41:06 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:41:06 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 09:41:06 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:41:06.839 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 09:41:06 compute-1 podman[85224]: 2026-02-02 09:41:06.924940082 +0000 UTC m=+0.034776926 container create 4ed90f2d39cf69139d83b5c424e85bd2bc014f5ae8d1e77f683408dd73a750fe (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=elegant_solomon, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, OSD_FLAVOR=default, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Feb 02 09:41:06 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:41:06 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 09:41:06 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:41:06.932 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 09:41:06 compute-1 ceph-osd[77691]: log_channel(cluster) log [DBG] : 7.e scrub starts
Feb 02 09:41:06 compute-1 systemd[1]: Started libpod-conmon-4ed90f2d39cf69139d83b5c424e85bd2bc014f5ae8d1e77f683408dd73a750fe.scope.
Feb 02 09:41:06 compute-1 ceph-osd[77691]: log_channel(cluster) log [DBG] : 7.e scrub ok
Feb 02 09:41:06 compute-1 systemd[1]: Started libcrun container.
Feb 02 09:41:07 compute-1 podman[85224]: 2026-02-02 09:41:06.90989973 +0000 UTC m=+0.019736594 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Feb 02 09:41:07 compute-1 podman[85224]: 2026-02-02 09:41:07.008730991 +0000 UTC m=+0.118567855 container init 4ed90f2d39cf69139d83b5c424e85bd2bc014f5ae8d1e77f683408dd73a750fe (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=elegant_solomon, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.40.1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS)
Feb 02 09:41:07 compute-1 podman[85224]: 2026-02-02 09:41:07.013794293 +0000 UTC m=+0.123631137 container start 4ed90f2d39cf69139d83b5c424e85bd2bc014f5ae8d1e77f683408dd73a750fe (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=elegant_solomon, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Feb 02 09:41:07 compute-1 podman[85224]: 2026-02-02 09:41:07.017166231 +0000 UTC m=+0.127003105 container attach 4ed90f2d39cf69139d83b5c424e85bd2bc014f5ae8d1e77f683408dd73a750fe (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=elegant_solomon, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Feb 02 09:41:07 compute-1 elegant_solomon[85241]: 167 167
Feb 02 09:41:07 compute-1 systemd[1]: libpod-4ed90f2d39cf69139d83b5c424e85bd2bc014f5ae8d1e77f683408dd73a750fe.scope: Deactivated successfully.
Feb 02 09:41:07 compute-1 conmon[85241]: conmon 4ed90f2d39cf69139d83 <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-4ed90f2d39cf69139d83b5c424e85bd2bc014f5ae8d1e77f683408dd73a750fe.scope/container/memory.events
Feb 02 09:41:07 compute-1 podman[85224]: 2026-02-02 09:41:07.02060858 +0000 UTC m=+0.130445444 container died 4ed90f2d39cf69139d83b5c424e85bd2bc014f5ae8d1e77f683408dd73a750fe (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=elegant_solomon, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/)
Feb 02 09:41:07 compute-1 systemd[1]: var-lib-containers-storage-overlay-cafcddcc12274bd2a400d470b95bc34e4047f4c09913d01a84e09918c0640db6-merged.mount: Deactivated successfully.
Feb 02 09:41:07 compute-1 podman[85224]: 2026-02-02 09:41:07.052358446 +0000 UTC m=+0.162195290 container remove 4ed90f2d39cf69139d83b5c424e85bd2bc014f5ae8d1e77f683408dd73a750fe (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=elegant_solomon, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, CEPH_REF=squid, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, io.buildah.version=1.40.1, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Feb 02 09:41:07 compute-1 systemd[1]: libpod-conmon-4ed90f2d39cf69139d83b5c424e85bd2bc014f5ae8d1e77f683408dd73a750fe.scope: Deactivated successfully.
Feb 02 09:41:07 compute-1 systemd[1]: Reloading.
Feb 02 09:41:07 compute-1 ceph-mon[80115]: pgmap v16: 198 pgs: 198 active+clean; 454 KiB data, 85 MiB used, 60 GiB / 60 GiB avail
Feb 02 09:41:07 compute-1 ceph-mon[80115]: 7.2 scrub starts
Feb 02 09:41:07 compute-1 ceph-mon[80115]: 7.2 scrub ok
Feb 02 09:41:07 compute-1 ceph-mon[80115]: 5.16 scrub starts
Feb 02 09:41:07 compute-1 ceph-mon[80115]: 5.16 scrub ok
Feb 02 09:41:07 compute-1 ceph-mon[80115]: from='client.14601 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Feb 02 09:41:07 compute-1 ceph-mon[80115]: mds.? [v2:192.168.122.102:6804/673721799,v1:192.168.122.102:6805/673721799] up:active
Feb 02 09:41:07 compute-1 ceph-mon[80115]: fsmap cephfs:1 {0=cephfs.compute-2.vvohrf=up:active}
Feb 02 09:41:07 compute-1 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' 
Feb 02 09:41:07 compute-1 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' 
Feb 02 09:41:07 compute-1 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' 
Feb 02 09:41:07 compute-1 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.compute-1.khfsen", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch
Feb 02 09:41:07 compute-1 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' cmd='[{"prefix": "auth get-or-create", "entity": "mds.cephfs.compute-1.khfsen", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]': finished
Feb 02 09:41:07 compute-1 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Feb 02 09:41:07 compute-1 ceph-mon[80115]: Deploying daemon mds.cephfs.compute-1.khfsen on compute-1
Feb 02 09:41:07 compute-1 systemd-rc-local-generator[85281]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 02 09:41:07 compute-1 systemd-sysv-generator[85285]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 02 09:41:07 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).mds e6 new map
Feb 02 09:41:07 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).mds e6 print_map
                                           e6
                                           btime 2026-02-02T09:41:07:200268+0000
                                           enable_multiple, ever_enabled_multiple: 1,1
                                           default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}
                                           legacy client fscid: 1
                                            
                                           Filesystem 'cephfs' (1)
                                           fs_name        cephfs
                                           epoch        5
                                           flags        12 joinable allow_snaps allow_multimds_snaps
                                           created        2026-02-02T09:40:48.656583+0000
                                           modified        2026-02-02T09:41:06.101697+0000
                                           tableserver        0
                                           root        0
                                           session_timeout        60
                                           session_autoclose        300
                                           max_file_size        1099511627776
                                           max_xattr_size        65536
                                           required_client_features        {}
                                           last_failure        0
                                           last_failure_osd_epoch        0
                                           compat        compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}
                                           max_mds        1
                                           in        0
                                           up        {0=24310}
                                           failed        
                                           damaged        
                                           stopped        
                                           data_pools        [7]
                                           metadata_pool        6
                                           inline_data        disabled
                                           balancer        
                                           bal_rank_mask        -1
                                           standby_count_wanted        0
                                           qdb_cluster        leader: 24310 members: 24310
                                           [mds.cephfs.compute-2.vvohrf{0:24310} state up:active seq 2 addr [v2:192.168.122.102:6804/673721799,v1:192.168.122.102:6805/673721799] compat {c=[1],r=[1],i=[1fff]}]
                                            
                                            
                                           Standby daemons:
                                            
                                           [mds.cephfs.compute-0.clmmzw{-1:14607} state up:standby seq 1 addr [v2:192.168.122.100:6806/4233871501,v1:192.168.122.100:6807/4233871501] compat {c=[1],r=[1],i=[1fff]}]
Feb 02 09:41:07 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).mds e7 new map
Feb 02 09:41:07 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).mds e7 print_map
                                           e7
                                           btime 2026-02-02T09:41:07:213917+0000
                                           enable_multiple, ever_enabled_multiple: 1,1
                                           default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}
                                           legacy client fscid: 1
                                            
                                           Filesystem 'cephfs' (1)
                                           fs_name        cephfs
                                           epoch        5
                                           flags        12 joinable allow_snaps allow_multimds_snaps
                                           created        2026-02-02T09:40:48.656583+0000
                                           modified        2026-02-02T09:41:06.101697+0000
                                           tableserver        0
                                           root        0
                                           session_timeout        60
                                           session_autoclose        300
                                           max_file_size        1099511627776
                                           max_xattr_size        65536
                                           required_client_features        {}
                                           last_failure        0
                                           last_failure_osd_epoch        0
                                           compat        compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}
                                           max_mds        1
                                           in        0
                                           up        {0=24310}
                                           failed        
                                           damaged        
                                           stopped        
                                           data_pools        [7]
                                           metadata_pool        6
                                           inline_data        disabled
                                           balancer        
                                           bal_rank_mask        -1
                                           standby_count_wanted        1
                                           qdb_cluster        leader: 24310 members: 24310
                                           [mds.cephfs.compute-2.vvohrf{0:24310} state up:active seq 2 addr [v2:192.168.122.102:6804/673721799,v1:192.168.122.102:6805/673721799] compat {c=[1],r=[1],i=[1fff]}]
                                            
                                            
                                           Standby daemons:
                                            
                                           [mds.cephfs.compute-0.clmmzw{-1:14607} state up:standby seq 1 addr [v2:192.168.122.100:6806/4233871501,v1:192.168.122.100:6807/4233871501] compat {c=[1],r=[1],i=[1fff]}]
Feb 02 09:41:07 compute-1 systemd[1]: Reloading.
Feb 02 09:41:07 compute-1 systemd-sysv-generator[85328]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 02 09:41:07 compute-1 systemd-rc-local-generator[85325]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 02 09:41:07 compute-1 systemd[1]: Starting Ceph mds.cephfs.compute-1.khfsen for d241d473-9fcb-5f74-b163-f1ca4454e7f1...
Feb 02 09:41:07 compute-1 podman[85383]: 2026-02-02 09:41:07.841635738 +0000 UTC m=+0.048576015 container create d2de892f7e328a2b8439d80aa4f6b300d90cd45464028edda638ab25282d74d0 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mds-cephfs-compute-1-khfsen, OSD_FLAVOR=default, io.buildah.version=1.40.1, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, CEPH_REF=squid, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/)
Feb 02 09:41:07 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/64b9fb5e8fc492fbdc69c58e90ea42bcfd097ef3c941da218c6aec616c8c7bc2/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Feb 02 09:41:07 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/64b9fb5e8fc492fbdc69c58e90ea42bcfd097ef3c941da218c6aec616c8c7bc2/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Feb 02 09:41:07 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/64b9fb5e8fc492fbdc69c58e90ea42bcfd097ef3c941da218c6aec616c8c7bc2/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Feb 02 09:41:07 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/64b9fb5e8fc492fbdc69c58e90ea42bcfd097ef3c941da218c6aec616c8c7bc2/merged/var/lib/ceph/mds/ceph-cephfs.compute-1.khfsen supports timestamps until 2038 (0x7fffffff)
Feb 02 09:41:07 compute-1 podman[85383]: 2026-02-02 09:41:07.910683044 +0000 UTC m=+0.117623381 container init d2de892f7e328a2b8439d80aa4f6b300d90cd45464028edda638ab25282d74d0 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mds-cephfs-compute-1-khfsen, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250325, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Feb 02 09:41:07 compute-1 podman[85383]: 2026-02-02 09:41:07.818473545 +0000 UTC m=+0.025413892 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Feb 02 09:41:07 compute-1 ceph-osd[77691]: log_channel(cluster) log [DBG] : 7.4 scrub starts
Feb 02 09:41:07 compute-1 podman[85383]: 2026-02-02 09:41:07.918671842 +0000 UTC m=+0.125612159 container start d2de892f7e328a2b8439d80aa4f6b300d90cd45464028edda638ab25282d74d0 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mds-cephfs-compute-1-khfsen, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325)
Feb 02 09:41:07 compute-1 bash[85383]: d2de892f7e328a2b8439d80aa4f6b300d90cd45464028edda638ab25282d74d0
Feb 02 09:41:07 compute-1 systemd[1]: Started Ceph mds.cephfs.compute-1.khfsen for d241d473-9fcb-5f74-b163-f1ca4454e7f1.
Feb 02 09:41:07 compute-1 ceph-osd[77691]: log_channel(cluster) log [DBG] : 7.4 scrub ok
Feb 02 09:41:07 compute-1 ceph-mds[85402]: set uid:gid to 167:167 (ceph:ceph)
Feb 02 09:41:07 compute-1 ceph-mds[85402]: ceph version 19.2.3 (c92aebb279828e9c3c1f5d24613efca272649e62) squid (stable), process ceph-mds, pid 2
Feb 02 09:41:07 compute-1 ceph-mds[85402]: main not setting numa affinity
Feb 02 09:41:07 compute-1 ceph-mds[85402]: pidfile_write: ignore empty --pid-file
Feb 02 09:41:07 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mds-cephfs-compute-1-khfsen[85398]: starting mds.cephfs.compute-1.khfsen at 
Feb 02 09:41:07 compute-1 ceph-mds[85402]: mds.cephfs.compute-1.khfsen Updating MDS map to version 7 from mon.2
Feb 02 09:41:07 compute-1 sudo[85159]: pam_unix(sudo:session): session closed for user root
Feb 02 09:41:08 compute-1 ceph-mon[80115]: 7.e scrub starts
Feb 02 09:41:08 compute-1 ceph-mon[80115]: 7.e scrub ok
Feb 02 09:41:08 compute-1 ceph-mon[80115]: 5.11 scrub starts
Feb 02 09:41:08 compute-1 ceph-mon[80115]: 5.11 scrub ok
Feb 02 09:41:08 compute-1 ceph-mon[80115]: mds.? [v2:192.168.122.100:6806/4233871501,v1:192.168.122.100:6807/4233871501] up:boot
Feb 02 09:41:08 compute-1 ceph-mon[80115]: fsmap cephfs:1 {0=cephfs.compute-2.vvohrf=up:active} 1 up:standby
Feb 02 09:41:08 compute-1 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "mds metadata", "who": "cephfs.compute-0.clmmzw"}]: dispatch
Feb 02 09:41:08 compute-1 ceph-mon[80115]: fsmap cephfs:1 {0=cephfs.compute-2.vvohrf=up:active} 1 up:standby
Feb 02 09:41:08 compute-1 ceph-mon[80115]: pgmap v17: 198 pgs: 198 active+clean; 454 KiB data, 85 MiB used, 60 GiB / 60 GiB avail
Feb 02 09:41:08 compute-1 ceph-mon[80115]: from='client.? 192.168.122.100:0/3598454046' entity='client.admin' cmd=[{"prefix": "status", "format": "json"}]: dispatch
Feb 02 09:41:08 compute-1 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' 
Feb 02 09:41:08 compute-1 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' 
Feb 02 09:41:08 compute-1 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' 
Feb 02 09:41:08 compute-1 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' 
Feb 02 09:41:08 compute-1 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' 
Feb 02 09:41:08 compute-1 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' 
Feb 02 09:41:08 compute-1 ceph-mon[80115]: Creating key for client.nfs.cephfs.0.0.compute-1.mhzhsx
Feb 02 09:41:08 compute-1 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "auth get-or-create", "entity": "client.nfs.cephfs.0.0.compute-1.mhzhsx", "caps": ["mon", "allow r", "osd", "allow rw pool=.nfs namespace=cephfs"]}]: dispatch
Feb 02 09:41:08 compute-1 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' cmd='[{"prefix": "auth get-or-create", "entity": "client.nfs.cephfs.0.0.compute-1.mhzhsx", "caps": ["mon", "allow r", "osd", "allow rw pool=.nfs namespace=cephfs"]}]': finished
Feb 02 09:41:08 compute-1 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "auth get-or-create", "entity": "client.mgr.nfs.grace.nfs.cephfs", "caps": ["mon", "allow r", "osd", "allow rwx pool .nfs"]}]: dispatch
Feb 02 09:41:08 compute-1 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' cmd='[{"prefix": "auth get-or-create", "entity": "client.mgr.nfs.grace.nfs.cephfs", "caps": ["mon", "allow r", "osd", "allow rwx pool .nfs"]}]': finished
Feb 02 09:41:08 compute-1 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Feb 02 09:41:08 compute-1 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "auth rm", "entity": "client.mgr.nfs.grace.nfs.cephfs"}]: dispatch
Feb 02 09:41:08 compute-1 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' cmd='[{"prefix": "auth rm", "entity": "client.mgr.nfs.grace.nfs.cephfs"}]': finished
Feb 02 09:41:08 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).mds e8 new map
Feb 02 09:41:08 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).mds e8 print_map
                                           e8
                                           btime 2026-02-02T09:41:08:229569+0000
                                           enable_multiple, ever_enabled_multiple: 1,1
                                           default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}
                                           legacy client fscid: 1
                                            
                                           Filesystem 'cephfs' (1)
                                           fs_name        cephfs
                                           epoch        5
                                           flags        12 joinable allow_snaps allow_multimds_snaps
                                           created        2026-02-02T09:40:48.656583+0000
                                           modified        2026-02-02T09:41:06.101697+0000
                                           tableserver        0
                                           root        0
                                           session_timeout        60
                                           session_autoclose        300
                                           max_file_size        1099511627776
                                           max_xattr_size        65536
                                           required_client_features        {}
                                           last_failure        0
                                           last_failure_osd_epoch        0
                                           compat        compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}
                                           max_mds        1
                                           in        0
                                           up        {0=24310}
                                           failed        
                                           damaged        
                                           stopped        
                                           data_pools        [7]
                                           metadata_pool        6
                                           inline_data        disabled
                                           balancer        
                                           bal_rank_mask        -1
                                           standby_count_wanted        1
                                           qdb_cluster        leader: 24310 members: 24310
                                           [mds.cephfs.compute-2.vvohrf{0:24310} state up:active seq 2 addr [v2:192.168.122.102:6804/673721799,v1:192.168.122.102:6805/673721799] compat {c=[1],r=[1],i=[1fff]}]
                                            
                                            
                                           Standby daemons:
                                            
                                           [mds.cephfs.compute-0.clmmzw{-1:14607} state up:standby seq 1 addr [v2:192.168.122.100:6806/4233871501,v1:192.168.122.100:6807/4233871501] compat {c=[1],r=[1],i=[1fff]}]
                                           [mds.cephfs.compute-1.khfsen{-1:24317} state up:standby seq 1 addr [v2:192.168.122.101:6804/685771812,v1:192.168.122.101:6805/685771812] compat {c=[1],r=[1],i=[1fff]}]
Feb 02 09:41:08 compute-1 ceph-mds[85402]: mds.cephfs.compute-1.khfsen Updating MDS map to version 8 from mon.2
Feb 02 09:41:08 compute-1 ceph-mds[85402]: mds.cephfs.compute-1.khfsen Monitors have assigned me to become a standby
Feb 02 09:41:08 compute-1 sudo[85422]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 02 09:41:08 compute-1 sudo[85422]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:41:08 compute-1 sudo[85422]: pam_unix(sudo:session): session closed for user root
Feb 02 09:41:08 compute-1 sudo[85447]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/d241d473-9fcb-5f74-b163-f1ca4454e7f1/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --image quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec --timeout 895 _orch deploy --fsid d241d473-9fcb-5f74-b163-f1ca4454e7f1
Feb 02 09:41:08 compute-1 sudo[85447]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:41:08 compute-1 podman[85513]: 2026-02-02 09:41:08.840992434 +0000 UTC m=+0.046761167 container create 1b5e2b32bee1d3686e4671f84b82bd26dbdae2f93b2379603bc82452073a6407 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=adoring_nightingale, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250325, ceph=True)
Feb 02 09:41:08 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:41:08 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 09:41:08 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:41:08.843 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 09:41:08 compute-1 ceph-osd[77691]: log_channel(cluster) log [DBG] : 7.f scrub starts
Feb 02 09:41:08 compute-1 systemd[1]: Started libpod-conmon-1b5e2b32bee1d3686e4671f84b82bd26dbdae2f93b2379603bc82452073a6407.scope.
Feb 02 09:41:08 compute-1 ceph-osd[77691]: log_channel(cluster) log [DBG] : 7.f scrub ok
Feb 02 09:41:08 compute-1 systemd[1]: Started libcrun container.
Feb 02 09:41:08 compute-1 podman[85513]: 2026-02-02 09:41:08.822900474 +0000 UTC m=+0.028669237 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Feb 02 09:41:08 compute-1 podman[85513]: 2026-02-02 09:41:08.926627742 +0000 UTC m=+0.132396515 container init 1b5e2b32bee1d3686e4671f84b82bd26dbdae2f93b2379603bc82452073a6407 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=adoring_nightingale, org.label-schema.schema-version=1.0, CEPH_REF=squid, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, io.buildah.version=1.40.1, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Feb 02 09:41:08 compute-1 podman[85513]: 2026-02-02 09:41:08.936767966 +0000 UTC m=+0.142536729 container start 1b5e2b32bee1d3686e4671f84b82bd26dbdae2f93b2379603bc82452073a6407 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=adoring_nightingale, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.40.1, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Feb 02 09:41:08 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:41:08 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:41:08 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:41:08.936 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:41:08 compute-1 podman[85513]: 2026-02-02 09:41:08.940341969 +0000 UTC m=+0.146110702 container attach 1b5e2b32bee1d3686e4671f84b82bd26dbdae2f93b2379603bc82452073a6407 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=adoring_nightingale, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.build-date=20250325, io.buildah.version=1.40.1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Feb 02 09:41:08 compute-1 adoring_nightingale[85530]: 167 167
Feb 02 09:41:08 compute-1 systemd[1]: libpod-1b5e2b32bee1d3686e4671f84b82bd26dbdae2f93b2379603bc82452073a6407.scope: Deactivated successfully.
Feb 02 09:41:08 compute-1 conmon[85530]: conmon 1b5e2b32bee1d3686e46 <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-1b5e2b32bee1d3686e4671f84b82bd26dbdae2f93b2379603bc82452073a6407.scope/container/memory.events
Feb 02 09:41:08 compute-1 podman[85513]: 2026-02-02 09:41:08.945682578 +0000 UTC m=+0.151451311 container died 1b5e2b32bee1d3686e4671f84b82bd26dbdae2f93b2379603bc82452073a6407 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=adoring_nightingale, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, CEPH_REF=squid, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Feb 02 09:41:08 compute-1 systemd[1]: var-lib-containers-storage-overlay-044e511332105de50e14cd72a355e14173d59e72e6b82d9b0bb0fba72e34493a-merged.mount: Deactivated successfully.
Feb 02 09:41:08 compute-1 podman[85513]: 2026-02-02 09:41:08.982937277 +0000 UTC m=+0.188706030 container remove 1b5e2b32bee1d3686e4671f84b82bd26dbdae2f93b2379603bc82452073a6407 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=adoring_nightingale, ceph=True, org.label-schema.build-date=20250325, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 02 09:41:08 compute-1 systemd[1]: libpod-conmon-1b5e2b32bee1d3686e4671f84b82bd26dbdae2f93b2379603bc82452073a6407.scope: Deactivated successfully.
Feb 02 09:41:09 compute-1 systemd[1]: Reloading.
Feb 02 09:41:09 compute-1 systemd-sysv-generator[85573]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 02 09:41:09 compute-1 systemd-rc-local-generator[85569]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 02 09:41:09 compute-1 ceph-mon[80115]: 7.4 scrub starts
Feb 02 09:41:09 compute-1 ceph-mon[80115]: 7.4 scrub ok
Feb 02 09:41:09 compute-1 ceph-mon[80115]: 4.e scrub starts
Feb 02 09:41:09 compute-1 ceph-mon[80115]: 4.e scrub ok
Feb 02 09:41:09 compute-1 ceph-mon[80115]: Ensuring nfs.cephfs.0 is in the ganesha grace table
Feb 02 09:41:09 compute-1 ceph-mon[80115]: mds.? [v2:192.168.122.101:6804/685771812,v1:192.168.122.101:6805/685771812] up:boot
Feb 02 09:41:09 compute-1 ceph-mon[80115]: fsmap cephfs:1 {0=cephfs.compute-2.vvohrf=up:active} 2 up:standby
Feb 02 09:41:09 compute-1 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "mds metadata", "who": "cephfs.compute-1.khfsen"}]: dispatch
Feb 02 09:41:09 compute-1 ceph-mon[80115]: Rados config object exists: conf-nfs.cephfs
Feb 02 09:41:09 compute-1 ceph-mon[80115]: Creating key for client.nfs.cephfs.0.0.compute-1.mhzhsx-rgw
Feb 02 09:41:09 compute-1 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "auth get-or-create", "entity": "client.nfs.cephfs.0.0.compute-1.mhzhsx-rgw", "caps": ["mon", "allow r", "osd", "allow rwx tag rgw *=*"]}]: dispatch
Feb 02 09:41:09 compute-1 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' cmd='[{"prefix": "auth get-or-create", "entity": "client.nfs.cephfs.0.0.compute-1.mhzhsx-rgw", "caps": ["mon", "allow r", "osd", "allow rwx tag rgw *=*"]}]': finished
Feb 02 09:41:09 compute-1 ceph-mon[80115]: Bind address in nfs.cephfs.0.0.compute-1.mhzhsx's ganesha conf is defaulting to empty
Feb 02 09:41:09 compute-1 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Feb 02 09:41:09 compute-1 ceph-mon[80115]: Deploying daemon nfs.cephfs.0.0.compute-1.mhzhsx on compute-1
Feb 02 09:41:09 compute-1 systemd[1]: Reloading.
Feb 02 09:41:09 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).mds e9 new map
Feb 02 09:41:09 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).mds e9 print_map
                                           e9
                                           btime 2026-02-02T09:41:09:317331+0000
                                           enable_multiple, ever_enabled_multiple: 1,1
                                           default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}
                                           legacy client fscid: 1
                                            
                                           Filesystem 'cephfs' (1)
                                           fs_name        cephfs
                                           epoch        9
                                           flags        12 joinable allow_snaps allow_multimds_snaps
                                           created        2026-02-02T09:40:48.656583+0000
                                           modified        2026-02-02T09:41:09.133293+0000
                                           tableserver        0
                                           root        0
                                           session_timeout        60
                                           session_autoclose        300
                                           max_file_size        1099511627776
                                           max_xattr_size        65536
                                           required_client_features        {}
                                           last_failure        0
                                           last_failure_osd_epoch        0
                                           compat        compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}
                                           max_mds        1
                                           in        0
                                           up        {0=24310}
                                           failed        
                                           damaged        
                                           stopped        
                                           data_pools        [7]
                                           metadata_pool        6
                                           inline_data        disabled
                                           balancer        
                                           bal_rank_mask        -1
                                           standby_count_wanted        1
                                           qdb_cluster        leader: 24310 members: 24310
                                           [mds.cephfs.compute-2.vvohrf{0:24310} state up:active seq 3 join_fscid=1 addr [v2:192.168.122.102:6804/673721799,v1:192.168.122.102:6805/673721799] compat {c=[1],r=[1],i=[1fff]}]
                                            
                                            
                                           Standby daemons:
                                            
                                           [mds.cephfs.compute-0.clmmzw{-1:14607} state up:standby seq 1 addr [v2:192.168.122.100:6806/4233871501,v1:192.168.122.100:6807/4233871501] compat {c=[1],r=[1],i=[1fff]}]
                                           [mds.cephfs.compute-1.khfsen{-1:24317} state up:standby seq 1 addr [v2:192.168.122.101:6804/685771812,v1:192.168.122.101:6805/685771812] compat {c=[1],r=[1],i=[1fff]}]
Feb 02 09:41:09 compute-1 systemd-rc-local-generator[85619]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 02 09:41:09 compute-1 systemd-sysv-generator[85622]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 02 09:41:09 compute-1 systemd[1]: Starting Ceph nfs.cephfs.0.0.compute-1.mhzhsx for d241d473-9fcb-5f74-b163-f1ca4454e7f1...
Feb 02 09:41:09 compute-1 podman[85673]: 2026-02-02 09:41:09.776624623 +0000 UTC m=+0.057330372 container create b92f7aee261ef9868168c88be932bb880afdd45ef202ba419eb3d1d25ccd8a7e (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.40.1, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250325, OSD_FLAVOR=default)
Feb 02 09:41:09 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4f000e337d81c95375c89b4aea8131c7417a0d2d466dbffabd46cfe3e1a43abb/merged/etc/ganesha supports timestamps until 2038 (0x7fffffff)
Feb 02 09:41:09 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4f000e337d81c95375c89b4aea8131c7417a0d2d466dbffabd46cfe3e1a43abb/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Feb 02 09:41:09 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4f000e337d81c95375c89b4aea8131c7417a0d2d466dbffabd46cfe3e1a43abb/merged/etc/ceph/keyring supports timestamps until 2038 (0x7fffffff)
Feb 02 09:41:09 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4f000e337d81c95375c89b4aea8131c7417a0d2d466dbffabd46cfe3e1a43abb/merged/var/lib/ceph/radosgw/ceph-nfs.cephfs.0.0.compute-1.mhzhsx-rgw/keyring supports timestamps until 2038 (0x7fffffff)
Feb 02 09:41:09 compute-1 ceph-osd[77691]: log_channel(cluster) log [DBG] : 7.8 scrub starts
Feb 02 09:41:09 compute-1 podman[85673]: 2026-02-02 09:41:09.833074402 +0000 UTC m=+0.113780181 container init b92f7aee261ef9868168c88be932bb880afdd45ef202ba419eb3d1d25ccd8a7e (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, ceph=True)
Feb 02 09:41:09 compute-1 ceph-osd[77691]: log_channel(cluster) log [DBG] : 7.8 scrub ok
Feb 02 09:41:09 compute-1 podman[85673]: 2026-02-02 09:41:09.84222223 +0000 UTC m=+0.122927979 container start b92f7aee261ef9868168c88be932bb880afdd45ef202ba419eb3d1d25ccd8a7e (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx, org.label-schema.build-date=20250325, ceph=True, io.buildah.version=1.40.1, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, CEPH_REF=squid, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Feb 02 09:41:09 compute-1 podman[85673]: 2026-02-02 09:41:09.750632597 +0000 UTC m=+0.031338416 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Feb 02 09:41:09 compute-1 bash[85673]: b92f7aee261ef9868168c88be932bb880afdd45ef202ba419eb3d1d25ccd8a7e
Feb 02 09:41:09 compute-1 systemd[1]: Started Ceph nfs.cephfs.0.0.compute-1.mhzhsx for d241d473-9fcb-5f74-b163-f1ca4454e7f1.
Feb 02 09:41:09 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[85688]: 02/02/2026 09:41:09 : epoch 69807135 : compute-1 : ganesha.nfsd-2[main] init_logging :LOG :NULL :LOG: Setting log level for all components to NIV_EVENT
Feb 02 09:41:09 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[85688]: 02/02/2026 09:41:09 : epoch 69807135 : compute-1 : ganesha.nfsd-2[main] main :MAIN :EVENT :ganesha.nfsd Starting: Ganesha Version 5.9
Feb 02 09:41:09 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[85688]: 02/02/2026 09:41:09 : epoch 69807135 : compute-1 : ganesha.nfsd-2[main] nfs_set_param_from_conf :NFS STARTUP :EVENT :Configuration file successfully parsed
Feb 02 09:41:09 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[85688]: 02/02/2026 09:41:09 : epoch 69807135 : compute-1 : ganesha.nfsd-2[main] monitoring_init :NFS STARTUP :EVENT :Init monitoring at 0.0.0.0:9587
Feb 02 09:41:09 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[85688]: 02/02/2026 09:41:09 : epoch 69807135 : compute-1 : ganesha.nfsd-2[main] fsal_init_fds_limit :MDCACHE LRU :EVENT :Setting the system-imposed limit on FDs to 1048576.
Feb 02 09:41:09 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[85688]: 02/02/2026 09:41:09 : epoch 69807135 : compute-1 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :Initializing ID Mapper.
Feb 02 09:41:09 compute-1 sudo[85447]: pam_unix(sudo:session): session closed for user root
Feb 02 09:41:09 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[85688]: 02/02/2026 09:41:09 : epoch 69807135 : compute-1 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :ID Mapper successfully initialized.
Feb 02 09:41:09 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[85688]: 02/02/2026 09:41:09 : epoch 69807135 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Feb 02 09:41:09 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[85688]: 02/02/2026 09:41:09 : epoch 69807135 : compute-1 : ganesha.nfsd-2[main] rados_kv_traverse :CLIENT ID :EVENT :Failed to lst kv ret=-2
Feb 02 09:41:09 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[85688]: 02/02/2026 09:41:09 : epoch 69807135 : compute-1 : ganesha.nfsd-2[main] rados_cluster_read_clids :CLIENT ID :EVENT :Failed to traverse recovery db: -2
Feb 02 09:41:09 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[85688]: 02/02/2026 09:41:09 : epoch 69807135 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Feb 02 09:41:09 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[85688]: 02/02/2026 09:41:09 : epoch 69807135 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Feb 02 09:41:10 compute-1 ceph-mon[80115]: 7.f scrub starts
Feb 02 09:41:10 compute-1 ceph-mon[80115]: 7.f scrub ok
Feb 02 09:41:10 compute-1 ceph-mon[80115]: 6.15 scrub starts
Feb 02 09:41:10 compute-1 ceph-mon[80115]: 6.15 scrub ok
Feb 02 09:41:10 compute-1 ceph-mon[80115]: from='client.? 192.168.122.100:0/3364289223' entity='client.admin' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch
Feb 02 09:41:10 compute-1 ceph-mon[80115]: mds.? [v2:192.168.122.102:6804/673721799,v1:192.168.122.102:6805/673721799] up:active
Feb 02 09:41:10 compute-1 ceph-mon[80115]: fsmap cephfs:1 {0=cephfs.compute-2.vvohrf=up:active} 2 up:standby
Feb 02 09:41:10 compute-1 ceph-mon[80115]: pgmap v18: 198 pgs: 198 active+clean; 456 KiB data, 85 MiB used, 60 GiB / 60 GiB avail; 85 B/s rd, 1.3 KiB/s wr, 4 op/s
Feb 02 09:41:10 compute-1 ceph-mon[80115]: 7.8 scrub starts
Feb 02 09:41:10 compute-1 ceph-mon[80115]: 7.8 scrub ok
Feb 02 09:41:10 compute-1 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' 
Feb 02 09:41:10 compute-1 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' 
Feb 02 09:41:10 compute-1 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' 
Feb 02 09:41:10 compute-1 ceph-mon[80115]: Creating key for client.nfs.cephfs.1.0.compute-2.dciyfa
Feb 02 09:41:10 compute-1 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "auth get-or-create", "entity": "client.nfs.cephfs.1.0.compute-2.dciyfa", "caps": ["mon", "allow r", "osd", "allow rw pool=.nfs namespace=cephfs"]}]: dispatch
Feb 02 09:41:10 compute-1 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' cmd='[{"prefix": "auth get-or-create", "entity": "client.nfs.cephfs.1.0.compute-2.dciyfa", "caps": ["mon", "allow r", "osd", "allow rw pool=.nfs namespace=cephfs"]}]': finished
Feb 02 09:41:10 compute-1 ceph-mon[80115]: Ensuring nfs.cephfs.1 is in the ganesha grace table
Feb 02 09:41:10 compute-1 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "auth get-or-create", "entity": "client.mgr.nfs.grace.nfs.cephfs", "caps": ["mon", "allow r", "osd", "allow rwx pool .nfs"]}]: dispatch
Feb 02 09:41:10 compute-1 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' cmd='[{"prefix": "auth get-or-create", "entity": "client.mgr.nfs.grace.nfs.cephfs", "caps": ["mon", "allow r", "osd", "allow rwx pool .nfs"]}]': finished
Feb 02 09:41:10 compute-1 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Feb 02 09:41:10 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e53 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 02 09:41:10 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:41:10 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:41:10 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:41:10.846 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:41:10 compute-1 ceph-osd[77691]: log_channel(cluster) log [DBG] : 7.b deep-scrub starts
Feb 02 09:41:10 compute-1 ceph-osd[77691]: log_channel(cluster) log [DBG] : 7.b deep-scrub ok
Feb 02 09:41:10 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:41:10 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 09:41:10 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:41:10.938 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 09:41:11 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).mds e10 new map
Feb 02 09:41:11 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).mds e10 print_map
                                           e10
                                           btime 2026-02-02T09:41:11:335086+0000
                                           enable_multiple, ever_enabled_multiple: 1,1
                                           default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}
                                           legacy client fscid: 1
                                            
                                           Filesystem 'cephfs' (1)
                                           fs_name        cephfs
                                           epoch        9
                                           flags        12 joinable allow_snaps allow_multimds_snaps
                                           created        2026-02-02T09:40:48.656583+0000
                                           modified        2026-02-02T09:41:09.133293+0000
                                           tableserver        0
                                           root        0
                                           session_timeout        60
                                           session_autoclose        300
                                           max_file_size        1099511627776
                                           max_xattr_size        65536
                                           required_client_features        {}
                                           last_failure        0
                                           last_failure_osd_epoch        0
                                           compat        compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}
                                           max_mds        1
                                           in        0
                                           up        {0=24310}
                                           failed        
                                           damaged        
                                           stopped        
                                           data_pools        [7]
                                           metadata_pool        6
                                           inline_data        disabled
                                           balancer        
                                           bal_rank_mask        -1
                                           standby_count_wanted        1
                                           qdb_cluster        leader: 24310 members: 24310
                                           [mds.cephfs.compute-2.vvohrf{0:24310} state up:active seq 3 join_fscid=1 addr [v2:192.168.122.102:6804/673721799,v1:192.168.122.102:6805/673721799] compat {c=[1],r=[1],i=[1fff]}]
                                            
                                            
                                           Standby daemons:
                                            
                                           [mds.cephfs.compute-0.clmmzw{-1:14607} state up:standby seq 2 join_fscid=1 addr [v2:192.168.122.100:6806/4233871501,v1:192.168.122.100:6807/4233871501] compat {c=[1],r=[1],i=[1fff]}]
                                           [mds.cephfs.compute-1.khfsen{-1:24317} state up:standby seq 1 addr [v2:192.168.122.101:6804/685771812,v1:192.168.122.101:6805/685771812] compat {c=[1],r=[1],i=[1fff]}]
Feb 02 09:41:11 compute-1 ceph-mon[80115]: 6.a deep-scrub starts
Feb 02 09:41:11 compute-1 ceph-mon[80115]: 6.a deep-scrub ok
Feb 02 09:41:11 compute-1 ceph-mon[80115]: from='client.? 192.168.122.100:0/789053282' entity='client.admin' cmd=[{"prefix": "osd get-require-min-compat-client"}]: dispatch
Feb 02 09:41:11 compute-1 ceph-osd[77691]: log_channel(cluster) log [DBG] : 7.9 scrub starts
Feb 02 09:41:11 compute-1 ceph-osd[77691]: log_channel(cluster) log [DBG] : 7.9 scrub ok
Feb 02 09:41:12 compute-1 ceph-mon[80115]: 7.b deep-scrub starts
Feb 02 09:41:12 compute-1 ceph-mon[80115]: 7.b deep-scrub ok
Feb 02 09:41:12 compute-1 ceph-mon[80115]: 6.7 scrub starts
Feb 02 09:41:12 compute-1 ceph-mon[80115]: 6.7 scrub ok
Feb 02 09:41:12 compute-1 ceph-mon[80115]: mds.? [v2:192.168.122.100:6806/4233871501,v1:192.168.122.100:6807/4233871501] up:standby
Feb 02 09:41:12 compute-1 ceph-mon[80115]: fsmap cephfs:1 {0=cephfs.compute-2.vvohrf=up:active} 2 up:standby
Feb 02 09:41:12 compute-1 ceph-mon[80115]: pgmap v19: 198 pgs: 198 active+clean; 456 KiB data, 85 MiB used, 60 GiB / 60 GiB avail; 85 B/s rd, 1.3 KiB/s wr, 4 op/s
Feb 02 09:41:12 compute-1 ceph-mon[80115]: from='client.? 192.168.122.100:0/4222980362' entity='client.admin' cmd=[{"prefix": "versions", "format": "json"}]: dispatch
Feb 02 09:41:12 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).mds e11 new map
Feb 02 09:41:12 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).mds e11 print_map
                                           e11
                                           btime 2026-02-02T09:41:12:473139+0000
                                           enable_multiple, ever_enabled_multiple: 1,1
                                           default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}
                                           legacy client fscid: 1
                                            
                                           Filesystem 'cephfs' (1)
                                           fs_name        cephfs
                                           epoch        9
                                           flags        12 joinable allow_snaps allow_multimds_snaps
                                           created        2026-02-02T09:40:48.656583+0000
                                           modified        2026-02-02T09:41:09.133293+0000
                                           tableserver        0
                                           root        0
                                           session_timeout        60
                                           session_autoclose        300
                                           max_file_size        1099511627776
                                           max_xattr_size        65536
                                           required_client_features        {}
                                           last_failure        0
                                           last_failure_osd_epoch        0
                                           compat        compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}
                                           max_mds        1
                                           in        0
                                           up        {0=24310}
                                           failed        
                                           damaged        
                                           stopped        
                                           data_pools        [7]
                                           metadata_pool        6
                                           inline_data        disabled
                                           balancer        
                                           bal_rank_mask        -1
                                           standby_count_wanted        1
                                           qdb_cluster        leader: 24310 members: 24310
                                           [mds.cephfs.compute-2.vvohrf{0:24310} state up:active seq 3 join_fscid=1 addr [v2:192.168.122.102:6804/673721799,v1:192.168.122.102:6805/673721799] compat {c=[1],r=[1],i=[1fff]}]
                                            
                                            
                                           Standby daemons:
                                            
                                           [mds.cephfs.compute-0.clmmzw{-1:14607} state up:standby seq 2 join_fscid=1 addr [v2:192.168.122.100:6806/4233871501,v1:192.168.122.100:6807/4233871501] compat {c=[1],r=[1],i=[1fff]}]
                                           [mds.cephfs.compute-1.khfsen{-1:24317} state up:standby seq 2 join_fscid=1 addr [v2:192.168.122.101:6804/685771812,v1:192.168.122.101:6805/685771812] compat {c=[1],r=[1],i=[1fff]}]
Feb 02 09:41:12 compute-1 ceph-mds[85402]: mds.cephfs.compute-1.khfsen Updating MDS map to version 11 from mon.2
Feb 02 09:41:12 compute-1 ceph-osd[77691]: log_channel(cluster) log [DBG] : 7.10 scrub starts
Feb 02 09:41:12 compute-1 ceph-osd[77691]: log_channel(cluster) log [DBG] : 7.10 scrub ok
Feb 02 09:41:12 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:41:12 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:41:12 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:41:12.848 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:41:12 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:41:12 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:41:12 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:41:12.941 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:41:13 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[85688]: 02/02/2026 09:41:13 : epoch 69807135 : compute-1 : ganesha.nfsd-2[main] rados_cluster_end_grace :CLIENT ID :EVENT :Failed to remove rec-0000000000000001:nfs.cephfs.0: -2
Feb 02 09:41:13 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[85688]: 02/02/2026 09:41:13 : epoch 69807135 : compute-1 : ganesha.nfsd-2[main] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Feb 02 09:41:13 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[85688]: 02/02/2026 09:41:13 : epoch 69807135 : compute-1 : ganesha.nfsd-2[main] main :NFS STARTUP :WARN :No export entries found in configuration file !!!
Feb 02 09:41:13 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[85688]: 02/02/2026 09:41:13 : epoch 69807135 : compute-1 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:24): Unknown block (RADOS_URLS)
Feb 02 09:41:13 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[85688]: 02/02/2026 09:41:13 : epoch 69807135 : compute-1 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:29): Unknown block (RGW)
Feb 02 09:41:13 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[85688]: 02/02/2026 09:41:13 : epoch 69807135 : compute-1 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :CAP_SYS_RESOURCE was successfully removed for proper quota management in FSAL
Feb 02 09:41:13 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[85688]: 02/02/2026 09:41:13 : epoch 69807135 : compute-1 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :currently set capabilities are: cap_chown,cap_dac_override,cap_fowner,cap_fsetid,cap_kill,cap_setgid,cap_setuid,cap_setpcap,cap_net_bind_service,cap_sys_chroot,cap_setfcap=ep
Feb 02 09:41:13 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[85688]: 02/02/2026 09:41:13 : epoch 69807135 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_pkginit :DBUS :CRIT :dbus_bus_get failed (Failed to connect to socket /run/dbus/system_bus_socket: No such file or directory)
Feb 02 09:41:13 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[85688]: 02/02/2026 09:41:13 : epoch 69807135 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Feb 02 09:41:13 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[85688]: 02/02/2026 09:41:13 : epoch 69807135 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Feb 02 09:41:13 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[85688]: 02/02/2026 09:41:13 : epoch 69807135 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Feb 02 09:41:13 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[85688]: 02/02/2026 09:41:13 : epoch 69807135 : compute-1 : ganesha.nfsd-2[main] nfs_Init_svc :DISP :CRIT :Cannot acquire credentials for principal nfs
Feb 02 09:41:13 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[85688]: 02/02/2026 09:41:13 : epoch 69807135 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Feb 02 09:41:13 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[85688]: 02/02/2026 09:41:13 : epoch 69807135 : compute-1 : ganesha.nfsd-2[main] nfs_Init_admin_thread :NFS CB :EVENT :Admin thread initialized
Feb 02 09:41:13 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[85688]: 02/02/2026 09:41:13 : epoch 69807135 : compute-1 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :EVENT :Callback creds directory (/var/run/ganesha) already exists
Feb 02 09:41:13 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[85688]: 02/02/2026 09:41:13 : epoch 69807135 : compute-1 : ganesha.nfsd-2[main] find_keytab_entry :NFS CB :WARN :Configuration file does not specify default realm while getting default realm name
Feb 02 09:41:13 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[85688]: 02/02/2026 09:41:13 : epoch 69807135 : compute-1 : ganesha.nfsd-2[main] gssd_refresh_krb5_machine_credential :NFS CB :CRIT :ERROR: gssd_refresh_krb5_machine_credential: no usable keytab entry found in keytab /etc/krb5.keytab for connection with host localhost
Feb 02 09:41:13 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[85688]: 02/02/2026 09:41:13 : epoch 69807135 : compute-1 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :WARN :gssd_refresh_krb5_machine_credential failed (-1765328160:0)
Feb 02 09:41:13 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[85688]: 02/02/2026 09:41:13 : epoch 69807135 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :Starting delayed executor.
Feb 02 09:41:13 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[85688]: 02/02/2026 09:41:13 : epoch 69807135 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :gsh_dbusthread was started successfully
Feb 02 09:41:13 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[85688]: 02/02/2026 09:41:13 : epoch 69807135 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :admin thread was started successfully
Feb 02 09:41:13 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[85688]: 02/02/2026 09:41:13 : epoch 69807135 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :reaper thread was started successfully
Feb 02 09:41:13 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[85688]: 02/02/2026 09:41:13 : epoch 69807135 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :General fridge was started successfully
Feb 02 09:41:13 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[85688]: 02/02/2026 09:41:13 : epoch 69807135 : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Feb 02 09:41:13 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[85688]: 02/02/2026 09:41:13 : epoch 69807135 : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :             NFS SERVER INITIALIZED
Feb 02 09:41:13 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[85688]: 02/02/2026 09:41:13 : epoch 69807135 : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Feb 02 09:41:13 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[85688]: 02/02/2026 09:41:13 : epoch 69807135 : compute-1 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :CRIT :DBUS not initialized, service thread exiting
Feb 02 09:41:13 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[85688]: 02/02/2026 09:41:13 : epoch 69807135 : compute-1 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :EVENT :shutdown
Feb 02 09:41:13 compute-1 ceph-mon[80115]: 7.9 scrub starts
Feb 02 09:41:13 compute-1 ceph-mon[80115]: 7.9 scrub ok
Feb 02 09:41:13 compute-1 ceph-mon[80115]: 6.8 scrub starts
Feb 02 09:41:13 compute-1 ceph-mon[80115]: 6.8 scrub ok
Feb 02 09:41:13 compute-1 ceph-mon[80115]: mds.? [v2:192.168.122.101:6804/685771812,v1:192.168.122.101:6805/685771812] up:standby
Feb 02 09:41:13 compute-1 ceph-mon[80115]: fsmap cephfs:1 {0=cephfs.compute-2.vvohrf=up:active} 2 up:standby
Feb 02 09:41:13 compute-1 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' 
Feb 02 09:41:13 compute-1 ceph-mon[80115]: 7.10 scrub starts
Feb 02 09:41:13 compute-1 ceph-mon[80115]: 7.10 scrub ok
Feb 02 09:41:13 compute-1 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "auth rm", "entity": "client.mgr.nfs.grace.nfs.cephfs"}]: dispatch
Feb 02 09:41:13 compute-1 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' cmd='[{"prefix": "auth rm", "entity": "client.mgr.nfs.grace.nfs.cephfs"}]': finished
Feb 02 09:41:13 compute-1 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "auth get-or-create", "entity": "client.nfs.cephfs.1.0.compute-2.dciyfa-rgw", "caps": ["mon", "allow r", "osd", "allow rwx tag rgw *=*"]}]: dispatch
Feb 02 09:41:13 compute-1 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' cmd='[{"prefix": "auth get-or-create", "entity": "client.nfs.cephfs.1.0.compute-2.dciyfa-rgw", "caps": ["mon", "allow r", "osd", "allow rwx tag rgw *=*"]}]': finished
Feb 02 09:41:13 compute-1 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Feb 02 09:41:13 compute-1 ceph-osd[77691]: log_channel(cluster) log [DBG] : 7.13 scrub starts
Feb 02 09:41:13 compute-1 ceph-osd[77691]: log_channel(cluster) log [DBG] : 7.13 scrub ok
Feb 02 09:41:14 compute-1 ceph-mon[80115]: 6.5 scrub starts
Feb 02 09:41:14 compute-1 ceph-mon[80115]: 6.5 scrub ok
Feb 02 09:41:14 compute-1 ceph-mon[80115]: Rados config object exists: conf-nfs.cephfs
Feb 02 09:41:14 compute-1 ceph-mon[80115]: Creating key for client.nfs.cephfs.1.0.compute-2.dciyfa-rgw
Feb 02 09:41:14 compute-1 ceph-mon[80115]: Bind address in nfs.cephfs.1.0.compute-2.dciyfa's ganesha conf is defaulting to empty
Feb 02 09:41:14 compute-1 ceph-mon[80115]: Deploying daemon nfs.cephfs.1.0.compute-2.dciyfa on compute-2
Feb 02 09:41:14 compute-1 ceph-mon[80115]: pgmap v20: 198 pgs: 198 active+clean; 456 KiB data, 85 MiB used, 60 GiB / 60 GiB avail; 85 B/s rd, 1.3 KiB/s wr, 4 op/s
Feb 02 09:41:14 compute-1 ceph-mon[80115]: 7.13 scrub starts
Feb 02 09:41:14 compute-1 ceph-mon[80115]: 7.13 scrub ok
Feb 02 09:41:14 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:41:14 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 09:41:14 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:41:14.851 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 09:41:14 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[85688]: 02/02/2026 09:41:14 : epoch 69807135 : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Feb 02 09:41:14 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[85688]: 02/02/2026 09:41:14 : epoch 69807135 : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Feb 02 09:41:14 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[85688]: 02/02/2026 09:41:14 : epoch 69807135 : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Feb 02 09:41:14 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[85688]: 02/02/2026 09:41:14 : epoch 69807135 : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Feb 02 09:41:14 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:41:14 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:41:14 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:41:14.945 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:41:14 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[85688]: 02/02/2026 09:41:14 : epoch 69807135 : compute-1 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Feb 02 09:41:15 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e53 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 02 09:41:15 compute-1 ceph-mon[80115]: 6.2 scrub starts
Feb 02 09:41:15 compute-1 ceph-mon[80115]: 6.2 scrub ok
Feb 02 09:41:15 compute-1 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' 
Feb 02 09:41:15 compute-1 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' 
Feb 02 09:41:15 compute-1 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' 
Feb 02 09:41:15 compute-1 ceph-mon[80115]: Creating key for client.nfs.cephfs.2.0.compute-0.fdwwab
Feb 02 09:41:15 compute-1 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "auth get-or-create", "entity": "client.nfs.cephfs.2.0.compute-0.fdwwab", "caps": ["mon", "allow r", "osd", "allow rw pool=.nfs namespace=cephfs"]}]: dispatch
Feb 02 09:41:15 compute-1 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' cmd='[{"prefix": "auth get-or-create", "entity": "client.nfs.cephfs.2.0.compute-0.fdwwab", "caps": ["mon", "allow r", "osd", "allow rw pool=.nfs namespace=cephfs"]}]': finished
Feb 02 09:41:15 compute-1 ceph-mon[80115]: Ensuring nfs.cephfs.2 is in the ganesha grace table
Feb 02 09:41:15 compute-1 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "auth get-or-create", "entity": "client.mgr.nfs.grace.nfs.cephfs", "caps": ["mon", "allow r", "osd", "allow rwx pool .nfs"]}]: dispatch
Feb 02 09:41:15 compute-1 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' cmd='[{"prefix": "auth get-or-create", "entity": "client.mgr.nfs.grace.nfs.cephfs", "caps": ["mon", "allow r", "osd", "allow rwx pool .nfs"]}]': finished
Feb 02 09:41:15 compute-1 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Feb 02 09:41:15 compute-1 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "auth rm", "entity": "client.mgr.nfs.grace.nfs.cephfs"}]: dispatch
Feb 02 09:41:15 compute-1 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' cmd='[{"prefix": "auth rm", "entity": "client.mgr.nfs.grace.nfs.cephfs"}]': finished
Feb 02 09:41:15 compute-1 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "auth get-or-create", "entity": "client.nfs.cephfs.2.0.compute-0.fdwwab-rgw", "caps": ["mon", "allow r", "osd", "allow rwx tag rgw *=*"]}]: dispatch
Feb 02 09:41:15 compute-1 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' cmd='[{"prefix": "auth get-or-create", "entity": "client.nfs.cephfs.2.0.compute-0.fdwwab-rgw", "caps": ["mon", "allow r", "osd", "allow rwx tag rgw *=*"]}]': finished
Feb 02 09:41:15 compute-1 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Feb 02 09:41:16 compute-1 ceph-mon[80115]: 6.3 scrub starts
Feb 02 09:41:16 compute-1 ceph-mon[80115]: 6.3 scrub ok
Feb 02 09:41:16 compute-1 ceph-mon[80115]: Rados config object exists: conf-nfs.cephfs
Feb 02 09:41:16 compute-1 ceph-mon[80115]: Creating key for client.nfs.cephfs.2.0.compute-0.fdwwab-rgw
Feb 02 09:41:16 compute-1 ceph-mon[80115]: Bind address in nfs.cephfs.2.0.compute-0.fdwwab's ganesha conf is defaulting to empty
Feb 02 09:41:16 compute-1 ceph-mon[80115]: Deploying daemon nfs.cephfs.2.0.compute-0.fdwwab on compute-0
Feb 02 09:41:16 compute-1 ceph-mon[80115]: pgmap v21: 198 pgs: 198 active+clean; 456 KiB data, 85 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 1.8 KiB/s wr, 5 op/s
Feb 02 09:41:16 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:41:16 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 09:41:16 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:41:16.854 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 09:41:16 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:41:16 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 09:41:16 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:41:16.949 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 09:41:17 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[85688]: 02/02/2026 09:41:17 : epoch 69807135 : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Feb 02 09:41:17 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[85688]: 02/02/2026 09:41:17 : epoch 69807135 : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Feb 02 09:41:17 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[85688]: 02/02/2026 09:41:17 : epoch 69807135 : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Feb 02 09:41:17 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[85688]: 02/02/2026 09:41:17 : epoch 69807135 : compute-1 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Feb 02 09:41:17 compute-1 sudo[85743]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 02 09:41:17 compute-1 sudo[85743]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:41:17 compute-1 sudo[85743]: pam_unix(sudo:session): session closed for user root
Feb 02 09:41:17 compute-1 sudo[85768]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/d241d473-9fcb-5f74-b163-f1ca4454e7f1/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --image quay.io/ceph/haproxy:2.3 --timeout 895 _orch deploy --fsid d241d473-9fcb-5f74-b163-f1ca4454e7f1
Feb 02 09:41:17 compute-1 sudo[85768]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:41:17 compute-1 ceph-mon[80115]: 6.e scrub starts
Feb 02 09:41:17 compute-1 ceph-mon[80115]: 6.e scrub ok
Feb 02 09:41:17 compute-1 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' 
Feb 02 09:41:17 compute-1 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' 
Feb 02 09:41:17 compute-1 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' 
Feb 02 09:41:17 compute-1 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' 
Feb 02 09:41:17 compute-1 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' 
Feb 02 09:41:18 compute-1 ceph-mon[80115]: 6.d scrub starts
Feb 02 09:41:18 compute-1 ceph-mon[80115]: 6.d scrub ok
Feb 02 09:41:18 compute-1 ceph-mon[80115]: Deploying daemon haproxy.nfs.cephfs.compute-1.sryqbx on compute-1
Feb 02 09:41:18 compute-1 ceph-mon[80115]: pgmap v22: 198 pgs: 198 active+clean; 456 KiB data, 85 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 1.8 KiB/s wr, 5 op/s
Feb 02 09:41:18 compute-1 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' 
Feb 02 09:41:18 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:41:18 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:41:18 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:41:18.858 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:41:18 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:41:18 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 09:41:18 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:41:18.952 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 09:41:19 compute-1 ceph-mon[80115]: 6.1a scrub starts
Feb 02 09:41:19 compute-1 ceph-mon[80115]: 6.1a scrub ok
Feb 02 09:41:19 compute-1 podman[85832]: 2026-02-02 09:41:19.933067386 +0000 UTC m=+2.188622894 container create 002192ca2dff51d34c5cf6894964747594bd07b6f1900bdfd46e08a14ede04b4 (image=quay.io/ceph/haproxy:2.3, name=nervous_leakey)
Feb 02 09:41:19 compute-1 systemd[1]: Started libpod-conmon-002192ca2dff51d34c5cf6894964747594bd07b6f1900bdfd46e08a14ede04b4.scope.
Feb 02 09:41:19 compute-1 podman[85832]: 2026-02-02 09:41:19.918002274 +0000 UTC m=+2.173557812 image pull e85424b0d443f37ddd2dd8a3bb2ef6f18dd352b987723a921b64289023af2914 quay.io/ceph/haproxy:2.3
Feb 02 09:41:20 compute-1 systemd[1]: Started libcrun container.
Feb 02 09:41:20 compute-1 podman[85832]: 2026-02-02 09:41:20.01855902 +0000 UTC m=+2.274114548 container init 002192ca2dff51d34c5cf6894964747594bd07b6f1900bdfd46e08a14ede04b4 (image=quay.io/ceph/haproxy:2.3, name=nervous_leakey)
Feb 02 09:41:20 compute-1 podman[85832]: 2026-02-02 09:41:20.026217389 +0000 UTC m=+2.281772897 container start 002192ca2dff51d34c5cf6894964747594bd07b6f1900bdfd46e08a14ede04b4 (image=quay.io/ceph/haproxy:2.3, name=nervous_leakey)
Feb 02 09:41:20 compute-1 podman[85832]: 2026-02-02 09:41:20.029070243 +0000 UTC m=+2.284625751 container attach 002192ca2dff51d34c5cf6894964747594bd07b6f1900bdfd46e08a14ede04b4 (image=quay.io/ceph/haproxy:2.3, name=nervous_leakey)
Feb 02 09:41:20 compute-1 nervous_leakey[85947]: 0 0
Feb 02 09:41:20 compute-1 systemd[1]: libpod-002192ca2dff51d34c5cf6894964747594bd07b6f1900bdfd46e08a14ede04b4.scope: Deactivated successfully.
Feb 02 09:41:20 compute-1 podman[85832]: 2026-02-02 09:41:20.032441821 +0000 UTC m=+2.287997319 container died 002192ca2dff51d34c5cf6894964747594bd07b6f1900bdfd46e08a14ede04b4 (image=quay.io/ceph/haproxy:2.3, name=nervous_leakey)
Feb 02 09:41:20 compute-1 systemd[1]: var-lib-containers-storage-overlay-6b69922dac310fe717a1652596151c0046bd7ee0432f5a90e39eef0cc5c78c16-merged.mount: Deactivated successfully.
Feb 02 09:41:20 compute-1 podman[85832]: 2026-02-02 09:41:20.064900665 +0000 UTC m=+2.320456183 container remove 002192ca2dff51d34c5cf6894964747594bd07b6f1900bdfd46e08a14ede04b4 (image=quay.io/ceph/haproxy:2.3, name=nervous_leakey)
Feb 02 09:41:20 compute-1 systemd[1]: libpod-conmon-002192ca2dff51d34c5cf6894964747594bd07b6f1900bdfd46e08a14ede04b4.scope: Deactivated successfully.
Feb 02 09:41:20 compute-1 systemd[1]: Reloading.
Feb 02 09:41:20 compute-1 systemd-sysv-generator[86000]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 02 09:41:20 compute-1 systemd-rc-local-generator[85995]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 02 09:41:20 compute-1 systemd[1]: Reloading.
Feb 02 09:41:20 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e53 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 02 09:41:20 compute-1 systemd-sysv-generator[86039]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 02 09:41:20 compute-1 systemd-rc-local-generator[86032]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 02 09:41:20 compute-1 ceph-mon[80115]: pgmap v23: 198 pgs: 198 active+clean; 456 KiB data, 85 MiB used, 60 GiB / 60 GiB avail; 6.0 KiB/s rd, 3.6 KiB/s wr, 12 op/s
Feb 02 09:41:20 compute-1 systemd[1]: Starting Ceph haproxy.nfs.cephfs.compute-1.sryqbx for d241d473-9fcb-5f74-b163-f1ca4454e7f1...
Feb 02 09:41:20 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:41:20 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 09:41:20 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:41:20.860 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 09:41:20 compute-1 podman[86095]: 2026-02-02 09:41:20.91772211 +0000 UTC m=+0.041358277 container create 956874b4175e8a2fade475e5bb43454e3b46ed50db9c41f7163b928c5dfb05c2 (image=quay.io/ceph/haproxy:2.3, name=ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-haproxy-nfs-cephfs-compute-1-sryqbx)
Feb 02 09:41:20 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:41:20 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6cd7be9a1d6d329ee96cef4a74a73dfb17663a86205b28cdea15cb6da056cb0e/merged/var/lib/haproxy supports timestamps until 2038 (0x7fffffff)
Feb 02 09:41:20 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:41:20 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:41:20.956 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:41:20 compute-1 podman[86095]: 2026-02-02 09:41:20.981698415 +0000 UTC m=+0.105334582 container init 956874b4175e8a2fade475e5bb43454e3b46ed50db9c41f7163b928c5dfb05c2 (image=quay.io/ceph/haproxy:2.3, name=ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-haproxy-nfs-cephfs-compute-1-sryqbx)
Feb 02 09:41:20 compute-1 podman[86095]: 2026-02-02 09:41:20.987191467 +0000 UTC m=+0.110827634 container start 956874b4175e8a2fade475e5bb43454e3b46ed50db9c41f7163b928c5dfb05c2 (image=quay.io/ceph/haproxy:2.3, name=ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-haproxy-nfs-cephfs-compute-1-sryqbx)
Feb 02 09:41:20 compute-1 bash[86095]: 956874b4175e8a2fade475e5bb43454e3b46ed50db9c41f7163b928c5dfb05c2
Feb 02 09:41:20 compute-1 podman[86095]: 2026-02-02 09:41:20.901159619 +0000 UTC m=+0.024795776 image pull e85424b0d443f37ddd2dd8a3bb2ef6f18dd352b987723a921b64289023af2914 quay.io/ceph/haproxy:2.3
Feb 02 09:41:20 compute-1 systemd[1]: Started Ceph haproxy.nfs.cephfs.compute-1.sryqbx for d241d473-9fcb-5f74-b163-f1ca4454e7f1.
Feb 02 09:41:21 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-haproxy-nfs-cephfs-compute-1-sryqbx[86108]: [NOTICE] 032/094120 (2) : New worker #1 (4) forked
Feb 02 09:41:21 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[85688]: 02/02/2026 09:41:21 : epoch 69807135 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9dc000df0 fd 37 proxy ignored for local
Feb 02 09:41:21 compute-1 kernel: ganesha.nfsd[86123]: segfault at 50 ip 00007fba5d3d632e sp 00007fb9c67fb210 error 4 in libntirpc.so.5.8[7fba5d3bb000+2c000] likely on CPU 0 (core 0, socket 0)
Feb 02 09:41:21 compute-1 kernel: Code: 47 20 66 41 89 86 f2 00 00 00 41 bf 01 00 00 00 b9 40 00 00 00 e9 af fd ff ff 66 90 48 8b 85 f8 00 00 00 48 8b 40 08 4c 8b 28 <45> 8b 65 50 49 8b 75 68 41 8b be 28 02 00 00 b9 40 00 00 00 e8 29
Feb 02 09:41:21 compute-1 systemd[1]: Created slice Slice /system/systemd-coredump.
Feb 02 09:41:21 compute-1 systemd[1]: Started Process Core Dump (PID 86126/UID 0).
Feb 02 09:41:21 compute-1 sudo[85768]: pam_unix(sudo:session): session closed for user root
Feb 02 09:41:22 compute-1 systemd-coredump[86127]: Process 85692 (ganesha.nfsd) of user 0 dumped core.
                                                   
                                                   Stack trace of thread 52:
                                                   #0  0x00007fba5d3d632e n/a (/usr/lib64/libntirpc.so.5.8 + 0x2232e)
                                                   #1  0x0000000000000000 n/a (n/a + 0x0)
                                                   #2  0x00007fba5d3e0900 n/a (/usr/lib64/libntirpc.so.5.8 + 0x2c900)
                                                   ELF object binary architecture: AMD x86-64
Feb 02 09:41:22 compute-1 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' 
Feb 02 09:41:22 compute-1 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' 
Feb 02 09:41:22 compute-1 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' 
Feb 02 09:41:22 compute-1 ceph-mon[80115]: Deploying daemon haproxy.nfs.cephfs.compute-0.ooxkuo on compute-0
Feb 02 09:41:22 compute-1 ceph-mon[80115]: pgmap v24: 198 pgs: 198 active+clean; 456 KiB data, 85 MiB used, 60 GiB / 60 GiB avail; 5.9 KiB/s rd, 2.2 KiB/s wr, 8 op/s
Feb 02 09:41:22 compute-1 systemd[1]: systemd-coredump@0-86126-0.service: Deactivated successfully.
Feb 02 09:41:22 compute-1 podman[86132]: 2026-02-02 09:41:22.15750653 +0000 UTC m=+0.038033820 container died b92f7aee261ef9868168c88be932bb880afdd45ef202ba419eb3d1d25ccd8a7e (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, io.buildah.version=1.40.1, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/)
Feb 02 09:41:22 compute-1 systemd[1]: var-lib-containers-storage-overlay-4f000e337d81c95375c89b4aea8131c7417a0d2d466dbffabd46cfe3e1a43abb-merged.mount: Deactivated successfully.
Feb 02 09:41:22 compute-1 podman[86132]: 2026-02-02 09:41:22.206793042 +0000 UTC m=+0.087320332 container remove b92f7aee261ef9868168c88be932bb880afdd45ef202ba419eb3d1d25ccd8a7e (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx, ceph=True, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Feb 02 09:41:22 compute-1 systemd[1]: ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1@nfs.cephfs.0.0.compute-1.mhzhsx.service: Main process exited, code=exited, status=139/n/a
Feb 02 09:41:22 compute-1 systemd[1]: ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1@nfs.cephfs.0.0.compute-1.mhzhsx.service: Failed with result 'exit-code'.
Feb 02 09:41:22 compute-1 systemd[1]: ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1@nfs.cephfs.0.0.compute-1.mhzhsx.service: Consumed 1.148s CPU time.
Feb 02 09:41:22 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:41:22 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:41:22 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:41:22.863 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:41:22 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:41:22 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 09:41:22 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:41:22.959 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 09:41:23 compute-1 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' 
Feb 02 09:41:23 compute-1 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' 
Feb 02 09:41:23 compute-1 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' 
Feb 02 09:41:23 compute-1 ceph-mon[80115]: Deploying daemon haproxy.nfs.cephfs.compute-2.arssaq on compute-2
Feb 02 09:41:24 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:41:24 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 09:41:24 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:41:24.866 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 09:41:24 compute-1 ceph-mon[80115]: pgmap v25: 198 pgs: 198 active+clean; 456 KiB data, 85 MiB used, 60 GiB / 60 GiB avail; 5.9 KiB/s rd, 2.2 KiB/s wr, 8 op/s
Feb 02 09:41:24 compute-1 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' 
Feb 02 09:41:24 compute-1 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' 
Feb 02 09:41:24 compute-1 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' 
Feb 02 09:41:24 compute-1 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' 
Feb 02 09:41:24 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:41:24 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:41:24 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:41:24.963 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:41:25 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e53 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 02 09:41:25 compute-1 ceph-mon[80115]: 192.168.122.2 is in 192.168.122.0/24 on compute-2 interface br-ex
Feb 02 09:41:25 compute-1 ceph-mon[80115]: 192.168.122.2 is in 192.168.122.0/24 on compute-0 interface br-ex
Feb 02 09:41:25 compute-1 ceph-mon[80115]: 192.168.122.2 is in 192.168.122.0/24 on compute-1 interface br-ex
Feb 02 09:41:25 compute-1 ceph-mon[80115]: Deploying daemon keepalived.nfs.cephfs.compute-2.tgzfzm on compute-2
Feb 02 09:41:26 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:41:26 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 09:41:26 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:41:26.869 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 09:41:26 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:41:26 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:41:26 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:41:26.966 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:41:26 compute-1 ceph-mon[80115]: pgmap v26: 198 pgs: 198 active+clean; 456 KiB data, 85 MiB used, 60 GiB / 60 GiB avail; 5.9 KiB/s rd, 2.2 KiB/s wr, 8 op/s
Feb 02 09:41:27 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-haproxy-nfs-cephfs-compute-1-sryqbx[86108]: [WARNING] 032/094127 (4) : Server backend/nfs.cephfs.0 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Feb 02 09:41:28 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:41:28 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:41:28 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:41:28.874 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:41:28 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:41:28 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:41:28 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:41:28.969 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:41:29 compute-1 ceph-mon[80115]: pgmap v27: 198 pgs: 198 active+clean; 456 KiB data, 85 MiB used, 60 GiB / 60 GiB avail; 5.0 KiB/s rd, 1.7 KiB/s wr, 6 op/s
Feb 02 09:41:30 compute-1 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' 
Feb 02 09:41:30 compute-1 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' 
Feb 02 09:41:30 compute-1 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' 
Feb 02 09:41:30 compute-1 ceph-mon[80115]: 192.168.122.2 is in 192.168.122.0/24 on compute-0 interface br-ex
Feb 02 09:41:30 compute-1 ceph-mon[80115]: 192.168.122.2 is in 192.168.122.0/24 on compute-1 interface br-ex
Feb 02 09:41:30 compute-1 ceph-mon[80115]: 192.168.122.2 is in 192.168.122.0/24 on compute-2 interface br-ex
Feb 02 09:41:30 compute-1 ceph-mon[80115]: Deploying daemon keepalived.nfs.cephfs.compute-0.pqolko on compute-0
Feb 02 09:41:30 compute-1 ceph-mon[80115]: pgmap v28: 198 pgs: 198 active+clean; 456 KiB data, 85 MiB used, 60 GiB / 60 GiB avail; 5.2 KiB/s rd, 1.7 KiB/s wr, 7 op/s
Feb 02 09:41:30 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e53 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 02 09:41:30 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:41:30 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:41:30 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:41:30.877 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:41:30 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:41:30 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 09:41:30 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:41:30.971 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 09:41:32 compute-1 systemd[1]: ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1@nfs.cephfs.0.0.compute-1.mhzhsx.service: Scheduled restart job, restart counter is at 1.
Feb 02 09:41:32 compute-1 systemd[1]: Stopped Ceph nfs.cephfs.0.0.compute-1.mhzhsx for d241d473-9fcb-5f74-b163-f1ca4454e7f1.
Feb 02 09:41:32 compute-1 systemd[1]: ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1@nfs.cephfs.0.0.compute-1.mhzhsx.service: Consumed 1.148s CPU time.
Feb 02 09:41:32 compute-1 systemd[1]: Starting Ceph nfs.cephfs.0.0.compute-1.mhzhsx for d241d473-9fcb-5f74-b163-f1ca4454e7f1...
Feb 02 09:41:32 compute-1 ceph-mon[80115]: pgmap v29: 198 pgs: 198 active+clean; 456 KiB data, 85 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s
Feb 02 09:41:32 compute-1 podman[86221]: 2026-02-02 09:41:32.79561145 +0000 UTC m=+0.050065983 container create 7c9865c3c5fcb01fb9a72090f5b9a87596fc8b607b4daae4dd97ee2f68e491d8 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325, org.opencontainers.image.documentation=https://docs.ceph.com/)
Feb 02 09:41:32 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fffc5fa1fa69a3fdffe465386ca294ac1ff31a0b62c5a54f0b0563f32f221eaf/merged/etc/ganesha supports timestamps until 2038 (0x7fffffff)
Feb 02 09:41:32 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fffc5fa1fa69a3fdffe465386ca294ac1ff31a0b62c5a54f0b0563f32f221eaf/merged/etc/ceph/keyring supports timestamps until 2038 (0x7fffffff)
Feb 02 09:41:32 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fffc5fa1fa69a3fdffe465386ca294ac1ff31a0b62c5a54f0b0563f32f221eaf/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Feb 02 09:41:32 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fffc5fa1fa69a3fdffe465386ca294ac1ff31a0b62c5a54f0b0563f32f221eaf/merged/var/lib/ceph/radosgw/ceph-nfs.cephfs.0.0.compute-1.mhzhsx-rgw/keyring supports timestamps until 2038 (0x7fffffff)
Feb 02 09:41:32 compute-1 podman[86221]: 2026-02-02 09:41:32.852428008 +0000 UTC m=+0.106882511 container init 7c9865c3c5fcb01fb9a72090f5b9a87596fc8b607b4daae4dd97ee2f68e491d8 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, io.buildah.version=1.40.1, OSD_FLAVOR=default, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Feb 02 09:41:32 compute-1 podman[86221]: 2026-02-02 09:41:32.861841483 +0000 UTC m=+0.116296026 container start 7c9865c3c5fcb01fb9a72090f5b9a87596fc8b607b4daae4dd97ee2f68e491d8 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, ceph=True, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, OSD_FLAVOR=default)
Feb 02 09:41:32 compute-1 bash[86221]: 7c9865c3c5fcb01fb9a72090f5b9a87596fc8b607b4daae4dd97ee2f68e491d8
Feb 02 09:41:32 compute-1 podman[86221]: 2026-02-02 09:41:32.772103219 +0000 UTC m=+0.026557802 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Feb 02 09:41:32 compute-1 systemd[1]: Started Ceph nfs.cephfs.0.0.compute-1.mhzhsx for d241d473-9fcb-5f74-b163-f1ca4454e7f1.
Feb 02 09:41:32 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[86236]: 02/02/2026 09:41:32 : epoch 6980714c : compute-1 : ganesha.nfsd-2[main] init_logging :LOG :NULL :LOG: Setting log level for all components to NIV_EVENT
Feb 02 09:41:32 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[86236]: 02/02/2026 09:41:32 : epoch 6980714c : compute-1 : ganesha.nfsd-2[main] main :MAIN :EVENT :ganesha.nfsd Starting: Ganesha Version 5.9
Feb 02 09:41:32 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:41:32 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:41:32 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:41:32.881 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:41:32 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[86236]: 02/02/2026 09:41:32 : epoch 6980714c : compute-1 : ganesha.nfsd-2[main] nfs_set_param_from_conf :NFS STARTUP :EVENT :Configuration file successfully parsed
Feb 02 09:41:32 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[86236]: 02/02/2026 09:41:32 : epoch 6980714c : compute-1 : ganesha.nfsd-2[main] monitoring_init :NFS STARTUP :EVENT :Init monitoring at 0.0.0.0:9587
Feb 02 09:41:32 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[86236]: 02/02/2026 09:41:32 : epoch 6980714c : compute-1 : ganesha.nfsd-2[main] fsal_init_fds_limit :MDCACHE LRU :EVENT :Setting the system-imposed limit on FDs to 1048576.
Feb 02 09:41:32 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[86236]: 02/02/2026 09:41:32 : epoch 6980714c : compute-1 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :Initializing ID Mapper.
Feb 02 09:41:32 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[86236]: 02/02/2026 09:41:32 : epoch 6980714c : compute-1 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :ID Mapper successfully initialized.
Feb 02 09:41:32 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:41:32 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:41:32 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:41:32.974 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:41:32 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[86236]: 02/02/2026 09:41:32 : epoch 6980714c : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Feb 02 09:41:33 compute-1 sudo[86278]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 02 09:41:33 compute-1 sudo[86278]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:41:33 compute-1 sudo[86278]: pam_unix(sudo:session): session closed for user root
Feb 02 09:41:33 compute-1 sudo[86303]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/d241d473-9fcb-5f74-b163-f1ca4454e7f1/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --image quay.io/ceph/keepalived:2.2.4 --timeout 895 _orch deploy --fsid d241d473-9fcb-5f74-b163-f1ca4454e7f1
Feb 02 09:41:33 compute-1 sudo[86303]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:41:34 compute-1 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' 
Feb 02 09:41:34 compute-1 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' 
Feb 02 09:41:34 compute-1 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' 
Feb 02 09:41:34 compute-1 ceph-mon[80115]: 192.168.122.2 is in 192.168.122.0/24 on compute-1 interface br-ex
Feb 02 09:41:34 compute-1 ceph-mon[80115]: 192.168.122.2 is in 192.168.122.0/24 on compute-0 interface br-ex
Feb 02 09:41:34 compute-1 ceph-mon[80115]: 192.168.122.2 is in 192.168.122.0/24 on compute-2 interface br-ex
Feb 02 09:41:34 compute-1 ceph-mon[80115]: Deploying daemon keepalived.nfs.cephfs.compute-1.whrwoq on compute-1
Feb 02 09:41:34 compute-1 ceph-mon[80115]: pgmap v30: 198 pgs: 198 active+clean; 456 KiB data, 85 MiB used, 60 GiB / 60 GiB avail; 597 B/s rd, 85 B/s wr, 0 op/s
Feb 02 09:41:34 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:41:34 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:41:34 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:41:34.884 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:41:34 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:41:34 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 09:41:34 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:41:34.977 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 09:41:35 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e53 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 02 09:41:36 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:41:36 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000025s ======
Feb 02 09:41:36 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:41:36.886 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Feb 02 09:41:36 compute-1 ceph-mon[80115]: pgmap v31: 198 pgs: 198 active+clean; 456 KiB data, 85 MiB used, 60 GiB / 60 GiB avail; 597 B/s rd, 85 B/s wr, 0 op/s
Feb 02 09:41:36 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:41:36 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000024s ======
Feb 02 09:41:36 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:41:36.980 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Feb 02 09:41:37 compute-1 podman[86370]: 2026-02-02 09:41:37.32210531 +0000 UTC m=+3.129237753 image pull 4a3a1ff181d97c6dcfa9138ad76eb99fa2c1e840298461d5a7a56133bc05b9a2 quay.io/ceph/keepalived:2.2.4
Feb 02 09:41:37 compute-1 podman[86370]: 2026-02-02 09:41:37.423232126 +0000 UTC m=+3.230364529 container create 234e3a955ed9a7946855494b9f4b03cdc58a5280b2fdc913faffc0b3a4ba1dec (image=quay.io/ceph/keepalived:2.2.4, name=nervous_goodall, release=1793, io.buildah.version=1.28.2, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, distribution-scope=public, name=keepalived, vcs-type=git, com.redhat.component=keepalived-container, io.openshift.expose-services=, description=keepalived for Ceph, io.k8s.display-name=Keepalived on RHEL 9, summary=Provides keepalived on RHEL 9 for Ceph., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, version=2.2.4, build-date=2023-02-22T09:23:20, architecture=x86_64, io.openshift.tags=Ceph keepalived, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, vendor=Red Hat, Inc.)
Feb 02 09:41:37 compute-1 systemd[1]: Started libpod-conmon-234e3a955ed9a7946855494b9f4b03cdc58a5280b2fdc913faffc0b3a4ba1dec.scope.
Feb 02 09:41:37 compute-1 systemd[1]: Started libcrun container.
Feb 02 09:41:37 compute-1 podman[86370]: 2026-02-02 09:41:37.525491409 +0000 UTC m=+3.332623862 container init 234e3a955ed9a7946855494b9f4b03cdc58a5280b2fdc913faffc0b3a4ba1dec (image=quay.io/ceph/keepalived:2.2.4, name=nervous_goodall, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=keepalived, build-date=2023-02-22T09:23:20, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., summary=Provides keepalived on RHEL 9 for Ceph., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, description=keepalived for Ceph, version=2.2.4, architecture=x86_64, io.openshift.tags=Ceph keepalived, vcs-type=git, io.k8s.display-name=Keepalived on RHEL 9, com.redhat.component=keepalived-container, release=1793, io.buildah.version=1.28.2, distribution-scope=public)
Feb 02 09:41:37 compute-1 podman[86370]: 2026-02-02 09:41:37.534041659 +0000 UTC m=+3.341174062 container start 234e3a955ed9a7946855494b9f4b03cdc58a5280b2fdc913faffc0b3a4ba1dec (image=quay.io/ceph/keepalived:2.2.4, name=nervous_goodall, io.buildah.version=1.28.2, name=keepalived, io.openshift.expose-services=, description=keepalived for Ceph, architecture=x86_64, com.redhat.component=keepalived-container, release=1793, build-date=2023-02-22T09:23:20, io.k8s.display-name=Keepalived on RHEL 9, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, distribution-scope=public, io.openshift.tags=Ceph keepalived, vcs-type=git, summary=Provides keepalived on RHEL 9 for Ceph., version=2.2.4, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 02 09:41:37 compute-1 nervous_goodall[86467]: 0 0
Feb 02 09:41:37 compute-1 systemd[1]: libpod-234e3a955ed9a7946855494b9f4b03cdc58a5280b2fdc913faffc0b3a4ba1dec.scope: Deactivated successfully.
Feb 02 09:41:37 compute-1 podman[86370]: 2026-02-02 09:41:37.546306811 +0000 UTC m=+3.353439264 container attach 234e3a955ed9a7946855494b9f4b03cdc58a5280b2fdc913faffc0b3a4ba1dec (image=quay.io/ceph/keepalived:2.2.4, name=nervous_goodall, io.openshift.expose-services=, release=1793, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Keepalived on RHEL 9, summary=Provides keepalived on RHEL 9 for Ceph., distribution-scope=public, description=keepalived for Ceph, io.buildah.version=1.28.2, architecture=x86_64, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, vcs-type=git, com.redhat.component=keepalived-container, name=keepalived, version=2.2.4, build-date=2023-02-22T09:23:20, io.openshift.tags=Ceph keepalived, vendor=Red Hat, Inc., vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9)
Feb 02 09:41:37 compute-1 podman[86370]: 2026-02-02 09:41:37.547546251 +0000 UTC m=+3.354678644 container died 234e3a955ed9a7946855494b9f4b03cdc58a5280b2fdc913faffc0b3a4ba1dec (image=quay.io/ceph/keepalived:2.2.4, name=nervous_goodall, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., com.redhat.component=keepalived-container, summary=Provides keepalived on RHEL 9 for Ceph., vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, description=keepalived for Ceph, io.buildah.version=1.28.2, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, distribution-scope=public, io.openshift.tags=Ceph keepalived, release=1793, version=2.2.4, build-date=2023-02-22T09:23:20, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Keepalived on RHEL 9, name=keepalived, io.openshift.expose-services=)
Feb 02 09:41:37 compute-1 systemd[1]: var-lib-containers-storage-overlay-f0b1f7f70685725d38e8079ceab8f79a91a3e9352fcb7d96bcfa543c847967f2-merged.mount: Deactivated successfully.
Feb 02 09:41:37 compute-1 podman[86370]: 2026-02-02 09:41:37.619782517 +0000 UTC m=+3.426914920 container remove 234e3a955ed9a7946855494b9f4b03cdc58a5280b2fdc913faffc0b3a4ba1dec (image=quay.io/ceph/keepalived:2.2.4, name=nervous_goodall, io.k8s.display-name=Keepalived on RHEL 9, io.openshift.expose-services=, version=2.2.4, build-date=2023-02-22T09:23:20, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=keepalived-container, description=keepalived for Ceph, io.openshift.tags=Ceph keepalived, vendor=Red Hat, Inc., summary=Provides keepalived on RHEL 9 for Ceph., name=keepalived, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, vcs-type=git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, release=1793, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, architecture=x86_64, io.buildah.version=1.28.2)
Feb 02 09:41:37 compute-1 systemd[1]: libpod-conmon-234e3a955ed9a7946855494b9f4b03cdc58a5280b2fdc913faffc0b3a4ba1dec.scope: Deactivated successfully.
Feb 02 09:41:37 compute-1 systemd[1]: Reloading.
Feb 02 09:41:37 compute-1 systemd-rc-local-generator[86512]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 02 09:41:37 compute-1 systemd-sysv-generator[86517]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 02 09:41:38 compute-1 systemd[1]: Reloading.
Feb 02 09:41:38 compute-1 systemd-rc-local-generator[86557]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 02 09:41:38 compute-1 systemd-sysv-generator[86562]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 02 09:41:38 compute-1 systemd[1]: Starting Ceph keepalived.nfs.cephfs.compute-1.whrwoq for d241d473-9fcb-5f74-b163-f1ca4454e7f1...
Feb 02 09:41:38 compute-1 podman[86616]: 2026-02-02 09:41:38.552635525 +0000 UTC m=+0.055493924 container create 2d34301ed9240709c7b956110bf2f7a09db1491812be3ea91463444d19b431a2 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-keepalived-nfs-cephfs-compute-1-whrwoq, summary=Provides keepalived on RHEL 9 for Ceph., io.k8s.display-name=Keepalived on RHEL 9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, distribution-scope=public, io.buildah.version=1.28.2, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2023-02-22T09:23:20, com.redhat.component=keepalived-container, name=keepalived, vcs-type=git, description=keepalived for Ceph, io.openshift.expose-services=, io.openshift.tags=Ceph keepalived, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., architecture=x86_64, version=2.2.4, release=1793, maintainer=Guillaume Abrioux <gabrioux@redhat.com>)
Feb 02 09:41:38 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/855c0c84ba32e92e80b8de6281d0a005428bc730406eeeb6d1f4ea34daf14c34/merged/etc/keepalived/keepalived.conf supports timestamps until 2038 (0x7fffffff)
Feb 02 09:41:38 compute-1 podman[86616]: 2026-02-02 09:41:38.528327988 +0000 UTC m=+0.031186407 image pull 4a3a1ff181d97c6dcfa9138ad76eb99fa2c1e840298461d5a7a56133bc05b9a2 quay.io/ceph/keepalived:2.2.4
Feb 02 09:41:38 compute-1 podman[86616]: 2026-02-02 09:41:38.622434611 +0000 UTC m=+0.125293040 container init 2d34301ed9240709c7b956110bf2f7a09db1491812be3ea91463444d19b431a2 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-keepalived-nfs-cephfs-compute-1-whrwoq, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, vendor=Red Hat, Inc., description=keepalived for Ceph, io.openshift.expose-services=, io.buildah.version=1.28.2, io.openshift.tags=Ceph keepalived, summary=Provides keepalived on RHEL 9 for Ceph., name=keepalived, vcs-type=git, version=2.2.4, build-date=2023-02-22T09:23:20, distribution-scope=public, architecture=x86_64, release=1793, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=keepalived-container, io.k8s.display-name=Keepalived on RHEL 9, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9)
Feb 02 09:41:38 compute-1 podman[86616]: 2026-02-02 09:41:38.630621912 +0000 UTC m=+0.133480321 container start 2d34301ed9240709c7b956110bf2f7a09db1491812be3ea91463444d19b431a2 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-keepalived-nfs-cephfs-compute-1-whrwoq, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., name=keepalived, description=keepalived for Ceph, io.buildah.version=1.28.2, summary=Provides keepalived on RHEL 9 for Ceph., build-date=2023-02-22T09:23:20, com.redhat.component=keepalived-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, distribution-scope=public, vcs-type=git, architecture=x86_64, version=2.2.4, release=1793, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.display-name=Keepalived on RHEL 9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, io.openshift.tags=Ceph keepalived, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9)
Feb 02 09:41:38 compute-1 bash[86616]: 2d34301ed9240709c7b956110bf2f7a09db1491812be3ea91463444d19b431a2
Feb 02 09:41:38 compute-1 systemd[1]: Started Ceph keepalived.nfs.cephfs.compute-1.whrwoq for d241d473-9fcb-5f74-b163-f1ca4454e7f1.
Feb 02 09:41:38 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-keepalived-nfs-cephfs-compute-1-whrwoq[86631]: Mon Feb  2 09:41:38 2026: Starting Keepalived v2.2.4 (08/21,2021)
Feb 02 09:41:38 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-keepalived-nfs-cephfs-compute-1-whrwoq[86631]: Mon Feb  2 09:41:38 2026: Running on Linux 5.14.0-665.el9.x86_64 #1 SMP PREEMPT_DYNAMIC Thu Jan 22 12:30:22 UTC 2026 (built for Linux 5.14.0)
Feb 02 09:41:38 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-keepalived-nfs-cephfs-compute-1-whrwoq[86631]: Mon Feb  2 09:41:38 2026: Command line: '/usr/sbin/keepalived' '-n' '-l' '-f' '/etc/keepalived/keepalived.conf'
Feb 02 09:41:38 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-keepalived-nfs-cephfs-compute-1-whrwoq[86631]: Mon Feb  2 09:41:38 2026: Configuration file /etc/keepalived/keepalived.conf
Feb 02 09:41:38 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-keepalived-nfs-cephfs-compute-1-whrwoq[86631]: Mon Feb  2 09:41:38 2026: NOTICE: setting config option max_auto_priority should result in better keepalived performance
Feb 02 09:41:38 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-keepalived-nfs-cephfs-compute-1-whrwoq[86631]: Mon Feb  2 09:41:38 2026: Starting VRRP child process, pid=4
Feb 02 09:41:38 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-keepalived-nfs-cephfs-compute-1-whrwoq[86631]: Mon Feb  2 09:41:38 2026: Startup complete
Feb 02 09:41:38 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-keepalived-nfs-cephfs-compute-1-whrwoq[86631]: Mon Feb  2 09:41:38 2026: (VI_0) Entering BACKUP STATE (init)
Feb 02 09:41:38 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-keepalived-nfs-cephfs-compute-1-whrwoq[86631]: Mon Feb  2 09:41:38 2026: VRRP_Script(check_backend) succeeded
Feb 02 09:41:38 compute-1 sudo[86303]: pam_unix(sudo:session): session closed for user root
Feb 02 09:41:38 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:41:38 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:41:38 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:41:38.890 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:41:38 compute-1 ceph-mon[80115]: pgmap v32: 198 pgs: 198 active+clean; 456 KiB data, 85 MiB used, 60 GiB / 60 GiB avail; 597 B/s rd, 85 B/s wr, 0 op/s
Feb 02 09:41:38 compute-1 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' 
Feb 02 09:41:38 compute-1 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' 
Feb 02 09:41:38 compute-1 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' 
Feb 02 09:41:38 compute-1 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' 
Feb 02 09:41:38 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:41:38 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000024s ======
Feb 02 09:41:38 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:41:38.983 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Feb 02 09:41:39 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[86236]: 02/02/2026 09:41:39 : epoch 6980714c : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Feb 02 09:41:39 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[86236]: 02/02/2026 09:41:39 : epoch 6980714c : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Feb 02 09:41:40 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e53 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 02 09:41:40 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:41:40 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:41:40 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:41:40.892 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:41:40 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:41:40 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:41:40 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:41:40.986 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:41:41 compute-1 ceph-mon[80115]: Deploying daemon alertmanager.compute-0 on compute-0
Feb 02 09:41:42 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-keepalived-nfs-cephfs-compute-1-whrwoq[86631]: Mon Feb  2 09:41:42 2026: (VI_0) Entering MASTER STATE
Feb 02 09:41:42 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-keepalived-nfs-cephfs-compute-1-whrwoq[86631]: Mon Feb  2 09:41:42 2026: (VI_0) Master received advert from 192.168.122.100 with higher priority 100, ours 90
Feb 02 09:41:42 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-keepalived-nfs-cephfs-compute-1-whrwoq[86631]: Mon Feb  2 09:41:42 2026: (VI_0) Entering BACKUP STATE
Feb 02 09:41:42 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:41:42 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:41:42 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:41:42.895 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:41:42 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:41:42 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:41:42 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:41:42.990 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:41:44 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:41:44 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:41:44 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:41:44.898 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:41:44 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:41:44 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000024s ======
Feb 02 09:41:44 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:41:44.992 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Feb 02 09:41:45 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e53 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 02 09:41:45 compute-1 ceph-mon[80115]: pgmap v33: 198 pgs: 198 active+clean; 456 KiB data, 85 MiB used, 60 GiB / 60 GiB avail; 1.2 KiB/s rd, 597 B/s wr, 1 op/s
Feb 02 09:41:45 compute-1 ceph-mon[80115]: pgmap v34: 198 pgs: 198 active+clean; 456 KiB data, 85 MiB used, 60 GiB / 60 GiB avail; 1.1 KiB/s rd, 597 B/s wr, 1 op/s
Feb 02 09:41:46 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[86236]: 02/02/2026 09:41:46 : epoch 6980714c : compute-1 : ganesha.nfsd-2[main] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Feb 02 09:41:46 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[86236]: 02/02/2026 09:41:46 : epoch 6980714c : compute-1 : ganesha.nfsd-2[main] main :NFS STARTUP :WARN :No export entries found in configuration file !!!
Feb 02 09:41:46 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[86236]: 02/02/2026 09:41:46 : epoch 6980714c : compute-1 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:24): Unknown block (RADOS_URLS)
Feb 02 09:41:46 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[86236]: 02/02/2026 09:41:46 : epoch 6980714c : compute-1 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:29): Unknown block (RGW)
Feb 02 09:41:46 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[86236]: 02/02/2026 09:41:46 : epoch 6980714c : compute-1 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :CAP_SYS_RESOURCE was successfully removed for proper quota management in FSAL
Feb 02 09:41:46 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[86236]: 02/02/2026 09:41:46 : epoch 6980714c : compute-1 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :currently set capabilities are: cap_chown,cap_dac_override,cap_fowner,cap_fsetid,cap_kill,cap_setgid,cap_setuid,cap_setpcap,cap_net_bind_service,cap_sys_chroot,cap_setfcap=ep
Feb 02 09:41:46 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[86236]: 02/02/2026 09:41:46 : epoch 6980714c : compute-1 : ganesha.nfsd-2[main] gsh_dbus_pkginit :DBUS :CRIT :dbus_bus_get failed (Failed to connect to socket /run/dbus/system_bus_socket: No such file or directory)
Feb 02 09:41:46 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[86236]: 02/02/2026 09:41:46 : epoch 6980714c : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Feb 02 09:41:46 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[86236]: 02/02/2026 09:41:46 : epoch 6980714c : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Feb 02 09:41:46 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[86236]: 02/02/2026 09:41:46 : epoch 6980714c : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Feb 02 09:41:46 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[86236]: 02/02/2026 09:41:46 : epoch 6980714c : compute-1 : ganesha.nfsd-2[main] nfs_Init_svc :DISP :CRIT :Cannot acquire credentials for principal nfs
Feb 02 09:41:46 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[86236]: 02/02/2026 09:41:46 : epoch 6980714c : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Feb 02 09:41:46 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[86236]: 02/02/2026 09:41:46 : epoch 6980714c : compute-1 : ganesha.nfsd-2[main] nfs_Init_admin_thread :NFS CB :EVENT :Admin thread initialized
Feb 02 09:41:46 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[86236]: 02/02/2026 09:41:46 : epoch 6980714c : compute-1 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :EVENT :Callback creds directory (/var/run/ganesha) already exists
Feb 02 09:41:46 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[86236]: 02/02/2026 09:41:46 : epoch 6980714c : compute-1 : ganesha.nfsd-2[main] find_keytab_entry :NFS CB :WARN :Configuration file does not specify default realm while getting default realm name
Feb 02 09:41:46 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[86236]: 02/02/2026 09:41:46 : epoch 6980714c : compute-1 : ganesha.nfsd-2[main] gssd_refresh_krb5_machine_credential :NFS CB :CRIT :ERROR: gssd_refresh_krb5_machine_credential: no usable keytab entry found in keytab /etc/krb5.keytab for connection with host localhost
Feb 02 09:41:46 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[86236]: 02/02/2026 09:41:46 : epoch 6980714c : compute-1 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :WARN :gssd_refresh_krb5_machine_credential failed (-1765328160:0)
Feb 02 09:41:46 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[86236]: 02/02/2026 09:41:46 : epoch 6980714c : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :Starting delayed executor.
Feb 02 09:41:46 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[86236]: 02/02/2026 09:41:46 : epoch 6980714c : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :gsh_dbusthread was started successfully
Feb 02 09:41:46 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[86236]: 02/02/2026 09:41:46 : epoch 6980714c : compute-1 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :CRIT :DBUS not initialized, service thread exiting
Feb 02 09:41:46 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[86236]: 02/02/2026 09:41:46 : epoch 6980714c : compute-1 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :EVENT :shutdown
Feb 02 09:41:46 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[86236]: 02/02/2026 09:41:46 : epoch 6980714c : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :admin thread was started successfully
Feb 02 09:41:46 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[86236]: 02/02/2026 09:41:46 : epoch 6980714c : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :reaper thread was started successfully
Feb 02 09:41:46 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[86236]: 02/02/2026 09:41:46 : epoch 6980714c : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :General fridge was started successfully
Feb 02 09:41:46 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[86236]: 02/02/2026 09:41:46 : epoch 6980714c : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Feb 02 09:41:46 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[86236]: 02/02/2026 09:41:46 : epoch 6980714c : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :             NFS SERVER INITIALIZED
Feb 02 09:41:46 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[86236]: 02/02/2026 09:41:46 : epoch 6980714c : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Feb 02 09:41:46 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[86236]: 02/02/2026 09:41:46 : epoch 6980714c : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e08000df0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:41:46 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[86236]: 02/02/2026 09:41:46 : epoch 6980714c : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e00001c00 fd 37 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:41:46 compute-1 ceph-mon[80115]: pgmap v35: 198 pgs: 198 active+clean; 456 KiB data, 85 MiB used, 60 GiB / 60 GiB avail; 2.2 KiB/s rd, 1023 B/s wr, 3 op/s
Feb 02 09:41:46 compute-1 ceph-mon[80115]: pgmap v36: 198 pgs: 198 active+clean; 456 KiB data, 85 MiB used, 60 GiB / 60 GiB avail; 1.7 KiB/s rd, 938 B/s wr, 2 op/s
Feb 02 09:41:46 compute-1 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' 
Feb 02 09:41:46 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:41:46 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:41:46 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:41:46.900 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:41:46 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:41:46 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000024s ======
Feb 02 09:41:46 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:41:46.995 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Feb 02 09:41:47 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[86236]: 02/02/2026 09:41:47 : epoch 6980714c : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9de4000b60 fd 37 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:41:48 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[86236]: 02/02/2026 09:41:48 : epoch 6980714c : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e08000df0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:41:48 compute-1 ceph-mon[80115]: pgmap v37: 198 pgs: 198 active+clean; 456 KiB data, 85 MiB used, 60 GiB / 60 GiB avail; 1.7 KiB/s rd, 938 B/s wr, 2 op/s
Feb 02 09:41:48 compute-1 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd pool set", "pool": ".rgw.root", "var": "pg_num", "val": "32"}]: dispatch
Feb 02 09:41:48 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[86236]: 02/02/2026 09:41:48 : epoch 6980714c : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9df0000fa0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:41:48 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e54 e54: 3 total, 3 up, 3 in
Feb 02 09:41:48 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:41:48 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000025s ======
Feb 02 09:41:48 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:41:48.903 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Feb 02 09:41:49 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:41:49 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:41:49 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:41:48.999 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:41:49 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-haproxy-nfs-cephfs-compute-1-sryqbx[86108]: [WARNING] 032/094149 (4) : Server backend/nfs.cephfs.0 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Feb 02 09:41:49 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[86236]: 02/02/2026 09:41:49 : epoch 6980714c : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e00001c00 fd 37 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:41:49 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e55 e55: 3 total, 3 up, 3 in
Feb 02 09:41:49 compute-1 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' cmd='[{"prefix": "osd pool set", "pool": ".rgw.root", "var": "pg_num", "val": "32"}]': finished
Feb 02 09:41:49 compute-1 ceph-mon[80115]: osdmap e54: 3 total, 3 up, 3 in
Feb 02 09:41:49 compute-1 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pg_num", "val": "32"}]: dispatch
Feb 02 09:41:49 compute-1 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd pool set", "pool": ".rgw.root", "var": "pg_num_actual", "val": "32"}]: dispatch
Feb 02 09:41:49 compute-1 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' 
Feb 02 09:41:49 compute-1 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' 
Feb 02 09:41:50 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 55 pg[8.0( v 37'12 (0'0,37'12] local-lis/les=36/37 n=6 ec=36/36 lis/c=36/36 les/c/f=37/37/0 sis=55 pruub=10.417726517s) [0] r=0 lpr=55 pi=[36,55)/1 crt=37'12 lcod 37'11 mlcod 37'11 active pruub 171.950485229s@ mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [0] -> [0], acting_primary 0 -> 0, up_primary 0 -> 0, role 0 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Feb 02 09:41:50 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 55 pg[8.0( v 37'12 lc 0'0 (0'0,37'12] local-lis/les=36/37 n=0 ec=36/36 lis/c=36/36 les/c/f=37/37/0 sis=55 pruub=10.417726517s) [0] r=0 lpr=55 pi=[36,55)/1 crt=37'12 lcod 37'11 mlcod 0'0 unknown pruub 171.950485229s@ mbc={}] state<Start>: transitioning to Primary
Feb 02 09:41:50 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e55 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 02 09:41:50 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[86236]: 02/02/2026 09:41:50 : epoch 6980714c : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9de40016a0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:41:50 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[86236]: 02/02/2026 09:41:50 : epoch 6980714c : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e08001f50 fd 37 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:41:50 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:41:50 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:41:50 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:41:50.907 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:41:50 compute-1 ceph-mon[80115]: pgmap v39: 198 pgs: 198 active+clean; 456 KiB data, 85 MiB used, 60 GiB / 60 GiB avail; 1.4 KiB/s rd, 511 B/s wr, 2 op/s
Feb 02 09:41:50 compute-1 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' 
Feb 02 09:41:50 compute-1 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pg_num", "val": "32"}]': finished
Feb 02 09:41:50 compute-1 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' cmd='[{"prefix": "osd pool set", "pool": ".rgw.root", "var": "pg_num_actual", "val": "32"}]': finished
Feb 02 09:41:50 compute-1 ceph-mon[80115]: osdmap e55: 3 total, 3 up, 3 in
Feb 02 09:41:50 compute-1 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.control", "var": "pg_num", "val": "32"}]: dispatch
Feb 02 09:41:50 compute-1 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' 
Feb 02 09:41:50 compute-1 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' 
Feb 02 09:41:50 compute-1 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' 
Feb 02 09:41:50 compute-1 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "dashboard set-grafana-api-ssl-verify", "value": "false"}]: dispatch
Feb 02 09:41:50 compute-1 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' 
Feb 02 09:41:51 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:41:51 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:41:51 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:41:51.002 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:41:51 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[86236]: 02/02/2026 09:41:51 : epoch 6980714c : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9df0001ac0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:41:51 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e56 e56: 3 total, 3 up, 3 in
Feb 02 09:41:51 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 56 pg[8.14( v 37'12 lc 0'0 (0'0,37'12] local-lis/les=36/37 n=0 ec=55/36 lis/c=36/36 les/c/f=37/37/0 sis=55) [0] r=0 lpr=55 pi=[36,55)/1 crt=37'12 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 02 09:41:51 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 56 pg[8.15( v 37'12 lc 0'0 (0'0,37'12] local-lis/les=36/37 n=0 ec=55/36 lis/c=36/36 les/c/f=37/37/0 sis=55) [0] r=0 lpr=55 pi=[36,55)/1 crt=37'12 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 02 09:41:51 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 56 pg[8.16( v 37'12 lc 0'0 (0'0,37'12] local-lis/les=36/37 n=0 ec=55/36 lis/c=36/36 les/c/f=37/37/0 sis=55) [0] r=0 lpr=55 pi=[36,55)/1 crt=37'12 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 02 09:41:51 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 56 pg[8.17( v 37'12 lc 0'0 (0'0,37'12] local-lis/les=36/37 n=0 ec=55/36 lis/c=36/36 les/c/f=37/37/0 sis=55) [0] r=0 lpr=55 pi=[36,55)/1 crt=37'12 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 02 09:41:51 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 56 pg[8.10( v 37'12 lc 0'0 (0'0,37'12] local-lis/les=36/37 n=0 ec=55/36 lis/c=36/36 les/c/f=37/37/0 sis=55) [0] r=0 lpr=55 pi=[36,55)/1 crt=37'12 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 02 09:41:51 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 56 pg[8.2( v 37'12 lc 0'0 (0'0,37'12] local-lis/les=36/37 n=1 ec=55/36 lis/c=36/36 les/c/f=37/37/0 sis=55) [0] r=0 lpr=55 pi=[36,55)/1 crt=37'12 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 02 09:41:51 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 56 pg[8.3( v 37'12 lc 0'0 (0'0,37'12] local-lis/les=36/37 n=1 ec=55/36 lis/c=36/36 les/c/f=37/37/0 sis=55) [0] r=0 lpr=55 pi=[36,55)/1 crt=37'12 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 02 09:41:51 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 56 pg[8.f( v 37'12 lc 0'0 (0'0,37'12] local-lis/les=36/37 n=0 ec=55/36 lis/c=36/36 les/c/f=37/37/0 sis=55) [0] r=0 lpr=55 pi=[36,55)/1 crt=37'12 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 02 09:41:51 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 56 pg[8.11( v 37'12 lc 0'0 (0'0,37'12] local-lis/les=36/37 n=0 ec=55/36 lis/c=36/36 les/c/f=37/37/0 sis=55) [0] r=0 lpr=55 pi=[36,55)/1 crt=37'12 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 02 09:41:51 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 56 pg[8.8( v 37'12 lc 0'0 (0'0,37'12] local-lis/les=36/37 n=0 ec=55/36 lis/c=36/36 les/c/f=37/37/0 sis=55) [0] r=0 lpr=55 pi=[36,55)/1 crt=37'12 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 02 09:41:51 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 56 pg[8.9( v 37'12 lc 0'0 (0'0,37'12] local-lis/les=36/37 n=0 ec=55/36 lis/c=36/36 les/c/f=37/37/0 sis=55) [0] r=0 lpr=55 pi=[36,55)/1 crt=37'12 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 02 09:41:51 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 56 pg[8.a( v 37'12 lc 0'0 (0'0,37'12] local-lis/les=36/37 n=0 ec=55/36 lis/c=36/36 les/c/f=37/37/0 sis=55) [0] r=0 lpr=55 pi=[36,55)/1 crt=37'12 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 02 09:41:51 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 56 pg[8.d( v 37'12 lc 0'0 (0'0,37'12] local-lis/les=36/37 n=0 ec=55/36 lis/c=36/36 les/c/f=37/37/0 sis=55) [0] r=0 lpr=55 pi=[36,55)/1 crt=37'12 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 02 09:41:51 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 56 pg[8.c( v 37'12 lc 0'0 (0'0,37'12] local-lis/les=36/37 n=0 ec=55/36 lis/c=36/36 les/c/f=37/37/0 sis=55) [0] r=0 lpr=55 pi=[36,55)/1 crt=37'12 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 02 09:41:51 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 56 pg[8.b( v 37'12 lc 0'0 (0'0,37'12] local-lis/les=36/37 n=0 ec=55/36 lis/c=36/36 les/c/f=37/37/0 sis=55) [0] r=0 lpr=55 pi=[36,55)/1 crt=37'12 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 02 09:41:51 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 56 pg[8.1( v 37'12 (0'0,37'12] local-lis/les=36/37 n=1 ec=55/36 lis/c=36/36 les/c/f=37/37/0 sis=55) [0] r=0 lpr=55 pi=[36,55)/1 crt=37'12 lcod 0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 02 09:41:51 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 56 pg[8.7( v 37'12 lc 0'0 (0'0,37'12] local-lis/les=36/37 n=0 ec=55/36 lis/c=36/36 les/c/f=37/37/0 sis=55) [0] r=0 lpr=55 pi=[36,55)/1 crt=37'12 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 02 09:41:51 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 56 pg[8.6( v 37'12 lc 0'0 (0'0,37'12] local-lis/les=36/37 n=1 ec=55/36 lis/c=36/36 les/c/f=37/37/0 sis=55) [0] r=0 lpr=55 pi=[36,55)/1 crt=37'12 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 02 09:41:51 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 56 pg[8.e( v 37'12 lc 0'0 (0'0,37'12] local-lis/les=36/37 n=0 ec=55/36 lis/c=36/36 les/c/f=37/37/0 sis=55) [0] r=0 lpr=55 pi=[36,55)/1 crt=37'12 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 02 09:41:51 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 56 pg[8.4( v 37'12 lc 0'0 (0'0,37'12] local-lis/les=36/37 n=1 ec=55/36 lis/c=36/36 les/c/f=37/37/0 sis=55) [0] r=0 lpr=55 pi=[36,55)/1 crt=37'12 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 02 09:41:51 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 56 pg[8.1b( v 37'12 lc 0'0 (0'0,37'12] local-lis/les=36/37 n=0 ec=55/36 lis/c=36/36 les/c/f=37/37/0 sis=55) [0] r=0 lpr=55 pi=[36,55)/1 crt=37'12 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 02 09:41:51 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 56 pg[8.5( v 37'12 lc 0'0 (0'0,37'12] local-lis/les=36/37 n=1 ec=55/36 lis/c=36/36 les/c/f=37/37/0 sis=55) [0] r=0 lpr=55 pi=[36,55)/1 crt=37'12 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 02 09:41:51 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 56 pg[8.1a( v 37'12 lc 0'0 (0'0,37'12] local-lis/les=36/37 n=0 ec=55/36 lis/c=36/36 les/c/f=37/37/0 sis=55) [0] r=0 lpr=55 pi=[36,55)/1 crt=37'12 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 02 09:41:51 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 56 pg[8.19( v 37'12 lc 0'0 (0'0,37'12] local-lis/les=36/37 n=0 ec=55/36 lis/c=36/36 les/c/f=37/37/0 sis=55) [0] r=0 lpr=55 pi=[36,55)/1 crt=37'12 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 02 09:41:51 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 56 pg[8.18( v 37'12 lc 0'0 (0'0,37'12] local-lis/les=36/37 n=0 ec=55/36 lis/c=36/36 les/c/f=37/37/0 sis=55) [0] r=0 lpr=55 pi=[36,55)/1 crt=37'12 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 02 09:41:51 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 56 pg[8.1e( v 37'12 lc 0'0 (0'0,37'12] local-lis/les=36/37 n=0 ec=55/36 lis/c=36/36 les/c/f=37/37/0 sis=55) [0] r=0 lpr=55 pi=[36,55)/1 crt=37'12 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 02 09:41:51 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 56 pg[8.1d( v 37'12 lc 0'0 (0'0,37'12] local-lis/les=36/37 n=0 ec=55/36 lis/c=36/36 les/c/f=37/37/0 sis=55) [0] r=0 lpr=55 pi=[36,55)/1 crt=37'12 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 02 09:41:51 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 56 pg[8.1c( v 37'12 lc 0'0 (0'0,37'12] local-lis/les=36/37 n=0 ec=55/36 lis/c=36/36 les/c/f=37/37/0 sis=55) [0] r=0 lpr=55 pi=[36,55)/1 crt=37'12 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 02 09:41:51 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 56 pg[8.1f( v 37'12 lc 0'0 (0'0,37'12] local-lis/les=36/37 n=0 ec=55/36 lis/c=36/36 les/c/f=37/37/0 sis=55) [0] r=0 lpr=55 pi=[36,55)/1 crt=37'12 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 02 09:41:51 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 56 pg[8.13( v 37'12 lc 0'0 (0'0,37'12] local-lis/les=36/37 n=0 ec=55/36 lis/c=36/36 les/c/f=37/37/0 sis=55) [0] r=0 lpr=55 pi=[36,55)/1 crt=37'12 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 02 09:41:51 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 56 pg[8.12( v 37'12 lc 0'0 (0'0,37'12] local-lis/les=36/37 n=0 ec=55/36 lis/c=36/36 les/c/f=37/37/0 sis=55) [0] r=0 lpr=55 pi=[36,55)/1 crt=37'12 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 02 09:41:51 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 56 pg[8.14( v 37'12 (0'0,37'12] local-lis/les=55/56 n=0 ec=55/36 lis/c=36/36 les/c/f=37/37/0 sis=55) [0] r=0 lpr=55 pi=[36,55)/1 crt=37'12 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 02 09:41:51 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 56 pg[8.16( v 37'12 (0'0,37'12] local-lis/les=55/56 n=0 ec=55/36 lis/c=36/36 les/c/f=37/37/0 sis=55) [0] r=0 lpr=55 pi=[36,55)/1 crt=37'12 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 02 09:41:51 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 56 pg[8.15( v 37'12 (0'0,37'12] local-lis/les=55/56 n=0 ec=55/36 lis/c=36/36 les/c/f=37/37/0 sis=55) [0] r=0 lpr=55 pi=[36,55)/1 crt=37'12 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 02 09:41:51 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 56 pg[8.17( v 37'12 (0'0,37'12] local-lis/les=55/56 n=0 ec=55/36 lis/c=36/36 les/c/f=37/37/0 sis=55) [0] r=0 lpr=55 pi=[36,55)/1 crt=37'12 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 02 09:41:51 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 56 pg[8.10( v 37'12 (0'0,37'12] local-lis/les=55/56 n=0 ec=55/36 lis/c=36/36 les/c/f=37/37/0 sis=55) [0] r=0 lpr=55 pi=[36,55)/1 crt=37'12 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 02 09:41:51 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 56 pg[8.3( v 37'12 (0'0,37'12] local-lis/les=55/56 n=1 ec=55/36 lis/c=36/36 les/c/f=37/37/0 sis=55) [0] r=0 lpr=55 pi=[36,55)/1 crt=37'12 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 02 09:41:51 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 56 pg[8.2( v 37'12 (0'0,37'12] local-lis/les=55/56 n=1 ec=55/36 lis/c=36/36 les/c/f=37/37/0 sis=55) [0] r=0 lpr=55 pi=[36,55)/1 crt=37'12 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 02 09:41:51 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 56 pg[8.11( v 37'12 (0'0,37'12] local-lis/les=55/56 n=0 ec=55/36 lis/c=36/36 les/c/f=37/37/0 sis=55) [0] r=0 lpr=55 pi=[36,55)/1 crt=37'12 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 02 09:41:51 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 56 pg[8.8( v 37'12 (0'0,37'12] local-lis/les=55/56 n=0 ec=55/36 lis/c=36/36 les/c/f=37/37/0 sis=55) [0] r=0 lpr=55 pi=[36,55)/1 crt=37'12 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 02 09:41:51 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 56 pg[8.9( v 37'12 (0'0,37'12] local-lis/les=55/56 n=0 ec=55/36 lis/c=36/36 les/c/f=37/37/0 sis=55) [0] r=0 lpr=55 pi=[36,55)/1 crt=37'12 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 02 09:41:51 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 56 pg[8.a( v 37'12 (0'0,37'12] local-lis/les=55/56 n=0 ec=55/36 lis/c=36/36 les/c/f=37/37/0 sis=55) [0] r=0 lpr=55 pi=[36,55)/1 crt=37'12 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 02 09:41:51 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 56 pg[8.c( v 37'12 (0'0,37'12] local-lis/les=55/56 n=0 ec=55/36 lis/c=36/36 les/c/f=37/37/0 sis=55) [0] r=0 lpr=55 pi=[36,55)/1 crt=37'12 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 02 09:41:51 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 56 pg[8.b( v 37'12 (0'0,37'12] local-lis/les=55/56 n=0 ec=55/36 lis/c=36/36 les/c/f=37/37/0 sis=55) [0] r=0 lpr=55 pi=[36,55)/1 crt=37'12 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 02 09:41:51 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 56 pg[8.d( v 37'12 (0'0,37'12] local-lis/les=55/56 n=0 ec=55/36 lis/c=36/36 les/c/f=37/37/0 sis=55) [0] r=0 lpr=55 pi=[36,55)/1 crt=37'12 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 02 09:41:51 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 56 pg[8.1( v 37'12 (0'0,37'12] local-lis/les=55/56 n=1 ec=55/36 lis/c=36/36 les/c/f=37/37/0 sis=55) [0] r=0 lpr=55 pi=[36,55)/1 crt=37'12 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 02 09:41:51 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 56 pg[8.0( v 37'12 (0'0,37'12] local-lis/les=55/56 n=0 ec=36/36 lis/c=36/36 les/c/f=37/37/0 sis=55) [0] r=0 lpr=55 pi=[36,55)/1 crt=37'12 lcod 37'11 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 02 09:41:51 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 56 pg[8.7( v 37'12 (0'0,37'12] local-lis/les=55/56 n=0 ec=55/36 lis/c=36/36 les/c/f=37/37/0 sis=55) [0] r=0 lpr=55 pi=[36,55)/1 crt=37'12 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 02 09:41:51 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 56 pg[8.6( v 37'12 (0'0,37'12] local-lis/les=55/56 n=1 ec=55/36 lis/c=36/36 les/c/f=37/37/0 sis=55) [0] r=0 lpr=55 pi=[36,55)/1 crt=37'12 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 02 09:41:51 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 56 pg[8.e( v 37'12 (0'0,37'12] local-lis/les=55/56 n=0 ec=55/36 lis/c=36/36 les/c/f=37/37/0 sis=55) [0] r=0 lpr=55 pi=[36,55)/1 crt=37'12 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 02 09:41:51 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 56 pg[8.f( v 37'12 (0'0,37'12] local-lis/les=55/56 n=0 ec=55/36 lis/c=36/36 les/c/f=37/37/0 sis=55) [0] r=0 lpr=55 pi=[36,55)/1 crt=37'12 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 02 09:41:51 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 56 pg[8.4( v 37'12 (0'0,37'12] local-lis/les=55/56 n=1 ec=55/36 lis/c=36/36 les/c/f=37/37/0 sis=55) [0] r=0 lpr=55 pi=[36,55)/1 crt=37'12 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 02 09:41:51 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 56 pg[8.1b( v 37'12 (0'0,37'12] local-lis/les=55/56 n=0 ec=55/36 lis/c=36/36 les/c/f=37/37/0 sis=55) [0] r=0 lpr=55 pi=[36,55)/1 crt=37'12 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 02 09:41:51 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 56 pg[8.5( v 37'12 (0'0,37'12] local-lis/les=55/56 n=1 ec=55/36 lis/c=36/36 les/c/f=37/37/0 sis=55) [0] r=0 lpr=55 pi=[36,55)/1 crt=37'12 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 02 09:41:51 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 56 pg[8.18( v 37'12 (0'0,37'12] local-lis/les=55/56 n=0 ec=55/36 lis/c=36/36 les/c/f=37/37/0 sis=55) [0] r=0 lpr=55 pi=[36,55)/1 crt=37'12 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 02 09:41:51 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 56 pg[8.19( v 37'12 (0'0,37'12] local-lis/les=55/56 n=0 ec=55/36 lis/c=36/36 les/c/f=37/37/0 sis=55) [0] r=0 lpr=55 pi=[36,55)/1 crt=37'12 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 02 09:41:51 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 56 pg[8.1d( v 37'12 (0'0,37'12] local-lis/les=55/56 n=0 ec=55/36 lis/c=36/36 les/c/f=37/37/0 sis=55) [0] r=0 lpr=55 pi=[36,55)/1 crt=37'12 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 02 09:41:51 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 56 pg[8.1c( v 37'12 (0'0,37'12] local-lis/les=55/56 n=0 ec=55/36 lis/c=36/36 les/c/f=37/37/0 sis=55) [0] r=0 lpr=55 pi=[36,55)/1 crt=37'12 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 02 09:41:51 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 56 pg[8.13( v 37'12 (0'0,37'12] local-lis/les=55/56 n=0 ec=55/36 lis/c=36/36 les/c/f=37/37/0 sis=55) [0] r=0 lpr=55 pi=[36,55)/1 crt=37'12 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 02 09:41:51 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 56 pg[8.1e( v 37'12 (0'0,37'12] local-lis/les=55/56 n=0 ec=55/36 lis/c=36/36 les/c/f=37/37/0 sis=55) [0] r=0 lpr=55 pi=[36,55)/1 crt=37'12 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 02 09:41:51 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 56 pg[8.1f( v 37'12 (0'0,37'12] local-lis/les=55/56 n=0 ec=55/36 lis/c=36/36 les/c/f=37/37/0 sis=55) [0] r=0 lpr=55 pi=[36,55)/1 crt=37'12 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 02 09:41:51 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 56 pg[8.1a( v 37'12 (0'0,37'12] local-lis/les=55/56 n=0 ec=55/36 lis/c=36/36 les/c/f=37/37/0 sis=55) [0] r=0 lpr=55 pi=[36,55)/1 crt=37'12 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 02 09:41:51 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 56 pg[8.12( v 37'12 (0'0,37'12] local-lis/les=55/56 n=0 ec=55/36 lis/c=36/36 les/c/f=37/37/0 sis=55) [0] r=0 lpr=55 pi=[36,55)/1 crt=37'12 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 02 09:41:52 compute-1 ceph-osd[77691]: log_channel(cluster) log [DBG] : 8.14 scrub starts
Feb 02 09:41:52 compute-1 ceph-osd[77691]: log_channel(cluster) log [DBG] : 8.14 scrub ok
Feb 02 09:41:52 compute-1 ceph-mon[80115]: Regenerating cephadm self-signed grafana TLS certificates
Feb 02 09:41:52 compute-1 ceph-mon[80115]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "dashboard set-grafana-api-ssl-verify", "value": "false"}]: dispatch
Feb 02 09:41:52 compute-1 ceph-mon[80115]: Deploying daemon grafana.compute-0 on compute-0
Feb 02 09:41:52 compute-1 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.control", "var": "pg_num", "val": "32"}]': finished
Feb 02 09:41:52 compute-1 ceph-mon[80115]: osdmap e56: 3 total, 3 up, 3 in
Feb 02 09:41:52 compute-1 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_num", "val": "32"}]: dispatch
Feb 02 09:41:52 compute-1 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' 
Feb 02 09:41:52 compute-1 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.control", "var": "pg_num_actual", "val": "32"}]: dispatch
Feb 02 09:41:52 compute-1 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pg_num_actual", "val": "32"}]: dispatch
Feb 02 09:41:52 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[86236]: 02/02/2026 09:41:52 : epoch 6980714c : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e00001c00 fd 37 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:41:52 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e57 e57: 3 total, 3 up, 3 in
Feb 02 09:41:52 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 57 pg[9.0( v 44'1041 (0'0,44'1041] local-lis/les=38/39 n=178 ec=38/38 lis/c=38/38 les/c/f=39/39/0 sis=57 pruub=10.447933197s) [0] r=0 lpr=57 pi=[38,57)/1 crt=44'1041 lcod 44'1040 mlcod 44'1040 active pruub 174.220642090s@ mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [0] -> [0], acting_primary 0 -> 0, up_primary 0 -> 0, role 0 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Feb 02 09:41:52 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 57 pg[9.0( v 44'1041 lc 0'0 (0'0,44'1041] local-lis/les=38/39 n=5 ec=38/38 lis/c=38/38 les/c/f=39/39/0 sis=57 pruub=10.447933197s) [0] r=0 lpr=57 pi=[38,57)/1 crt=44'1041 lcod 44'1040 mlcod 0'0 unknown pruub 174.220642090s@ mbc={}] state<Start>: transitioning to Primary
Feb 02 09:41:52 compute-1 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0).collection(9.0_head 0x5616e1383200) operator()   moving buffer(0x5616e0dab2e8 space 0x5616e0cb91f0 0x0~1000 clean)
Feb 02 09:41:52 compute-1 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0).collection(9.0_head 0x5616e1383200) operator()   moving buffer(0x5616e0dc6848 space 0x5616e0db21b0 0x0~1000 clean)
Feb 02 09:41:52 compute-1 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0).collection(9.0_head 0x5616e1383200) operator()   moving buffer(0x5616e0dabc48 space 0x5616e0bd4de0 0x0~1000 clean)
Feb 02 09:41:52 compute-1 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0).collection(9.0_head 0x5616e1383200) operator()   moving buffer(0x5616e0dc76a8 space 0x5616e0bd4d10 0x0~1000 clean)
Feb 02 09:41:52 compute-1 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0).collection(9.0_head 0x5616e1383200) operator()   moving buffer(0x5616e0d8c708 space 0x5616e0db2420 0x0~1000 clean)
Feb 02 09:41:52 compute-1 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0).collection(9.0_head 0x5616e1383200) operator()   moving buffer(0x5616e0b89f68 space 0x5616e0cb9120 0x0~1000 clean)
Feb 02 09:41:52 compute-1 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0).collection(9.0_head 0x5616e1383200) operator()   moving buffer(0x5616e0d8d888 space 0x5616e0db2d10 0x0~1000 clean)
Feb 02 09:41:52 compute-1 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0).collection(9.0_head 0x5616e1383200) operator()   moving buffer(0x5616e0df65c8 space 0x5616e0c6d460 0x0~1000 clean)
Feb 02 09:41:52 compute-1 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0).collection(9.0_head 0x5616e1383200) operator()   moving buffer(0x5616e0dc7b08 space 0x5616e0db20e0 0x0~1000 clean)
Feb 02 09:41:52 compute-1 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0).collection(9.0_head 0x5616e1383200) operator()   moving buffer(0x5616e077fce8 space 0x5616e0db2aa0 0x0~1000 clean)
Feb 02 09:41:52 compute-1 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0).collection(9.0_head 0x5616e1383200) operator()   moving buffer(0x5616e0dabb08 space 0x5616e070b7a0 0x0~1000 clean)
Feb 02 09:41:52 compute-1 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0).collection(9.0_head 0x5616e1383200) operator()   moving buffer(0x5616e0d8d4c8 space 0x5616e0db25c0 0x0~1000 clean)
Feb 02 09:41:52 compute-1 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0).collection(9.0_head 0x5616e1383200) operator()   moving buffer(0x5616e0d8cc08 space 0x5616e0db24f0 0x0~1000 clean)
Feb 02 09:41:52 compute-1 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0).collection(9.0_head 0x5616e1383200) operator()   moving buffer(0x5616e0daa668 space 0x5616e0c18350 0x0~1000 clean)
Feb 02 09:41:52 compute-1 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0).collection(9.0_head 0x5616e1383200) operator()   moving buffer(0x5616e0dc6c08 space 0x5616e0db2280 0x0~1000 clean)
Feb 02 09:41:52 compute-1 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0).collection(9.0_head 0x5616e1383200) operator()   moving buffer(0x5616e0dab108 space 0x5616e070b870 0x0~1000 clean)
Feb 02 09:41:52 compute-1 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0).collection(9.0_head 0x5616e1383200) operator()   moving buffer(0x5616e0d9d108 space 0x5616e0db2830 0x0~1000 clean)
Feb 02 09:41:52 compute-1 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0).collection(9.0_head 0x5616e1383200) operator()   moving buffer(0x5616e0dc7568 space 0x5616e05c9a10 0x0~1000 clean)
Feb 02 09:41:52 compute-1 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0).collection(9.0_head 0x5616e1383200) operator()   moving buffer(0x5616e0daaca8 space 0x5616e0db3c80 0x0~1000 clean)
Feb 02 09:41:52 compute-1 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0).collection(9.0_head 0x5616e1383200) operator()   moving buffer(0x5616e0dc7e28 space 0x5616e0bd4eb0 0x0~1000 clean)
Feb 02 09:41:52 compute-1 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0).collection(9.0_head 0x5616e1383200) operator()   moving buffer(0x5616e0db5608 space 0x5616e0c97e20 0x0~1000 clean)
Feb 02 09:41:52 compute-1 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0).collection(9.0_head 0x5616e1383200) operator()   moving buffer(0x5616e0d8cca8 space 0x5616e0bda420 0x0~1000 clean)
Feb 02 09:41:52 compute-1 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0).collection(9.0_head 0x5616e1383200) operator()   moving buffer(0x5616e0b08848 space 0x5616e0cb9050 0x0~1000 clean)
Feb 02 09:41:52 compute-1 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0).collection(9.0_head 0x5616e1383200) operator()   moving buffer(0x5616e0db9b08 space 0x5616e0cb9c80 0x0~1000 clean)
Feb 02 09:41:52 compute-1 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0).collection(9.0_head 0x5616e1383200) operator()   moving buffer(0x5616e0d9dba8 space 0x5616e0db2c40 0x0~1000 clean)
Feb 02 09:41:52 compute-1 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0).collection(9.0_head 0x5616e1383200) operator()   moving buffer(0x5616e0d9cb68 space 0x5616e0db2b70 0x0~1000 clean)
Feb 02 09:41:52 compute-1 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0).collection(9.0_head 0x5616e1383200) operator()   moving buffer(0x5616e0dc6028 space 0x5616e0bd5600 0x0~1000 clean)
Feb 02 09:41:52 compute-1 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0).collection(9.0_head 0x5616e1383200) operator()   moving buffer(0x5616e0d9cac8 space 0x5616e0db2760 0x0~1000 clean)
Feb 02 09:41:52 compute-1 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0).collection(9.0_head 0x5616e1383200) operator()   moving buffer(0x5616e0daa488 space 0x5616e0cb92c0 0x0~1000 clean)
Feb 02 09:41:52 compute-1 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0).collection(9.0_head 0x5616e1383200) operator()   moving buffer(0x5616e0d9c5c8 space 0x5616e0db2690 0x0~1000 clean)
Feb 02 09:41:52 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[86236]: 02/02/2026 09:41:52 : epoch 6980714c : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9de40016a0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:41:52 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:41:52 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000024s ======
Feb 02 09:41:52 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:41:52.909 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Feb 02 09:41:53 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:41:53 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:41:53 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:41:53.005 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:41:53 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[86236]: 02/02/2026 09:41:53 : epoch 6980714c : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e08001f50 fd 37 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:41:53 compute-1 ceph-osd[77691]: log_channel(cluster) log [DBG] : 8.3 scrub starts
Feb 02 09:41:53 compute-1 ceph-osd[77691]: log_channel(cluster) log [DBG] : 8.3 scrub ok
Feb 02 09:41:53 compute-1 ceph-mon[80115]: pgmap v42: 229 pgs: 31 unknown, 198 active+clean; 456 KiB data, 85 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 B/s wr, 0 op/s
Feb 02 09:41:53 compute-1 ceph-mon[80115]: 8.14 scrub starts
Feb 02 09:41:53 compute-1 ceph-mon[80115]: 8.14 scrub ok
Feb 02 09:41:53 compute-1 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_num", "val": "32"}]': finished
Feb 02 09:41:53 compute-1 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.control", "var": "pg_num_actual", "val": "32"}]': finished
Feb 02 09:41:53 compute-1 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pg_num_actual", "val": "32"}]': finished
Feb 02 09:41:53 compute-1 ceph-mon[80115]: osdmap e57: 3 total, 3 up, 3 in
Feb 02 09:41:53 compute-1 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd pool set", "pool": ".nfs", "var": "pg_num", "val": "32"}]: dispatch
Feb 02 09:41:53 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e58 e58: 3 total, 3 up, 3 in
Feb 02 09:41:53 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 58 pg[9.15( v 44'1041 lc 0'0 (0'0,44'1041] local-lis/les=38/39 n=5 ec=57/38 lis/c=38/38 les/c/f=39/39/0 sis=57) [0] r=0 lpr=57 pi=[38,57)/1 crt=44'1041 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 02 09:41:53 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 58 pg[9.14( v 44'1041 lc 0'0 (0'0,44'1041] local-lis/les=38/39 n=5 ec=57/38 lis/c=38/38 les/c/f=39/39/0 sis=57) [0] r=0 lpr=57 pi=[38,57)/1 crt=44'1041 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 02 09:41:53 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 58 pg[9.17( v 44'1041 lc 0'0 (0'0,44'1041] local-lis/les=38/39 n=5 ec=57/38 lis/c=38/38 les/c/f=39/39/0 sis=57) [0] r=0 lpr=57 pi=[38,57)/1 crt=44'1041 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 02 09:41:53 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 58 pg[9.16( v 44'1041 lc 0'0 (0'0,44'1041] local-lis/les=38/39 n=5 ec=57/38 lis/c=38/38 les/c/f=39/39/0 sis=57) [0] r=0 lpr=57 pi=[38,57)/1 crt=44'1041 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 02 09:41:53 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 58 pg[9.11( v 44'1041 lc 0'0 (0'0,44'1041] local-lis/les=38/39 n=6 ec=57/38 lis/c=38/38 les/c/f=39/39/0 sis=57) [0] r=0 lpr=57 pi=[38,57)/1 crt=44'1041 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 02 09:41:53 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 58 pg[9.10( v 44'1041 lc 0'0 (0'0,44'1041] local-lis/les=38/39 n=6 ec=57/38 lis/c=38/38 les/c/f=39/39/0 sis=57) [0] r=0 lpr=57 pi=[38,57)/1 crt=44'1041 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 02 09:41:53 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 58 pg[9.3( v 44'1041 lc 0'0 (0'0,44'1041] local-lis/les=38/39 n=6 ec=57/38 lis/c=38/38 les/c/f=39/39/0 sis=57) [0] r=0 lpr=57 pi=[38,57)/1 crt=44'1041 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 02 09:41:53 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 58 pg[9.2( v 44'1041 lc 0'0 (0'0,44'1041] local-lis/les=38/39 n=6 ec=57/38 lis/c=38/38 les/c/f=39/39/0 sis=57) [0] r=0 lpr=57 pi=[38,57)/1 crt=44'1041 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 02 09:41:53 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 58 pg[9.e( v 44'1041 lc 0'0 (0'0,44'1041] local-lis/les=38/39 n=6 ec=57/38 lis/c=38/38 les/c/f=39/39/0 sis=57) [0] r=0 lpr=57 pi=[38,57)/1 crt=44'1041 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 02 09:41:53 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 58 pg[9.9( v 44'1041 lc 0'0 (0'0,44'1041] local-lis/les=38/39 n=6 ec=57/38 lis/c=38/38 les/c/f=39/39/0 sis=57) [0] r=0 lpr=57 pi=[38,57)/1 crt=44'1041 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 02 09:41:53 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 58 pg[9.8( v 44'1041 lc 0'0 (0'0,44'1041] local-lis/les=38/39 n=6 ec=57/38 lis/c=38/38 les/c/f=39/39/0 sis=57) [0] r=0 lpr=57 pi=[38,57)/1 crt=44'1041 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 02 09:41:53 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 58 pg[9.b( v 44'1041 lc 0'0 (0'0,44'1041] local-lis/les=38/39 n=6 ec=57/38 lis/c=38/38 les/c/f=39/39/0 sis=57) [0] r=0 lpr=57 pi=[38,57)/1 crt=44'1041 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 02 09:41:53 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 58 pg[9.f( v 44'1041 lc 0'0 (0'0,44'1041] local-lis/les=38/39 n=6 ec=57/38 lis/c=38/38 les/c/f=39/39/0 sis=57) [0] r=0 lpr=57 pi=[38,57)/1 crt=44'1041 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 02 09:41:53 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 58 pg[9.c( v 44'1041 lc 0'0 (0'0,44'1041] local-lis/les=38/39 n=6 ec=57/38 lis/c=38/38 les/c/f=39/39/0 sis=57) [0] r=0 lpr=57 pi=[38,57)/1 crt=44'1041 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 02 09:41:53 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 58 pg[9.d( v 44'1041 lc 0'0 (0'0,44'1041] local-lis/les=38/39 n=6 ec=57/38 lis/c=38/38 les/c/f=39/39/0 sis=57) [0] r=0 lpr=57 pi=[38,57)/1 crt=44'1041 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 02 09:41:53 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 58 pg[9.a( v 44'1041 lc 0'0 (0'0,44'1041] local-lis/les=38/39 n=6 ec=57/38 lis/c=38/38 les/c/f=39/39/0 sis=57) [0] r=0 lpr=57 pi=[38,57)/1 crt=44'1041 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 02 09:41:53 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 58 pg[9.1( v 44'1041 lc 0'0 (0'0,44'1041] local-lis/les=38/39 n=6 ec=57/38 lis/c=38/38 les/c/f=39/39/0 sis=57) [0] r=0 lpr=57 pi=[38,57)/1 crt=44'1041 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 02 09:41:53 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 58 pg[9.6( v 44'1041 lc 0'0 (0'0,44'1041] local-lis/les=38/39 n=6 ec=57/38 lis/c=38/38 les/c/f=39/39/0 sis=57) [0] r=0 lpr=57 pi=[38,57)/1 crt=44'1041 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 02 09:41:53 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 58 pg[9.7( v 44'1041 lc 0'0 (0'0,44'1041] local-lis/les=38/39 n=6 ec=57/38 lis/c=38/38 les/c/f=39/39/0 sis=57) [0] r=0 lpr=57 pi=[38,57)/1 crt=44'1041 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 02 09:41:53 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 58 pg[9.5( v 44'1041 lc 0'0 (0'0,44'1041] local-lis/les=38/39 n=6 ec=57/38 lis/c=38/38 les/c/f=39/39/0 sis=57) [0] r=0 lpr=57 pi=[38,57)/1 crt=44'1041 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 02 09:41:53 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 58 pg[9.4( v 44'1041 lc 0'0 (0'0,44'1041] local-lis/les=38/39 n=6 ec=57/38 lis/c=38/38 les/c/f=39/39/0 sis=57) [0] r=0 lpr=57 pi=[38,57)/1 crt=44'1041 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 02 09:41:53 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 58 pg[9.1a( v 44'1041 lc 0'0 (0'0,44'1041] local-lis/les=38/39 n=5 ec=57/38 lis/c=38/38 les/c/f=39/39/0 sis=57) [0] r=0 lpr=57 pi=[38,57)/1 crt=44'1041 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 02 09:41:53 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 58 pg[9.1b( v 44'1041 lc 0'0 (0'0,44'1041] local-lis/les=38/39 n=5 ec=57/38 lis/c=38/38 les/c/f=39/39/0 sis=57) [0] r=0 lpr=57 pi=[38,57)/1 crt=44'1041 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 02 09:41:53 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 58 pg[9.18( v 44'1041 lc 0'0 (0'0,44'1041] local-lis/les=38/39 n=5 ec=57/38 lis/c=38/38 les/c/f=39/39/0 sis=57) [0] r=0 lpr=57 pi=[38,57)/1 crt=44'1041 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 02 09:41:53 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 58 pg[9.19( v 44'1041 lc 0'0 (0'0,44'1041] local-lis/les=38/39 n=5 ec=57/38 lis/c=38/38 les/c/f=39/39/0 sis=57) [0] r=0 lpr=57 pi=[38,57)/1 crt=44'1041 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 02 09:41:53 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 58 pg[9.1e( v 44'1041 lc 0'0 (0'0,44'1041] local-lis/les=38/39 n=5 ec=57/38 lis/c=38/38 les/c/f=39/39/0 sis=57) [0] r=0 lpr=57 pi=[38,57)/1 crt=44'1041 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 02 09:41:53 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 58 pg[9.1f( v 44'1041 lc 0'0 (0'0,44'1041] local-lis/les=38/39 n=5 ec=57/38 lis/c=38/38 les/c/f=39/39/0 sis=57) [0] r=0 lpr=57 pi=[38,57)/1 crt=44'1041 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 02 09:41:53 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 58 pg[9.1c( v 44'1041 lc 0'0 (0'0,44'1041] local-lis/les=38/39 n=5 ec=57/38 lis/c=38/38 les/c/f=39/39/0 sis=57) [0] r=0 lpr=57 pi=[38,57)/1 crt=44'1041 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 02 09:41:53 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 58 pg[9.1d( v 44'1041 lc 0'0 (0'0,44'1041] local-lis/les=38/39 n=5 ec=57/38 lis/c=38/38 les/c/f=39/39/0 sis=57) [0] r=0 lpr=57 pi=[38,57)/1 crt=44'1041 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 02 09:41:53 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 58 pg[9.12( v 44'1041 lc 0'0 (0'0,44'1041] local-lis/les=38/39 n=6 ec=57/38 lis/c=38/38 les/c/f=39/39/0 sis=57) [0] r=0 lpr=57 pi=[38,57)/1 crt=44'1041 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 02 09:41:53 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 58 pg[9.13( v 44'1041 lc 0'0 (0'0,44'1041] local-lis/les=38/39 n=5 ec=57/38 lis/c=38/38 les/c/f=39/39/0 sis=57) [0] r=0 lpr=57 pi=[38,57)/1 crt=44'1041 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 02 09:41:53 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 58 pg[9.17( v 44'1041 (0'0,44'1041] local-lis/les=57/58 n=5 ec=57/38 lis/c=38/38 les/c/f=39/39/0 sis=57) [0] r=0 lpr=57 pi=[38,57)/1 crt=44'1041 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 02 09:41:53 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 58 pg[9.15( v 44'1041 (0'0,44'1041] local-lis/les=57/58 n=5 ec=57/38 lis/c=38/38 les/c/f=39/39/0 sis=57) [0] r=0 lpr=57 pi=[38,57)/1 crt=44'1041 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 02 09:41:53 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 58 pg[9.14( v 44'1041 (0'0,44'1041] local-lis/les=57/58 n=5 ec=57/38 lis/c=38/38 les/c/f=39/39/0 sis=57) [0] r=0 lpr=57 pi=[38,57)/1 crt=44'1041 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 02 09:41:53 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 58 pg[9.10( v 44'1041 (0'0,44'1041] local-lis/les=57/58 n=6 ec=57/38 lis/c=38/38 les/c/f=39/39/0 sis=57) [0] r=0 lpr=57 pi=[38,57)/1 crt=44'1041 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 02 09:41:53 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 58 pg[9.16( v 44'1041 (0'0,44'1041] local-lis/les=57/58 n=5 ec=57/38 lis/c=38/38 les/c/f=39/39/0 sis=57) [0] r=0 lpr=57 pi=[38,57)/1 crt=44'1041 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 02 09:41:53 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 58 pg[9.11( v 44'1041 (0'0,44'1041] local-lis/les=57/58 n=6 ec=57/38 lis/c=38/38 les/c/f=39/39/0 sis=57) [0] r=0 lpr=57 pi=[38,57)/1 crt=44'1041 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 02 09:41:53 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 58 pg[9.3( v 44'1041 (0'0,44'1041] local-lis/les=57/58 n=6 ec=57/38 lis/c=38/38 les/c/f=39/39/0 sis=57) [0] r=0 lpr=57 pi=[38,57)/1 crt=44'1041 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 02 09:41:53 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 58 pg[9.e( v 44'1041 (0'0,44'1041] local-lis/les=57/58 n=6 ec=57/38 lis/c=38/38 les/c/f=39/39/0 sis=57) [0] r=0 lpr=57 pi=[38,57)/1 crt=44'1041 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 02 09:41:53 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 58 pg[9.9( v 44'1041 (0'0,44'1041] local-lis/les=57/58 n=6 ec=57/38 lis/c=38/38 les/c/f=39/39/0 sis=57) [0] r=0 lpr=57 pi=[38,57)/1 crt=44'1041 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 02 09:41:53 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 58 pg[9.2( v 44'1041 (0'0,44'1041] local-lis/les=57/58 n=6 ec=57/38 lis/c=38/38 les/c/f=39/39/0 sis=57) [0] r=0 lpr=57 pi=[38,57)/1 crt=44'1041 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 02 09:41:53 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 58 pg[9.8( v 44'1041 (0'0,44'1041] local-lis/les=57/58 n=6 ec=57/38 lis/c=38/38 les/c/f=39/39/0 sis=57) [0] r=0 lpr=57 pi=[38,57)/1 crt=44'1041 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 02 09:41:53 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 58 pg[9.b( v 44'1041 (0'0,44'1041] local-lis/les=57/58 n=6 ec=57/38 lis/c=38/38 les/c/f=39/39/0 sis=57) [0] r=0 lpr=57 pi=[38,57)/1 crt=44'1041 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 02 09:41:53 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 58 pg[9.f( v 44'1041 (0'0,44'1041] local-lis/les=57/58 n=6 ec=57/38 lis/c=38/38 les/c/f=39/39/0 sis=57) [0] r=0 lpr=57 pi=[38,57)/1 crt=44'1041 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 02 09:41:53 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 58 pg[9.0( v 44'1041 (0'0,44'1041] local-lis/les=57/58 n=5 ec=38/38 lis/c=38/38 les/c/f=39/39/0 sis=57) [0] r=0 lpr=57 pi=[38,57)/1 crt=44'1041 lcod 44'1040 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 02 09:41:53 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 58 pg[9.c( v 44'1041 (0'0,44'1041] local-lis/les=57/58 n=6 ec=57/38 lis/c=38/38 les/c/f=39/39/0 sis=57) [0] r=0 lpr=57 pi=[38,57)/1 crt=44'1041 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 02 09:41:53 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 58 pg[9.d( v 44'1041 (0'0,44'1041] local-lis/les=57/58 n=6 ec=57/38 lis/c=38/38 les/c/f=39/39/0 sis=57) [0] r=0 lpr=57 pi=[38,57)/1 crt=44'1041 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 02 09:41:53 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 58 pg[9.a( v 44'1041 (0'0,44'1041] local-lis/les=57/58 n=6 ec=57/38 lis/c=38/38 les/c/f=39/39/0 sis=57) [0] r=0 lpr=57 pi=[38,57)/1 crt=44'1041 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 02 09:41:53 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 58 pg[9.6( v 44'1041 (0'0,44'1041] local-lis/les=57/58 n=6 ec=57/38 lis/c=38/38 les/c/f=39/39/0 sis=57) [0] r=0 lpr=57 pi=[38,57)/1 crt=44'1041 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 02 09:41:53 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 58 pg[9.1( v 44'1041 (0'0,44'1041] local-lis/les=57/58 n=6 ec=57/38 lis/c=38/38 les/c/f=39/39/0 sis=57) [0] r=0 lpr=57 pi=[38,57)/1 crt=44'1041 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 02 09:41:53 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 58 pg[9.5( v 44'1041 (0'0,44'1041] local-lis/les=57/58 n=6 ec=57/38 lis/c=38/38 les/c/f=39/39/0 sis=57) [0] r=0 lpr=57 pi=[38,57)/1 crt=44'1041 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 02 09:41:53 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 58 pg[9.7( v 44'1041 (0'0,44'1041] local-lis/les=57/58 n=6 ec=57/38 lis/c=38/38 les/c/f=39/39/0 sis=57) [0] r=0 lpr=57 pi=[38,57)/1 crt=44'1041 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 02 09:41:53 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 58 pg[9.18( v 44'1041 (0'0,44'1041] local-lis/les=57/58 n=5 ec=57/38 lis/c=38/38 les/c/f=39/39/0 sis=57) [0] r=0 lpr=57 pi=[38,57)/1 crt=44'1041 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 02 09:41:53 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 58 pg[9.4( v 44'1041 (0'0,44'1041] local-lis/les=57/58 n=6 ec=57/38 lis/c=38/38 les/c/f=39/39/0 sis=57) [0] r=0 lpr=57 pi=[38,57)/1 crt=44'1041 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 02 09:41:53 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 58 pg[9.1a( v 44'1041 (0'0,44'1041] local-lis/les=57/58 n=5 ec=57/38 lis/c=38/38 les/c/f=39/39/0 sis=57) [0] r=0 lpr=57 pi=[38,57)/1 crt=44'1041 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 02 09:41:53 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 58 pg[9.1e( v 44'1041 (0'0,44'1041] local-lis/les=57/58 n=5 ec=57/38 lis/c=38/38 les/c/f=39/39/0 sis=57) [0] r=0 lpr=57 pi=[38,57)/1 crt=44'1041 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 02 09:41:53 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 58 pg[9.1f( v 44'1041 (0'0,44'1041] local-lis/les=57/58 n=5 ec=57/38 lis/c=38/38 les/c/f=39/39/0 sis=57) [0] r=0 lpr=57 pi=[38,57)/1 crt=44'1041 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 02 09:41:53 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 58 pg[9.19( v 44'1041 (0'0,44'1041] local-lis/les=57/58 n=5 ec=57/38 lis/c=38/38 les/c/f=39/39/0 sis=57) [0] r=0 lpr=57 pi=[38,57)/1 crt=44'1041 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 02 09:41:53 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 58 pg[9.1c( v 44'1041 (0'0,44'1041] local-lis/les=57/58 n=5 ec=57/38 lis/c=38/38 les/c/f=39/39/0 sis=57) [0] r=0 lpr=57 pi=[38,57)/1 crt=44'1041 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 02 09:41:53 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 58 pg[9.1b( v 44'1041 (0'0,44'1041] local-lis/les=57/58 n=5 ec=57/38 lis/c=38/38 les/c/f=39/39/0 sis=57) [0] r=0 lpr=57 pi=[38,57)/1 crt=44'1041 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 02 09:41:53 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 58 pg[9.13( v 44'1041 (0'0,44'1041] local-lis/les=57/58 n=5 ec=57/38 lis/c=38/38 les/c/f=39/39/0 sis=57) [0] r=0 lpr=57 pi=[38,57)/1 crt=44'1041 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 02 09:41:53 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 58 pg[9.12( v 44'1041 (0'0,44'1041] local-lis/les=57/58 n=6 ec=57/38 lis/c=38/38 les/c/f=39/39/0 sis=57) [0] r=0 lpr=57 pi=[38,57)/1 crt=44'1041 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 02 09:41:53 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 58 pg[9.1d( v 44'1041 (0'0,44'1041] local-lis/les=57/58 n=5 ec=57/38 lis/c=38/38 les/c/f=39/39/0 sis=57) [0] r=0 lpr=57 pi=[38,57)/1 crt=44'1041 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 02 09:41:54 compute-1 ceph-osd[77691]: log_channel(cluster) log [DBG] : 8.15 scrub starts
Feb 02 09:41:54 compute-1 ceph-osd[77691]: log_channel(cluster) log [DBG] : 8.15 scrub ok
Feb 02 09:41:54 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[86236]: 02/02/2026 09:41:54 : epoch 6980714c : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9df0001ac0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:41:54 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e59 e59: 3 total, 3 up, 3 in
Feb 02 09:41:54 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 59 pg[11.0( empty local-lis/les=42/43 n=0 ec=42/42 lis/c=42/42 les/c/f=43/43/0 sis=59 pruub=12.393096924s) [0] r=0 lpr=59 pi=[42,59)/1 crt=0'0 mlcod 0'0 active pruub 178.257980347s@ mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [0] -> [0], acting_primary 0 -> 0, up_primary 0 -> 0, role 0 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Feb 02 09:41:54 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 59 pg[11.0( empty local-lis/les=42/43 n=0 ec=42/42 lis/c=42/42 les/c/f=43/43/0 sis=59 pruub=12.393096924s) [0] r=0 lpr=59 pi=[42,59)/1 crt=0'0 mlcod 0'0 unknown pruub 178.257980347s@ mbc={}] state<Start>: transitioning to Primary
Feb 02 09:41:54 compute-1 ceph-mon[80115]: 8.3 scrub starts
Feb 02 09:41:54 compute-1 ceph-mon[80115]: 8.3 scrub ok
Feb 02 09:41:54 compute-1 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' cmd='[{"prefix": "osd pool set", "pool": ".nfs", "var": "pg_num", "val": "32"}]': finished
Feb 02 09:41:54 compute-1 ceph-mon[80115]: osdmap e58: 3 total, 3 up, 3 in
Feb 02 09:41:54 compute-1 ceph-mon[80115]: pgmap v45: 291 pgs: 62 unknown, 229 active+clean; 456 KiB data, 85 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Feb 02 09:41:54 compute-1 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd pool set", "pool": ".nfs", "var": "pg_num_actual", "val": "32"}]: dispatch
Feb 02 09:41:54 compute-1 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_num_actual", "val": "32"}]: dispatch
Feb 02 09:41:54 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[86236]: 02/02/2026 09:41:54 : epoch 6980714c : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e00001c00 fd 37 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:41:54 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:41:54 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:41:54 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:41:54.912 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:41:55 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:41:55 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:41:55 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:41:55.008 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:41:55 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[86236]: 02/02/2026 09:41:55 : epoch 6980714c : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9de40016a0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:41:55 compute-1 ceph-osd[77691]: log_channel(cluster) log [DBG] : 8.17 scrub starts
Feb 02 09:41:55 compute-1 ceph-osd[77691]: log_channel(cluster) log [DBG] : 8.17 scrub ok
Feb 02 09:41:55 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e59 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 02 09:41:55 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e60 e60: 3 total, 3 up, 3 in
Feb 02 09:41:55 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 60 pg[11.17( empty local-lis/les=42/43 n=0 ec=59/42 lis/c=42/42 les/c/f=43/43/0 sis=59) [0] r=0 lpr=59 pi=[42,59)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 02 09:41:55 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 60 pg[11.16( empty local-lis/les=42/43 n=0 ec=59/42 lis/c=42/42 les/c/f=43/43/0 sis=59) [0] r=0 lpr=59 pi=[42,59)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 02 09:41:55 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 60 pg[11.15( empty local-lis/les=42/43 n=0 ec=59/42 lis/c=42/42 les/c/f=43/43/0 sis=59) [0] r=0 lpr=59 pi=[42,59)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 02 09:41:55 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 60 pg[11.13( empty local-lis/les=42/43 n=0 ec=59/42 lis/c=42/42 les/c/f=43/43/0 sis=59) [0] r=0 lpr=59 pi=[42,59)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 02 09:41:55 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 60 pg[11.12( empty local-lis/les=42/43 n=0 ec=59/42 lis/c=42/42 les/c/f=43/43/0 sis=59) [0] r=0 lpr=59 pi=[42,59)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 02 09:41:55 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 60 pg[11.1( empty local-lis/les=42/43 n=0 ec=59/42 lis/c=42/42 les/c/f=43/43/0 sis=59) [0] r=0 lpr=59 pi=[42,59)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 02 09:41:55 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 60 pg[11.c( empty local-lis/les=42/43 n=0 ec=59/42 lis/c=42/42 les/c/f=43/43/0 sis=59) [0] r=0 lpr=59 pi=[42,59)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 02 09:41:55 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 60 pg[11.b( empty local-lis/les=42/43 n=0 ec=59/42 lis/c=42/42 les/c/f=43/43/0 sis=59) [0] r=0 lpr=59 pi=[42,59)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 02 09:41:55 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 60 pg[11.a( empty local-lis/les=42/43 n=0 ec=59/42 lis/c=42/42 les/c/f=43/43/0 sis=59) [0] r=0 lpr=59 pi=[42,59)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 02 09:41:55 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 60 pg[11.14( empty local-lis/les=42/43 n=0 ec=59/42 lis/c=42/42 les/c/f=43/43/0 sis=59) [0] r=0 lpr=59 pi=[42,59)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 02 09:41:55 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 60 pg[11.9( empty local-lis/les=42/43 n=0 ec=59/42 lis/c=42/42 les/c/f=43/43/0 sis=59) [0] r=0 lpr=59 pi=[42,59)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 02 09:41:55 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 60 pg[11.d( empty local-lis/les=42/43 n=0 ec=59/42 lis/c=42/42 les/c/f=43/43/0 sis=59) [0] r=0 lpr=59 pi=[42,59)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 02 09:41:55 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 60 pg[11.e( empty local-lis/les=42/43 n=0 ec=59/42 lis/c=42/42 les/c/f=43/43/0 sis=59) [0] r=0 lpr=59 pi=[42,59)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 02 09:41:55 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 60 pg[11.f( empty local-lis/les=42/43 n=0 ec=59/42 lis/c=42/42 les/c/f=43/43/0 sis=59) [0] r=0 lpr=59 pi=[42,59)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 02 09:41:55 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 60 pg[11.8( empty local-lis/les=42/43 n=0 ec=59/42 lis/c=42/42 les/c/f=43/43/0 sis=59) [0] r=0 lpr=59 pi=[42,59)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 02 09:41:55 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 60 pg[11.2( empty local-lis/les=42/43 n=0 ec=59/42 lis/c=42/42 les/c/f=43/43/0 sis=59) [0] r=0 lpr=59 pi=[42,59)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 02 09:41:55 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 60 pg[11.3( empty local-lis/les=42/43 n=0 ec=59/42 lis/c=42/42 les/c/f=43/43/0 sis=59) [0] r=0 lpr=59 pi=[42,59)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 02 09:41:55 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 60 pg[11.4( empty local-lis/les=42/43 n=0 ec=59/42 lis/c=42/42 les/c/f=43/43/0 sis=59) [0] r=0 lpr=59 pi=[42,59)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 02 09:41:55 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 60 pg[11.5( empty local-lis/les=42/43 n=0 ec=59/42 lis/c=42/42 les/c/f=43/43/0 sis=59) [0] r=0 lpr=59 pi=[42,59)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 02 09:41:55 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 60 pg[11.6( empty local-lis/les=42/43 n=0 ec=59/42 lis/c=42/42 les/c/f=43/43/0 sis=59) [0] r=0 lpr=59 pi=[42,59)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 02 09:41:55 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 60 pg[11.7( empty local-lis/les=42/43 n=0 ec=59/42 lis/c=42/42 les/c/f=43/43/0 sis=59) [0] r=0 lpr=59 pi=[42,59)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 02 09:41:55 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 60 pg[11.18( empty local-lis/les=42/43 n=0 ec=59/42 lis/c=42/42 les/c/f=43/43/0 sis=59) [0] r=0 lpr=59 pi=[42,59)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 02 09:41:55 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 60 pg[11.19( empty local-lis/les=42/43 n=0 ec=59/42 lis/c=42/42 les/c/f=43/43/0 sis=59) [0] r=0 lpr=59 pi=[42,59)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 02 09:41:55 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 60 pg[11.1a( empty local-lis/les=42/43 n=0 ec=59/42 lis/c=42/42 les/c/f=43/43/0 sis=59) [0] r=0 lpr=59 pi=[42,59)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 02 09:41:55 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 60 pg[11.1c( empty local-lis/les=42/43 n=0 ec=59/42 lis/c=42/42 les/c/f=43/43/0 sis=59) [0] r=0 lpr=59 pi=[42,59)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 02 09:41:55 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 60 pg[11.1b( empty local-lis/les=42/43 n=0 ec=59/42 lis/c=42/42 les/c/f=43/43/0 sis=59) [0] r=0 lpr=59 pi=[42,59)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 02 09:41:55 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 60 pg[11.1d( empty local-lis/les=42/43 n=0 ec=59/42 lis/c=42/42 les/c/f=43/43/0 sis=59) [0] r=0 lpr=59 pi=[42,59)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 02 09:41:55 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 60 pg[11.1e( empty local-lis/les=42/43 n=0 ec=59/42 lis/c=42/42 les/c/f=43/43/0 sis=59) [0] r=0 lpr=59 pi=[42,59)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 02 09:41:55 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 60 pg[11.1f( empty local-lis/les=42/43 n=0 ec=59/42 lis/c=42/42 les/c/f=43/43/0 sis=59) [0] r=0 lpr=59 pi=[42,59)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 02 09:41:55 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 60 pg[11.11( empty local-lis/les=42/43 n=0 ec=59/42 lis/c=42/42 les/c/f=43/43/0 sis=59) [0] r=0 lpr=59 pi=[42,59)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 02 09:41:55 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 60 pg[11.10( empty local-lis/les=42/43 n=0 ec=59/42 lis/c=42/42 les/c/f=43/43/0 sis=59) [0] r=0 lpr=59 pi=[42,59)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 02 09:41:55 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 60 pg[11.17( empty local-lis/les=59/60 n=0 ec=59/42 lis/c=42/42 les/c/f=43/43/0 sis=59) [0] r=0 lpr=59 pi=[42,59)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 02 09:41:55 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 60 pg[11.16( empty local-lis/les=59/60 n=0 ec=59/42 lis/c=42/42 les/c/f=43/43/0 sis=59) [0] r=0 lpr=59 pi=[42,59)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 02 09:41:55 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 60 pg[11.12( empty local-lis/les=59/60 n=0 ec=59/42 lis/c=42/42 les/c/f=43/43/0 sis=59) [0] r=0 lpr=59 pi=[42,59)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 02 09:41:55 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 60 pg[11.15( empty local-lis/les=59/60 n=0 ec=59/42 lis/c=42/42 les/c/f=43/43/0 sis=59) [0] r=0 lpr=59 pi=[42,59)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 02 09:41:55 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 60 pg[11.13( empty local-lis/les=59/60 n=0 ec=59/42 lis/c=42/42 les/c/f=43/43/0 sis=59) [0] r=0 lpr=59 pi=[42,59)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 02 09:41:55 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 60 pg[11.c( empty local-lis/les=59/60 n=0 ec=59/42 lis/c=42/42 les/c/f=43/43/0 sis=59) [0] r=0 lpr=59 pi=[42,59)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 02 09:41:55 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 60 pg[11.0( empty local-lis/les=59/60 n=0 ec=42/42 lis/c=42/42 les/c/f=43/43/0 sis=59) [0] r=0 lpr=59 pi=[42,59)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 02 09:41:55 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 60 pg[11.1( empty local-lis/les=59/60 n=0 ec=59/42 lis/c=42/42 les/c/f=43/43/0 sis=59) [0] r=0 lpr=59 pi=[42,59)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 02 09:41:55 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 60 pg[11.b( empty local-lis/les=59/60 n=0 ec=59/42 lis/c=42/42 les/c/f=43/43/0 sis=59) [0] r=0 lpr=59 pi=[42,59)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 02 09:41:55 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 60 pg[11.a( empty local-lis/les=59/60 n=0 ec=59/42 lis/c=42/42 les/c/f=43/43/0 sis=59) [0] r=0 lpr=59 pi=[42,59)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 02 09:41:55 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 60 pg[11.9( empty local-lis/les=59/60 n=0 ec=59/42 lis/c=42/42 les/c/f=43/43/0 sis=59) [0] r=0 lpr=59 pi=[42,59)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 02 09:41:55 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 60 pg[11.d( empty local-lis/les=59/60 n=0 ec=59/42 lis/c=42/42 les/c/f=43/43/0 sis=59) [0] r=0 lpr=59 pi=[42,59)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 02 09:41:55 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 60 pg[11.8( empty local-lis/les=59/60 n=0 ec=59/42 lis/c=42/42 les/c/f=43/43/0 sis=59) [0] r=0 lpr=59 pi=[42,59)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 02 09:41:55 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 60 pg[11.f( empty local-lis/les=59/60 n=0 ec=59/42 lis/c=42/42 les/c/f=43/43/0 sis=59) [0] r=0 lpr=59 pi=[42,59)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 02 09:41:55 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 60 pg[11.e( empty local-lis/les=59/60 n=0 ec=59/42 lis/c=42/42 les/c/f=43/43/0 sis=59) [0] r=0 lpr=59 pi=[42,59)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 02 09:41:55 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 60 pg[11.3( empty local-lis/les=59/60 n=0 ec=59/42 lis/c=42/42 les/c/f=43/43/0 sis=59) [0] r=0 lpr=59 pi=[42,59)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 02 09:41:55 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 60 pg[11.4( empty local-lis/les=59/60 n=0 ec=59/42 lis/c=42/42 les/c/f=43/43/0 sis=59) [0] r=0 lpr=59 pi=[42,59)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 02 09:41:55 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 60 pg[11.2( empty local-lis/les=59/60 n=0 ec=59/42 lis/c=42/42 les/c/f=43/43/0 sis=59) [0] r=0 lpr=59 pi=[42,59)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 02 09:41:55 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 60 pg[11.5( empty local-lis/les=59/60 n=0 ec=59/42 lis/c=42/42 les/c/f=43/43/0 sis=59) [0] r=0 lpr=59 pi=[42,59)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 02 09:41:55 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 60 pg[11.6( empty local-lis/les=59/60 n=0 ec=59/42 lis/c=42/42 les/c/f=43/43/0 sis=59) [0] r=0 lpr=59 pi=[42,59)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 02 09:41:55 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 60 pg[11.7( empty local-lis/les=59/60 n=0 ec=59/42 lis/c=42/42 les/c/f=43/43/0 sis=59) [0] r=0 lpr=59 pi=[42,59)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 02 09:41:55 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 60 pg[11.18( empty local-lis/les=59/60 n=0 ec=59/42 lis/c=42/42 les/c/f=43/43/0 sis=59) [0] r=0 lpr=59 pi=[42,59)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 02 09:41:55 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 60 pg[11.19( empty local-lis/les=59/60 n=0 ec=59/42 lis/c=42/42 les/c/f=43/43/0 sis=59) [0] r=0 lpr=59 pi=[42,59)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 02 09:41:55 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 60 pg[11.14( empty local-lis/les=59/60 n=0 ec=59/42 lis/c=42/42 les/c/f=43/43/0 sis=59) [0] r=0 lpr=59 pi=[42,59)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 02 09:41:55 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 60 pg[11.1c( empty local-lis/les=59/60 n=0 ec=59/42 lis/c=42/42 les/c/f=43/43/0 sis=59) [0] r=0 lpr=59 pi=[42,59)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 02 09:41:55 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 60 pg[11.1a( empty local-lis/les=59/60 n=0 ec=59/42 lis/c=42/42 les/c/f=43/43/0 sis=59) [0] r=0 lpr=59 pi=[42,59)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 02 09:41:55 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 60 pg[11.1b( empty local-lis/les=59/60 n=0 ec=59/42 lis/c=42/42 les/c/f=43/43/0 sis=59) [0] r=0 lpr=59 pi=[42,59)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 02 09:41:55 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 60 pg[11.1e( empty local-lis/les=59/60 n=0 ec=59/42 lis/c=42/42 les/c/f=43/43/0 sis=59) [0] r=0 lpr=59 pi=[42,59)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 02 09:41:55 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 60 pg[11.1d( empty local-lis/les=59/60 n=0 ec=59/42 lis/c=42/42 les/c/f=43/43/0 sis=59) [0] r=0 lpr=59 pi=[42,59)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 02 09:41:55 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 60 pg[11.1f( empty local-lis/les=59/60 n=0 ec=59/42 lis/c=42/42 les/c/f=43/43/0 sis=59) [0] r=0 lpr=59 pi=[42,59)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 02 09:41:55 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 60 pg[11.11( empty local-lis/les=59/60 n=0 ec=59/42 lis/c=42/42 les/c/f=43/43/0 sis=59) [0] r=0 lpr=59 pi=[42,59)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 02 09:41:55 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 60 pg[11.10( empty local-lis/les=59/60 n=0 ec=59/42 lis/c=42/42 les/c/f=43/43/0 sis=59) [0] r=0 lpr=59 pi=[42,59)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 02 09:41:55 compute-1 ceph-mon[80115]: 8.15 scrub starts
Feb 02 09:41:55 compute-1 ceph-mon[80115]: 8.15 scrub ok
Feb 02 09:41:55 compute-1 ceph-mon[80115]: 10.12 deep-scrub starts
Feb 02 09:41:55 compute-1 ceph-mon[80115]: 10.12 deep-scrub ok
Feb 02 09:41:55 compute-1 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' cmd='[{"prefix": "osd pool set", "pool": ".nfs", "var": "pg_num_actual", "val": "32"}]': finished
Feb 02 09:41:55 compute-1 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_num_actual", "val": "32"}]': finished
Feb 02 09:41:55 compute-1 ceph-mon[80115]: osdmap e59: 3 total, 3 up, 3 in
Feb 02 09:41:55 compute-1 ceph-mon[80115]: osdmap e60: 3 total, 3 up, 3 in
Feb 02 09:41:56 compute-1 ceph-osd[77691]: log_channel(cluster) log [DBG] : 8.2 scrub starts
Feb 02 09:41:56 compute-1 ceph-osd[77691]: log_channel(cluster) log [DBG] : 8.2 scrub ok
Feb 02 09:41:56 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[86236]: 02/02/2026 09:41:56 : epoch 6980714c : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e08008dc0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:41:56 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[86236]: 02/02/2026 09:41:56 : epoch 6980714c : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9df0001ac0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:41:56 compute-1 ceph-mon[80115]: 8.17 scrub starts
Feb 02 09:41:56 compute-1 ceph-mon[80115]: 8.17 scrub ok
Feb 02 09:41:56 compute-1 ceph-mon[80115]: 10.7 scrub starts
Feb 02 09:41:56 compute-1 ceph-mon[80115]: 10.7 scrub ok
Feb 02 09:41:56 compute-1 ceph-mon[80115]: pgmap v48: 353 pgs: 62 unknown, 291 active+clean; 456 KiB data, 103 MiB used, 60 GiB / 60 GiB avail
Feb 02 09:41:56 compute-1 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' 
Feb 02 09:41:56 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:41:56 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000024s ======
Feb 02 09:41:56 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:41:56.915 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Feb 02 09:41:57 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:41:57 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000025s ======
Feb 02 09:41:57 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:41:57.011 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Feb 02 09:41:57 compute-1 ceph-osd[77691]: log_channel(cluster) log [DBG] : 8.10 scrub starts
Feb 02 09:41:57 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[86236]: 02/02/2026 09:41:57 : epoch 6980714c : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e00001c00 fd 37 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:41:57 compute-1 ceph-osd[77691]: log_channel(cluster) log [DBG] : 8.10 scrub ok
Feb 02 09:41:57 compute-1 ceph-mon[80115]: 8.2 scrub starts
Feb 02 09:41:57 compute-1 ceph-mon[80115]: 8.2 scrub ok
Feb 02 09:41:57 compute-1 ceph-mon[80115]: 10.1b scrub starts
Feb 02 09:41:57 compute-1 ceph-mon[80115]: 10.1b scrub ok
Feb 02 09:41:57 compute-1 ceph-mon[80115]: 10.1f scrub starts
Feb 02 09:41:57 compute-1 ceph-mon[80115]: 10.1f scrub ok
Feb 02 09:41:58 compute-1 ceph-osd[77691]: log_channel(cluster) log [DBG] : 8.8 deep-scrub starts
Feb 02 09:41:58 compute-1 ceph-osd[77691]: log_channel(cluster) log [DBG] : 8.8 deep-scrub ok
Feb 02 09:41:58 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[86236]: 02/02/2026 09:41:58 : epoch 6980714c : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9de4002b10 fd 37 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:41:58 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[86236]: 02/02/2026 09:41:58 : epoch 6980714c : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e08008f40 fd 37 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:41:58 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:41:58 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:41:58 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:41:58.918 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:41:59 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:41:59 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:41:59 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:41:59.015 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:41:59 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[86236]: 02/02/2026 09:41:59 : epoch 6980714c : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9df0002f50 fd 37 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:41:59 compute-1 ceph-osd[77691]: log_channel(cluster) log [DBG] : 8.9 scrub starts
Feb 02 09:41:59 compute-1 ceph-osd[77691]: log_channel(cluster) log [DBG] : 8.9 scrub ok
Feb 02 09:41:59 compute-1 ceph-mon[80115]: 8.10 scrub starts
Feb 02 09:41:59 compute-1 ceph-mon[80115]: 8.10 scrub ok
Feb 02 09:41:59 compute-1 ceph-mon[80115]: pgmap v49: 353 pgs: 62 unknown, 291 active+clean; 456 KiB data, 103 MiB used, 60 GiB / 60 GiB avail
Feb 02 09:41:59 compute-1 ceph-mon[80115]: 10.10 scrub starts
Feb 02 09:41:59 compute-1 ceph-mon[80115]: 10.10 scrub ok
Feb 02 09:42:00 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e61 e61: 3 total, 3 up, 3 in
Feb 02 09:42:00 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 61 pg[11.17( empty local-lis/les=59/60 n=0 ec=59/42 lis/c=59/59 les/c/f=60/60/0 sis=61 pruub=11.609076500s) [2] r=-1 lpr=61 pi=[59,61)/1 crt=0'0 mlcod 0'0 active pruub 182.828979492s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb 02 09:42:00 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 61 pg[11.17( empty local-lis/les=59/60 n=0 ec=59/42 lis/c=59/59 les/c/f=60/60/0 sis=61 pruub=11.609023094s) [2] r=-1 lpr=61 pi=[59,61)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 182.828979492s@ mbc={}] state<Start>: transitioning to Stray
Feb 02 09:42:00 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 61 pg[8.14( v 37'12 (0'0,37'12] local-lis/les=55/56 n=0 ec=55/36 lis/c=55/55 les/c/f=56/56/0 sis=61 pruub=15.121712685s) [1] r=-1 lpr=61 pi=[55,61)/1 crt=37'12 lcod 0'0 mlcod 0'0 active pruub 186.341796875s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb 02 09:42:00 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 61 pg[11.16( empty local-lis/les=59/60 n=0 ec=59/42 lis/c=59/59 les/c/f=60/60/0 sis=61 pruub=11.613359451s) [2] r=-1 lpr=61 pi=[59,61)/1 crt=0'0 mlcod 0'0 active pruub 182.833557129s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb 02 09:42:00 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 61 pg[8.14( v 37'12 (0'0,37'12] local-lis/les=55/56 n=0 ec=55/36 lis/c=55/55 les/c/f=56/56/0 sis=61 pruub=15.121651649s) [1] r=-1 lpr=61 pi=[55,61)/1 crt=37'12 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 186.341796875s@ mbc={}] state<Start>: transitioning to Stray
Feb 02 09:42:00 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 61 pg[8.15( v 37'12 (0'0,37'12] local-lis/les=55/56 n=0 ec=55/36 lis/c=55/55 les/c/f=56/56/0 sis=61 pruub=15.125307083s) [2] r=-1 lpr=61 pi=[55,61)/1 crt=37'12 lcod 0'0 mlcod 0'0 active pruub 186.345596313s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb 02 09:42:00 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 61 pg[11.16( empty local-lis/les=59/60 n=0 ec=59/42 lis/c=59/59 les/c/f=60/60/0 sis=61 pruub=11.613282204s) [2] r=-1 lpr=61 pi=[59,61)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 182.833557129s@ mbc={}] state<Start>: transitioning to Stray
Feb 02 09:42:00 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 61 pg[8.15( v 37'12 (0'0,37'12] local-lis/les=55/56 n=0 ec=55/36 lis/c=55/55 les/c/f=56/56/0 sis=61 pruub=15.125280380s) [2] r=-1 lpr=61 pi=[55,61)/1 crt=37'12 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 186.345596313s@ mbc={}] state<Start>: transitioning to Stray
Feb 02 09:42:00 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 61 pg[11.14( empty local-lis/les=59/60 n=0 ec=59/42 lis/c=59/59 les/c/f=60/60/0 sis=61 pruub=11.614042282s) [1] r=-1 lpr=61 pi=[59,61)/1 crt=0'0 mlcod 0'0 active pruub 182.834625244s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb 02 09:42:00 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 61 pg[11.14( empty local-lis/les=59/60 n=0 ec=59/42 lis/c=59/59 les/c/f=60/60/0 sis=61 pruub=11.614006996s) [1] r=-1 lpr=61 pi=[59,61)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 182.834625244s@ mbc={}] state<Start>: transitioning to Stray
Feb 02 09:42:00 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 61 pg[11.13( empty local-lis/les=59/60 n=0 ec=59/42 lis/c=59/59 les/c/f=60/60/0 sis=61 pruub=11.612858772s) [2] r=-1 lpr=61 pi=[59,61)/1 crt=0'0 mlcod 0'0 active pruub 182.833755493s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb 02 09:42:00 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 61 pg[8.17( v 37'12 (0'0,37'12] local-lis/les=55/56 n=0 ec=55/36 lis/c=55/55 les/c/f=56/56/0 sis=61 pruub=15.124693871s) [1] r=-1 lpr=61 pi=[55,61)/1 crt=37'12 lcod 0'0 mlcod 0'0 active pruub 186.345626831s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb 02 09:42:00 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 61 pg[11.13( empty local-lis/les=59/60 n=0 ec=59/42 lis/c=59/59 les/c/f=60/60/0 sis=61 pruub=11.612816811s) [2] r=-1 lpr=61 pi=[59,61)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 182.833755493s@ mbc={}] state<Start>: transitioning to Stray
Feb 02 09:42:00 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 61 pg[8.10( v 37'12 (0'0,37'12] local-lis/les=55/56 n=0 ec=55/36 lis/c=55/55 les/c/f=56/56/0 sis=61 pruub=15.124675751s) [1] r=-1 lpr=61 pi=[55,61)/1 crt=37'12 lcod 0'0 mlcod 0'0 active pruub 186.345657349s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb 02 09:42:00 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 61 pg[8.17( v 37'12 (0'0,37'12] local-lis/les=55/56 n=0 ec=55/36 lis/c=55/55 les/c/f=56/56/0 sis=61 pruub=15.124633789s) [1] r=-1 lpr=61 pi=[55,61)/1 crt=37'12 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 186.345626831s@ mbc={}] state<Start>: transitioning to Stray
Feb 02 09:42:00 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 61 pg[8.10( v 37'12 (0'0,37'12] local-lis/les=55/56 n=0 ec=55/36 lis/c=55/55 les/c/f=56/56/0 sis=61 pruub=15.124631882s) [1] r=-1 lpr=61 pi=[55,61)/1 crt=37'12 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 186.345657349s@ mbc={}] state<Start>: transitioning to Stray
Feb 02 09:42:00 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 61 pg[8.16( v 37'12 (0'0,37'12] local-lis/les=55/56 n=0 ec=55/36 lis/c=55/55 les/c/f=56/56/0 sis=61 pruub=15.124475479s) [2] r=-1 lpr=61 pi=[55,61)/1 crt=37'12 lcod 0'0 mlcod 0'0 active pruub 186.345626831s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb 02 09:42:00 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 61 pg[8.16( v 37'12 (0'0,37'12] local-lis/les=55/56 n=0 ec=55/36 lis/c=55/55 les/c/f=56/56/0 sis=61 pruub=15.124404907s) [2] r=-1 lpr=61 pi=[55,61)/1 crt=37'12 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 186.345626831s@ mbc={}] state<Start>: transitioning to Stray
Feb 02 09:42:00 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 61 pg[11.12( empty local-lis/les=59/60 n=0 ec=59/42 lis/c=59/59 les/c/f=60/60/0 sis=61 pruub=11.612273216s) [1] r=-1 lpr=61 pi=[59,61)/1 crt=0'0 mlcod 0'0 active pruub 182.833572388s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb 02 09:42:00 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 61 pg[11.1( empty local-lis/les=59/60 n=0 ec=59/42 lis/c=59/59 les/c/f=60/60/0 sis=61 pruub=11.612373352s) [1] r=-1 lpr=61 pi=[59,61)/1 crt=0'0 mlcod 0'0 active pruub 182.833770752s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb 02 09:42:00 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 61 pg[8.11( v 37'12 (0'0,37'12] local-lis/les=55/56 n=0 ec=55/36 lis/c=55/55 les/c/f=56/56/0 sis=61 pruub=15.124789238s) [2] r=-1 lpr=61 pi=[55,61)/1 crt=37'12 lcod 0'0 mlcod 0'0 active pruub 186.346160889s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb 02 09:42:00 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 61 pg[11.12( empty local-lis/les=59/60 n=0 ec=59/42 lis/c=59/59 les/c/f=60/60/0 sis=61 pruub=11.612235069s) [1] r=-1 lpr=61 pi=[59,61)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 182.833572388s@ mbc={}] state<Start>: transitioning to Stray
Feb 02 09:42:00 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 61 pg[11.1( empty local-lis/les=59/60 n=0 ec=59/42 lis/c=59/59 les/c/f=60/60/0 sis=61 pruub=11.612346649s) [1] r=-1 lpr=61 pi=[59,61)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 182.833770752s@ mbc={}] state<Start>: transitioning to Stray
Feb 02 09:42:00 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 61 pg[8.2( v 37'12 (0'0,37'12] local-lis/les=55/56 n=0 ec=55/36 lis/c=55/55 les/c/f=56/56/0 sis=61 pruub=15.124505997s) [2] r=-1 lpr=61 pi=[55,61)/1 crt=37'12 lcod 0'0 mlcod 0'0 active pruub 186.345977783s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb 02 09:42:00 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 61 pg[8.11( v 37'12 (0'0,37'12] local-lis/les=55/56 n=0 ec=55/36 lis/c=55/55 les/c/f=56/56/0 sis=61 pruub=15.124707222s) [2] r=-1 lpr=61 pi=[55,61)/1 crt=37'12 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 186.346160889s@ mbc={}] state<Start>: transitioning to Stray
Feb 02 09:42:00 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 61 pg[8.2( v 37'12 (0'0,37'12] local-lis/les=55/56 n=0 ec=55/36 lis/c=55/55 les/c/f=56/56/0 sis=61 pruub=15.124474525s) [2] r=-1 lpr=61 pi=[55,61)/1 crt=37'12 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 186.345977783s@ mbc={}] state<Start>: transitioning to Stray
Feb 02 09:42:00 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 61 pg[8.3( v 37'12 (0'0,37'12] local-lis/les=55/56 n=0 ec=55/36 lis/c=55/55 les/c/f=56/56/0 sis=61 pruub=15.124224663s) [2] r=-1 lpr=61 pi=[55,61)/1 crt=37'12 lcod 0'0 mlcod 0'0 active pruub 186.345855713s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb 02 09:42:00 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 61 pg[8.f( v 37'12 (0'0,37'12] local-lis/les=55/56 n=0 ec=55/36 lis/c=55/55 les/c/f=56/56/0 sis=61 pruub=15.124885559s) [2] r=-1 lpr=61 pi=[55,61)/1 crt=37'12 lcod 0'0 mlcod 0'0 active pruub 186.346649170s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb 02 09:42:00 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 61 pg[8.3( v 37'12 (0'0,37'12] local-lis/les=55/56 n=0 ec=55/36 lis/c=55/55 les/c/f=56/56/0 sis=61 pruub=15.124187469s) [2] r=-1 lpr=61 pi=[55,61)/1 crt=37'12 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 186.345855713s@ mbc={}] state<Start>: transitioning to Stray
Feb 02 09:42:00 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 61 pg[8.f( v 37'12 (0'0,37'12] local-lis/les=55/56 n=0 ec=55/36 lis/c=55/55 les/c/f=56/56/0 sis=61 pruub=15.124848366s) [2] r=-1 lpr=61 pi=[55,61)/1 crt=37'12 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 186.346649170s@ mbc={}] state<Start>: transitioning to Stray
Feb 02 09:42:00 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 61 pg[8.9( v 37'12 (0'0,37'12] local-lis/les=55/56 n=0 ec=55/36 lis/c=55/55 les/c/f=56/56/0 sis=61 pruub=15.124327660s) [2] r=-1 lpr=61 pi=[55,61)/1 crt=37'12 lcod 0'0 mlcod 0'0 active pruub 186.346328735s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb 02 09:42:00 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 61 pg[11.a( empty local-lis/les=59/60 n=0 ec=59/42 lis/c=59/59 les/c/f=60/60/0 sis=61 pruub=11.611934662s) [2] r=-1 lpr=61 pi=[59,61)/1 crt=0'0 mlcod 0'0 active pruub 182.833969116s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb 02 09:42:00 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 61 pg[11.a( empty local-lis/les=59/60 n=0 ec=59/42 lis/c=59/59 les/c/f=60/60/0 sis=61 pruub=11.611896515s) [2] r=-1 lpr=61 pi=[59,61)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 182.833969116s@ mbc={}] state<Start>: transitioning to Stray
Feb 02 09:42:00 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 61 pg[8.8( v 37'12 (0'0,37'12] local-lis/les=55/56 n=0 ec=55/36 lis/c=55/55 les/c/f=56/56/0 sis=61 pruub=15.124028206s) [1] r=-1 lpr=61 pi=[55,61)/1 crt=37'12 lcod 0'0 mlcod 0'0 active pruub 186.346130371s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb 02 09:42:00 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 61 pg[8.9( v 37'12 (0'0,37'12] local-lis/les=55/56 n=0 ec=55/36 lis/c=55/55 les/c/f=56/56/0 sis=61 pruub=15.124276161s) [2] r=-1 lpr=61 pi=[55,61)/1 crt=37'12 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 186.346328735s@ mbc={}] state<Start>: transitioning to Stray
Feb 02 09:42:00 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 61 pg[8.a( v 37'12 (0'0,37'12] local-lis/les=55/56 n=0 ec=55/36 lis/c=55/55 les/c/f=56/56/0 sis=61 pruub=15.124288559s) [2] r=-1 lpr=61 pi=[55,61)/1 crt=37'12 lcod 0'0 mlcod 0'0 active pruub 186.346420288s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb 02 09:42:00 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 61 pg[8.8( v 37'12 (0'0,37'12] local-lis/les=55/56 n=0 ec=55/36 lis/c=55/55 les/c/f=56/56/0 sis=61 pruub=15.123966217s) [1] r=-1 lpr=61 pi=[55,61)/1 crt=37'12 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 186.346130371s@ mbc={}] state<Start>: transitioning to Stray
Feb 02 09:42:00 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 61 pg[8.a( v 37'12 (0'0,37'12] local-lis/les=55/56 n=0 ec=55/36 lis/c=55/55 les/c/f=56/56/0 sis=61 pruub=15.124264717s) [2] r=-1 lpr=61 pi=[55,61)/1 crt=37'12 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 186.346420288s@ mbc={}] state<Start>: transitioning to Stray
Feb 02 09:42:00 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 61 pg[8.d( v 37'12 (0'0,37'12] local-lis/les=55/56 n=0 ec=55/36 lis/c=55/55 les/c/f=56/56/0 sis=61 pruub=15.123930931s) [2] r=-1 lpr=61 pi=[55,61)/1 crt=37'12 lcod 0'0 mlcod 0'0 active pruub 186.346435547s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb 02 09:42:00 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 61 pg[11.e( empty local-lis/les=59/60 n=0 ec=59/42 lis/c=59/59 les/c/f=60/60/0 sis=61 pruub=11.611637115s) [2] r=-1 lpr=61 pi=[59,61)/1 crt=0'0 mlcod 0'0 active pruub 182.834091187s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb 02 09:42:00 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 61 pg[11.f( empty local-lis/les=59/60 n=0 ec=59/42 lis/c=59/59 les/c/f=60/60/0 sis=61 pruub=11.611498833s) [1] r=-1 lpr=61 pi=[59,61)/1 crt=0'0 mlcod 0'0 active pruub 182.834075928s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb 02 09:42:00 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 61 pg[8.d( v 37'12 (0'0,37'12] local-lis/les=55/56 n=0 ec=55/36 lis/c=55/55 les/c/f=56/56/0 sis=61 pruub=15.123887062s) [2] r=-1 lpr=61 pi=[55,61)/1 crt=37'12 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 186.346435547s@ mbc={}] state<Start>: transitioning to Stray
Feb 02 09:42:00 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 61 pg[8.c( v 37'12 (0'0,37'12] local-lis/les=55/56 n=0 ec=55/36 lis/c=55/55 les/c/f=56/56/0 sis=61 pruub=15.123829842s) [2] r=-1 lpr=61 pi=[55,61)/1 crt=37'12 lcod 0'0 mlcod 0'0 active pruub 186.346450806s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb 02 09:42:00 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 61 pg[11.e( empty local-lis/les=59/60 n=0 ec=59/42 lis/c=59/59 les/c/f=60/60/0 sis=61 pruub=11.611479759s) [2] r=-1 lpr=61 pi=[59,61)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 182.834091187s@ mbc={}] state<Start>: transitioning to Stray
Feb 02 09:42:00 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 61 pg[11.f( empty local-lis/les=59/60 n=0 ec=59/42 lis/c=59/59 les/c/f=60/60/0 sis=61 pruub=11.611470222s) [1] r=-1 lpr=61 pi=[59,61)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 182.834075928s@ mbc={}] state<Start>: transitioning to Stray
Feb 02 09:42:00 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 61 pg[8.c( v 37'12 (0'0,37'12] local-lis/les=55/56 n=0 ec=55/36 lis/c=55/55 les/c/f=56/56/0 sis=61 pruub=15.123803139s) [2] r=-1 lpr=61 pi=[55,61)/1 crt=37'12 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 186.346450806s@ mbc={}] state<Start>: transitioning to Stray
Feb 02 09:42:00 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 61 pg[11.8( empty local-lis/les=59/60 n=0 ec=59/42 lis/c=59/59 les/c/f=60/60/0 sis=61 pruub=11.611256599s) [2] r=-1 lpr=61 pi=[59,61)/1 crt=0'0 mlcod 0'0 active pruub 182.834030151s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb 02 09:42:00 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 61 pg[8.b( v 37'12 (0'0,37'12] local-lis/les=55/56 n=0 ec=55/36 lis/c=55/55 les/c/f=56/56/0 sis=61 pruub=15.123665810s) [2] r=-1 lpr=61 pi=[55,61)/1 crt=37'12 lcod 0'0 mlcod 0'0 active pruub 186.346466064s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb 02 09:42:00 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 61 pg[8.b( v 37'12 (0'0,37'12] local-lis/les=55/56 n=0 ec=55/36 lis/c=55/55 les/c/f=56/56/0 sis=61 pruub=15.123608589s) [2] r=-1 lpr=61 pi=[55,61)/1 crt=37'12 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 186.346466064s@ mbc={}] state<Start>: transitioning to Stray
Feb 02 09:42:00 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 61 pg[11.8( empty local-lis/les=59/60 n=0 ec=59/42 lis/c=59/59 les/c/f=60/60/0 sis=61 pruub=11.611164093s) [2] r=-1 lpr=61 pi=[59,61)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 182.834030151s@ mbc={}] state<Start>: transitioning to Stray
Feb 02 09:42:00 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 61 pg[11.3( empty local-lis/les=59/60 n=0 ec=59/42 lis/c=59/59 les/c/f=60/60/0 sis=61 pruub=11.611262321s) [2] r=-1 lpr=61 pi=[59,61)/1 crt=0'0 mlcod 0'0 active pruub 182.834335327s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb 02 09:42:00 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 61 pg[11.4( empty local-lis/les=59/60 n=0 ec=59/42 lis/c=59/59 les/c/f=60/60/0 sis=61 pruub=11.611124992s) [1] r=-1 lpr=61 pi=[59,61)/1 crt=0'0 mlcod 0'0 active pruub 182.834335327s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb 02 09:42:00 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 61 pg[11.3( empty local-lis/les=59/60 n=0 ec=59/42 lis/c=59/59 les/c/f=60/60/0 sis=61 pruub=11.611133575s) [2] r=-1 lpr=61 pi=[59,61)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 182.834335327s@ mbc={}] state<Start>: transitioning to Stray
Feb 02 09:42:00 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 61 pg[11.4( empty local-lis/les=59/60 n=0 ec=59/42 lis/c=59/59 les/c/f=60/60/0 sis=61 pruub=11.611044884s) [1] r=-1 lpr=61 pi=[59,61)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 182.834335327s@ mbc={}] state<Start>: transitioning to Stray
Feb 02 09:42:00 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 61 pg[11.5( v 60'1 (0'0,60'1] local-lis/les=59/60 n=1 ec=59/42 lis/c=59/59 les/c/f=60/60/0 sis=61 pruub=11.611040115s) [1] r=-1 lpr=61 pi=[59,61)/1 crt=60'1 lcod 0'0 mlcod 0'0 active pruub 182.834381104s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb 02 09:42:00 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 61 pg[8.6( v 37'12 (0'0,37'12] local-lis/les=55/56 n=1 ec=55/36 lis/c=55/55 les/c/f=56/56/0 sis=61 pruub=15.123147011s) [2] r=-1 lpr=61 pi=[55,61)/1 crt=37'12 lcod 0'0 mlcod 0'0 active pruub 186.346542358s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb 02 09:42:00 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 61 pg[11.5( v 60'1 (0'0,60'1] local-lis/les=59/60 n=1 ec=59/42 lis/c=59/59 les/c/f=60/60/0 sis=61 pruub=11.611004829s) [1] r=-1 lpr=61 pi=[59,61)/1 crt=60'1 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 182.834381104s@ mbc={}] state<Start>: transitioning to Stray
Feb 02 09:42:00 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 61 pg[8.6( v 37'12 (0'0,37'12] local-lis/les=55/56 n=1 ec=55/36 lis/c=55/55 les/c/f=56/56/0 sis=61 pruub=15.123107910s) [2] r=-1 lpr=61 pi=[55,61)/1 crt=37'12 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 186.346542358s@ mbc={}] state<Start>: transitioning to Stray
Feb 02 09:42:00 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 61 pg[11.7( empty local-lis/les=59/60 n=0 ec=59/42 lis/c=59/59 les/c/f=60/60/0 sis=61 pruub=11.610848427s) [1] r=-1 lpr=61 pi=[59,61)/1 crt=0'0 mlcod 0'0 active pruub 182.834472656s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb 02 09:42:00 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 61 pg[11.7( empty local-lis/les=59/60 n=0 ec=59/42 lis/c=59/59 les/c/f=60/60/0 sis=61 pruub=11.610822678s) [1] r=-1 lpr=61 pi=[59,61)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 182.834472656s@ mbc={}] state<Start>: transitioning to Stray
Feb 02 09:42:00 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 61 pg[8.4( v 37'12 (0'0,37'12] local-lis/les=55/56 n=1 ec=55/36 lis/c=55/55 les/c/f=56/56/0 sis=61 pruub=15.122960091s) [1] r=-1 lpr=61 pi=[55,61)/1 crt=37'12 lcod 0'0 mlcod 0'0 active pruub 186.346664429s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb 02 09:42:00 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 61 pg[8.5( v 37'12 (0'0,37'12] local-lis/les=55/56 n=1 ec=55/36 lis/c=55/55 les/c/f=56/56/0 sis=61 pruub=15.123128891s) [2] r=-1 lpr=61 pi=[55,61)/1 crt=37'12 lcod 0'0 mlcod 0'0 active pruub 186.346801758s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb 02 09:42:00 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 61 pg[8.4( v 37'12 (0'0,37'12] local-lis/les=55/56 n=1 ec=55/36 lis/c=55/55 les/c/f=56/56/0 sis=61 pruub=15.122926712s) [1] r=-1 lpr=61 pi=[55,61)/1 crt=37'12 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 186.346664429s@ mbc={}] state<Start>: transitioning to Stray
Feb 02 09:42:00 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 61 pg[8.5( v 37'12 (0'0,37'12] local-lis/les=55/56 n=1 ec=55/36 lis/c=55/55 les/c/f=56/56/0 sis=61 pruub=15.123043060s) [2] r=-1 lpr=61 pi=[55,61)/1 crt=37'12 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 186.346801758s@ mbc={}] state<Start>: transitioning to Stray
Feb 02 09:42:00 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 61 pg[11.19( empty local-lis/les=59/60 n=0 ec=59/42 lis/c=59/59 les/c/f=60/60/0 sis=61 pruub=11.610724449s) [2] r=-1 lpr=61 pi=[59,61)/1 crt=0'0 mlcod 0'0 active pruub 182.834594727s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb 02 09:42:00 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 61 pg[8.1b( v 37'12 (0'0,37'12] local-lis/les=55/56 n=0 ec=55/36 lis/c=55/55 les/c/f=56/56/0 sis=61 pruub=15.122791290s) [1] r=-1 lpr=61 pi=[55,61)/1 crt=37'12 lcod 0'0 mlcod 0'0 active pruub 186.346664429s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb 02 09:42:00 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 61 pg[11.19( empty local-lis/les=59/60 n=0 ec=59/42 lis/c=59/59 les/c/f=60/60/0 sis=61 pruub=11.610702515s) [2] r=-1 lpr=61 pi=[59,61)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 182.834594727s@ mbc={}] state<Start>: transitioning to Stray
Feb 02 09:42:00 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 61 pg[8.1b( v 37'12 (0'0,37'12] local-lis/les=55/56 n=0 ec=55/36 lis/c=55/55 les/c/f=56/56/0 sis=61 pruub=15.122735977s) [1] r=-1 lpr=61 pi=[55,61)/1 crt=37'12 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 186.346664429s@ mbc={}] state<Start>: transitioning to Stray
Feb 02 09:42:00 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 61 pg[11.1a( empty local-lis/les=59/60 n=0 ec=59/42 lis/c=59/59 les/c/f=60/60/0 sis=61 pruub=11.610624313s) [1] r=-1 lpr=61 pi=[59,61)/1 crt=0'0 mlcod 0'0 active pruub 182.834655762s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb 02 09:42:00 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 61 pg[8.19( v 37'12 (0'0,37'12] local-lis/les=55/56 n=0 ec=55/36 lis/c=55/55 les/c/f=56/56/0 sis=61 pruub=15.122812271s) [1] r=-1 lpr=61 pi=[55,61)/1 crt=37'12 lcod 0'0 mlcod 0'0 active pruub 186.346878052s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb 02 09:42:00 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 61 pg[11.1a( empty local-lis/les=59/60 n=0 ec=59/42 lis/c=59/59 les/c/f=60/60/0 sis=61 pruub=11.610590935s) [1] r=-1 lpr=61 pi=[59,61)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 182.834655762s@ mbc={}] state<Start>: transitioning to Stray
Feb 02 09:42:00 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 61 pg[11.1b( empty local-lis/les=59/60 n=0 ec=59/42 lis/c=59/59 les/c/f=60/60/0 sis=61 pruub=11.610517502s) [1] r=-1 lpr=61 pi=[59,61)/1 crt=0'0 mlcod 0'0 active pruub 182.834686279s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb 02 09:42:00 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 61 pg[8.19( v 37'12 (0'0,37'12] local-lis/les=55/56 n=0 ec=55/36 lis/c=55/55 les/c/f=56/56/0 sis=61 pruub=15.122742653s) [1] r=-1 lpr=61 pi=[55,61)/1 crt=37'12 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 186.346878052s@ mbc={}] state<Start>: transitioning to Stray
Feb 02 09:42:00 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 61 pg[8.18( v 37'12 (0'0,37'12] local-lis/les=55/56 n=0 ec=55/36 lis/c=55/55 les/c/f=56/56/0 sis=61 pruub=15.122614861s) [1] r=-1 lpr=61 pi=[55,61)/1 crt=37'12 lcod 0'0 mlcod 0'0 active pruub 186.346893311s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb 02 09:42:00 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 61 pg[11.1b( empty local-lis/les=59/60 n=0 ec=59/42 lis/c=59/59 les/c/f=60/60/0 sis=61 pruub=11.610429764s) [1] r=-1 lpr=61 pi=[59,61)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 182.834686279s@ mbc={}] state<Start>: transitioning to Stray
Feb 02 09:42:00 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 61 pg[11.1c( empty local-lis/les=59/60 n=0 ec=59/42 lis/c=59/59 les/c/f=60/60/0 sis=61 pruub=11.610346794s) [1] r=-1 lpr=61 pi=[59,61)/1 crt=0'0 mlcod 0'0 active pruub 182.834655762s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb 02 09:42:00 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 61 pg[8.18( v 37'12 (0'0,37'12] local-lis/les=55/56 n=0 ec=55/36 lis/c=55/55 les/c/f=56/56/0 sis=61 pruub=15.122591019s) [1] r=-1 lpr=61 pi=[55,61)/1 crt=37'12 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 186.346893311s@ mbc={}] state<Start>: transitioning to Stray
Feb 02 09:42:00 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 61 pg[11.1c( empty local-lis/les=59/60 n=0 ec=59/42 lis/c=59/59 les/c/f=60/60/0 sis=61 pruub=11.610308647s) [1] r=-1 lpr=61 pi=[59,61)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 182.834655762s@ mbc={}] state<Start>: transitioning to Stray
Feb 02 09:42:00 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 61 pg[8.1f( v 37'12 (0'0,37'12] local-lis/les=55/56 n=0 ec=55/36 lis/c=55/55 les/c/f=56/56/0 sis=61 pruub=15.124456406s) [2] r=-1 lpr=61 pi=[55,61)/1 crt=37'12 lcod 0'0 mlcod 0'0 active pruub 186.348846436s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb 02 09:42:00 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 61 pg[11.1d( empty local-lis/les=59/60 n=0 ec=59/42 lis/c=59/59 les/c/f=60/60/0 sis=61 pruub=11.610197067s) [1] r=-1 lpr=61 pi=[59,61)/1 crt=0'0 mlcod 0'0 active pruub 182.834686279s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb 02 09:42:00 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 61 pg[8.1f( v 37'12 (0'0,37'12] local-lis/les=55/56 n=0 ec=55/36 lis/c=55/55 les/c/f=56/56/0 sis=61 pruub=15.124374390s) [2] r=-1 lpr=61 pi=[55,61)/1 crt=37'12 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 186.348846436s@ mbc={}] state<Start>: transitioning to Stray
Feb 02 09:42:00 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 61 pg[11.1d( empty local-lis/les=59/60 n=0 ec=59/42 lis/c=59/59 les/c/f=60/60/0 sis=61 pruub=11.610175133s) [1] r=-1 lpr=61 pi=[59,61)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 182.834686279s@ mbc={}] state<Start>: transitioning to Stray
Feb 02 09:42:00 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 61 pg[11.1e( empty local-lis/les=59/60 n=0 ec=59/42 lis/c=59/59 les/c/f=60/60/0 sis=61 pruub=11.610151291s) [1] r=-1 lpr=61 pi=[59,61)/1 crt=0'0 mlcod 0'0 active pruub 182.834747314s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb 02 09:42:00 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 61 pg[11.1e( empty local-lis/les=59/60 n=0 ec=59/42 lis/c=59/59 les/c/f=60/60/0 sis=61 pruub=11.610117912s) [1] r=-1 lpr=61 pi=[59,61)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 182.834747314s@ mbc={}] state<Start>: transitioning to Stray
Feb 02 09:42:00 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 61 pg[8.1c( v 37'12 (0'0,37'12] local-lis/les=55/56 n=0 ec=55/36 lis/c=55/55 les/c/f=56/56/0 sis=61 pruub=15.124007225s) [2] r=-1 lpr=61 pi=[55,61)/1 crt=37'12 lcod 0'0 mlcod 0'0 active pruub 186.348831177s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb 02 09:42:00 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 61 pg[8.1c( v 37'12 (0'0,37'12] local-lis/les=55/56 n=0 ec=55/36 lis/c=55/55 les/c/f=56/56/0 sis=61 pruub=15.123960495s) [2] r=-1 lpr=61 pi=[55,61)/1 crt=37'12 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 186.348831177s@ mbc={}] state<Start>: transitioning to Stray
Feb 02 09:42:00 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 61 pg[8.12( v 37'12 (0'0,37'12] local-lis/les=55/56 n=0 ec=55/36 lis/c=55/55 les/c/f=56/56/0 sis=61 pruub=15.124003410s) [1] r=-1 lpr=61 pi=[55,61)/1 crt=37'12 lcod 0'0 mlcod 0'0 active pruub 186.349044800s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb 02 09:42:00 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 61 pg[8.12( v 37'12 (0'0,37'12] local-lis/les=55/56 n=0 ec=55/36 lis/c=55/55 les/c/f=56/56/0 sis=61 pruub=15.123972893s) [1] r=-1 lpr=61 pi=[55,61)/1 crt=37'12 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 186.349044800s@ mbc={}] state<Start>: transitioning to Stray
Feb 02 09:42:00 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 61 pg[10.15( empty local-lis/les=0/0 n=0 ec=57/40 lis/c=57/57 les/c/f=58/58/0 sis=61) [0] r=0 lpr=61 pi=[57,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 02 09:42:00 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 61 pg[12.10( empty local-lis/les=0/0 n=0 ec=59/51 lis/c=59/59 les/c/f=60/60/0 sis=61) [0] r=0 lpr=61 pi=[59,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 02 09:42:00 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 61 pg[12.12( empty local-lis/les=0/0 n=0 ec=59/51 lis/c=59/59 les/c/f=60/60/0 sis=61) [0] r=0 lpr=61 pi=[59,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 02 09:42:00 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 61 pg[10.14( empty local-lis/les=0/0 n=0 ec=57/40 lis/c=57/57 les/c/f=58/58/0 sis=61) [0] r=0 lpr=61 pi=[57,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 02 09:42:00 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 61 pg[10.13( empty local-lis/les=0/0 n=0 ec=57/40 lis/c=57/57 les/c/f=58/58/0 sis=61) [0] r=0 lpr=61 pi=[57,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 02 09:42:00 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 61 pg[10.2( empty local-lis/les=0/0 n=0 ec=57/40 lis/c=57/57 les/c/f=58/58/0 sis=61) [0] r=0 lpr=61 pi=[57,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 02 09:42:00 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 61 pg[12.6( empty local-lis/les=0/0 n=0 ec=59/51 lis/c=59/59 les/c/f=60/60/0 sis=61) [0] r=0 lpr=61 pi=[59,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 02 09:42:00 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 61 pg[12.a( empty local-lis/les=0/0 n=0 ec=59/51 lis/c=59/59 les/c/f=60/60/0 sis=61) [0] r=0 lpr=61 pi=[59,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 02 09:42:00 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 61 pg[12.c( empty local-lis/les=0/0 n=0 ec=59/51 lis/c=59/59 les/c/f=60/60/0 sis=61) [0] r=0 lpr=61 pi=[59,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 02 09:42:00 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 61 pg[12.8( empty local-lis/les=0/0 n=0 ec=59/51 lis/c=59/59 les/c/f=60/60/0 sis=61) [0] r=0 lpr=61 pi=[59,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 02 09:42:00 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 61 pg[12.b( empty local-lis/les=0/0 n=0 ec=59/51 lis/c=59/59 les/c/f=60/60/0 sis=61) [0] r=0 lpr=61 pi=[59,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 02 09:42:00 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 61 pg[10.8( empty local-lis/les=0/0 n=0 ec=57/40 lis/c=57/57 les/c/f=58/58/0 sis=61) [0] r=0 lpr=61 pi=[57,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 02 09:42:00 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 61 pg[12.e( empty local-lis/les=0/0 n=0 ec=59/51 lis/c=59/59 les/c/f=60/60/0 sis=61) [0] r=0 lpr=61 pi=[59,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 02 09:42:00 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 61 pg[10.5( empty local-lis/les=0/0 n=0 ec=57/40 lis/c=57/57 les/c/f=58/58/0 sis=61) [0] r=0 lpr=61 pi=[57,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 02 09:42:00 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 61 pg[10.18( empty local-lis/les=0/0 n=0 ec=57/40 lis/c=57/57 les/c/f=58/58/0 sis=61) [0] r=0 lpr=61 pi=[57,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 02 09:42:00 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 61 pg[10.19( empty local-lis/les=0/0 n=0 ec=57/40 lis/c=57/57 les/c/f=58/58/0 sis=61) [0] r=0 lpr=61 pi=[57,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 02 09:42:00 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 61 pg[12.1c( empty local-lis/les=0/0 n=0 ec=59/51 lis/c=59/59 les/c/f=60/60/0 sis=61) [0] r=0 lpr=61 pi=[59,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 02 09:42:00 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 61 pg[12.19( empty local-lis/les=0/0 n=0 ec=59/51 lis/c=59/59 les/c/f=60/60/0 sis=61) [0] r=0 lpr=61 pi=[59,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 02 09:42:00 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 61 pg[10.1b( empty local-lis/les=0/0 n=0 ec=57/40 lis/c=57/57 les/c/f=58/58/0 sis=61) [0] r=0 lpr=61 pi=[57,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 02 09:42:00 compute-1 ceph-osd[77691]: log_channel(cluster) log [DBG] : 9.15 scrub starts
Feb 02 09:42:00 compute-1 ceph-osd[77691]: log_channel(cluster) log [DBG] : 9.15 scrub ok
Feb 02 09:42:00 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e61 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 02 09:42:00 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[86236]: 02/02/2026 09:42:00 : epoch 6980714c : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e00001c00 fd 47 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:42:00 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[86236]: 02/02/2026 09:42:00 : epoch 6980714c : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9de4002b10 fd 47 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:42:00 compute-1 ceph-mon[80115]: 8.8 deep-scrub starts
Feb 02 09:42:00 compute-1 ceph-mon[80115]: 8.8 deep-scrub ok
Feb 02 09:42:00 compute-1 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' 
Feb 02 09:42:00 compute-1 ceph-mon[80115]: 8.9 scrub starts
Feb 02 09:42:00 compute-1 ceph-mon[80115]: 8.9 scrub ok
Feb 02 09:42:00 compute-1 ceph-mon[80115]: 10.1e scrub starts
Feb 02 09:42:00 compute-1 ceph-mon[80115]: 10.1e scrub ok
Feb 02 09:42:00 compute-1 ceph-mon[80115]: pgmap v50: 353 pgs: 353 active+clean; 456 KiB data, 103 MiB used, 60 GiB / 60 GiB avail; 3.6 KiB/s rd, 3 op/s
Feb 02 09:42:00 compute-1 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd pool set", "pool": ".nfs", "var": "pgp_num_actual", "val": "32"}]: dispatch
Feb 02 09:42:00 compute-1 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd pool set", "pool": ".rgw.root", "var": "pgp_num_actual", "val": "32"}]: dispatch
Feb 02 09:42:00 compute-1 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.control", "var": "pgp_num_actual", "val": "32"}]: dispatch
Feb 02 09:42:00 compute-1 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "2"}]: dispatch
Feb 02 09:42:00 compute-1 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pgp_num_actual", "val": "32"}]: dispatch
Feb 02 09:42:00 compute-1 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' 
Feb 02 09:42:00 compute-1 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' 
Feb 02 09:42:00 compute-1 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' 
Feb 02 09:42:00 compute-1 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' 
Feb 02 09:42:00 compute-1 ceph-mon[80115]: 192.168.122.2 is in 192.168.122.0/24 on compute-2 interface br-ex
Feb 02 09:42:00 compute-1 ceph-mon[80115]: 192.168.122.2 is in 192.168.122.0/24 on compute-0 interface br-ex
Feb 02 09:42:00 compute-1 ceph-mon[80115]: Deploying daemon keepalived.rgw.default.compute-2.tapsuz on compute-2
Feb 02 09:42:00 compute-1 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' cmd='[{"prefix": "osd pool set", "pool": ".nfs", "var": "pgp_num_actual", "val": "32"}]': finished
Feb 02 09:42:00 compute-1 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' cmd='[{"prefix": "osd pool set", "pool": ".rgw.root", "var": "pgp_num_actual", "val": "32"}]': finished
Feb 02 09:42:00 compute-1 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.control", "var": "pgp_num_actual", "val": "32"}]': finished
Feb 02 09:42:00 compute-1 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "2"}]': finished
Feb 02 09:42:00 compute-1 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pgp_num_actual", "val": "32"}]': finished
Feb 02 09:42:00 compute-1 ceph-mon[80115]: osdmap e61: 3 total, 3 up, 3 in
Feb 02 09:42:00 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:42:00 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:42:00 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:42:00.921 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:42:01 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:42:01 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000025s ======
Feb 02 09:42:01 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:42:01.017 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Feb 02 09:42:01 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[86236]: 02/02/2026 09:42:01 : epoch 6980714c : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e08009860 fd 47 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:42:01 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e62 e62: 3 total, 3 up, 3 in
Feb 02 09:42:01 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 62 pg[10.2( v 41'48 (0'0,41'48] local-lis/les=61/62 n=1 ec=57/40 lis/c=57/57 les/c/f=58/58/0 sis=61) [0] r=0 lpr=61 pi=[57,61)/1 crt=41'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 02 09:42:01 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 62 pg[12.10( v 60'66 lc 53'46 (0'0,60'66] local-lis/les=61/62 n=0 ec=59/51 lis/c=59/59 les/c/f=60/60/0 sis=61) [0] r=0 lpr=61 pi=[59,61)/1 crt=60'66 lcod 0'0 mlcod 0'0 active+degraded m=1 mbc={255={(0+1)=1}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 02 09:42:01 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 62 pg[10.15( v 60'57 lc 60'56 (0'0,60'57] local-lis/les=61/62 n=0 ec=57/40 lis/c=57/57 les/c/f=58/58/0 sis=61) [0] r=0 lpr=61 pi=[57,61)/1 crt=60'57 lcod 0'0 mlcod 0'0 active+degraded m=1 mbc={255={(0+1)=1}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 02 09:42:01 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 62 pg[10.14( v 60'57 lc 60'56 (0'0,60'57] local-lis/les=61/62 n=0 ec=57/40 lis/c=57/57 les/c/f=58/58/0 sis=61) [0] r=0 lpr=61 pi=[57,61)/1 crt=60'57 lcod 0'0 mlcod 0'0 active+degraded m=1 mbc={255={(0+1)=1}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 02 09:42:01 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 62 pg[10.13( v 41'48 (0'0,41'48] local-lis/les=61/62 n=0 ec=57/40 lis/c=57/57 les/c/f=58/58/0 sis=61) [0] r=0 lpr=61 pi=[57,61)/1 crt=41'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 02 09:42:01 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 62 pg[12.12( v 54'63 (0'0,54'63] local-lis/les=61/62 n=0 ec=59/51 lis/c=59/59 les/c/f=60/60/0 sis=61) [0] r=0 lpr=61 pi=[59,61)/1 crt=54'63 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 02 09:42:01 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 62 pg[12.6( v 54'63 lc 53'43 (0'0,54'63] local-lis/les=61/62 n=0 ec=59/51 lis/c=59/59 les/c/f=60/60/0 sis=61) [0] r=0 lpr=61 pi=[59,61)/1 crt=54'63 lcod 0'0 mlcod 0'0 active+degraded m=1 mbc={255={(0+1)=1}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 02 09:42:01 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 62 pg[12.b( v 54'63 (0'0,54'63] local-lis/les=61/62 n=0 ec=59/51 lis/c=59/59 les/c/f=60/60/0 sis=61) [0] r=0 lpr=61 pi=[59,61)/1 crt=54'63 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 02 09:42:01 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 62 pg[12.8( v 54'63 (0'0,54'63] local-lis/les=61/62 n=0 ec=59/51 lis/c=59/59 les/c/f=60/60/0 sis=61) [0] r=0 lpr=61 pi=[59,61)/1 crt=54'63 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 02 09:42:01 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 62 pg[12.a( v 54'63 lc 0'0 (0'0,54'63] local-lis/les=61/62 n=0 ec=59/51 lis/c=59/59 les/c/f=60/60/0 sis=61) [0] r=0 lpr=61 pi=[59,61)/1 crt=54'63 mlcod 0'0 active+degraded m=1 mbc={255={(0+1)=1}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 02 09:42:01 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 62 pg[12.c( v 54'63 (0'0,54'63] local-lis/les=61/62 n=0 ec=59/51 lis/c=59/59 les/c/f=60/60/0 sis=61) [0] r=0 lpr=61 pi=[59,61)/1 crt=54'63 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 02 09:42:01 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 62 pg[10.8( v 41'48 (0'0,41'48] local-lis/les=61/62 n=1 ec=57/40 lis/c=57/57 les/c/f=58/58/0 sis=61) [0] r=0 lpr=61 pi=[57,61)/1 crt=41'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 02 09:42:01 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 62 pg[12.e( v 54'63 (0'0,54'63] local-lis/les=61/62 n=0 ec=59/51 lis/c=59/59 les/c/f=60/60/0 sis=61) [0] r=0 lpr=61 pi=[59,61)/1 crt=54'63 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 02 09:42:01 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 62 pg[10.5( v 41'48 (0'0,41'48] local-lis/les=61/62 n=1 ec=57/40 lis/c=57/57 les/c/f=58/58/0 sis=61) [0] r=0 lpr=61 pi=[57,61)/1 crt=41'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 02 09:42:01 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 62 pg[10.18( v 41'48 (0'0,41'48] local-lis/les=61/62 n=0 ec=57/40 lis/c=57/57 les/c/f=58/58/0 sis=61) [0] r=0 lpr=61 pi=[57,61)/1 crt=41'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 02 09:42:01 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 62 pg[10.19( v 41'48 (0'0,41'48] local-lis/les=61/62 n=0 ec=57/40 lis/c=57/57 les/c/f=58/58/0 sis=61) [0] r=0 lpr=61 pi=[57,61)/1 crt=41'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 02 09:42:01 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 62 pg[12.1c( v 54'63 (0'0,54'63] local-lis/les=61/62 n=0 ec=59/51 lis/c=59/59 les/c/f=60/60/0 sis=61) [0] r=0 lpr=61 pi=[59,61)/1 crt=54'63 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 02 09:42:01 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 62 pg[12.19( v 54'63 (0'0,54'63] local-lis/les=61/62 n=0 ec=59/51 lis/c=59/59 les/c/f=60/60/0 sis=61) [0] r=0 lpr=61 pi=[59,61)/1 crt=54'63 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 02 09:42:01 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 62 pg[10.1b( v 41'48 (0'0,41'48] local-lis/les=61/62 n=0 ec=57/40 lis/c=57/57 les/c/f=58/58/0 sis=61) [0] r=0 lpr=61 pi=[57,61)/1 crt=41'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 02 09:42:01 compute-1 ceph-mon[80115]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #13. Immutable memtables: 0.
Feb 02 09:42:01 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-09:42:01.277121) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Feb 02 09:42:01 compute-1 ceph-mon[80115]: rocksdb: [db/flush_job.cc:856] [default] [JOB 3] Flushing memtable with next log file: 13
Feb 02 09:42:01 compute-1 ceph-mon[80115]: rocksdb: EVENT_LOG_v1 {"time_micros": 1770025321277313, "job": 3, "event": "flush_started", "num_memtables": 1, "num_entries": 6845, "num_deletes": 259, "total_data_size": 18219860, "memory_usage": 19128080, "flush_reason": "Manual Compaction"}
Feb 02 09:42:01 compute-1 ceph-mon[80115]: rocksdb: [db/flush_job.cc:885] [default] [JOB 3] Level-0 flush table #14: started
Feb 02 09:42:01 compute-1 ceph-mon[80115]: rocksdb: EVENT_LOG_v1 {"time_micros": 1770025321370080, "cf_name": "default", "job": 3, "event": "table_file_creation", "file_number": 14, "file_size": 11534958, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 6, "largest_seqno": 6850, "table_properties": {"data_size": 11509030, "index_size": 16342, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 8517, "raw_key_size": 83139, "raw_average_key_size": 24, "raw_value_size": 11443688, "raw_average_value_size": 3367, "num_data_blocks": 719, "num_entries": 3398, "num_filter_entries": 3398, "num_deletions": 259, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1770025175, "oldest_key_time": 1770025175, "file_creation_time": 1770025321, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "fac1d709-8a2a-487d-b05b-57255ec289c7", "db_session_id": "DE871D21TSCUFP8UED8E", "orig_file_number": 14, "seqno_to_time_mapping": "N/A"}}
Feb 02 09:42:01 compute-1 ceph-mon[80115]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 3] Flush lasted 92992 microseconds, and 26472 cpu microseconds.
Feb 02 09:42:01 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-09:42:01.370174) [db/flush_job.cc:967] [default] [JOB 3] Level-0 flush table #14: 11534958 bytes OK
Feb 02 09:42:01 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-09:42:01.370194) [db/memtable_list.cc:519] [default] Level-0 commit table #14 started
Feb 02 09:42:01 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-09:42:01.377756) [db/memtable_list.cc:722] [default] Level-0 commit table #14: memtable #1 done
Feb 02 09:42:01 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-09:42:01.377773) EVENT_LOG_v1 {"time_micros": 1770025321377769, "job": 3, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [2, 0, 0, 0, 0, 0, 0], "immutable_memtables": 0}
Feb 02 09:42:01 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-09:42:01.377791) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: files[2 0 0 0 0 0 0] max score 0.50
Feb 02 09:42:01 compute-1 ceph-mon[80115]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 3] Try to delete WAL files size 18183110, prev total WAL file size 18183110, number of live WAL files 2.
Feb 02 09:42:01 compute-1 ceph-mon[80115]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000009.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb 02 09:42:01 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-09:42:01.379683) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6B760030' seq:72057594037927935, type:22 .. '6B7600323535' seq:0, type:0; will stop at (end)
Feb 02 09:42:01 compute-1 ceph-mon[80115]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 4] Compacting 2@0 files to L6, score -1.00
Feb 02 09:42:01 compute-1 ceph-mon[80115]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 3 Base level 0, inputs: [14(11MB) 8(1648B)]
Feb 02 09:42:01 compute-1 ceph-mon[80115]: rocksdb: EVENT_LOG_v1 {"time_micros": 1770025321379738, "job": 4, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [14, 8], "score": -1, "input_data_size": 11536606, "oldest_snapshot_seqno": -1}
Feb 02 09:42:01 compute-1 anacron[4318]: Job `cron.weekly' started
Feb 02 09:42:01 compute-1 anacron[4318]: Job `cron.weekly' terminated
Feb 02 09:42:01 compute-1 ceph-mon[80115]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 4] Generated table #15: 3143 keys, 11531437 bytes, temperature: kUnknown
Feb 02 09:42:01 compute-1 ceph-mon[80115]: rocksdb: EVENT_LOG_v1 {"time_micros": 1770025321546606, "cf_name": "default", "job": 4, "event": "table_file_creation", "file_number": 15, "file_size": 11531437, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 11506138, "index_size": 16358, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 7877, "raw_key_size": 79606, "raw_average_key_size": 25, "raw_value_size": 11443968, "raw_average_value_size": 3641, "num_data_blocks": 718, "num_entries": 3143, "num_filter_entries": 3143, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1770025175, "oldest_key_time": 0, "file_creation_time": 1770025321, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "fac1d709-8a2a-487d-b05b-57255ec289c7", "db_session_id": "DE871D21TSCUFP8UED8E", "orig_file_number": 15, "seqno_to_time_mapping": "N/A"}}
Feb 02 09:42:01 compute-1 ceph-mon[80115]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Feb 02 09:42:01 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-09:42:01.547189) [db/compaction/compaction_job.cc:1663] [default] [JOB 4] Compacted 2@0 files to L6 => 11531437 bytes
Feb 02 09:42:01 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-09:42:01.549844) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 69.0 rd, 69.0 wr, level 6, files in(2, 0) out(1 +0 blob) MB in(11.0, 0.0 +0.0 blob) out(11.0 +0.0 blob), read-write-amplify(2.0) write-amplify(1.0) OK, records in: 3403, records dropped: 260 output_compression: NoCompression
Feb 02 09:42:01 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-09:42:01.549873) EVENT_LOG_v1 {"time_micros": 1770025321549860, "job": 4, "event": "compaction_finished", "compaction_time_micros": 167196, "compaction_time_cpu_micros": 16046, "output_level": 6, "num_output_files": 1, "total_output_size": 11531437, "num_input_records": 3403, "num_output_records": 3143, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Feb 02 09:42:01 compute-1 ceph-mon[80115]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000014.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb 02 09:42:01 compute-1 ceph-mon[80115]: rocksdb: EVENT_LOG_v1 {"time_micros": 1770025321551445, "job": 4, "event": "table_file_deletion", "file_number": 14}
Feb 02 09:42:01 compute-1 ceph-mon[80115]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000008.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb 02 09:42:01 compute-1 ceph-mon[80115]: rocksdb: EVENT_LOG_v1 {"time_micros": 1770025321551499, "job": 4, "event": "table_file_deletion", "file_number": 8}
Feb 02 09:42:01 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-09:42:01.379581) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 02 09:42:01 compute-1 ceph-mon[80115]: 9.15 scrub starts
Feb 02 09:42:01 compute-1 ceph-mon[80115]: 9.15 scrub ok
Feb 02 09:42:01 compute-1 ceph-mon[80115]: 10.17 scrub starts
Feb 02 09:42:01 compute-1 ceph-mon[80115]: 10.17 scrub ok
Feb 02 09:42:01 compute-1 ceph-mon[80115]: osdmap e62: 3 total, 3 up, 3 in
Feb 02 09:42:01 compute-1 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "3"}]: dispatch
Feb 02 09:42:02 compute-1 ceph-osd[77691]: log_channel(cluster) log [DBG] : 12.10 scrub starts
Feb 02 09:42:02 compute-1 ceph-osd[77691]: log_channel(cluster) log [DBG] : 12.10 scrub ok
Feb 02 09:42:02 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e63 e63: 3 total, 3 up, 3 in
Feb 02 09:42:02 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[86236]: 02/02/2026 09:42:02 : epoch 6980714c : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9df0002f50 fd 47 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:42:02 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[86236]: 02/02/2026 09:42:02 : epoch 6980714c : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e00001c00 fd 47 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:42:02 compute-1 ceph-mon[80115]: pgmap v53: 353 pgs: 353 active+clean; 456 KiB data, 103 MiB used, 60 GiB / 60 GiB avail; 3.6 KiB/s rd, 3 op/s
Feb 02 09:42:02 compute-1 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' 
Feb 02 09:42:02 compute-1 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' 
Feb 02 09:42:02 compute-1 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' 
Feb 02 09:42:02 compute-1 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' 
Feb 02 09:42:02 compute-1 ceph-mon[80115]: 192.168.122.2 is in 192.168.122.0/24 on compute-0 interface br-ex
Feb 02 09:42:02 compute-1 ceph-mon[80115]: 192.168.122.2 is in 192.168.122.0/24 on compute-2 interface br-ex
Feb 02 09:42:02 compute-1 ceph-mon[80115]: Deploying daemon keepalived.rgw.default.compute-0.pxmjnp on compute-0
Feb 02 09:42:02 compute-1 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "3"}]': finished
Feb 02 09:42:02 compute-1 ceph-mon[80115]: osdmap e63: 3 total, 3 up, 3 in
Feb 02 09:42:02 compute-1 ceph-mon[80115]: 8.f scrub starts
Feb 02 09:42:02 compute-1 ceph-mon[80115]: 8.f scrub ok
Feb 02 09:42:02 compute-1 ceph-mon[80115]: 8.19 scrub starts
Feb 02 09:42:02 compute-1 ceph-mon[80115]: 8.19 scrub ok
Feb 02 09:42:02 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:42:02 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:42:02 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:42:02.930 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:42:03 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:42:03 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:42:03 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:42:03.020 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:42:03 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[86236]: 02/02/2026 09:42:03 : epoch 6980714c : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9de4002b10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:42:03 compute-1 ceph-osd[77691]: log_channel(cluster) log [DBG] : 10.14 scrub starts
Feb 02 09:42:03 compute-1 ceph-osd[77691]: log_channel(cluster) log [DBG] : 10.14 scrub ok
Feb 02 09:42:03 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e64 e64: 3 total, 3 up, 3 in
Feb 02 09:42:03 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 64 pg[9.17( v 44'1041 (0'0,44'1041] local-lis/les=57/58 n=5 ec=57/38 lis/c=57/57 les/c/f=58/58/0 sis=64 pruub=13.606101990s) [2] r=-1 lpr=64 pi=[57,64)/1 crt=44'1041 lcod 0'0 mlcod 0'0 active pruub 188.751937866s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb 02 09:42:03 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 64 pg[9.3( v 44'1041 (0'0,44'1041] local-lis/les=57/58 n=6 ec=57/38 lis/c=57/57 les/c/f=58/58/0 sis=64 pruub=13.605978966s) [2] r=-1 lpr=64 pi=[57,64)/1 crt=44'1041 lcod 0'0 mlcod 0'0 active pruub 188.751983643s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb 02 09:42:03 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 64 pg[9.3( v 44'1041 (0'0,44'1041] local-lis/les=57/58 n=6 ec=57/38 lis/c=57/57 les/c/f=58/58/0 sis=64 pruub=13.605916023s) [2] r=-1 lpr=64 pi=[57,64)/1 crt=44'1041 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 188.751983643s@ mbc={}] state<Start>: transitioning to Stray
Feb 02 09:42:03 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 64 pg[9.17( v 44'1041 (0'0,44'1041] local-lis/les=57/58 n=5 ec=57/38 lis/c=57/57 les/c/f=58/58/0 sis=64 pruub=13.605821609s) [2] r=-1 lpr=64 pi=[57,64)/1 crt=44'1041 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 188.751937866s@ mbc={}] state<Start>: transitioning to Stray
Feb 02 09:42:03 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 64 pg[9.f( v 44'1041 (0'0,44'1041] local-lis/les=57/58 n=6 ec=57/38 lis/c=57/57 les/c/f=58/58/0 sis=64 pruub=13.609919548s) [2] r=-1 lpr=64 pi=[57,64)/1 crt=44'1041 lcod 0'0 mlcod 0'0 active pruub 188.756454468s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb 02 09:42:03 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 64 pg[9.f( v 44'1041 (0'0,44'1041] local-lis/les=57/58 n=6 ec=57/38 lis/c=57/57 les/c/f=58/58/0 sis=64 pruub=13.609889030s) [2] r=-1 lpr=64 pi=[57,64)/1 crt=44'1041 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 188.756454468s@ mbc={}] state<Start>: transitioning to Stray
Feb 02 09:42:03 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 64 pg[9.b( v 44'1041 (0'0,44'1041] local-lis/les=57/58 n=6 ec=57/38 lis/c=57/57 les/c/f=58/58/0 sis=64 pruub=13.609468460s) [2] r=-1 lpr=64 pi=[57,64)/1 crt=44'1041 lcod 0'0 mlcod 0'0 active pruub 188.756454468s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb 02 09:42:03 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 64 pg[9.b( v 44'1041 (0'0,44'1041] local-lis/les=57/58 n=6 ec=57/38 lis/c=57/57 les/c/f=58/58/0 sis=64 pruub=13.609417915s) [2] r=-1 lpr=64 pi=[57,64)/1 crt=44'1041 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 188.756454468s@ mbc={}] state<Start>: transitioning to Stray
Feb 02 09:42:03 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 64 pg[9.7( v 44'1041 (0'0,44'1041] local-lis/les=57/58 n=6 ec=57/38 lis/c=57/57 les/c/f=58/58/0 sis=64 pruub=13.609522820s) [2] r=-1 lpr=64 pi=[57,64)/1 crt=44'1041 lcod 0'0 mlcod 0'0 active pruub 188.756866455s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb 02 09:42:03 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 64 pg[9.7( v 44'1041 (0'0,44'1041] local-lis/les=57/58 n=6 ec=57/38 lis/c=57/57 les/c/f=58/58/0 sis=64 pruub=13.609489441s) [2] r=-1 lpr=64 pi=[57,64)/1 crt=44'1041 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 188.756866455s@ mbc={}] state<Start>: transitioning to Stray
Feb 02 09:42:03 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 64 pg[9.1b( v 44'1041 (0'0,44'1041] local-lis/les=57/58 n=5 ec=57/38 lis/c=57/57 les/c/f=58/58/0 sis=64 pruub=13.609068871s) [2] r=-1 lpr=64 pi=[57,64)/1 crt=44'1041 lcod 0'0 mlcod 0'0 active pruub 188.757019043s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb 02 09:42:03 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 64 pg[9.1b( v 44'1041 (0'0,44'1041] local-lis/les=57/58 n=5 ec=57/38 lis/c=57/57 les/c/f=58/58/0 sis=64 pruub=13.608972549s) [2] r=-1 lpr=64 pi=[57,64)/1 crt=44'1041 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 188.757019043s@ mbc={}] state<Start>: transitioning to Stray
Feb 02 09:42:03 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 64 pg[9.1f( v 44'1041 (0'0,44'1041] local-lis/les=57/58 n=5 ec=57/38 lis/c=57/57 les/c/f=58/58/0 sis=64 pruub=13.608766556s) [2] r=-1 lpr=64 pi=[57,64)/1 crt=44'1041 lcod 0'0 mlcod 0'0 active pruub 188.756958008s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb 02 09:42:03 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 64 pg[9.1f( v 44'1041 (0'0,44'1041] local-lis/les=57/58 n=5 ec=57/38 lis/c=57/57 les/c/f=58/58/0 sis=64 pruub=13.608687401s) [2] r=-1 lpr=64 pi=[57,64)/1 crt=44'1041 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 188.756958008s@ mbc={}] state<Start>: transitioning to Stray
Feb 02 09:42:03 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 64 pg[9.13( v 44'1041 (0'0,44'1041] local-lis/les=57/58 n=5 ec=57/38 lis/c=57/57 les/c/f=58/58/0 sis=64 pruub=13.614995003s) [2] r=-1 lpr=64 pi=[57,64)/1 crt=44'1041 lcod 0'0 mlcod 0'0 active pruub 188.763641357s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb 02 09:42:03 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 64 pg[9.13( v 44'1041 (0'0,44'1041] local-lis/les=57/58 n=5 ec=57/38 lis/c=57/57 les/c/f=58/58/0 sis=64 pruub=13.614953041s) [2] r=-1 lpr=64 pi=[57,64)/1 crt=44'1041 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 188.763641357s@ mbc={}] state<Start>: transitioning to Stray
Feb 02 09:42:03 compute-1 ceph-mon[80115]: 12.10 scrub starts
Feb 02 09:42:03 compute-1 ceph-mon[80115]: 12.10 scrub ok
Feb 02 09:42:03 compute-1 ceph-mon[80115]: 12.7 scrub starts
Feb 02 09:42:03 compute-1 ceph-mon[80115]: 12.7 scrub ok
Feb 02 09:42:03 compute-1 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "4"}]: dispatch
Feb 02 09:42:03 compute-1 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' 
Feb 02 09:42:03 compute-1 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' 
Feb 02 09:42:03 compute-1 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' 
Feb 02 09:42:03 compute-1 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' 
Feb 02 09:42:04 compute-1 ceph-osd[77691]: log_channel(cluster) log [DBG] : 12.6 scrub starts
Feb 02 09:42:04 compute-1 ceph-osd[77691]: log_channel(cluster) log [DBG] : 12.6 scrub ok
Feb 02 09:42:04 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[86236]: 02/02/2026 09:42:04 : epoch 6980714c : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e08009860 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:42:04 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[86236]: 02/02/2026 09:42:04 : epoch 6980714c : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9df0002f50 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:42:04 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:42:04 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000025s ======
Feb 02 09:42:04 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:42:04.932 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Feb 02 09:42:04 compute-1 ceph-mon[80115]: 10.14 scrub starts
Feb 02 09:42:04 compute-1 ceph-mon[80115]: 10.14 scrub ok
Feb 02 09:42:04 compute-1 ceph-mon[80115]: pgmap v55: 353 pgs: 353 active+clean; 456 KiB data, 103 MiB used, 60 GiB / 60 GiB avail; 3.5 KiB/s rd, 3 op/s; 2 B/s, 0 objects/s recovering
Feb 02 09:42:04 compute-1 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "4"}]': finished
Feb 02 09:42:04 compute-1 ceph-mon[80115]: osdmap e64: 3 total, 3 up, 3 in
Feb 02 09:42:04 compute-1 ceph-mon[80115]: 12.6 scrub starts
Feb 02 09:42:04 compute-1 ceph-mon[80115]: 12.6 scrub ok
Feb 02 09:42:04 compute-1 ceph-mon[80115]: 8.16 scrub starts
Feb 02 09:42:04 compute-1 ceph-mon[80115]: 8.16 scrub ok
Feb 02 09:42:04 compute-1 ceph-osd[77691]: log_channel(cluster) log [DBG] : 10.15 scrub starts
Feb 02 09:42:04 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e65 e65: 3 total, 3 up, 3 in
Feb 02 09:42:05 compute-1 ceph-osd[77691]: log_channel(cluster) log [DBG] : 10.15 scrub ok
Feb 02 09:42:05 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 65 pg[9.13( v 44'1041 (0'0,44'1041] local-lis/les=57/58 n=5 ec=57/38 lis/c=57/57 les/c/f=58/58/0 sis=65) [2]/[0] r=0 lpr=65 pi=[57,65)/1 crt=44'1041 lcod 0'0 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Feb 02 09:42:05 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 65 pg[9.13( v 44'1041 (0'0,44'1041] local-lis/les=57/58 n=5 ec=57/38 lis/c=57/57 les/c/f=58/58/0 sis=65) [2]/[0] r=0 lpr=65 pi=[57,65)/1 crt=44'1041 lcod 0'0 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Feb 02 09:42:05 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 65 pg[9.1f( v 44'1041 (0'0,44'1041] local-lis/les=57/58 n=5 ec=57/38 lis/c=57/57 les/c/f=58/58/0 sis=65) [2]/[0] r=0 lpr=65 pi=[57,65)/1 crt=44'1041 lcod 0'0 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Feb 02 09:42:05 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 65 pg[9.1f( v 44'1041 (0'0,44'1041] local-lis/les=57/58 n=5 ec=57/38 lis/c=57/57 les/c/f=58/58/0 sis=65) [2]/[0] r=0 lpr=65 pi=[57,65)/1 crt=44'1041 lcod 0'0 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Feb 02 09:42:05 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 65 pg[9.f( v 44'1041 (0'0,44'1041] local-lis/les=57/58 n=6 ec=57/38 lis/c=57/57 les/c/f=58/58/0 sis=65) [2]/[0] r=0 lpr=65 pi=[57,65)/1 crt=44'1041 lcod 0'0 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Feb 02 09:42:05 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 65 pg[9.f( v 44'1041 (0'0,44'1041] local-lis/les=57/58 n=6 ec=57/38 lis/c=57/57 les/c/f=58/58/0 sis=65) [2]/[0] r=0 lpr=65 pi=[57,65)/1 crt=44'1041 lcod 0'0 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Feb 02 09:42:05 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 65 pg[9.b( v 44'1041 (0'0,44'1041] local-lis/les=57/58 n=6 ec=57/38 lis/c=57/57 les/c/f=58/58/0 sis=65) [2]/[0] r=0 lpr=65 pi=[57,65)/1 crt=44'1041 lcod 0'0 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Feb 02 09:42:05 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 65 pg[9.b( v 44'1041 (0'0,44'1041] local-lis/les=57/58 n=6 ec=57/38 lis/c=57/57 les/c/f=58/58/0 sis=65) [2]/[0] r=0 lpr=65 pi=[57,65)/1 crt=44'1041 lcod 0'0 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Feb 02 09:42:05 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 65 pg[9.3( v 44'1041 (0'0,44'1041] local-lis/les=57/58 n=6 ec=57/38 lis/c=57/57 les/c/f=58/58/0 sis=65) [2]/[0] r=0 lpr=65 pi=[57,65)/1 crt=44'1041 lcod 0'0 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Feb 02 09:42:05 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 65 pg[9.3( v 44'1041 (0'0,44'1041] local-lis/les=57/58 n=6 ec=57/38 lis/c=57/57 les/c/f=58/58/0 sis=65) [2]/[0] r=0 lpr=65 pi=[57,65)/1 crt=44'1041 lcod 0'0 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Feb 02 09:42:05 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 65 pg[9.1b( v 44'1041 (0'0,44'1041] local-lis/les=57/58 n=5 ec=57/38 lis/c=57/57 les/c/f=58/58/0 sis=65) [2]/[0] r=0 lpr=65 pi=[57,65)/1 crt=44'1041 lcod 0'0 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Feb 02 09:42:05 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 65 pg[9.7( v 44'1041 (0'0,44'1041] local-lis/les=57/58 n=6 ec=57/38 lis/c=57/57 les/c/f=58/58/0 sis=65) [2]/[0] r=0 lpr=65 pi=[57,65)/1 crt=44'1041 lcod 0'0 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Feb 02 09:42:05 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 65 pg[9.17( v 44'1041 (0'0,44'1041] local-lis/les=57/58 n=5 ec=57/38 lis/c=57/57 les/c/f=58/58/0 sis=65) [2]/[0] r=0 lpr=65 pi=[57,65)/1 crt=44'1041 lcod 0'0 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Feb 02 09:42:05 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 65 pg[9.17( v 44'1041 (0'0,44'1041] local-lis/les=57/58 n=5 ec=57/38 lis/c=57/57 les/c/f=58/58/0 sis=65) [2]/[0] r=0 lpr=65 pi=[57,65)/1 crt=44'1041 lcod 0'0 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Feb 02 09:42:05 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 65 pg[9.1b( v 44'1041 (0'0,44'1041] local-lis/les=57/58 n=5 ec=57/38 lis/c=57/57 les/c/f=58/58/0 sis=65) [2]/[0] r=0 lpr=65 pi=[57,65)/1 crt=44'1041 lcod 0'0 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Feb 02 09:42:05 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 65 pg[9.7( v 44'1041 (0'0,44'1041] local-lis/les=57/58 n=6 ec=57/38 lis/c=57/57 les/c/f=58/58/0 sis=65) [2]/[0] r=0 lpr=65 pi=[57,65)/1 crt=44'1041 lcod 0'0 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Feb 02 09:42:05 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:42:05 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000025s ======
Feb 02 09:42:05 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:42:05.023 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Feb 02 09:42:05 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[86236]: 02/02/2026 09:42:05 : epoch 6980714c : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e00001c00 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:42:05 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e65 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 02 09:42:05 compute-1 ceph-osd[77691]: log_channel(cluster) log [DBG] : 12.a deep-scrub starts
Feb 02 09:42:05 compute-1 ceph-osd[77691]: log_channel(cluster) log [DBG] : 12.a deep-scrub ok
Feb 02 09:42:06 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e66 e66: 3 total, 3 up, 3 in
Feb 02 09:42:06 compute-1 ceph-mon[80115]: Deploying daemon prometheus.compute-0 on compute-0
Feb 02 09:42:06 compute-1 ceph-mon[80115]: 10.15 scrub starts
Feb 02 09:42:06 compute-1 ceph-mon[80115]: osdmap e65: 3 total, 3 up, 3 in
Feb 02 09:42:06 compute-1 ceph-mon[80115]: 10.15 scrub ok
Feb 02 09:42:06 compute-1 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "5"}]: dispatch
Feb 02 09:42:06 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 66 pg[9.7( v 44'1041 (0'0,44'1041] local-lis/les=65/66 n=6 ec=57/38 lis/c=57/57 les/c/f=58/58/0 sis=65) [2]/[0] async=[2] r=0 lpr=65 pi=[57,65)/1 crt=44'1041 lcod 0'0 mlcod 0'0 active+remapped mbc={255={(0+1)=5}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 02 09:42:06 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 66 pg[9.1f( v 44'1041 (0'0,44'1041] local-lis/les=65/66 n=5 ec=57/38 lis/c=57/57 les/c/f=58/58/0 sis=65) [2]/[0] async=[2] r=0 lpr=65 pi=[57,65)/1 crt=44'1041 lcod 0'0 mlcod 0'0 active+remapped mbc={255={(0+1)=5}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 02 09:42:06 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 66 pg[9.13( v 44'1041 (0'0,44'1041] local-lis/les=65/66 n=5 ec=57/38 lis/c=57/57 les/c/f=58/58/0 sis=65) [2]/[0] async=[2] r=0 lpr=65 pi=[57,65)/1 crt=44'1041 lcod 0'0 mlcod 0'0 active+remapped mbc={255={(0+1)=5}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 02 09:42:06 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 66 pg[9.3( v 44'1041 (0'0,44'1041] local-lis/les=65/66 n=6 ec=57/38 lis/c=57/57 les/c/f=58/58/0 sis=65) [2]/[0] async=[2] r=0 lpr=65 pi=[57,65)/1 crt=44'1041 lcod 0'0 mlcod 0'0 active+remapped mbc={255={(0+1)=8}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 02 09:42:06 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 66 pg[9.b( v 44'1041 (0'0,44'1041] local-lis/les=65/66 n=6 ec=57/38 lis/c=57/57 les/c/f=58/58/0 sis=65) [2]/[0] async=[2] r=0 lpr=65 pi=[57,65)/1 crt=44'1041 lcod 0'0 mlcod 0'0 active+remapped mbc={255={(0+1)=4}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 02 09:42:06 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 66 pg[9.17( v 44'1041 (0'0,44'1041] local-lis/les=65/66 n=5 ec=57/38 lis/c=57/57 les/c/f=58/58/0 sis=65) [2]/[0] async=[2] r=0 lpr=65 pi=[57,65)/1 crt=44'1041 lcod 0'0 mlcod 0'0 active+remapped mbc={255={(0+1)=3}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 02 09:42:06 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 66 pg[9.f( v 44'1041 (0'0,44'1041] local-lis/les=65/66 n=6 ec=57/38 lis/c=57/57 les/c/f=58/58/0 sis=65) [2]/[0] async=[2] r=0 lpr=65 pi=[57,65)/1 crt=44'1041 lcod 0'0 mlcod 0'0 active+remapped mbc={255={(0+1)=7}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 02 09:42:06 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 66 pg[9.1b( v 44'1041 (0'0,44'1041] local-lis/les=65/66 n=5 ec=57/38 lis/c=57/57 les/c/f=58/58/0 sis=65) [2]/[0] async=[2] r=0 lpr=65 pi=[57,65)/1 crt=44'1041 lcod 0'0 mlcod 0'0 active+remapped mbc={255={(0+1)=2}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 02 09:42:06 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e67 e67: 3 total, 3 up, 3 in
Feb 02 09:42:06 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 67 pg[9.3( v 44'1041 (0'0,44'1041] local-lis/les=65/66 n=6 ec=57/38 lis/c=65/57 les/c/f=66/58/0 sis=67 pruub=15.793918610s) [2] async=[2] r=-1 lpr=67 pi=[57,67)/1 crt=44'1041 lcod 0'0 mlcod 0'0 active pruub 193.273803711s@ mbc={255={}}] PeeringState::start_peering_interval up [2] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb 02 09:42:06 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 67 pg[9.3( v 44'1041 (0'0,44'1041] local-lis/les=65/66 n=6 ec=57/38 lis/c=65/57 les/c/f=66/58/0 sis=67 pruub=15.793812752s) [2] r=-1 lpr=67 pi=[57,67)/1 crt=44'1041 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 193.273803711s@ mbc={}] state<Start>: transitioning to Stray
Feb 02 09:42:06 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 67 pg[9.7( v 44'1041 (0'0,44'1041] local-lis/les=65/66 n=6 ec=57/38 lis/c=65/57 les/c/f=66/58/0 sis=67 pruub=15.782711983s) [2] async=[2] r=-1 lpr=67 pi=[57,67)/1 crt=44'1041 lcod 0'0 mlcod 0'0 active pruub 193.263900757s@ mbc={255={}}] PeeringState::start_peering_interval up [2] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb 02 09:42:06 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 67 pg[9.7( v 44'1041 (0'0,44'1041] local-lis/les=65/66 n=6 ec=57/38 lis/c=65/57 les/c/f=66/58/0 sis=67 pruub=15.782621384s) [2] r=-1 lpr=67 pi=[57,67)/1 crt=44'1041 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 193.263900757s@ mbc={}] state<Start>: transitioning to Stray
Feb 02 09:42:06 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 67 pg[9.1f( v 44'1041 (0'0,44'1041] local-lis/les=65/66 n=5 ec=57/38 lis/c=65/57 les/c/f=66/58/0 sis=67 pruub=15.782208443s) [2] async=[2] r=-1 lpr=67 pi=[57,67)/1 crt=44'1041 lcod 0'0 mlcod 0'0 active pruub 193.263931274s@ mbc={255={}}] PeeringState::start_peering_interval up [2] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb 02 09:42:06 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 67 pg[9.1f( v 44'1041 (0'0,44'1041] local-lis/les=65/66 n=5 ec=57/38 lis/c=65/57 les/c/f=66/58/0 sis=67 pruub=15.782091141s) [2] r=-1 lpr=67 pi=[57,67)/1 crt=44'1041 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 193.263931274s@ mbc={}] state<Start>: transitioning to Stray
Feb 02 09:42:06 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[86236]: 02/02/2026 09:42:06 : epoch 6980714c : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9de4003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:42:06 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[86236]: 02/02/2026 09:42:06 : epoch 6980714c : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e0800a180 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:42:06 compute-1 ceph-osd[77691]: log_channel(cluster) log [DBG] : 11.15 scrub starts
Feb 02 09:42:06 compute-1 ceph-osd[77691]: log_channel(cluster) log [DBG] : 11.15 scrub ok
Feb 02 09:42:06 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:42:06 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:42:06 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:42:06.936 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:42:07 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:42:07 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000025s ======
Feb 02 09:42:07 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:42:07.026 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Feb 02 09:42:07 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[86236]: 02/02/2026 09:42:07 : epoch 6980714c : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e0800a180 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:42:07 compute-1 ceph-mon[80115]: pgmap v58: 353 pgs: 353 active+clean; 456 KiB data, 103 MiB used, 60 GiB / 60 GiB avail; 110 B/s, 0 keys/s, 2 objects/s recovering
Feb 02 09:42:07 compute-1 ceph-mon[80115]: 12.a deep-scrub starts
Feb 02 09:42:07 compute-1 ceph-mon[80115]: 12.a deep-scrub ok
Feb 02 09:42:07 compute-1 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "5"}]': finished
Feb 02 09:42:07 compute-1 ceph-mon[80115]: osdmap e66: 3 total, 3 up, 3 in
Feb 02 09:42:07 compute-1 ceph-mon[80115]: osdmap e67: 3 total, 3 up, 3 in
Feb 02 09:42:07 compute-1 ceph-mon[80115]: 10.16 scrub starts
Feb 02 09:42:07 compute-1 ceph-mon[80115]: 10.16 scrub ok
Feb 02 09:42:07 compute-1 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' 
Feb 02 09:42:07 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e68 e68: 3 total, 3 up, 3 in
Feb 02 09:42:07 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 68 pg[9.17( v 44'1041 (0'0,44'1041] local-lis/les=65/66 n=5 ec=57/38 lis/c=65/57 les/c/f=66/58/0 sis=68 pruub=14.790109634s) [2] async=[2] r=-1 lpr=68 pi=[57,68)/1 crt=44'1041 lcod 0'0 mlcod 0'0 active pruub 193.273849487s@ mbc={255={}}] PeeringState::start_peering_interval up [2] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb 02 09:42:07 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 68 pg[9.17( v 44'1041 (0'0,44'1041] local-lis/les=65/66 n=5 ec=57/38 lis/c=65/57 les/c/f=66/58/0 sis=68 pruub=14.790036201s) [2] r=-1 lpr=68 pi=[57,68)/1 crt=44'1041 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 193.273849487s@ mbc={}] state<Start>: transitioning to Stray
Feb 02 09:42:07 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 68 pg[9.b( v 44'1041 (0'0,44'1041] local-lis/les=65/66 n=6 ec=57/38 lis/c=65/57 les/c/f=66/58/0 sis=68 pruub=14.789252281s) [2] async=[2] r=-1 lpr=68 pi=[57,68)/1 crt=44'1041 lcod 0'0 mlcod 0'0 active pruub 193.273773193s@ mbc={255={}}] PeeringState::start_peering_interval up [2] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb 02 09:42:07 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 68 pg[9.b( v 44'1041 (0'0,44'1041] local-lis/les=65/66 n=6 ec=57/38 lis/c=65/57 les/c/f=66/58/0 sis=68 pruub=14.789169312s) [2] r=-1 lpr=68 pi=[57,68)/1 crt=44'1041 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 193.273773193s@ mbc={}] state<Start>: transitioning to Stray
Feb 02 09:42:07 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 68 pg[9.f( v 44'1041 (0'0,44'1041] local-lis/les=65/66 n=6 ec=57/38 lis/c=65/57 les/c/f=66/58/0 sis=68 pruub=14.788814545s) [2] async=[2] r=-1 lpr=68 pi=[57,68)/1 crt=44'1041 lcod 0'0 mlcod 0'0 active pruub 193.273757935s@ mbc={255={}}] PeeringState::start_peering_interval up [2] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb 02 09:42:07 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 68 pg[9.f( v 44'1041 (0'0,44'1041] local-lis/les=65/66 n=6 ec=57/38 lis/c=65/57 les/c/f=66/58/0 sis=68 pruub=14.788699150s) [2] r=-1 lpr=68 pi=[57,68)/1 crt=44'1041 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 193.273757935s@ mbc={}] state<Start>: transitioning to Stray
Feb 02 09:42:07 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 68 pg[9.1b( v 44'1041 (0'0,44'1041] local-lis/les=65/66 n=5 ec=57/38 lis/c=65/57 les/c/f=66/58/0 sis=68 pruub=14.788599014s) [2] async=[2] r=-1 lpr=68 pi=[57,68)/1 crt=44'1041 lcod 0'0 mlcod 0'0 active pruub 193.274124146s@ mbc={255={}}] PeeringState::start_peering_interval up [2] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb 02 09:42:07 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 68 pg[9.1b( v 44'1041 (0'0,44'1041] local-lis/les=65/66 n=5 ec=57/38 lis/c=65/57 les/c/f=66/58/0 sis=68 pruub=14.788511276s) [2] r=-1 lpr=68 pi=[57,68)/1 crt=44'1041 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 193.274124146s@ mbc={}] state<Start>: transitioning to Stray
Feb 02 09:42:07 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 68 pg[9.13( v 44'1041 (0'0,44'1041] local-lis/les=65/66 n=5 ec=57/38 lis/c=65/57 les/c/f=66/58/0 sis=68 pruub=14.787535667s) [2] async=[2] r=-1 lpr=68 pi=[57,68)/1 crt=44'1041 lcod 0'0 mlcod 0'0 active pruub 193.273727417s@ mbc={255={}}] PeeringState::start_peering_interval up [2] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb 02 09:42:07 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 68 pg[9.13( v 44'1041 (0'0,44'1041] local-lis/les=65/66 n=5 ec=57/38 lis/c=65/57 les/c/f=66/58/0 sis=68 pruub=14.787388802s) [2] r=-1 lpr=68 pi=[57,68)/1 crt=44'1041 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 193.273727417s@ mbc={}] state<Start>: transitioning to Stray
Feb 02 09:42:07 compute-1 ceph-osd[77691]: log_channel(cluster) log [DBG] : 11.0 scrub starts
Feb 02 09:42:07 compute-1 ceph-osd[77691]: log_channel(cluster) log [DBG] : 11.0 scrub ok
Feb 02 09:42:08 compute-1 ceph-mon[80115]: 11.15 scrub starts
Feb 02 09:42:08 compute-1 ceph-mon[80115]: 11.15 scrub ok
Feb 02 09:42:08 compute-1 ceph-mon[80115]: osdmap e68: 3 total, 3 up, 3 in
Feb 02 09:42:08 compute-1 ceph-mon[80115]: 12.11 deep-scrub starts
Feb 02 09:42:08 compute-1 ceph-mon[80115]: 12.11 deep-scrub ok
Feb 02 09:42:08 compute-1 ceph-mon[80115]: 12.15 scrub starts
Feb 02 09:42:08 compute-1 ceph-mon[80115]: 12.15 scrub ok
Feb 02 09:42:08 compute-1 ceph-mon[80115]: pgmap v62: 353 pgs: 353 active+clean; 456 KiB data, 103 MiB used, 60 GiB / 60 GiB avail
Feb 02 09:42:08 compute-1 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "6"}]: dispatch
Feb 02 09:42:08 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e69 e69: 3 total, 3 up, 3 in
Feb 02 09:42:08 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[86236]: 02/02/2026 09:42:08 : epoch 6980714c : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9de8000b60 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:42:08 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[86236]: 02/02/2026 09:42:08 : epoch 6980714c : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9de4003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:42:08 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 69 pg[9.15( v 44'1041 (0'0,44'1041] local-lis/les=57/58 n=4 ec=57/38 lis/c=57/57 les/c/f=58/58/0 sis=69 pruub=8.721615791s) [2] r=-1 lpr=69 pi=[57,69)/1 crt=44'1041 lcod 0'0 mlcod 0'0 active pruub 188.751953125s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb 02 09:42:08 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 69 pg[9.15( v 44'1041 (0'0,44'1041] local-lis/les=57/58 n=4 ec=57/38 lis/c=57/57 les/c/f=58/58/0 sis=69 pruub=8.721550941s) [2] r=-1 lpr=69 pi=[57,69)/1 crt=44'1041 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 188.751953125s@ mbc={}] state<Start>: transitioning to Stray
Feb 02 09:42:08 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 69 pg[9.d( v 44'1041 (0'0,44'1041] local-lis/les=57/58 n=6 ec=57/38 lis/c=57/57 les/c/f=58/58/0 sis=69 pruub=8.725348473s) [2] r=-1 lpr=69 pi=[57,69)/1 crt=44'1041 lcod 0'0 mlcod 0'0 active pruub 188.756683350s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb 02 09:42:08 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 69 pg[9.d( v 44'1041 (0'0,44'1041] local-lis/les=57/58 n=6 ec=57/38 lis/c=57/57 les/c/f=58/58/0 sis=69 pruub=8.725308418s) [2] r=-1 lpr=69 pi=[57,69)/1 crt=44'1041 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 188.756683350s@ mbc={}] state<Start>: transitioning to Stray
Feb 02 09:42:08 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 69 pg[9.5( v 59'1044 (0'0,59'1044] local-lis/les=57/58 n=6 ec=57/38 lis/c=57/57 les/c/f=58/58/0 sis=69 pruub=8.724955559s) [2] r=-1 lpr=69 pi=[57,69)/1 crt=58'1042 lcod 59'1043 mlcod 59'1043 active pruub 188.756774902s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb 02 09:42:08 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 69 pg[9.5( v 59'1044 (0'0,59'1044] local-lis/les=57/58 n=6 ec=57/38 lis/c=57/57 les/c/f=58/58/0 sis=69 pruub=8.724844933s) [2] r=-1 lpr=69 pi=[57,69)/1 crt=58'1042 lcod 59'1043 mlcod 0'0 unknown NOTIFY pruub 188.756774902s@ mbc={}] state<Start>: transitioning to Stray
Feb 02 09:42:08 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 69 pg[9.1d( v 44'1041 (0'0,44'1041] local-lis/les=57/58 n=5 ec=57/38 lis/c=57/57 les/c/f=58/58/0 sis=69 pruub=8.731335640s) [2] r=-1 lpr=69 pi=[57,69)/1 crt=44'1041 lcod 0'0 mlcod 0'0 active pruub 188.763656616s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb 02 09:42:08 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 69 pg[9.1d( v 44'1041 (0'0,44'1041] local-lis/les=57/58 n=5 ec=57/38 lis/c=57/57 les/c/f=58/58/0 sis=69 pruub=8.731319427s) [2] r=-1 lpr=69 pi=[57,69)/1 crt=44'1041 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 188.763656616s@ mbc={}] state<Start>: transitioning to Stray
Feb 02 09:42:08 compute-1 ceph-osd[77691]: log_channel(cluster) log [DBG] : 11.c scrub starts
Feb 02 09:42:08 compute-1 ceph-osd[77691]: log_channel(cluster) log [DBG] : 11.c scrub ok
Feb 02 09:42:08 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:42:08 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000024s ======
Feb 02 09:42:08 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:42:08.938 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Feb 02 09:42:09 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:42:09 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:42:09 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:42:09.030 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:42:09 compute-1 kernel: ganesha.nfsd[86643]: segfault at 50 ip 00007f9e9266932e sp 00007f9e1e7fb210 error 4 in libntirpc.so.5.8[7f9e9264e000+2c000] likely on CPU 2 (core 0, socket 2)
Feb 02 09:42:09 compute-1 kernel: Code: 47 20 66 41 89 86 f2 00 00 00 41 bf 01 00 00 00 b9 40 00 00 00 e9 af fd ff ff 66 90 48 8b 85 f8 00 00 00 48 8b 40 08 4c 8b 28 <45> 8b 65 50 49 8b 75 68 41 8b be 28 02 00 00 b9 40 00 00 00 e8 29
Feb 02 09:42:09 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[86236]: 02/02/2026 09:42:09 : epoch 6980714c : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e00001c00 fd 48 proxy ignored for local
Feb 02 09:42:09 compute-1 systemd[1]: Started Process Core Dump (PID 86673/UID 0).
Feb 02 09:42:09 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e70 e70: 3 total, 3 up, 3 in
Feb 02 09:42:09 compute-1 ceph-mon[80115]: 11.0 scrub starts
Feb 02 09:42:09 compute-1 ceph-mon[80115]: 11.0 scrub ok
Feb 02 09:42:09 compute-1 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "6"}]': finished
Feb 02 09:42:09 compute-1 ceph-mon[80115]: osdmap e69: 3 total, 3 up, 3 in
Feb 02 09:42:09 compute-1 ceph-mon[80115]: 12.3 scrub starts
Feb 02 09:42:09 compute-1 ceph-mon[80115]: 12.3 scrub ok
Feb 02 09:42:09 compute-1 ceph-mon[80115]: 10.0 scrub starts
Feb 02 09:42:09 compute-1 ceph-mon[80115]: 10.0 scrub ok
Feb 02 09:42:09 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 70 pg[9.5( v 59'1044 (0'0,59'1044] local-lis/les=57/58 n=6 ec=57/38 lis/c=57/57 les/c/f=58/58/0 sis=70) [2]/[0] r=0 lpr=70 pi=[57,70)/1 crt=58'1042 lcod 59'1043 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Feb 02 09:42:09 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 70 pg[9.d( v 44'1041 (0'0,44'1041] local-lis/les=57/58 n=6 ec=57/38 lis/c=57/57 les/c/f=58/58/0 sis=70) [2]/[0] r=0 lpr=70 pi=[57,70)/1 crt=44'1041 lcod 0'0 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Feb 02 09:42:09 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 70 pg[9.5( v 59'1044 (0'0,59'1044] local-lis/les=57/58 n=6 ec=57/38 lis/c=57/57 les/c/f=58/58/0 sis=70) [2]/[0] r=0 lpr=70 pi=[57,70)/1 crt=58'1042 lcod 59'1043 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Feb 02 09:42:09 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 70 pg[9.d( v 44'1041 (0'0,44'1041] local-lis/les=57/58 n=6 ec=57/38 lis/c=57/57 les/c/f=58/58/0 sis=70) [2]/[0] r=0 lpr=70 pi=[57,70)/1 crt=44'1041 lcod 0'0 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Feb 02 09:42:09 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 70 pg[9.1d( v 44'1041 (0'0,44'1041] local-lis/les=57/58 n=5 ec=57/38 lis/c=57/57 les/c/f=58/58/0 sis=70) [2]/[0] r=0 lpr=70 pi=[57,70)/1 crt=44'1041 lcod 0'0 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Feb 02 09:42:09 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 70 pg[9.1d( v 44'1041 (0'0,44'1041] local-lis/les=57/58 n=5 ec=57/38 lis/c=57/57 les/c/f=58/58/0 sis=70) [2]/[0] r=0 lpr=70 pi=[57,70)/1 crt=44'1041 lcod 0'0 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Feb 02 09:42:09 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 70 pg[9.15( v 44'1041 (0'0,44'1041] local-lis/les=57/58 n=4 ec=57/38 lis/c=57/57 les/c/f=58/58/0 sis=70) [2]/[0] r=0 lpr=70 pi=[57,70)/1 crt=44'1041 lcod 0'0 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Feb 02 09:42:09 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 70 pg[9.15( v 44'1041 (0'0,44'1041] local-lis/les=57/58 n=4 ec=57/38 lis/c=57/57 les/c/f=58/58/0 sis=70) [2]/[0] r=0 lpr=70 pi=[57,70)/1 crt=44'1041 lcod 0'0 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Feb 02 09:42:09 compute-1 ceph-osd[77691]: log_channel(cluster) log [DBG] : 11.b scrub starts
Feb 02 09:42:09 compute-1 ceph-osd[77691]: log_channel(cluster) log [DBG] : 11.b scrub ok
Feb 02 09:42:09 compute-1 systemd-coredump[86674]: Process 86240 (ganesha.nfsd) of user 0 dumped core.
                                                   
                                                   Stack trace of thread 41:
                                                   #0  0x00007f9e9266932e n/a (/usr/lib64/libntirpc.so.5.8 + 0x2232e)
                                                   ELF object binary architecture: AMD x86-64
Feb 02 09:42:09 compute-1 systemd[1]: systemd-coredump@1-86673-0.service: Deactivated successfully.
Feb 02 09:42:10 compute-1 podman[86679]: 2026-02-02 09:42:10.020227001 +0000 UTC m=+0.024935834 container died 7c9865c3c5fcb01fb9a72090f5b9a87596fc8b607b4daae4dd97ee2f68e491d8 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.40.1, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250325, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS)
Feb 02 09:42:10 compute-1 systemd[1]: var-lib-containers-storage-overlay-fffc5fa1fa69a3fdffe465386ca294ac1ff31a0b62c5a54f0b0563f32f221eaf-merged.mount: Deactivated successfully.
Feb 02 09:42:10 compute-1 podman[86679]: 2026-02-02 09:42:10.201033035 +0000 UTC m=+0.205741808 container remove 7c9865c3c5fcb01fb9a72090f5b9a87596fc8b607b4daae4dd97ee2f68e491d8 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, CEPH_REF=squid, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325)
Feb 02 09:42:10 compute-1 systemd[1]: ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1@nfs.cephfs.0.0.compute-1.mhzhsx.service: Main process exited, code=exited, status=139/n/a
Feb 02 09:42:10 compute-1 systemd[1]: ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1@nfs.cephfs.0.0.compute-1.mhzhsx.service: Failed with result 'exit-code'.
Feb 02 09:42:10 compute-1 systemd[1]: ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1@nfs.cephfs.0.0.compute-1.mhzhsx.service: Consumed 1.105s CPU time.
Feb 02 09:42:10 compute-1 ceph-mon[80115]: 11.c scrub starts
Feb 02 09:42:10 compute-1 ceph-mon[80115]: 11.c scrub ok
Feb 02 09:42:10 compute-1 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' 
Feb 02 09:42:10 compute-1 ceph-mon[80115]: osdmap e70: 3 total, 3 up, 3 in
Feb 02 09:42:10 compute-1 ceph-mon[80115]: 11.19 scrub starts
Feb 02 09:42:10 compute-1 ceph-mon[80115]: 11.19 scrub ok
Feb 02 09:42:10 compute-1 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' 
Feb 02 09:42:10 compute-1 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' 
Feb 02 09:42:10 compute-1 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "mgr module enable", "module": "prometheus"}]: dispatch
Feb 02 09:42:10 compute-1 ceph-mon[80115]: 10.e scrub starts
Feb 02 09:42:10 compute-1 ceph-mon[80115]: 10.e scrub ok
Feb 02 09:42:10 compute-1 ceph-mon[80115]: pgmap v65: 353 pgs: 353 active+clean; 457 KiB data, 103 MiB used, 60 GiB / 60 GiB avail; 213 B/s, 9 objects/s recovering
Feb 02 09:42:10 compute-1 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "7"}]: dispatch
Feb 02 09:42:10 compute-1 ceph-mon[80115]: 11.b scrub starts
Feb 02 09:42:10 compute-1 ceph-mon[80115]: 11.b scrub ok
Feb 02 09:42:10 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e70 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 02 09:42:10 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:42:10 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000025s ======
Feb 02 09:42:10 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:42:10.941 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Feb 02 09:42:11 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:42:11 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:42:11 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:42:11.028 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:42:11 compute-1 ceph-osd[77691]: log_channel(cluster) log [DBG] : 11.9 scrub starts
Feb 02 09:42:11 compute-1 ceph-osd[77691]: log_channel(cluster) log [DBG] : 11.9 scrub ok
Feb 02 09:42:11 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e71 e71: 3 total, 3 up, 3 in
Feb 02 09:42:11 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 71 pg[9.16( v 44'1041 (0'0,44'1041] local-lis/les=57/58 n=5 ec=57/38 lis/c=57/57 les/c/f=58/58/0 sis=71 pruub=14.397579193s) [1] r=-1 lpr=71 pi=[57,71)/1 crt=44'1041 lcod 0'0 mlcod 0'0 active pruub 196.752578735s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb 02 09:42:11 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 71 pg[9.16( v 44'1041 (0'0,44'1041] local-lis/les=57/58 n=5 ec=57/38 lis/c=57/57 les/c/f=58/58/0 sis=71 pruub=14.397531509s) [1] r=-1 lpr=71 pi=[57,71)/1 crt=44'1041 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 196.752578735s@ mbc={}] state<Start>: transitioning to Stray
Feb 02 09:42:11 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 71 pg[9.e( v 44'1041 (0'0,44'1041] local-lis/les=57/58 n=6 ec=57/38 lis/c=57/57 les/c/f=58/58/0 sis=71 pruub=14.400834084s) [1] r=-1 lpr=71 pi=[57,71)/1 crt=44'1041 lcod 0'0 mlcod 0'0 active pruub 196.756729126s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb 02 09:42:11 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 71 pg[9.e( v 44'1041 (0'0,44'1041] local-lis/les=57/58 n=6 ec=57/38 lis/c=57/57 les/c/f=58/58/0 sis=71 pruub=14.400775909s) [1] r=-1 lpr=71 pi=[57,71)/1 crt=44'1041 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 196.756729126s@ mbc={}] state<Start>: transitioning to Stray
Feb 02 09:42:11 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 71 pg[9.6( v 44'1041 (0'0,44'1041] local-lis/les=57/58 n=6 ec=57/38 lis/c=57/57 les/c/f=58/58/0 sis=71 pruub=14.399782181s) [1] r=-1 lpr=71 pi=[57,71)/1 crt=44'1041 lcod 0'0 mlcod 0'0 active pruub 196.756805420s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb 02 09:42:11 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 71 pg[9.6( v 44'1041 (0'0,44'1041] local-lis/les=57/58 n=6 ec=57/38 lis/c=57/57 les/c/f=58/58/0 sis=71 pruub=14.399748802s) [1] r=-1 lpr=71 pi=[57,71)/1 crt=44'1041 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 196.756805420s@ mbc={}] state<Start>: transitioning to Stray
Feb 02 09:42:11 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 71 pg[9.1e( v 44'1041 (0'0,44'1041] local-lis/les=57/58 n=5 ec=57/38 lis/c=57/57 les/c/f=58/58/0 sis=71 pruub=14.400461197s) [1] r=-1 lpr=71 pi=[57,71)/1 crt=44'1041 lcod 0'0 mlcod 0'0 active pruub 196.757797241s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb 02 09:42:11 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 71 pg[9.1e( v 44'1041 (0'0,44'1041] local-lis/les=57/58 n=5 ec=57/38 lis/c=57/57 les/c/f=58/58/0 sis=71 pruub=14.400429726s) [1] r=-1 lpr=71 pi=[57,71)/1 crt=44'1041 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 196.757797241s@ mbc={}] state<Start>: transitioning to Stray
Feb 02 09:42:11 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 71 pg[9.1d( v 44'1041 (0'0,44'1041] local-lis/les=70/71 n=5 ec=57/38 lis/c=57/57 les/c/f=58/58/0 sis=70) [2]/[0] async=[2] r=0 lpr=70 pi=[57,70)/1 crt=44'1041 lcod 0'0 mlcod 0'0 active+remapped mbc={255={(0+1)=5}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 02 09:42:11 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 71 pg[9.15( v 44'1041 (0'0,44'1041] local-lis/les=70/71 n=4 ec=57/38 lis/c=57/57 les/c/f=58/58/0 sis=70) [2]/[0] async=[2] r=0 lpr=70 pi=[57,70)/1 crt=44'1041 lcod 0'0 mlcod 0'0 active+remapped mbc={255={(0+1)=4}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 02 09:42:11 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 71 pg[9.5( v 59'1044 (0'0,59'1044] local-lis/les=70/71 n=6 ec=57/38 lis/c=57/57 les/c/f=58/58/0 sis=70) [2]/[0] async=[2] r=0 lpr=70 pi=[57,70)/1 crt=59'1044 lcod 59'1043 mlcod 0'0 active+remapped mbc={255={(0+1)=8}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 02 09:42:11 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 71 pg[9.d( v 44'1041 (0'0,44'1041] local-lis/les=70/71 n=6 ec=57/38 lis/c=57/57 les/c/f=58/58/0 sis=70) [2]/[0] async=[2] r=0 lpr=70 pi=[57,70)/1 crt=44'1041 lcod 0'0 mlcod 0'0 active+remapped mbc={255={(0+1)=8}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 02 09:42:11 compute-1 ceph-mgr[80422]: mgr handle_mgr_map respawning because set of enabled modules changed!
Feb 02 09:42:11 compute-1 sshd-session[83564]: Connection closed by 192.168.122.100 port 48194
Feb 02 09:42:11 compute-1 sshd-session[83533]: pam_unix(sshd:session): session closed for user ceph-admin
Feb 02 09:42:11 compute-1 systemd[1]: session-34.scope: Deactivated successfully.
Feb 02 09:42:11 compute-1 systemd[1]: session-34.scope: Consumed 18.173s CPU time.
Feb 02 09:42:11 compute-1 systemd-logind[805]: Session 34 logged out. Waiting for processes to exit.
Feb 02 09:42:11 compute-1 systemd-logind[805]: Removed session 34.
Feb 02 09:42:11 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]: ignoring --setuser ceph since I am not root
Feb 02 09:42:11 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]: ignoring --setgroup ceph since I am not root
Feb 02 09:42:11 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e72 e72: 3 total, 3 up, 3 in
Feb 02 09:42:11 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 72 pg[9.16( v 44'1041 (0'0,44'1041] local-lis/les=57/58 n=5 ec=57/38 lis/c=57/57 les/c/f=58/58/0 sis=72) [1]/[0] r=0 lpr=72 pi=[57,72)/1 crt=44'1041 lcod 0'0 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [1] -> [1], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 1, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Feb 02 09:42:11 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 72 pg[9.16( v 44'1041 (0'0,44'1041] local-lis/les=57/58 n=5 ec=57/38 lis/c=57/57 les/c/f=58/58/0 sis=72) [1]/[0] r=0 lpr=72 pi=[57,72)/1 crt=44'1041 lcod 0'0 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Feb 02 09:42:11 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 72 pg[9.e( v 44'1041 (0'0,44'1041] local-lis/les=57/58 n=6 ec=57/38 lis/c=57/57 les/c/f=58/58/0 sis=72) [1]/[0] r=0 lpr=72 pi=[57,72)/1 crt=44'1041 lcod 0'0 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [1] -> [1], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 1, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Feb 02 09:42:11 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 72 pg[9.e( v 44'1041 (0'0,44'1041] local-lis/les=57/58 n=6 ec=57/38 lis/c=57/57 les/c/f=58/58/0 sis=72) [1]/[0] r=0 lpr=72 pi=[57,72)/1 crt=44'1041 lcod 0'0 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Feb 02 09:42:11 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 72 pg[9.6( v 44'1041 (0'0,44'1041] local-lis/les=57/58 n=6 ec=57/38 lis/c=57/57 les/c/f=58/58/0 sis=72) [1]/[0] r=0 lpr=72 pi=[57,72)/1 crt=44'1041 lcod 0'0 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [1] -> [1], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 1, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Feb 02 09:42:11 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 72 pg[9.6( v 44'1041 (0'0,44'1041] local-lis/les=57/58 n=6 ec=57/38 lis/c=57/57 les/c/f=58/58/0 sis=72) [1]/[0] r=0 lpr=72 pi=[57,72)/1 crt=44'1041 lcod 0'0 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Feb 02 09:42:11 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 72 pg[9.1e( v 44'1041 (0'0,44'1041] local-lis/les=57/58 n=5 ec=57/38 lis/c=57/57 les/c/f=58/58/0 sis=72) [1]/[0] r=0 lpr=72 pi=[57,72)/1 crt=44'1041 lcod 0'0 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [1] -> [1], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 1, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Feb 02 09:42:11 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 72 pg[9.1d( v 44'1041 (0'0,44'1041] local-lis/les=70/71 n=5 ec=57/38 lis/c=70/57 les/c/f=71/58/0 sis=72 pruub=15.885287285s) [2] async=[2] r=-1 lpr=72 pi=[57,72)/1 crt=44'1041 lcod 0'0 mlcod 0'0 active pruub 198.365600586s@ mbc={255={}}] PeeringState::start_peering_interval up [2] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb 02 09:42:11 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 72 pg[9.1d( v 44'1041 (0'0,44'1041] local-lis/les=70/71 n=5 ec=57/38 lis/c=70/57 les/c/f=71/58/0 sis=72 pruub=15.885204315s) [2] r=-1 lpr=72 pi=[57,72)/1 crt=44'1041 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 198.365600586s@ mbc={}] state<Start>: transitioning to Stray
Feb 02 09:42:11 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 72 pg[9.1e( v 44'1041 (0'0,44'1041] local-lis/les=57/58 n=5 ec=57/38 lis/c=57/57 les/c/f=58/58/0 sis=72) [1]/[0] r=0 lpr=72 pi=[57,72)/1 crt=44'1041 lcod 0'0 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Feb 02 09:42:11 compute-1 ceph-mgr[80422]: ceph version 19.2.3 (c92aebb279828e9c3c1f5d24613efca272649e62) squid (stable), process ceph-mgr, pid 2
Feb 02 09:42:11 compute-1 ceph-mgr[80422]: pidfile_write: ignore empty --pid-file
Feb 02 09:42:11 compute-1 ceph-mgr[80422]: mgr[py] Loading python module 'alerts'
Feb 02 09:42:11 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]: 2026-02-02T09:42:11.415+0000 7f8e1069c140 -1 mgr[py] Module alerts has missing NOTIFY_TYPES member
Feb 02 09:42:11 compute-1 ceph-mgr[80422]: mgr[py] Module alerts has missing NOTIFY_TYPES member
Feb 02 09:42:11 compute-1 ceph-mgr[80422]: mgr[py] Loading python module 'balancer'
Feb 02 09:42:11 compute-1 ceph-mon[80115]: 11.16 scrub starts
Feb 02 09:42:11 compute-1 ceph-mon[80115]: 11.16 scrub ok
Feb 02 09:42:11 compute-1 ceph-mon[80115]: 10.c scrub starts
Feb 02 09:42:11 compute-1 ceph-mon[80115]: 10.c scrub ok
Feb 02 09:42:11 compute-1 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "7"}]': finished
Feb 02 09:42:11 compute-1 ceph-mon[80115]: osdmap e71: 3 total, 3 up, 3 in
Feb 02 09:42:11 compute-1 ceph-mon[80115]: from='mgr.14514 192.168.122.100:0/3917148228' entity='mgr.compute-0.djvyfo' cmd='[{"prefix": "mgr module enable", "module": "prometheus"}]': finished
Feb 02 09:42:11 compute-1 ceph-mon[80115]: mgrmap e25: compute-0.djvyfo(active, since 83s), standbys: compute-1.teascl, compute-2.gzlyac
Feb 02 09:42:11 compute-1 ceph-mon[80115]: osdmap e72: 3 total, 3 up, 3 in
Feb 02 09:42:11 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]: 2026-02-02T09:42:11.493+0000 7f8e1069c140 -1 mgr[py] Module balancer has missing NOTIFY_TYPES member
Feb 02 09:42:11 compute-1 ceph-mgr[80422]: mgr[py] Module balancer has missing NOTIFY_TYPES member
Feb 02 09:42:11 compute-1 ceph-mgr[80422]: mgr[py] Loading python module 'cephadm'
Feb 02 09:42:12 compute-1 ceph-osd[77691]: log_channel(cluster) log [DBG] : 11.d scrub starts
Feb 02 09:42:12 compute-1 ceph-osd[77691]: log_channel(cluster) log [DBG] : 11.d scrub ok
Feb 02 09:42:12 compute-1 ceph-mgr[80422]: mgr[py] Loading python module 'crash'
Feb 02 09:42:12 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]: 2026-02-02T09:42:12.252+0000 7f8e1069c140 -1 mgr[py] Module crash has missing NOTIFY_TYPES member
Feb 02 09:42:12 compute-1 ceph-mgr[80422]: mgr[py] Module crash has missing NOTIFY_TYPES member
Feb 02 09:42:12 compute-1 ceph-mgr[80422]: mgr[py] Loading python module 'dashboard'
Feb 02 09:42:12 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e73 e73: 3 total, 3 up, 3 in
Feb 02 09:42:12 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 73 pg[9.15( v 44'1041 (0'0,44'1041] local-lis/les=70/71 n=4 ec=57/38 lis/c=70/57 les/c/f=71/58/0 sis=73 pruub=14.876825333s) [2] async=[2] r=-1 lpr=73 pi=[57,73)/1 crt=44'1041 lcod 0'0 mlcod 0'0 active pruub 198.372344971s@ mbc={255={}}] PeeringState::start_peering_interval up [2] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb 02 09:42:12 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 73 pg[9.15( v 44'1041 (0'0,44'1041] local-lis/les=70/71 n=4 ec=57/38 lis/c=70/57 les/c/f=71/58/0 sis=73 pruub=14.876741409s) [2] r=-1 lpr=73 pi=[57,73)/1 crt=44'1041 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 198.372344971s@ mbc={}] state<Start>: transitioning to Stray
Feb 02 09:42:12 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 73 pg[9.d( v 44'1041 (0'0,44'1041] local-lis/les=70/71 n=6 ec=57/38 lis/c=70/57 les/c/f=71/58/0 sis=73 pruub=14.875668526s) [2] async=[2] r=-1 lpr=73 pi=[57,73)/1 crt=44'1041 lcod 0'0 mlcod 0'0 active pruub 198.372756958s@ mbc={255={}}] PeeringState::start_peering_interval up [2] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb 02 09:42:12 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 73 pg[9.d( v 44'1041 (0'0,44'1041] local-lis/les=70/71 n=6 ec=57/38 lis/c=70/57 les/c/f=71/58/0 sis=73 pruub=14.875595093s) [2] r=-1 lpr=73 pi=[57,73)/1 crt=44'1041 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 198.372756958s@ mbc={}] state<Start>: transitioning to Stray
Feb 02 09:42:12 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 73 pg[9.5( v 72'1048 (0'0,72'1048] local-lis/les=70/71 n=6 ec=57/38 lis/c=70/57 les/c/f=71/58/0 sis=73 pruub=14.874918938s) [2] async=[2] r=-1 lpr=73 pi=[57,73)/1 crt=59'1044 lcod 72'1047 mlcod 72'1047 active pruub 198.372436523s@ mbc={255={}}] PeeringState::start_peering_interval up [2] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb 02 09:42:12 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 73 pg[9.5( v 72'1048 (0'0,72'1048] local-lis/les=70/71 n=6 ec=57/38 lis/c=70/57 les/c/f=71/58/0 sis=73 pruub=14.874726295s) [2] r=-1 lpr=73 pi=[57,73)/1 crt=59'1044 lcod 72'1047 mlcod 0'0 unknown NOTIFY pruub 198.372436523s@ mbc={}] state<Start>: transitioning to Stray
Feb 02 09:42:12 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 73 pg[9.16( v 44'1041 (0'0,44'1041] local-lis/les=72/73 n=5 ec=57/38 lis/c=57/57 les/c/f=58/58/0 sis=72) [1]/[0] async=[1] r=0 lpr=72 pi=[57,72)/1 crt=44'1041 lcod 0'0 mlcod 0'0 active+remapped mbc={255={(0+1)=4}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 02 09:42:12 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 73 pg[9.1e( v 44'1041 (0'0,44'1041] local-lis/les=72/73 n=5 ec=57/38 lis/c=57/57 les/c/f=58/58/0 sis=72) [1]/[0] async=[1] r=0 lpr=72 pi=[57,72)/1 crt=44'1041 lcod 0'0 mlcod 0'0 active+remapped mbc={255={(0+1)=5}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 02 09:42:12 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 73 pg[9.e( v 44'1041 (0'0,44'1041] local-lis/les=72/73 n=6 ec=57/38 lis/c=57/57 les/c/f=58/58/0 sis=72) [1]/[0] async=[1] r=0 lpr=72 pi=[57,72)/1 crt=44'1041 lcod 0'0 mlcod 0'0 active+remapped mbc={255={(0+1)=5}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 02 09:42:12 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 73 pg[9.6( v 44'1041 (0'0,44'1041] local-lis/les=72/73 n=6 ec=57/38 lis/c=57/57 les/c/f=58/58/0 sis=72) [1]/[0] async=[1] r=0 lpr=72 pi=[57,72)/1 crt=44'1041 lcod 0'0 mlcod 0'0 active+remapped mbc={255={(0+1)=6}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 02 09:42:12 compute-1 ceph-mon[80115]: 11.9 scrub starts
Feb 02 09:42:12 compute-1 ceph-mon[80115]: 11.9 scrub ok
Feb 02 09:42:12 compute-1 ceph-mon[80115]: 10.a scrub starts
Feb 02 09:42:12 compute-1 ceph-mon[80115]: 10.a scrub ok
Feb 02 09:42:12 compute-1 ceph-mon[80115]: osdmap e73: 3 total, 3 up, 3 in
Feb 02 09:42:12 compute-1 ceph-mgr[80422]: mgr[py] Loading python module 'devicehealth'
Feb 02 09:42:12 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]: 2026-02-02T09:42:12.900+0000 7f8e1069c140 -1 mgr[py] Module devicehealth has missing NOTIFY_TYPES member
Feb 02 09:42:12 compute-1 ceph-mgr[80422]: mgr[py] Module devicehealth has missing NOTIFY_TYPES member
Feb 02 09:42:12 compute-1 ceph-mgr[80422]: mgr[py] Loading python module 'diskprediction_local'
Feb 02 09:42:12 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:42:12 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:42:12 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:42:12.944 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:42:13 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]: /lib64/python3.9/site-packages/scipy/__init__.py:73: UserWarning: NumPy was imported from a Python sub-interpreter but NumPy does not properly support sub-interpreters. This will likely work for most users but might cause hard to track down issues or subtle bugs. A common user of the rare sub-interpreter feature is wsgi which also allows single-interpreter mode.
Feb 02 09:42:13 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]: Improvements in the case of bugs are welcome, but is not on the NumPy roadmap, and full support may require significant effort to achieve.
Feb 02 09:42:13 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]:   from numpy import show_config as show_numpy_config
Feb 02 09:42:13 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:42:13 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:42:13 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:42:13.036 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:42:13 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]: 2026-02-02T09:42:13.042+0000 7f8e1069c140 -1 mgr[py] Module diskprediction_local has missing NOTIFY_TYPES member
Feb 02 09:42:13 compute-1 ceph-mgr[80422]: mgr[py] Module diskprediction_local has missing NOTIFY_TYPES member
Feb 02 09:42:13 compute-1 ceph-mgr[80422]: mgr[py] Loading python module 'influx'
Feb 02 09:42:13 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]: 2026-02-02T09:42:13.105+0000 7f8e1069c140 -1 mgr[py] Module influx has missing NOTIFY_TYPES member
Feb 02 09:42:13 compute-1 ceph-mgr[80422]: mgr[py] Module influx has missing NOTIFY_TYPES member
Feb 02 09:42:13 compute-1 ceph-mgr[80422]: mgr[py] Loading python module 'insights'
Feb 02 09:42:13 compute-1 ceph-osd[77691]: log_channel(cluster) log [DBG] : 8.e scrub starts
Feb 02 09:42:13 compute-1 ceph-osd[77691]: log_channel(cluster) log [DBG] : 8.e scrub ok
Feb 02 09:42:13 compute-1 ceph-mgr[80422]: mgr[py] Loading python module 'iostat'
Feb 02 09:42:13 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]: 2026-02-02T09:42:13.244+0000 7f8e1069c140 -1 mgr[py] Module iostat has missing NOTIFY_TYPES member
Feb 02 09:42:13 compute-1 ceph-mgr[80422]: mgr[py] Module iostat has missing NOTIFY_TYPES member
Feb 02 09:42:13 compute-1 ceph-mgr[80422]: mgr[py] Loading python module 'k8sevents'
Feb 02 09:42:13 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e74 e74: 3 total, 3 up, 3 in
Feb 02 09:42:13 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 74 pg[9.16( v 44'1041 (0'0,44'1041] local-lis/les=72/73 n=5 ec=57/38 lis/c=72/57 les/c/f=73/58/0 sis=74 pruub=15.008621216s) [1] async=[1] r=-1 lpr=74 pi=[57,74)/1 crt=44'1041 lcod 0'0 mlcod 0'0 active pruub 199.507293701s@ mbc={255={}}] PeeringState::start_peering_interval up [1] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 1 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb 02 09:42:13 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 74 pg[9.16( v 44'1041 (0'0,44'1041] local-lis/les=72/73 n=5 ec=57/38 lis/c=72/57 les/c/f=73/58/0 sis=74 pruub=15.008448601s) [1] r=-1 lpr=74 pi=[57,74)/1 crt=44'1041 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 199.507293701s@ mbc={}] state<Start>: transitioning to Stray
Feb 02 09:42:13 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 74 pg[9.e( v 44'1041 (0'0,44'1041] local-lis/les=72/73 n=6 ec=57/38 lis/c=72/57 les/c/f=73/58/0 sis=74 pruub=15.033875465s) [1] async=[1] r=-1 lpr=74 pi=[57,74)/1 crt=44'1041 lcod 0'0 mlcod 0'0 active pruub 199.532867432s@ mbc={255={}}] PeeringState::start_peering_interval up [1] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 1 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb 02 09:42:13 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 74 pg[9.e( v 44'1041 (0'0,44'1041] local-lis/les=72/73 n=6 ec=57/38 lis/c=72/57 les/c/f=73/58/0 sis=74 pruub=15.033797264s) [1] r=-1 lpr=74 pi=[57,74)/1 crt=44'1041 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 199.532867432s@ mbc={}] state<Start>: transitioning to Stray
Feb 02 09:42:13 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 74 pg[9.6( v 44'1041 (0'0,44'1041] local-lis/les=72/73 n=6 ec=57/38 lis/c=72/57 les/c/f=73/58/0 sis=74 pruub=15.032817841s) [1] async=[1] r=-1 lpr=74 pi=[57,74)/1 crt=44'1041 lcod 0'0 mlcod 0'0 active pruub 199.532943726s@ mbc={255={}}] PeeringState::start_peering_interval up [1] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 1 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb 02 09:42:13 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 74 pg[9.6( v 44'1041 (0'0,44'1041] local-lis/les=72/73 n=6 ec=57/38 lis/c=72/57 les/c/f=73/58/0 sis=74 pruub=15.032670975s) [1] r=-1 lpr=74 pi=[57,74)/1 crt=44'1041 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 199.532943726s@ mbc={}] state<Start>: transitioning to Stray
Feb 02 09:42:13 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 74 pg[9.1e( v 44'1041 (0'0,44'1041] local-lis/les=72/73 n=5 ec=57/38 lis/c=72/57 les/c/f=73/58/0 sis=74 pruub=15.032168388s) [1] async=[1] r=-1 lpr=74 pi=[57,74)/1 crt=44'1041 lcod 0'0 mlcod 0'0 active pruub 199.532821655s@ mbc={255={}}] PeeringState::start_peering_interval up [1] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 1 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb 02 09:42:13 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 74 pg[9.1e( v 44'1041 (0'0,44'1041] local-lis/les=72/73 n=5 ec=57/38 lis/c=72/57 les/c/f=73/58/0 sis=74 pruub=15.032093048s) [1] r=-1 lpr=74 pi=[57,74)/1 crt=44'1041 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 199.532821655s@ mbc={}] state<Start>: transitioning to Stray
Feb 02 09:42:13 compute-1 ceph-mon[80115]: 11.d scrub starts
Feb 02 09:42:13 compute-1 ceph-mon[80115]: 11.d scrub ok
Feb 02 09:42:13 compute-1 ceph-mon[80115]: 9.1d deep-scrub starts
Feb 02 09:42:13 compute-1 ceph-mon[80115]: 9.1d deep-scrub ok
Feb 02 09:42:13 compute-1 ceph-mon[80115]: 10.9 scrub starts
Feb 02 09:42:13 compute-1 ceph-mon[80115]: 10.9 scrub ok
Feb 02 09:42:13 compute-1 ceph-mon[80115]: osdmap e74: 3 total, 3 up, 3 in
Feb 02 09:42:13 compute-1 ceph-mgr[80422]: mgr[py] Loading python module 'localpool'
Feb 02 09:42:13 compute-1 ceph-mgr[80422]: mgr[py] Loading python module 'mds_autoscaler'
Feb 02 09:42:13 compute-1 ceph-mgr[80422]: mgr[py] Loading python module 'mirroring'
Feb 02 09:42:13 compute-1 ceph-mgr[80422]: mgr[py] Loading python module 'nfs'
Feb 02 09:42:14 compute-1 ceph-osd[77691]: log_channel(cluster) log [DBG] : 11.2 scrub starts
Feb 02 09:42:14 compute-1 ceph-osd[77691]: log_channel(cluster) log [DBG] : 11.2 scrub ok
Feb 02 09:42:14 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]: 2026-02-02T09:42:14.168+0000 7f8e1069c140 -1 mgr[py] Module nfs has missing NOTIFY_TYPES member
Feb 02 09:42:14 compute-1 ceph-mgr[80422]: mgr[py] Module nfs has missing NOTIFY_TYPES member
Feb 02 09:42:14 compute-1 ceph-mgr[80422]: mgr[py] Loading python module 'orchestrator'
Feb 02 09:42:14 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e75 e75: 3 total, 3 up, 3 in
Feb 02 09:42:14 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]: 2026-02-02T09:42:14.382+0000 7f8e1069c140 -1 mgr[py] Module orchestrator has missing NOTIFY_TYPES member
Feb 02 09:42:14 compute-1 ceph-mgr[80422]: mgr[py] Module orchestrator has missing NOTIFY_TYPES member
Feb 02 09:42:14 compute-1 ceph-mgr[80422]: mgr[py] Loading python module 'osd_perf_query'
Feb 02 09:42:14 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]: 2026-02-02T09:42:14.449+0000 7f8e1069c140 -1 mgr[py] Module osd_perf_query has missing NOTIFY_TYPES member
Feb 02 09:42:14 compute-1 ceph-mgr[80422]: mgr[py] Module osd_perf_query has missing NOTIFY_TYPES member
Feb 02 09:42:14 compute-1 ceph-mgr[80422]: mgr[py] Loading python module 'osd_support'
Feb 02 09:42:14 compute-1 ceph-mon[80115]: 8.e scrub starts
Feb 02 09:42:14 compute-1 ceph-mon[80115]: 8.e scrub ok
Feb 02 09:42:14 compute-1 ceph-mon[80115]: 9.d scrub starts
Feb 02 09:42:14 compute-1 ceph-mon[80115]: 9.d scrub ok
Feb 02 09:42:14 compute-1 ceph-mon[80115]: 12.f scrub starts
Feb 02 09:42:14 compute-1 ceph-mon[80115]: 12.f scrub ok
Feb 02 09:42:14 compute-1 ceph-mon[80115]: osdmap e75: 3 total, 3 up, 3 in
Feb 02 09:42:14 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]: 2026-02-02T09:42:14.506+0000 7f8e1069c140 -1 mgr[py] Module osd_support has missing NOTIFY_TYPES member
Feb 02 09:42:14 compute-1 ceph-mgr[80422]: mgr[py] Module osd_support has missing NOTIFY_TYPES member
Feb 02 09:42:14 compute-1 ceph-mgr[80422]: mgr[py] Loading python module 'pg_autoscaler'
Feb 02 09:42:14 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]: 2026-02-02T09:42:14.577+0000 7f8e1069c140 -1 mgr[py] Module pg_autoscaler has missing NOTIFY_TYPES member
Feb 02 09:42:14 compute-1 ceph-mgr[80422]: mgr[py] Module pg_autoscaler has missing NOTIFY_TYPES member
Feb 02 09:42:14 compute-1 ceph-mgr[80422]: mgr[py] Loading python module 'progress'
Feb 02 09:42:14 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]: 2026-02-02T09:42:14.643+0000 7f8e1069c140 -1 mgr[py] Module progress has missing NOTIFY_TYPES member
Feb 02 09:42:14 compute-1 ceph-mgr[80422]: mgr[py] Module progress has missing NOTIFY_TYPES member
Feb 02 09:42:14 compute-1 ceph-mgr[80422]: mgr[py] Loading python module 'prometheus'
Feb 02 09:42:14 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:42:14 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000025s ======
Feb 02 09:42:14 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:42:14.948 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Feb 02 09:42:14 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]: 2026-02-02T09:42:14.952+0000 7f8e1069c140 -1 mgr[py] Module prometheus has missing NOTIFY_TYPES member
Feb 02 09:42:14 compute-1 ceph-mgr[80422]: mgr[py] Module prometheus has missing NOTIFY_TYPES member
Feb 02 09:42:14 compute-1 ceph-mgr[80422]: mgr[py] Loading python module 'rbd_support'
Feb 02 09:42:15 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:42:15 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000025s ======
Feb 02 09:42:15 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:42:15.041 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Feb 02 09:42:15 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]: 2026-02-02T09:42:15.043+0000 7f8e1069c140 -1 mgr[py] Module rbd_support has missing NOTIFY_TYPES member
Feb 02 09:42:15 compute-1 ceph-mgr[80422]: mgr[py] Module rbd_support has missing NOTIFY_TYPES member
Feb 02 09:42:15 compute-1 ceph-mgr[80422]: mgr[py] Loading python module 'restful'
Feb 02 09:42:15 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-haproxy-nfs-cephfs-compute-1-sryqbx[86108]: [WARNING] 032/094215 (4) : Server backend/nfs.cephfs.0 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Feb 02 09:42:15 compute-1 ceph-osd[77691]: log_channel(cluster) log [DBG] : 8.1 scrub starts
Feb 02 09:42:15 compute-1 ceph-osd[77691]: log_channel(cluster) log [DBG] : 8.1 scrub ok
Feb 02 09:42:15 compute-1 ceph-mgr[80422]: mgr[py] Loading python module 'rgw'
Feb 02 09:42:15 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]: 2026-02-02T09:42:15.425+0000 7f8e1069c140 -1 mgr[py] Module rgw has missing NOTIFY_TYPES member
Feb 02 09:42:15 compute-1 ceph-mgr[80422]: mgr[py] Module rgw has missing NOTIFY_TYPES member
Feb 02 09:42:15 compute-1 ceph-mgr[80422]: mgr[py] Loading python module 'rook'
Feb 02 09:42:15 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e75 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 02 09:42:15 compute-1 ceph-mon[80115]: 11.2 scrub starts
Feb 02 09:42:15 compute-1 ceph-mon[80115]: 11.2 scrub ok
Feb 02 09:42:15 compute-1 ceph-mon[80115]: 9.5 scrub starts
Feb 02 09:42:15 compute-1 ceph-mon[80115]: 9.5 scrub ok
Feb 02 09:42:15 compute-1 ceph-mon[80115]: 10.d scrub starts
Feb 02 09:42:15 compute-1 ceph-mon[80115]: 10.d scrub ok
Feb 02 09:42:15 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]: 2026-02-02T09:42:15.912+0000 7f8e1069c140 -1 mgr[py] Module rook has missing NOTIFY_TYPES member
Feb 02 09:42:15 compute-1 ceph-mgr[80422]: mgr[py] Module rook has missing NOTIFY_TYPES member
Feb 02 09:42:15 compute-1 ceph-mgr[80422]: mgr[py] Loading python module 'selftest'
Feb 02 09:42:15 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]: 2026-02-02T09:42:15.976+0000 7f8e1069c140 -1 mgr[py] Module selftest has missing NOTIFY_TYPES member
Feb 02 09:42:15 compute-1 ceph-mgr[80422]: mgr[py] Module selftest has missing NOTIFY_TYPES member
Feb 02 09:42:15 compute-1 ceph-mgr[80422]: mgr[py] Loading python module 'snap_schedule'
Feb 02 09:42:16 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]: 2026-02-02T09:42:16.047+0000 7f8e1069c140 -1 mgr[py] Module snap_schedule has missing NOTIFY_TYPES member
Feb 02 09:42:16 compute-1 ceph-mgr[80422]: mgr[py] Module snap_schedule has missing NOTIFY_TYPES member
Feb 02 09:42:16 compute-1 ceph-mgr[80422]: mgr[py] Loading python module 'stats'
Feb 02 09:42:16 compute-1 ceph-osd[77691]: log_channel(cluster) log [DBG] : 8.0 scrub starts
Feb 02 09:42:16 compute-1 ceph-osd[77691]: log_channel(cluster) log [DBG] : 8.0 scrub ok
Feb 02 09:42:16 compute-1 ceph-mgr[80422]: mgr[py] Loading python module 'status'
Feb 02 09:42:16 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]: 2026-02-02T09:42:16.175+0000 7f8e1069c140 -1 mgr[py] Module status has missing NOTIFY_TYPES member
Feb 02 09:42:16 compute-1 ceph-mgr[80422]: mgr[py] Module status has missing NOTIFY_TYPES member
Feb 02 09:42:16 compute-1 ceph-mgr[80422]: mgr[py] Loading python module 'telegraf'
Feb 02 09:42:16 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]: 2026-02-02T09:42:16.236+0000 7f8e1069c140 -1 mgr[py] Module telegraf has missing NOTIFY_TYPES member
Feb 02 09:42:16 compute-1 ceph-mgr[80422]: mgr[py] Module telegraf has missing NOTIFY_TYPES member
Feb 02 09:42:16 compute-1 ceph-mgr[80422]: mgr[py] Loading python module 'telemetry'
Feb 02 09:42:16 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]: 2026-02-02T09:42:16.369+0000 7f8e1069c140 -1 mgr[py] Module telemetry has missing NOTIFY_TYPES member
Feb 02 09:42:16 compute-1 ceph-mgr[80422]: mgr[py] Module telemetry has missing NOTIFY_TYPES member
Feb 02 09:42:16 compute-1 ceph-mgr[80422]: mgr[py] Loading python module 'test_orchestrator'
Feb 02 09:42:16 compute-1 ceph-mon[80115]: 8.1 scrub starts
Feb 02 09:42:16 compute-1 ceph-mon[80115]: 8.1 scrub ok
Feb 02 09:42:16 compute-1 ceph-mon[80115]: 10.b scrub starts
Feb 02 09:42:16 compute-1 ceph-mon[80115]: 10.b scrub ok
Feb 02 09:42:16 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]: 2026-02-02T09:42:16.561+0000 7f8e1069c140 -1 mgr[py] Module test_orchestrator has missing NOTIFY_TYPES member
Feb 02 09:42:16 compute-1 ceph-mgr[80422]: mgr[py] Module test_orchestrator has missing NOTIFY_TYPES member
Feb 02 09:42:16 compute-1 ceph-mgr[80422]: mgr[py] Loading python module 'volumes'
Feb 02 09:42:16 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]: 2026-02-02T09:42:16.786+0000 7f8e1069c140 -1 mgr[py] Module volumes has missing NOTIFY_TYPES member
Feb 02 09:42:16 compute-1 ceph-mgr[80422]: mgr[py] Module volumes has missing NOTIFY_TYPES member
Feb 02 09:42:16 compute-1 ceph-mgr[80422]: mgr[py] Loading python module 'zabbix'
Feb 02 09:42:16 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]: 2026-02-02T09:42:16.845+0000 7f8e1069c140 -1 mgr[py] Module zabbix has missing NOTIFY_TYPES member
Feb 02 09:42:16 compute-1 ceph-mgr[80422]: mgr[py] Module zabbix has missing NOTIFY_TYPES member
Feb 02 09:42:16 compute-1 ceph-mgr[80422]: [dashboard DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Feb 02 09:42:16 compute-1 ceph-mgr[80422]: mgr load Constructed class from module: dashboard
Feb 02 09:42:16 compute-1 ceph-mgr[80422]: [prometheus DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Feb 02 09:42:16 compute-1 ceph-mgr[80422]: mgr load Constructed class from module: prometheus
Feb 02 09:42:16 compute-1 ceph-mgr[80422]: ms_deliver_dispatch: unhandled message 0x55a32753d860 mon_map magic: 0 from mon.2 v2:192.168.122.101:3300/0
Feb 02 09:42:16 compute-1 ceph-mgr[80422]: [dashboard INFO root] server: ssl=no host=192.168.122.101 port=8443
Feb 02 09:42:16 compute-1 ceph-mgr[80422]: [dashboard INFO root] Configured CherryPy, starting engine...
Feb 02 09:42:16 compute-1 ceph-mgr[80422]: [dashboard INFO root] Starting engine...
Feb 02 09:42:16 compute-1 ceph-mgr[80422]: [prometheus INFO root] server_addr: :: server_port: 9283
Feb 02 09:42:16 compute-1 ceph-mgr[80422]: [prometheus INFO root] Starting engine...
Feb 02 09:42:16 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]: [02/Feb/2026:09:42:16] ENGINE Bus STARTING
Feb 02 09:42:16 compute-1 ceph-mgr[80422]: [prometheus INFO cherrypy.error] [02/Feb/2026:09:42:16] ENGINE Bus STARTING
Feb 02 09:42:16 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]: CherryPy Checker:
Feb 02 09:42:16 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]: The Application mounted at '' has an empty config.
Feb 02 09:42:16 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]: 
Feb 02 09:42:16 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:42:16 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000025s ======
Feb 02 09:42:16 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:42:16.951 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Feb 02 09:42:16 compute-1 ceph-mgr[80422]: [dashboard INFO root] Engine started...
Feb 02 09:42:16 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]: [02/Feb/2026:09:42:16] ENGINE Serving on http://:::9283
Feb 02 09:42:16 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-mgr-compute-1-teascl[80418]: [02/Feb/2026:09:42:16] ENGINE Bus STARTED
Feb 02 09:42:16 compute-1 ceph-mgr[80422]: [prometheus INFO cherrypy.error] [02/Feb/2026:09:42:16] ENGINE Serving on http://:::9283
Feb 02 09:42:16 compute-1 ceph-mgr[80422]: [prometheus INFO cherrypy.error] [02/Feb/2026:09:42:16] ENGINE Bus STARTED
Feb 02 09:42:16 compute-1 ceph-mgr[80422]: [prometheus INFO root] Engine started.
Feb 02 09:42:16 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e76 e76: 3 total, 3 up, 3 in
Feb 02 09:42:17 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:42:17 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:42:17 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:42:17.044 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:42:17 compute-1 ceph-osd[77691]: log_channel(cluster) log [DBG] : 8.7 scrub starts
Feb 02 09:42:17 compute-1 ceph-osd[77691]: log_channel(cluster) log [DBG] : 8.7 scrub ok
Feb 02 09:42:17 compute-1 sshd-session[86781]: Accepted publickey for ceph-admin from 192.168.122.100 port 43538 ssh2: RSA SHA256:U0yYyMay/+pOHGkTC+bWOMAOMtoKtn/A+YnW2fdFMFU
Feb 02 09:42:17 compute-1 ceph-mon[80115]: 8.0 scrub starts
Feb 02 09:42:17 compute-1 ceph-mon[80115]: 8.0 scrub ok
Feb 02 09:42:17 compute-1 ceph-mon[80115]: 12.1e scrub starts
Feb 02 09:42:17 compute-1 ceph-mon[80115]: 12.1e scrub ok
Feb 02 09:42:17 compute-1 ceph-mon[80115]: 12.d scrub starts
Feb 02 09:42:17 compute-1 ceph-mon[80115]: 12.d scrub ok
Feb 02 09:42:17 compute-1 ceph-mon[80115]: Standby manager daemon compute-1.teascl restarted
Feb 02 09:42:17 compute-1 ceph-mon[80115]: Standby manager daemon compute-1.teascl started
Feb 02 09:42:17 compute-1 ceph-mon[80115]: Standby manager daemon compute-2.gzlyac restarted
Feb 02 09:42:17 compute-1 ceph-mon[80115]: Standby manager daemon compute-2.gzlyac started
Feb 02 09:42:17 compute-1 ceph-mon[80115]: Active manager daemon compute-0.djvyfo restarted
Feb 02 09:42:17 compute-1 ceph-mon[80115]: Activating manager daemon compute-0.djvyfo
Feb 02 09:42:17 compute-1 ceph-mon[80115]: osdmap e76: 3 total, 3 up, 3 in
Feb 02 09:42:17 compute-1 ceph-mon[80115]: mgrmap e26: compute-0.djvyfo(active, starting, since 0.0413396s), standbys: compute-2.gzlyac, compute-1.teascl
Feb 02 09:42:17 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "mon metadata", "id": "compute-0"}]: dispatch
Feb 02 09:42:17 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "mon metadata", "id": "compute-1"}]: dispatch
Feb 02 09:42:17 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "mon metadata", "id": "compute-2"}]: dispatch
Feb 02 09:42:17 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "mds metadata", "who": "cephfs.compute-0.clmmzw"}]: dispatch
Feb 02 09:42:17 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "mds metadata", "who": "cephfs.compute-2.vvohrf"}]: dispatch
Feb 02 09:42:17 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "mds metadata", "who": "cephfs.compute-1.khfsen"}]: dispatch
Feb 02 09:42:17 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "mgr metadata", "who": "compute-0.djvyfo", "id": "compute-0.djvyfo"}]: dispatch
Feb 02 09:42:17 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "mgr metadata", "who": "compute-2.gzlyac", "id": "compute-2.gzlyac"}]: dispatch
Feb 02 09:42:17 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "mgr metadata", "who": "compute-1.teascl", "id": "compute-1.teascl"}]: dispatch
Feb 02 09:42:17 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd metadata", "id": 0}]: dispatch
Feb 02 09:42:17 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch
Feb 02 09:42:17 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch
Feb 02 09:42:17 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "mds metadata"}]: dispatch
Feb 02 09:42:17 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd metadata"}]: dispatch
Feb 02 09:42:17 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "mon metadata"}]: dispatch
Feb 02 09:42:17 compute-1 ceph-mon[80115]: Manager daemon compute-0.djvyfo is now available
Feb 02 09:42:17 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb 02 09:42:17 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Feb 02 09:42:17 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/compute-0.djvyfo/mirror_snapshot_schedule"}]: dispatch
Feb 02 09:42:17 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/compute-0.djvyfo/trash_purge_schedule"}]: dispatch
Feb 02 09:42:17 compute-1 systemd-logind[805]: New session 36 of user ceph-admin.
Feb 02 09:42:17 compute-1 systemd[1]: Started Session 36 of User ceph-admin.
Feb 02 09:42:17 compute-1 sshd-session[86781]: pam_unix(sshd:session): session opened for user ceph-admin(uid=42477) by ceph-admin(uid=0)
Feb 02 09:42:17 compute-1 sudo[86785]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 02 09:42:17 compute-1 sudo[86785]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:42:17 compute-1 sudo[86785]: pam_unix(sudo:session): session closed for user root
Feb 02 09:42:17 compute-1 sudo[86810]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/d241d473-9fcb-5f74-b163-f1ca4454e7f1/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --image quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec --timeout 895 ls
Feb 02 09:42:17 compute-1 sudo[86810]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:42:18 compute-1 ceph-osd[77691]: log_channel(cluster) log [DBG] : 11.6 scrub starts
Feb 02 09:42:18 compute-1 ceph-osd[77691]: log_channel(cluster) log [DBG] : 11.6 scrub ok
Feb 02 09:42:18 compute-1 podman[86905]: 2026-02-02 09:42:18.337779135 +0000 UTC m=+0.069156590 container exec 01cf0f34952fee77036b3444f3bcffc4063a933b24861a159799fb77cf2e6e0e (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-crash-compute-1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, CEPH_REF=squid, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS)
Feb 02 09:42:18 compute-1 podman[86905]: 2026-02-02 09:42:18.418198361 +0000 UTC m=+0.149575806 container exec_died 01cf0f34952fee77036b3444f3bcffc4063a933b24861a159799fb77cf2e6e0e (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-crash-compute-1, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, CEPH_REF=squid, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325)
Feb 02 09:42:18 compute-1 ceph-mon[80115]: 8.7 scrub starts
Feb 02 09:42:18 compute-1 ceph-mon[80115]: 8.7 scrub ok
Feb 02 09:42:18 compute-1 ceph-mon[80115]: 10.11 scrub starts
Feb 02 09:42:18 compute-1 ceph-mon[80115]: 10.11 scrub ok
Feb 02 09:42:18 compute-1 ceph-mon[80115]: 12.5 deep-scrub starts
Feb 02 09:42:18 compute-1 ceph-mon[80115]: 12.5 deep-scrub ok
Feb 02 09:42:18 compute-1 ceph-mon[80115]: mgrmap e27: compute-0.djvyfo(active, since 1.0671s), standbys: compute-2.gzlyac, compute-1.teascl
Feb 02 09:42:18 compute-1 ceph-mon[80115]: [02/Feb/2026:09:42:18] ENGINE Bus STARTING
Feb 02 09:42:18 compute-1 ceph-mon[80115]: 12.13 scrub starts
Feb 02 09:42:18 compute-1 ceph-mon[80115]: 12.13 scrub ok
Feb 02 09:42:18 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:42:18 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:42:18 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:42:18.954 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:42:18 compute-1 podman[87046]: 2026-02-02 09:42:18.999308324 +0000 UTC m=+0.065465390 container exec 8d747add3fe1f1bfe0d198ee6025faae61dcb9cc9c5a16a41a7205162dac9d07 (image=quay.io/prometheus/node-exporter:v1.7.0, name=ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-node-exporter-compute-1, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Feb 02 09:42:19 compute-1 podman[87046]: 2026-02-02 09:42:19.010639243 +0000 UTC m=+0.076796249 container exec_died 8d747add3fe1f1bfe0d198ee6025faae61dcb9cc9c5a16a41a7205162dac9d07 (image=quay.io/prometheus/node-exporter:v1.7.0, name=ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-node-exporter-compute-1, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Feb 02 09:42:19 compute-1 ceph-osd[77691]: log_channel(cluster) log [DBG] : 11.18 scrub starts
Feb 02 09:42:19 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:42:19 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000024s ======
Feb 02 09:42:19 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:42:19.046 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Feb 02 09:42:19 compute-1 ceph-osd[77691]: log_channel(cluster) log [DBG] : 11.18 scrub ok
Feb 02 09:42:19 compute-1 podman[87166]: 2026-02-02 09:42:19.37790524 +0000 UTC m=+0.059805471 container exec 956874b4175e8a2fade475e5bb43454e3b46ed50db9c41f7163b928c5dfb05c2 (image=quay.io/ceph/haproxy:2.3, name=ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-haproxy-nfs-cephfs-compute-1-sryqbx)
Feb 02 09:42:19 compute-1 podman[87166]: 2026-02-02 09:42:19.386779738 +0000 UTC m=+0.068679869 container exec_died 956874b4175e8a2fade475e5bb43454e3b46ed50db9c41f7163b928c5dfb05c2 (image=quay.io/ceph/haproxy:2.3, name=ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-haproxy-nfs-cephfs-compute-1-sryqbx)
Feb 02 09:42:19 compute-1 podman[87230]: 2026-02-02 09:42:19.582483468 +0000 UTC m=+0.062611200 container exec 2d34301ed9240709c7b956110bf2f7a09db1491812be3ea91463444d19b431a2 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-keepalived-nfs-cephfs-compute-1-whrwoq, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, description=keepalived for Ceph, com.redhat.component=keepalived-container, distribution-scope=public, io.openshift.expose-services=, name=keepalived, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, io.buildah.version=1.28.2, vendor=Red Hat, Inc., version=2.2.4, summary=Provides keepalived on RHEL 9 for Ceph., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, io.openshift.tags=Ceph keepalived, io.k8s.display-name=Keepalived on RHEL 9, release=1793, build-date=2023-02-22T09:23:20, maintainer=Guillaume Abrioux <gabrioux@redhat.com>)
Feb 02 09:42:19 compute-1 podman[87230]: 2026-02-02 09:42:19.596718008 +0000 UTC m=+0.076845650 container exec_died 2d34301ed9240709c7b956110bf2f7a09db1491812be3ea91463444d19b431a2 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-keepalived-nfs-cephfs-compute-1-whrwoq, vendor=Red Hat, Inc., io.openshift.tags=Ceph keepalived, release=1793, build-date=2023-02-22T09:23:20, architecture=x86_64, version=2.2.4, com.redhat.component=keepalived-container, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.28.2, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, distribution-scope=public, io.k8s.display-name=Keepalived on RHEL 9, summary=Provides keepalived on RHEL 9 for Ceph., io.openshift.expose-services=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=keepalived, vcs-type=git, description=keepalived for Ceph)
Feb 02 09:42:19 compute-1 ceph-mon[80115]: 11.6 scrub starts
Feb 02 09:42:19 compute-1 ceph-mon[80115]: 11.6 scrub ok
Feb 02 09:42:19 compute-1 ceph-mon[80115]: [02/Feb/2026:09:42:18] ENGINE Serving on https://192.168.122.100:7150
Feb 02 09:42:19 compute-1 ceph-mon[80115]: [02/Feb/2026:09:42:18] ENGINE Client ('192.168.122.100', 35430) lost — peer dropped the TLS connection suddenly, during handshake: (6, 'TLS/SSL connection has been closed (EOF) (_ssl.c:1147)')
Feb 02 09:42:19 compute-1 ceph-mon[80115]: [02/Feb/2026:09:42:18] ENGINE Serving on http://192.168.122.100:8765
Feb 02 09:42:19 compute-1 ceph-mon[80115]: [02/Feb/2026:09:42:18] ENGINE Bus STARTED
Feb 02 09:42:19 compute-1 ceph-mon[80115]: 10.6 scrub starts
Feb 02 09:42:19 compute-1 ceph-mon[80115]: 10.6 scrub ok
Feb 02 09:42:19 compute-1 ceph-mon[80115]: pgmap v4: 353 pgs: 353 active+clean; 457 KiB data, 125 MiB used, 60 GiB / 60 GiB avail
Feb 02 09:42:19 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "8"}]: dispatch
Feb 02 09:42:19 compute-1 ceph-mon[80115]: 12.17 scrub starts
Feb 02 09:42:19 compute-1 ceph-mon[80115]: 12.17 scrub ok
Feb 02 09:42:19 compute-1 sudo[86810]: pam_unix(sudo:session): session closed for user root
Feb 02 09:42:19 compute-1 ceph-mon[80115]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #16. Immutable memtables: 0.
Feb 02 09:42:19 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-09:42:19.775767) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Feb 02 09:42:19 compute-1 ceph-mon[80115]: rocksdb: [db/flush_job.cc:856] [default] [JOB 5] Flushing memtable with next log file: 16
Feb 02 09:42:19 compute-1 ceph-mon[80115]: rocksdb: EVENT_LOG_v1 {"time_micros": 1770025339775840, "job": 5, "event": "flush_started", "num_memtables": 1, "num_entries": 905, "num_deletes": 251, "total_data_size": 3412189, "memory_usage": 3570384, "flush_reason": "Manual Compaction"}
Feb 02 09:42:19 compute-1 ceph-mon[80115]: rocksdb: [db/flush_job.cc:885] [default] [JOB 5] Level-0 flush table #17: started
Feb 02 09:42:19 compute-1 sudo[87263]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 02 09:42:19 compute-1 sudo[87263]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:42:19 compute-1 sudo[87263]: pam_unix(sudo:session): session closed for user root
Feb 02 09:42:19 compute-1 ceph-mon[80115]: rocksdb: EVENT_LOG_v1 {"time_micros": 1770025339814911, "cf_name": "default", "job": 5, "event": "table_file_creation", "file_number": 17, "file_size": 2243757, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 6855, "largest_seqno": 7755, "table_properties": {"data_size": 2239090, "index_size": 2123, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1477, "raw_key_size": 12500, "raw_average_key_size": 21, "raw_value_size": 2228756, "raw_average_value_size": 3803, "num_data_blocks": 92, "num_entries": 586, "num_filter_entries": 586, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1770025321, "oldest_key_time": 1770025321, "file_creation_time": 1770025339, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "fac1d709-8a2a-487d-b05b-57255ec289c7", "db_session_id": "DE871D21TSCUFP8UED8E", "orig_file_number": 17, "seqno_to_time_mapping": "N/A"}}
Feb 02 09:42:19 compute-1 ceph-mon[80115]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 5] Flush lasted 39235 microseconds, and 4221 cpu microseconds.
Feb 02 09:42:19 compute-1 ceph-mon[80115]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Feb 02 09:42:19 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-09:42:19.815014) [db/flush_job.cc:967] [default] [JOB 5] Level-0 flush table #17: 2243757 bytes OK
Feb 02 09:42:19 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-09:42:19.815035) [db/memtable_list.cc:519] [default] Level-0 commit table #17 started
Feb 02 09:42:19 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-09:42:19.843625) [db/memtable_list.cc:722] [default] Level-0 commit table #17: memtable #1 done
Feb 02 09:42:19 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-09:42:19.843699) EVENT_LOG_v1 {"time_micros": 1770025339843683, "job": 5, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Feb 02 09:42:19 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-09:42:19.843734) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Feb 02 09:42:19 compute-1 ceph-mon[80115]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 5] Try to delete WAL files size 3407067, prev total WAL file size 3435163, number of live WAL files 2.
Feb 02 09:42:19 compute-1 ceph-mon[80115]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000013.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb 02 09:42:19 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-09:42:19.844632) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730030' seq:72057594037927935, type:22 .. '7061786F7300323532' seq:0, type:0; will stop at (end)
Feb 02 09:42:19 compute-1 ceph-mon[80115]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 6] Compacting 1@0 + 1@6 files to L6, score -1.00
Feb 02 09:42:19 compute-1 ceph-mon[80115]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 5 Base level 0, inputs: [17(2191KB)], [15(10MB)]
Feb 02 09:42:19 compute-1 ceph-mon[80115]: rocksdb: EVENT_LOG_v1 {"time_micros": 1770025339844702, "job": 6, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [17], "files_L6": [15], "score": -1, "input_data_size": 13775194, "oldest_snapshot_seqno": -1}
Feb 02 09:42:19 compute-1 sudo[87288]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/d241d473-9fcb-5f74-b163-f1ca4454e7f1/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Feb 02 09:42:19 compute-1 sudo[87288]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:42:20 compute-1 ceph-osd[77691]: log_channel(cluster) log [DBG] : 8.1a scrub starts
Feb 02 09:42:20 compute-1 ceph-mon[80115]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 6] Generated table #18: 3197 keys, 12428737 bytes, temperature: kUnknown
Feb 02 09:42:20 compute-1 ceph-mon[80115]: rocksdb: EVENT_LOG_v1 {"time_micros": 1770025340033883, "cf_name": "default", "job": 6, "event": "table_file_creation", "file_number": 18, "file_size": 12428737, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 12403370, "index_size": 16298, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 8005, "raw_key_size": 82723, "raw_average_key_size": 25, "raw_value_size": 12340273, "raw_average_value_size": 3859, "num_data_blocks": 707, "num_entries": 3197, "num_filter_entries": 3197, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1770025175, "oldest_key_time": 0, "file_creation_time": 1770025339, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "fac1d709-8a2a-487d-b05b-57255ec289c7", "db_session_id": "DE871D21TSCUFP8UED8E", "orig_file_number": 18, "seqno_to_time_mapping": "N/A"}}
Feb 02 09:42:20 compute-1 ceph-mon[80115]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Feb 02 09:42:20 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-09:42:20.034225) [db/compaction/compaction_job.cc:1663] [default] [JOB 6] Compacted 1@0 + 1@6 files to L6 => 12428737 bytes
Feb 02 09:42:20 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-09:42:20.036157) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 72.8 rd, 65.7 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(2.1, 11.0 +0.0 blob) out(11.9 +0.0 blob), read-write-amplify(11.7) write-amplify(5.5) OK, records in: 3729, records dropped: 532 output_compression: NoCompression
Feb 02 09:42:20 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-09:42:20.036189) EVENT_LOG_v1 {"time_micros": 1770025340036174, "job": 6, "event": "compaction_finished", "compaction_time_micros": 189277, "compaction_time_cpu_micros": 22185, "output_level": 6, "num_output_files": 1, "total_output_size": 12428737, "num_input_records": 3729, "num_output_records": 3197, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Feb 02 09:42:20 compute-1 ceph-mon[80115]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000017.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb 02 09:42:20 compute-1 ceph-mon[80115]: rocksdb: EVENT_LOG_v1 {"time_micros": 1770025340036608, "job": 6, "event": "table_file_deletion", "file_number": 17}
Feb 02 09:42:20 compute-1 ceph-mon[80115]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000015.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb 02 09:42:20 compute-1 ceph-mon[80115]: rocksdb: EVENT_LOG_v1 {"time_micros": 1770025340037865, "job": 6, "event": "table_file_deletion", "file_number": 15}
Feb 02 09:42:20 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-09:42:19.844532) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 02 09:42:20 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-09:42:20.037956) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 02 09:42:20 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-09:42:20.037968) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 02 09:42:20 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-09:42:20.037973) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 02 09:42:20 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-09:42:20.037978) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 02 09:42:20 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-09:42:20.037984) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 02 09:42:20 compute-1 ceph-osd[77691]: log_channel(cluster) log [DBG] : 8.1a scrub ok
Feb 02 09:42:20 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e77 e77: 3 total, 3 up, 3 in
Feb 02 09:42:20 compute-1 sudo[87288]: pam_unix(sudo:session): session closed for user root
Feb 02 09:42:20 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e77 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 02 09:42:20 compute-1 systemd[1]: ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1@nfs.cephfs.0.0.compute-1.mhzhsx.service: Scheduled restart job, restart counter is at 2.
Feb 02 09:42:20 compute-1 systemd[1]: Stopped Ceph nfs.cephfs.0.0.compute-1.mhzhsx for d241d473-9fcb-5f74-b163-f1ca4454e7f1.
Feb 02 09:42:20 compute-1 systemd[1]: ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1@nfs.cephfs.0.0.compute-1.mhzhsx.service: Consumed 1.105s CPU time.
Feb 02 09:42:20 compute-1 systemd[1]: Starting Ceph nfs.cephfs.0.0.compute-1.mhzhsx for d241d473-9fcb-5f74-b163-f1ca4454e7f1...
Feb 02 09:42:20 compute-1 sudo[87344]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 02 09:42:20 compute-1 sudo[87344]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:42:20 compute-1 sudo[87344]: pam_unix(sudo:session): session closed for user root
Feb 02 09:42:20 compute-1 sudo[87372]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/d241d473-9fcb-5f74-b163-f1ca4454e7f1/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --image quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec --timeout 895 list-networks
Feb 02 09:42:20 compute-1 sudo[87372]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:42:20 compute-1 ceph-mon[80115]: 11.18 scrub starts
Feb 02 09:42:20 compute-1 ceph-mon[80115]: 11.18 scrub ok
Feb 02 09:42:20 compute-1 ceph-mon[80115]: mgrmap e28: compute-0.djvyfo(active, since 2s), standbys: compute-2.gzlyac, compute-1.teascl
Feb 02 09:42:20 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb 02 09:42:20 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb 02 09:42:20 compute-1 ceph-mon[80115]: 12.0 scrub starts
Feb 02 09:42:20 compute-1 ceph-mon[80115]: 12.0 scrub ok
Feb 02 09:42:20 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb 02 09:42:20 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb 02 09:42:20 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "8"}]': finished
Feb 02 09:42:20 compute-1 ceph-mon[80115]: osdmap e77: 3 total, 3 up, 3 in
Feb 02 09:42:20 compute-1 ceph-mon[80115]: 12.1d scrub starts
Feb 02 09:42:20 compute-1 ceph-mon[80115]: 12.1d scrub ok
Feb 02 09:42:20 compute-1 podman[87440]: 2026-02-02 09:42:20.79212459 +0000 UTC m=+0.042435084 container create fc0d21172bebee4cc890b402589ad01587ff53c1f4d8ab1d900275946a1bfaf9 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250325, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Feb 02 09:42:20 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5b0aa32c59c7b5af7c31cc81a030baa01c0eb043b66d432659d68e87d7177710/merged/etc/ganesha supports timestamps until 2038 (0x7fffffff)
Feb 02 09:42:20 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5b0aa32c59c7b5af7c31cc81a030baa01c0eb043b66d432659d68e87d7177710/merged/etc/ceph/keyring supports timestamps until 2038 (0x7fffffff)
Feb 02 09:42:20 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5b0aa32c59c7b5af7c31cc81a030baa01c0eb043b66d432659d68e87d7177710/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Feb 02 09:42:20 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5b0aa32c59c7b5af7c31cc81a030baa01c0eb043b66d432659d68e87d7177710/merged/var/lib/ceph/radosgw/ceph-nfs.cephfs.0.0.compute-1.mhzhsx-rgw/keyring supports timestamps until 2038 (0x7fffffff)
Feb 02 09:42:20 compute-1 podman[87440]: 2026-02-02 09:42:20.850487714 +0000 UTC m=+0.100798218 container init fc0d21172bebee4cc890b402589ad01587ff53c1f4d8ab1d900275946a1bfaf9 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True)
Feb 02 09:42:20 compute-1 podman[87440]: 2026-02-02 09:42:20.855534978 +0000 UTC m=+0.105845462 container start fc0d21172bebee4cc890b402589ad01587ff53c1f4d8ab1d900275946a1bfaf9 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, io.buildah.version=1.40.1, ceph=True, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 02 09:42:20 compute-1 bash[87440]: fc0d21172bebee4cc890b402589ad01587ff53c1f4d8ab1d900275946a1bfaf9
Feb 02 09:42:20 compute-1 podman[87440]: 2026-02-02 09:42:20.775405089 +0000 UTC m=+0.025715623 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Feb 02 09:42:20 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:42:20 : epoch 6980717c : compute-1 : ganesha.nfsd-2[main] init_logging :LOG :NULL :LOG: Setting log level for all components to NIV_EVENT
Feb 02 09:42:20 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:42:20 : epoch 6980717c : compute-1 : ganesha.nfsd-2[main] main :MAIN :EVENT :ganesha.nfsd Starting: Ganesha Version 5.9
Feb 02 09:42:20 compute-1 systemd[1]: Started Ceph nfs.cephfs.0.0.compute-1.mhzhsx for d241d473-9fcb-5f74-b163-f1ca4454e7f1.
Feb 02 09:42:20 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:42:20 : epoch 6980717c : compute-1 : ganesha.nfsd-2[main] nfs_set_param_from_conf :NFS STARTUP :EVENT :Configuration file successfully parsed
Feb 02 09:42:20 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:42:20 : epoch 6980717c : compute-1 : ganesha.nfsd-2[main] monitoring_init :NFS STARTUP :EVENT :Init monitoring at 0.0.0.0:9587
Feb 02 09:42:20 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:42:20 : epoch 6980717c : compute-1 : ganesha.nfsd-2[main] fsal_init_fds_limit :MDCACHE LRU :EVENT :Setting the system-imposed limit on FDs to 1048576.
Feb 02 09:42:20 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:42:20 : epoch 6980717c : compute-1 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :Initializing ID Mapper.
Feb 02 09:42:20 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:42:20 : epoch 6980717c : compute-1 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :ID Mapper successfully initialized.
Feb 02 09:42:20 compute-1 sudo[87372]: pam_unix(sudo:session): session closed for user root
Feb 02 09:42:20 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:42:20 : epoch 6980717c : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Feb 02 09:42:20 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:42:20 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:42:20 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:42:20.956 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:42:21 compute-1 ceph-osd[77691]: log_channel(cluster) log [DBG] : 8.1e scrub starts
Feb 02 09:42:21 compute-1 ceph-osd[77691]: log_channel(cluster) log [DBG] : 8.1e scrub ok
Feb 02 09:42:21 compute-1 sshd-session[87514]: Accepted publickey for zuul from 192.168.122.30 port 56634 ssh2: ECDSA SHA256:RIWOugHsRom13QN8+H2eekzMj6VNcm6gUxie+zDStiQ
Feb 02 09:42:21 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:42:21 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:42:21 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:42:21.048 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:42:21 compute-1 systemd-logind[805]: New session 37 of user zuul.
Feb 02 09:42:21 compute-1 systemd[1]: Started Session 37 of User zuul.
Feb 02 09:42:21 compute-1 sshd-session[87514]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Feb 02 09:42:21 compute-1 ceph-mon[80115]: 8.1a scrub starts
Feb 02 09:42:21 compute-1 ceph-mon[80115]: 8.1a scrub ok
Feb 02 09:42:21 compute-1 ceph-mon[80115]: 12.1f scrub starts
Feb 02 09:42:21 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb 02 09:42:21 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb 02 09:42:21 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "config rm", "who": "osd/host:compute-2", "name": "osd_memory_target"}]: dispatch
Feb 02 09:42:21 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb 02 09:42:21 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb 02 09:42:21 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "config rm", "who": "osd/host:compute-1", "name": "osd_memory_target"}]: dispatch
Feb 02 09:42:21 compute-1 ceph-mon[80115]: pgmap v6: 353 pgs: 353 active+clean; 457 KiB data, 125 MiB used, 60 GiB / 60 GiB avail
Feb 02 09:42:21 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "9"}]: dispatch
Feb 02 09:42:21 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb 02 09:42:21 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb 02 09:42:21 compute-1 ceph-mon[80115]: 8.1c scrub starts
Feb 02 09:42:21 compute-1 ceph-mon[80115]: 8.1c scrub ok
Feb 02 09:42:21 compute-1 ceph-osd[77691]: log_channel(cluster) log [DBG] : 11.1f deep-scrub starts
Feb 02 09:42:22 compute-1 ceph-osd[77691]: log_channel(cluster) log [DBG] : 11.1f deep-scrub ok
Feb 02 09:42:22 compute-1 python3.9[87667]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 02 09:42:22 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e78 e78: 3 total, 3 up, 3 in
Feb 02 09:42:22 compute-1 sudo[87724]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /etc/ceph
Feb 02 09:42:22 compute-1 sudo[87724]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:42:22 compute-1 sudo[87724]: pam_unix(sudo:session): session closed for user root
Feb 02 09:42:22 compute-1 sudo[87749]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-d241d473-9fcb-5f74-b163-f1ca4454e7f1/etc/ceph
Feb 02 09:42:22 compute-1 sudo[87749]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:42:22 compute-1 sudo[87749]: pam_unix(sudo:session): session closed for user root
Feb 02 09:42:22 compute-1 ceph-mon[80115]: 12.1f scrub ok
Feb 02 09:42:22 compute-1 ceph-mon[80115]: 8.1e scrub starts
Feb 02 09:42:22 compute-1 ceph-mon[80115]: 8.1e scrub ok
Feb 02 09:42:22 compute-1 ceph-mon[80115]: 10.1c scrub starts
Feb 02 09:42:22 compute-1 ceph-mon[80115]: 10.1c scrub ok
Feb 02 09:42:22 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "9"}]': finished
Feb 02 09:42:22 compute-1 ceph-mon[80115]: osdmap e78: 3 total, 3 up, 3 in
Feb 02 09:42:22 compute-1 ceph-mon[80115]: mgrmap e29: compute-0.djvyfo(active, since 5s), standbys: compute-2.gzlyac, compute-1.teascl
Feb 02 09:42:22 compute-1 ceph-mon[80115]: 8.1f scrub starts
Feb 02 09:42:22 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb 02 09:42:22 compute-1 ceph-mon[80115]: 8.1f scrub ok
Feb 02 09:42:22 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb 02 09:42:22 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "config rm", "who": "osd/host:compute-0", "name": "osd_memory_target"}]: dispatch
Feb 02 09:42:22 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Feb 02 09:42:22 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Feb 02 09:42:22 compute-1 sudo[87774]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-d241d473-9fcb-5f74-b163-f1ca4454e7f1/etc/ceph/ceph.conf.new
Feb 02 09:42:22 compute-1 sudo[87774]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:42:22 compute-1 sudo[87774]: pam_unix(sudo:session): session closed for user root
Feb 02 09:42:22 compute-1 sudo[87806]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-d241d473-9fcb-5f74-b163-f1ca4454e7f1
Feb 02 09:42:22 compute-1 sudo[87806]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:42:22 compute-1 sudo[87806]: pam_unix(sudo:session): session closed for user root
Feb 02 09:42:22 compute-1 sudo[87831]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-d241d473-9fcb-5f74-b163-f1ca4454e7f1/etc/ceph/ceph.conf.new
Feb 02 09:42:22 compute-1 sudo[87831]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:42:22 compute-1 sudo[87831]: pam_unix(sudo:session): session closed for user root
Feb 02 09:42:22 compute-1 sudo[87879]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-d241d473-9fcb-5f74-b163-f1ca4454e7f1/etc/ceph/ceph.conf.new
Feb 02 09:42:22 compute-1 sudo[87879]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:42:22 compute-1 sudo[87879]: pam_unix(sudo:session): session closed for user root
Feb 02 09:42:22 compute-1 ceph-osd[77691]: log_channel(cluster) log [DBG] : 8.1d scrub starts
Feb 02 09:42:22 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:42:22 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:42:22 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:42:22.958 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:42:22 compute-1 ceph-osd[77691]: log_channel(cluster) log [DBG] : 8.1d scrub ok
Feb 02 09:42:22 compute-1 sudo[87909]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-d241d473-9fcb-5f74-b163-f1ca4454e7f1/etc/ceph/ceph.conf.new
Feb 02 09:42:22 compute-1 sudo[87909]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:42:22 compute-1 sudo[87909]: pam_unix(sudo:session): session closed for user root
Feb 02 09:42:23 compute-1 sudo[87953]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-d241d473-9fcb-5f74-b163-f1ca4454e7f1/etc/ceph/ceph.conf.new /etc/ceph/ceph.conf
Feb 02 09:42:23 compute-1 sudo[87953]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:42:23 compute-1 sudo[87953]: pam_unix(sudo:session): session closed for user root
Feb 02 09:42:23 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:42:23 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:42:23 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:42:23.050 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:42:23 compute-1 sudo[87978]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/d241d473-9fcb-5f74-b163-f1ca4454e7f1/config
Feb 02 09:42:23 compute-1 sudo[87978]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:42:23 compute-1 sudo[87978]: pam_unix(sudo:session): session closed for user root
Feb 02 09:42:23 compute-1 sudo[88003]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-d241d473-9fcb-5f74-b163-f1ca4454e7f1/var/lib/ceph/d241d473-9fcb-5f74-b163-f1ca4454e7f1/config
Feb 02 09:42:23 compute-1 sudo[88003]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:42:23 compute-1 sudo[88003]: pam_unix(sudo:session): session closed for user root
Feb 02 09:42:23 compute-1 sudo[88028]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-d241d473-9fcb-5f74-b163-f1ca4454e7f1/var/lib/ceph/d241d473-9fcb-5f74-b163-f1ca4454e7f1/config/ceph.conf.new
Feb 02 09:42:23 compute-1 sudo[88028]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:42:23 compute-1 sudo[88028]: pam_unix(sudo:session): session closed for user root
Feb 02 09:42:23 compute-1 sudo[88053]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-d241d473-9fcb-5f74-b163-f1ca4454e7f1
Feb 02 09:42:23 compute-1 sudo[88053]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:42:23 compute-1 sudo[88053]: pam_unix(sudo:session): session closed for user root
Feb 02 09:42:23 compute-1 sudo[88078]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-d241d473-9fcb-5f74-b163-f1ca4454e7f1/var/lib/ceph/d241d473-9fcb-5f74-b163-f1ca4454e7f1/config/ceph.conf.new
Feb 02 09:42:23 compute-1 sudo[88078]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:42:23 compute-1 sudo[88078]: pam_unix(sudo:session): session closed for user root
Feb 02 09:42:23 compute-1 sudo[88126]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-d241d473-9fcb-5f74-b163-f1ca4454e7f1/var/lib/ceph/d241d473-9fcb-5f74-b163-f1ca4454e7f1/config/ceph.conf.new
Feb 02 09:42:23 compute-1 sudo[88126]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:42:23 compute-1 sudo[88126]: pam_unix(sudo:session): session closed for user root
Feb 02 09:42:23 compute-1 sudo[88151]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-d241d473-9fcb-5f74-b163-f1ca4454e7f1/var/lib/ceph/d241d473-9fcb-5f74-b163-f1ca4454e7f1/config/ceph.conf.new
Feb 02 09:42:23 compute-1 sudo[88151]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:42:23 compute-1 sudo[88151]: pam_unix(sudo:session): session closed for user root
Feb 02 09:42:23 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 78 pg[9.8( v 44'1041 (0'0,44'1041] local-lis/les=57/58 n=6 ec=57/38 lis/c=57/57 les/c/f=58/58/0 sis=78 pruub=10.051406860s) [2] r=-1 lpr=78 pi=[57,78)/1 crt=44'1041 lcod 0'0 mlcod 0'0 active pruub 204.756744385s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb 02 09:42:23 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 78 pg[9.8( v 44'1041 (0'0,44'1041] local-lis/les=57/58 n=6 ec=57/38 lis/c=57/57 les/c/f=58/58/0 sis=78 pruub=10.051076889s) [2] r=-1 lpr=78 pi=[57,78)/1 crt=44'1041 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 204.756744385s@ mbc={}] state<Start>: transitioning to Stray
Feb 02 09:42:23 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 78 pg[9.18( v 44'1041 (0'0,44'1041] local-lis/les=57/58 n=5 ec=57/38 lis/c=57/57 les/c/f=58/58/0 sis=78 pruub=10.051866531s) [2] r=-1 lpr=78 pi=[57,78)/1 crt=44'1041 lcod 0'0 mlcod 0'0 active pruub 204.757888794s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb 02 09:42:23 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 78 pg[9.18( v 44'1041 (0'0,44'1041] local-lis/les=57/58 n=5 ec=57/38 lis/c=57/57 les/c/f=58/58/0 sis=78 pruub=10.051836014s) [2] r=-1 lpr=78 pi=[57,78)/1 crt=44'1041 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 204.757888794s@ mbc={}] state<Start>: transitioning to Stray
Feb 02 09:42:23 compute-1 sudo[88176]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-d241d473-9fcb-5f74-b163-f1ca4454e7f1/var/lib/ceph/d241d473-9fcb-5f74-b163-f1ca4454e7f1/config/ceph.conf.new /var/lib/ceph/d241d473-9fcb-5f74-b163-f1ca4454e7f1/config/ceph.conf
Feb 02 09:42:23 compute-1 sudo[88176]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:42:23 compute-1 sudo[88176]: pam_unix(sudo:session): session closed for user root
Feb 02 09:42:23 compute-1 sudo[88201]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /etc/ceph
Feb 02 09:42:23 compute-1 sudo[88201]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:42:23 compute-1 sudo[88201]: pam_unix(sudo:session): session closed for user root
Feb 02 09:42:23 compute-1 sudo[88226]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-d241d473-9fcb-5f74-b163-f1ca4454e7f1/etc/ceph
Feb 02 09:42:23 compute-1 sudo[88226]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:42:23 compute-1 sudo[88226]: pam_unix(sudo:session): session closed for user root
Feb 02 09:42:23 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e79 e79: 3 total, 3 up, 3 in
Feb 02 09:42:23 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 79 pg[9.8( v 44'1041 (0'0,44'1041] local-lis/les=57/58 n=6 ec=57/38 lis/c=57/57 les/c/f=58/58/0 sis=79) [2]/[0] r=0 lpr=79 pi=[57,79)/1 crt=44'1041 lcod 0'0 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Feb 02 09:42:23 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 79 pg[9.8( v 44'1041 (0'0,44'1041] local-lis/les=57/58 n=6 ec=57/38 lis/c=57/57 les/c/f=58/58/0 sis=79) [2]/[0] r=0 lpr=79 pi=[57,79)/1 crt=44'1041 lcod 0'0 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Feb 02 09:42:23 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 79 pg[9.18( v 44'1041 (0'0,44'1041] local-lis/les=57/58 n=5 ec=57/38 lis/c=57/57 les/c/f=58/58/0 sis=79) [2]/[0] r=0 lpr=79 pi=[57,79)/1 crt=44'1041 lcod 0'0 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Feb 02 09:42:23 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 79 pg[9.18( v 44'1041 (0'0,44'1041] local-lis/les=57/58 n=5 ec=57/38 lis/c=57/57 les/c/f=58/58/0 sis=79) [2]/[0] r=0 lpr=79 pi=[57,79)/1 crt=44'1041 lcod 0'0 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Feb 02 09:42:23 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 79 pg[9.9( v 44'1041 (0'0,44'1041] local-lis/les=57/58 n=6 ec=57/38 lis/c=57/57 les/c/f=58/58/0 sis=79 pruub=9.832950592s) [2] r=-1 lpr=79 pi=[57,79)/1 crt=44'1041 lcod 0'0 mlcod 0'0 active pruub 204.756683350s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb 02 09:42:23 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 79 pg[9.9( v 44'1041 (0'0,44'1041] local-lis/les=57/58 n=6 ec=57/38 lis/c=57/57 les/c/f=58/58/0 sis=79 pruub=9.832920074s) [2] r=-1 lpr=79 pi=[57,79)/1 crt=44'1041 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 204.756683350s@ mbc={}] state<Start>: transitioning to Stray
Feb 02 09:42:23 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 79 pg[9.19( v 44'1041 (0'0,44'1041] local-lis/les=57/58 n=5 ec=57/38 lis/c=57/57 les/c/f=58/58/0 sis=79 pruub=9.833053589s) [2] r=-1 lpr=79 pi=[57,79)/1 crt=44'1041 lcod 0'0 mlcod 0'0 active pruub 204.757949829s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb 02 09:42:23 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 79 pg[9.19( v 44'1041 (0'0,44'1041] local-lis/les=57/58 n=5 ec=57/38 lis/c=57/57 les/c/f=58/58/0 sis=79 pruub=9.833026886s) [2] r=-1 lpr=79 pi=[57,79)/1 crt=44'1041 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 204.757949829s@ mbc={}] state<Start>: transitioning to Stray
Feb 02 09:42:23 compute-1 ceph-mon[80115]: 11.1f deep-scrub starts
Feb 02 09:42:23 compute-1 ceph-mon[80115]: 11.1f deep-scrub ok
Feb 02 09:42:23 compute-1 ceph-mon[80115]: Updating compute-0:/etc/ceph/ceph.conf
Feb 02 09:42:23 compute-1 ceph-mon[80115]: Updating compute-1:/etc/ceph/ceph.conf
Feb 02 09:42:23 compute-1 ceph-mon[80115]: Updating compute-2:/etc/ceph/ceph.conf
Feb 02 09:42:23 compute-1 ceph-mon[80115]: 10.1a deep-scrub starts
Feb 02 09:42:23 compute-1 ceph-mon[80115]: 10.1a deep-scrub ok
Feb 02 09:42:23 compute-1 ceph-mon[80115]: pgmap v8: 353 pgs: 353 active+clean; 457 KiB data, 125 MiB used, 60 GiB / 60 GiB avail
Feb 02 09:42:23 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "10"}]: dispatch
Feb 02 09:42:23 compute-1 ceph-mon[80115]: Updating compute-1:/var/lib/ceph/d241d473-9fcb-5f74-b163-f1ca4454e7f1/config/ceph.conf
Feb 02 09:42:23 compute-1 ceph-mon[80115]: Updating compute-2:/var/lib/ceph/d241d473-9fcb-5f74-b163-f1ca4454e7f1/config/ceph.conf
Feb 02 09:42:23 compute-1 ceph-mon[80115]: Updating compute-0:/var/lib/ceph/d241d473-9fcb-5f74-b163-f1ca4454e7f1/config/ceph.conf
Feb 02 09:42:23 compute-1 ceph-mon[80115]: 10.4 scrub starts
Feb 02 09:42:23 compute-1 ceph-mon[80115]: 10.4 scrub ok
Feb 02 09:42:23 compute-1 sudo[88251]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-d241d473-9fcb-5f74-b163-f1ca4454e7f1/etc/ceph/ceph.client.admin.keyring.new
Feb 02 09:42:23 compute-1 sudo[88251]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:42:23 compute-1 sudo[88251]: pam_unix(sudo:session): session closed for user root
Feb 02 09:42:23 compute-1 sudo[88276]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-d241d473-9fcb-5f74-b163-f1ca4454e7f1
Feb 02 09:42:23 compute-1 sudo[88276]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:42:23 compute-1 sudo[88276]: pam_unix(sudo:session): session closed for user root
Feb 02 09:42:23 compute-1 sudo[88324]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-d241d473-9fcb-5f74-b163-f1ca4454e7f1/etc/ceph/ceph.client.admin.keyring.new
Feb 02 09:42:23 compute-1 sudo[88324]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:42:23 compute-1 sudo[88324]: pam_unix(sudo:session): session closed for user root
Feb 02 09:42:23 compute-1 sudo[88401]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-d241d473-9fcb-5f74-b163-f1ca4454e7f1/etc/ceph/ceph.client.admin.keyring.new
Feb 02 09:42:23 compute-1 sudo[88401]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:42:23 compute-1 sudo[88401]: pam_unix(sudo:session): session closed for user root
Feb 02 09:42:23 compute-1 ceph-osd[77691]: log_channel(cluster) log [DBG] : 11.10 scrub starts
Feb 02 09:42:23 compute-1 ceph-osd[77691]: log_channel(cluster) log [DBG] : 11.10 scrub ok
Feb 02 09:42:24 compute-1 sudo[88426]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 600 /tmp/cephadm-d241d473-9fcb-5f74-b163-f1ca4454e7f1/etc/ceph/ceph.client.admin.keyring.new
Feb 02 09:42:24 compute-1 sudo[88426]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:42:24 compute-1 sudo[88426]: pam_unix(sudo:session): session closed for user root
Feb 02 09:42:24 compute-1 sudo[88454]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-d241d473-9fcb-5f74-b163-f1ca4454e7f1/etc/ceph/ceph.client.admin.keyring.new /etc/ceph/ceph.client.admin.keyring
Feb 02 09:42:24 compute-1 sudo[88454]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:42:24 compute-1 sudo[88454]: pam_unix(sudo:session): session closed for user root
Feb 02 09:42:24 compute-1 sudo[88500]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/d241d473-9fcb-5f74-b163-f1ca4454e7f1/config
Feb 02 09:42:24 compute-1 sudo[88500]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:42:24 compute-1 sudo[88500]: pam_unix(sudo:session): session closed for user root
Feb 02 09:42:24 compute-1 sudo[88597]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sztttcmaqwtpobkcuxkcgcppxzzdisbu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025343.7984724-52-261703540844414/AnsiballZ_command.py'
Feb 02 09:42:24 compute-1 sudo[88597]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:42:24 compute-1 sudo[88549]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-d241d473-9fcb-5f74-b163-f1ca4454e7f1/var/lib/ceph/d241d473-9fcb-5f74-b163-f1ca4454e7f1/config
Feb 02 09:42:24 compute-1 sudo[88549]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:42:24 compute-1 sudo[88549]: pam_unix(sudo:session): session closed for user root
Feb 02 09:42:24 compute-1 sudo[88602]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-d241d473-9fcb-5f74-b163-f1ca4454e7f1/var/lib/ceph/d241d473-9fcb-5f74-b163-f1ca4454e7f1/config/ceph.client.admin.keyring.new
Feb 02 09:42:24 compute-1 sudo[88602]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:42:24 compute-1 sudo[88602]: pam_unix(sudo:session): session closed for user root
Feb 02 09:42:24 compute-1 sudo[88627]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-d241d473-9fcb-5f74-b163-f1ca4454e7f1
Feb 02 09:42:24 compute-1 sudo[88627]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:42:24 compute-1 sudo[88627]: pam_unix(sudo:session): session closed for user root
Feb 02 09:42:24 compute-1 sudo[88652]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-d241d473-9fcb-5f74-b163-f1ca4454e7f1/var/lib/ceph/d241d473-9fcb-5f74-b163-f1ca4454e7f1/config/ceph.client.admin.keyring.new
Feb 02 09:42:24 compute-1 sudo[88652]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:42:24 compute-1 sudo[88652]: pam_unix(sudo:session): session closed for user root
Feb 02 09:42:24 compute-1 python3.9[88600]: ansible-ansible.legacy.command Invoked with _raw_params=set -euxo pipefail
                                            pushd /var/tmp
                                            curl -sL https://github.com/openstack-k8s-operators/repo-setup/archive/refs/heads/main.tar.gz | tar -xz
                                            pushd repo-setup-main
                                            python3 -m venv ./venv
                                            PBR_VERSION=0.0.0 ./venv/bin/pip install ./
                                            ./venv/bin/repo-setup current-podified -b antelope
                                            popd
                                            rm -rf repo-setup-main
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 02 09:42:24 compute-1 sudo[88706]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-d241d473-9fcb-5f74-b163-f1ca4454e7f1/var/lib/ceph/d241d473-9fcb-5f74-b163-f1ca4454e7f1/config/ceph.client.admin.keyring.new
Feb 02 09:42:24 compute-1 sudo[88706]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:42:24 compute-1 sudo[88706]: pam_unix(sudo:session): session closed for user root
Feb 02 09:42:24 compute-1 sudo[88731]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 600 /tmp/cephadm-d241d473-9fcb-5f74-b163-f1ca4454e7f1/var/lib/ceph/d241d473-9fcb-5f74-b163-f1ca4454e7f1/config/ceph.client.admin.keyring.new
Feb 02 09:42:24 compute-1 sudo[88731]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:42:24 compute-1 sudo[88731]: pam_unix(sudo:session): session closed for user root
Feb 02 09:42:24 compute-1 sudo[88757]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-d241d473-9fcb-5f74-b163-f1ca4454e7f1/var/lib/ceph/d241d473-9fcb-5f74-b163-f1ca4454e7f1/config/ceph.client.admin.keyring.new /var/lib/ceph/d241d473-9fcb-5f74-b163-f1ca4454e7f1/config/ceph.client.admin.keyring
Feb 02 09:42:24 compute-1 sudo[88757]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:42:24 compute-1 sudo[88757]: pam_unix(sudo:session): session closed for user root
Feb 02 09:42:24 compute-1 ceph-mon[80115]: 8.1d scrub starts
Feb 02 09:42:24 compute-1 ceph-mon[80115]: 8.1d scrub ok
Feb 02 09:42:24 compute-1 ceph-mon[80115]: Updating compute-2:/etc/ceph/ceph.client.admin.keyring
Feb 02 09:42:24 compute-1 ceph-mon[80115]: Updating compute-1:/etc/ceph/ceph.client.admin.keyring
Feb 02 09:42:24 compute-1 ceph-mon[80115]: Updating compute-0:/etc/ceph/ceph.client.admin.keyring
Feb 02 09:42:24 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "10"}]': finished
Feb 02 09:42:24 compute-1 ceph-mon[80115]: osdmap e79: 3 total, 3 up, 3 in
Feb 02 09:42:24 compute-1 ceph-mon[80115]: 10.1d scrub starts
Feb 02 09:42:24 compute-1 ceph-mon[80115]: 10.1d scrub ok
Feb 02 09:42:24 compute-1 ceph-mon[80115]: Updating compute-2:/var/lib/ceph/d241d473-9fcb-5f74-b163-f1ca4454e7f1/config/ceph.client.admin.keyring
Feb 02 09:42:24 compute-1 ceph-mon[80115]: Updating compute-1:/var/lib/ceph/d241d473-9fcb-5f74-b163-f1ca4454e7f1/config/ceph.client.admin.keyring
Feb 02 09:42:24 compute-1 ceph-mon[80115]: Updating compute-0:/var/lib/ceph/d241d473-9fcb-5f74-b163-f1ca4454e7f1/config/ceph.client.admin.keyring
Feb 02 09:42:24 compute-1 ceph-mon[80115]: 12.4 scrub starts
Feb 02 09:42:24 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb 02 09:42:24 compute-1 ceph-mon[80115]: 12.4 scrub ok
Feb 02 09:42:24 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb 02 09:42:24 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb 02 09:42:24 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb 02 09:42:24 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e80 e80: 3 total, 3 up, 3 in
Feb 02 09:42:24 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 80 pg[9.9( v 44'1041 (0'0,44'1041] local-lis/les=57/58 n=6 ec=57/38 lis/c=57/57 les/c/f=58/58/0 sis=80) [2]/[0] r=0 lpr=80 pi=[57,80)/1 crt=44'1041 lcod 0'0 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Feb 02 09:42:24 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 80 pg[9.9( v 44'1041 (0'0,44'1041] local-lis/les=57/58 n=6 ec=57/38 lis/c=57/57 les/c/f=58/58/0 sis=80) [2]/[0] r=0 lpr=80 pi=[57,80)/1 crt=44'1041 lcod 0'0 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Feb 02 09:42:24 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 80 pg[9.19( v 44'1041 (0'0,44'1041] local-lis/les=57/58 n=5 ec=57/38 lis/c=57/57 les/c/f=58/58/0 sis=80) [2]/[0] r=0 lpr=80 pi=[57,80)/1 crt=44'1041 lcod 0'0 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Feb 02 09:42:24 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 80 pg[9.19( v 44'1041 (0'0,44'1041] local-lis/les=57/58 n=5 ec=57/38 lis/c=57/57 les/c/f=58/58/0 sis=80) [2]/[0] r=0 lpr=80 pi=[57,80)/1 crt=44'1041 lcod 0'0 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Feb 02 09:42:24 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 80 pg[9.18( v 44'1041 (0'0,44'1041] local-lis/les=79/80 n=5 ec=57/38 lis/c=57/57 les/c/f=58/58/0 sis=79) [2]/[0] async=[2] r=0 lpr=79 pi=[57,79)/1 crt=44'1041 lcod 0'0 mlcod 0'0 active+remapped mbc={255={(0+1)=4}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 02 09:42:24 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 80 pg[9.8( v 44'1041 (0'0,44'1041] local-lis/les=79/80 n=6 ec=57/38 lis/c=57/57 les/c/f=58/58/0 sis=79) [2]/[0] async=[2] r=0 lpr=79 pi=[57,79)/1 crt=44'1041 lcod 0'0 mlcod 0'0 active+remapped mbc={255={(0+1)=7}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 02 09:42:24 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:42:24 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:42:24 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:42:24.960 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:42:25 compute-1 ceph-osd[77691]: log_channel(cluster) log [DBG] : 8.13 scrub starts
Feb 02 09:42:25 compute-1 ceph-osd[77691]: log_channel(cluster) log [DBG] : 8.13 scrub ok
Feb 02 09:42:25 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:42:25 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:42:25 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:42:25.052 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:42:25 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e80 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 02 09:42:25 compute-1 ceph-mon[80115]: 11.10 scrub starts
Feb 02 09:42:25 compute-1 ceph-mon[80115]: 11.10 scrub ok
Feb 02 09:42:25 compute-1 ceph-mon[80115]: 12.1b scrub starts
Feb 02 09:42:25 compute-1 ceph-mon[80115]: 12.1b scrub ok
Feb 02 09:42:25 compute-1 ceph-mon[80115]: osdmap e80: 3 total, 3 up, 3 in
Feb 02 09:42:25 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb 02 09:42:25 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb 02 09:42:25 compute-1 ceph-mon[80115]: pgmap v11: 353 pgs: 353 active+clean; 458 KiB data, 125 MiB used, 60 GiB / 60 GiB avail; 33 KiB/s rd, 0 B/s wr, 10 op/s
Feb 02 09:42:25 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "11"}]: dispatch
Feb 02 09:42:25 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb 02 09:42:25 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb 02 09:42:25 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Feb 02 09:42:25 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Feb 02 09:42:25 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Feb 02 09:42:25 compute-1 ceph-mon[80115]: 11.3 scrub starts
Feb 02 09:42:25 compute-1 ceph-mon[80115]: 11.3 scrub ok
Feb 02 09:42:25 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e81 e81: 3 total, 3 up, 3 in
Feb 02 09:42:25 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 81 pg[9.8( v 44'1041 (0'0,44'1041] local-lis/les=79/80 n=6 ec=57/38 lis/c=79/57 les/c/f=80/58/0 sis=81 pruub=15.000723839s) [2] async=[2] r=-1 lpr=81 pi=[57,81)/1 crt=44'1041 lcod 0'0 mlcod 0'0 active pruub 211.974411011s@ mbc={255={}}] PeeringState::start_peering_interval up [2] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb 02 09:42:25 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 81 pg[9.a( v 44'1041 (0'0,44'1041] local-lis/les=57/58 n=6 ec=57/38 lis/c=57/57 les/c/f=58/58/0 sis=81 pruub=15.783122063s) [1] r=-1 lpr=81 pi=[57,81)/1 crt=44'1041 lcod 0'0 mlcod 0'0 active pruub 212.757095337s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb 02 09:42:25 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 81 pg[9.8( v 44'1041 (0'0,44'1041] local-lis/les=79/80 n=6 ec=57/38 lis/c=79/57 les/c/f=80/58/0 sis=81 pruub=15.000452995s) [2] r=-1 lpr=81 pi=[57,81)/1 crt=44'1041 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 211.974411011s@ mbc={}] state<Start>: transitioning to Stray
Feb 02 09:42:25 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 81 pg[9.a( v 44'1041 (0'0,44'1041] local-lis/les=57/58 n=6 ec=57/38 lis/c=57/57 les/c/f=58/58/0 sis=81 pruub=15.783082008s) [1] r=-1 lpr=81 pi=[57,81)/1 crt=44'1041 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 212.757095337s@ mbc={}] state<Start>: transitioning to Stray
Feb 02 09:42:25 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 81 pg[9.1a( v 44'1041 (0'0,44'1041] local-lis/les=57/58 n=5 ec=57/38 lis/c=57/57 les/c/f=58/58/0 sis=81 pruub=15.783574104s) [1] r=-1 lpr=81 pi=[57,81)/1 crt=44'1041 lcod 0'0 mlcod 0'0 active pruub 212.758117676s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb 02 09:42:25 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 81 pg[9.1a( v 44'1041 (0'0,44'1041] local-lis/les=57/58 n=5 ec=57/38 lis/c=57/57 les/c/f=58/58/0 sis=81 pruub=15.783516884s) [1] r=-1 lpr=81 pi=[57,81)/1 crt=44'1041 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 212.758117676s@ mbc={}] state<Start>: transitioning to Stray
Feb 02 09:42:25 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 81 pg[9.18( v 44'1041 (0'0,44'1041] local-lis/les=79/80 n=5 ec=57/38 lis/c=79/57 les/c/f=80/58/0 sis=81 pruub=14.999728203s) [2] async=[2] r=-1 lpr=81 pi=[57,81)/1 crt=44'1041 lcod 0'0 mlcod 0'0 active pruub 211.974380493s@ mbc={255={}}] PeeringState::start_peering_interval up [2] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb 02 09:42:25 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 81 pg[9.18( v 44'1041 (0'0,44'1041] local-lis/les=79/80 n=5 ec=57/38 lis/c=79/57 les/c/f=80/58/0 sis=81 pruub=14.999654770s) [2] r=-1 lpr=81 pi=[57,81)/1 crt=44'1041 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 211.974380493s@ mbc={}] state<Start>: transitioning to Stray
Feb 02 09:42:25 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 81 pg[9.9( v 44'1041 (0'0,44'1041] local-lis/les=80/81 n=6 ec=57/38 lis/c=57/57 les/c/f=58/58/0 sis=80) [2]/[0] async=[2] r=0 lpr=80 pi=[57,80)/1 crt=44'1041 lcod 0'0 mlcod 0'0 active+remapped mbc={255={(0+1)=5}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 02 09:42:25 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 81 pg[9.19( v 44'1041 (0'0,44'1041] local-lis/les=80/81 n=5 ec=57/38 lis/c=57/57 les/c/f=58/58/0 sis=80) [2]/[0] async=[2] r=0 lpr=80 pi=[57,80)/1 crt=44'1041 lcod 0'0 mlcod 0'0 active+remapped mbc={255={(0+1)=7}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 02 09:42:26 compute-1 ceph-osd[77691]: log_channel(cluster) log [DBG] : 11.11 scrub starts
Feb 02 09:42:26 compute-1 ceph-osd[77691]: log_channel(cluster) log [DBG] : 11.11 scrub ok
Feb 02 09:42:26 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e82 e82: 3 total, 3 up, 3 in
Feb 02 09:42:26 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 82 pg[9.9( v 44'1041 (0'0,44'1041] local-lis/les=80/81 n=6 ec=57/38 lis/c=80/57 les/c/f=81/58/0 sis=82 pruub=15.465485573s) [2] async=[2] r=-1 lpr=82 pi=[57,82)/1 crt=44'1041 lcod 0'0 mlcod 0'0 active pruub 212.981048584s@ mbc={255={}}] PeeringState::start_peering_interval up [2] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb 02 09:42:26 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 82 pg[9.9( v 44'1041 (0'0,44'1041] local-lis/les=80/81 n=6 ec=57/38 lis/c=80/57 les/c/f=81/58/0 sis=82 pruub=15.465374947s) [2] r=-1 lpr=82 pi=[57,82)/1 crt=44'1041 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 212.981048584s@ mbc={}] state<Start>: transitioning to Stray
Feb 02 09:42:26 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 82 pg[9.a( v 44'1041 (0'0,44'1041] local-lis/les=57/58 n=6 ec=57/38 lis/c=57/57 les/c/f=58/58/0 sis=82) [1]/[0] r=0 lpr=82 pi=[57,82)/1 crt=44'1041 lcod 0'0 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [1] -> [1], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 1, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Feb 02 09:42:26 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 82 pg[9.a( v 44'1041 (0'0,44'1041] local-lis/les=57/58 n=6 ec=57/38 lis/c=57/57 les/c/f=58/58/0 sis=82) [1]/[0] r=0 lpr=82 pi=[57,82)/1 crt=44'1041 lcod 0'0 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Feb 02 09:42:26 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 82 pg[9.1a( v 44'1041 (0'0,44'1041] local-lis/les=57/58 n=5 ec=57/38 lis/c=57/57 les/c/f=58/58/0 sis=82) [1]/[0] r=0 lpr=82 pi=[57,82)/1 crt=44'1041 lcod 0'0 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [1] -> [1], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 1, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Feb 02 09:42:26 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 82 pg[9.1a( v 44'1041 (0'0,44'1041] local-lis/les=57/58 n=5 ec=57/38 lis/c=57/57 les/c/f=58/58/0 sis=82) [1]/[0] r=0 lpr=82 pi=[57,82)/1 crt=44'1041 lcod 0'0 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Feb 02 09:42:26 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 82 pg[9.19( v 44'1041 (0'0,44'1041] local-lis/les=80/81 n=5 ec=57/38 lis/c=80/57 les/c/f=81/58/0 sis=82 pruub=15.467186928s) [2] async=[2] r=-1 lpr=82 pi=[57,82)/1 crt=44'1041 lcod 0'0 mlcod 0'0 active pruub 212.984359741s@ mbc={255={}}] PeeringState::start_peering_interval up [2] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb 02 09:42:26 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 82 pg[9.19( v 44'1041 (0'0,44'1041] local-lis/les=80/81 n=5 ec=57/38 lis/c=80/57 les/c/f=81/58/0 sis=82 pruub=15.467089653s) [2] r=-1 lpr=82 pi=[57,82)/1 crt=44'1041 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 212.984359741s@ mbc={}] state<Start>: transitioning to Stray
Feb 02 09:42:26 compute-1 ceph-mon[80115]: 8.13 scrub starts
Feb 02 09:42:26 compute-1 ceph-mon[80115]: 8.13 scrub ok
Feb 02 09:42:26 compute-1 ceph-mon[80115]: 12.16 scrub starts
Feb 02 09:42:26 compute-1 ceph-mon[80115]: 12.16 scrub ok
Feb 02 09:42:26 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "11"}]': finished
Feb 02 09:42:26 compute-1 ceph-mon[80115]: osdmap e81: 3 total, 3 up, 3 in
Feb 02 09:42:26 compute-1 ceph-mon[80115]: Health check failed: 1 failed cephadm daemon(s) (CEPHADM_FAILED_DAEMON)
Feb 02 09:42:26 compute-1 ceph-mon[80115]: osdmap e82: 3 total, 3 up, 3 in
Feb 02 09:42:26 compute-1 ceph-mon[80115]: 12.2 deep-scrub starts
Feb 02 09:42:26 compute-1 ceph-mon[80115]: 12.2 deep-scrub ok
Feb 02 09:42:26 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:42:26 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000024s ======
Feb 02 09:42:26 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:42:26.962 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Feb 02 09:42:26 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:42:26 : epoch 6980717c : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Feb 02 09:42:26 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:42:26 : epoch 6980717c : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Feb 02 09:42:27 compute-1 ceph-osd[77691]: log_channel(cluster) log [DBG] : 10.2 scrub starts
Feb 02 09:42:27 compute-1 ceph-osd[77691]: log_channel(cluster) log [DBG] : 10.2 scrub ok
Feb 02 09:42:27 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:42:27 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:42:27 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:42:27.054 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:42:28 compute-1 ceph-osd[77691]: log_channel(cluster) log [DBG] : 10.13 scrub starts
Feb 02 09:42:28 compute-1 ceph-osd[77691]: log_channel(cluster) log [DBG] : 10.13 scrub ok
Feb 02 09:42:28 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e83 e83: 3 total, 3 up, 3 in
Feb 02 09:42:28 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 83 pg[9.a( v 44'1041 (0'0,44'1041] local-lis/les=82/83 n=6 ec=57/38 lis/c=57/57 les/c/f=58/58/0 sis=82) [1]/[0] async=[1] r=0 lpr=82 pi=[57,82)/1 crt=44'1041 lcod 0'0 mlcod 0'0 active+remapped mbc={255={(0+1)=9}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 02 09:42:28 compute-1 ceph-mon[80115]: 11.11 scrub starts
Feb 02 09:42:28 compute-1 ceph-mon[80115]: 11.11 scrub ok
Feb 02 09:42:28 compute-1 ceph-mon[80115]: 12.14 scrub starts
Feb 02 09:42:28 compute-1 ceph-mon[80115]: 12.14 scrub ok
Feb 02 09:42:28 compute-1 ceph-mon[80115]: pgmap v14: 353 pgs: 353 active+clean; 458 KiB data, 125 MiB used, 60 GiB / 60 GiB avail; 41 KiB/s rd, 0 B/s wr, 13 op/s
Feb 02 09:42:28 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "12"}]: dispatch
Feb 02 09:42:28 compute-1 ceph-mon[80115]: 10.2 scrub starts
Feb 02 09:42:28 compute-1 ceph-mon[80115]: 10.2 scrub ok
Feb 02 09:42:28 compute-1 ceph-mon[80115]: 8.b scrub starts
Feb 02 09:42:28 compute-1 ceph-mon[80115]: 8.b scrub ok
Feb 02 09:42:28 compute-1 ceph-mon[80115]: 12.1 deep-scrub starts
Feb 02 09:42:28 compute-1 ceph-mon[80115]: 12.1 deep-scrub ok
Feb 02 09:42:28 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 83 pg[9.1a( v 44'1041 (0'0,44'1041] local-lis/les=82/83 n=5 ec=57/38 lis/c=57/57 les/c/f=58/58/0 sis=82) [1]/[0] async=[1] r=0 lpr=82 pi=[57,82)/1 crt=44'1041 lcod 0'0 mlcod 0'0 active+remapped mbc={255={(0+1)=4}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 02 09:42:28 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:42:28 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:42:28 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:42:28.965 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:42:28 compute-1 ceph-osd[77691]: log_channel(cluster) log [DBG] : 12.b scrub starts
Feb 02 09:42:29 compute-1 ceph-osd[77691]: log_channel(cluster) log [DBG] : 12.b scrub ok
Feb 02 09:42:29 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:42:29 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:42:29 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:42:29.056 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:42:29 compute-1 ceph-mon[80115]: 10.13 scrub starts
Feb 02 09:42:29 compute-1 ceph-mon[80115]: 10.13 scrub ok
Feb 02 09:42:29 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "12"}]': finished
Feb 02 09:42:29 compute-1 ceph-mon[80115]: osdmap e83: 3 total, 3 up, 3 in
Feb 02 09:42:29 compute-1 ceph-mon[80115]: 8.5 scrub starts
Feb 02 09:42:29 compute-1 ceph-mon[80115]: 11.12 scrub starts
Feb 02 09:42:29 compute-1 ceph-mon[80115]: 11.12 scrub ok
Feb 02 09:42:29 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e84 e84: 3 total, 3 up, 3 in
Feb 02 09:42:29 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 84 pg[9.a( v 44'1041 (0'0,44'1041] local-lis/les=82/83 n=6 ec=57/38 lis/c=82/57 les/c/f=83/58/0 sis=84 pruub=14.974443436s) [1] async=[1] r=-1 lpr=84 pi=[57,84)/1 crt=44'1041 lcod 0'0 mlcod 0'0 active pruub 215.460998535s@ mbc={255={}}] PeeringState::start_peering_interval up [1] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 1 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb 02 09:42:29 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 84 pg[9.a( v 44'1041 (0'0,44'1041] local-lis/les=82/83 n=6 ec=57/38 lis/c=82/57 les/c/f=83/58/0 sis=84 pruub=14.974328041s) [1] r=-1 lpr=84 pi=[57,84)/1 crt=44'1041 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 215.460998535s@ mbc={}] state<Start>: transitioning to Stray
Feb 02 09:42:29 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 84 pg[9.1a( v 44'1041 (0'0,44'1041] local-lis/les=82/83 n=5 ec=57/38 lis/c=82/57 les/c/f=83/58/0 sis=84 pruub=14.982478142s) [1] async=[1] r=-1 lpr=84 pi=[57,84)/1 crt=44'1041 lcod 0'0 mlcod 0'0 active pruub 215.469451904s@ mbc={255={}}] PeeringState::start_peering_interval up [1] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 1 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb 02 09:42:29 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 84 pg[9.1a( v 44'1041 (0'0,44'1041] local-lis/les=82/83 n=5 ec=57/38 lis/c=82/57 les/c/f=83/58/0 sis=84 pruub=14.982404709s) [1] r=-1 lpr=84 pi=[57,84)/1 crt=44'1041 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 215.469451904s@ mbc={}] state<Start>: transitioning to Stray
Feb 02 09:42:29 compute-1 ceph-osd[77691]: log_channel(cluster) log [DBG] : 12.8 scrub starts
Feb 02 09:42:29 compute-1 ceph-osd[77691]: log_channel(cluster) log [DBG] : 12.8 scrub ok
Feb 02 09:42:30 compute-1 ceph-mon[80115]: 8.5 scrub ok
Feb 02 09:42:30 compute-1 ceph-mon[80115]: pgmap v16: 353 pgs: 1 active+clean+scrubbing, 2 remapped+peering, 2 peering, 348 active+clean; 458 KiB data, 125 MiB used, 60 GiB / 60 GiB avail; 133 B/s, 5 objects/s recovering
Feb 02 09:42:30 compute-1 ceph-mon[80115]: 12.b scrub starts
Feb 02 09:42:30 compute-1 ceph-mon[80115]: 12.b scrub ok
Feb 02 09:42:30 compute-1 ceph-mon[80115]: osdmap e84: 3 total, 3 up, 3 in
Feb 02 09:42:30 compute-1 ceph-mon[80115]: 11.8 scrub starts
Feb 02 09:42:30 compute-1 ceph-mon[80115]: 11.8 scrub ok
Feb 02 09:42:30 compute-1 ceph-mon[80115]: 11.1 scrub starts
Feb 02 09:42:30 compute-1 ceph-mon[80115]: 11.1 scrub ok
Feb 02 09:42:30 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e85 e85: 3 total, 3 up, 3 in
Feb 02 09:42:30 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e85 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 02 09:42:30 compute-1 ceph-osd[77691]: log_channel(cluster) log [DBG] : 10.8 scrub starts
Feb 02 09:42:30 compute-1 ceph-osd[77691]: log_channel(cluster) log [DBG] : 10.8 scrub ok
Feb 02 09:42:30 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:42:30 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000024s ======
Feb 02 09:42:30 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:42:30.967 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Feb 02 09:42:31 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:42:31 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000024s ======
Feb 02 09:42:31 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:42:31.058 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Feb 02 09:42:31 compute-1 sudo[88808]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Feb 02 09:42:31 compute-1 sudo[88808]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:42:31 compute-1 sudo[88808]: pam_unix(sudo:session): session closed for user root
Feb 02 09:42:31 compute-1 sudo[88833]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 02 09:42:31 compute-1 sudo[88833]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:42:31 compute-1 sudo[88833]: pam_unix(sudo:session): session closed for user root
Feb 02 09:42:31 compute-1 ceph-mon[80115]: 12.8 scrub starts
Feb 02 09:42:31 compute-1 ceph-mon[80115]: 12.8 scrub ok
Feb 02 09:42:31 compute-1 ceph-mon[80115]: osdmap e85: 3 total, 3 up, 3 in
Feb 02 09:42:31 compute-1 ceph-mon[80115]: 8.a scrub starts
Feb 02 09:42:31 compute-1 ceph-mon[80115]: 8.a scrub ok
Feb 02 09:42:31 compute-1 ceph-mon[80115]: 11.14 scrub starts
Feb 02 09:42:31 compute-1 ceph-mon[80115]: 11.14 scrub ok
Feb 02 09:42:31 compute-1 ceph-mon[80115]: pgmap v19: 353 pgs: 1 active+clean+scrubbing, 2 remapped+peering, 2 peering, 348 active+clean; 458 KiB data, 125 MiB used, 60 GiB / 60 GiB avail; 120 B/s, 5 objects/s recovering
Feb 02 09:42:31 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb 02 09:42:31 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb 02 09:42:31 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb 02 09:42:31 compute-1 sudo[88597]: pam_unix(sudo:session): session closed for user root
Feb 02 09:42:31 compute-1 ceph-osd[77691]: log_channel(cluster) log [DBG] : 12.e scrub starts
Feb 02 09:42:31 compute-1 ceph-osd[77691]: log_channel(cluster) log [DBG] : 12.e scrub ok
Feb 02 09:42:32 compute-1 ceph-mon[80115]: 10.8 scrub starts
Feb 02 09:42:32 compute-1 ceph-mon[80115]: 10.8 scrub ok
Feb 02 09:42:32 compute-1 ceph-mon[80115]: Reconfiguring mon.compute-0 (monmap changed)...
Feb 02 09:42:32 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "auth get", "entity": "mon."}]: dispatch
Feb 02 09:42:32 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "config get", "who": "mon", "key": "public_network"}]: dispatch
Feb 02 09:42:32 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Feb 02 09:42:32 compute-1 ceph-mon[80115]: Reconfiguring daemon mon.compute-0 on compute-0
Feb 02 09:42:32 compute-1 ceph-mon[80115]: 10.f scrub starts
Feb 02 09:42:32 compute-1 ceph-mon[80115]: 10.f scrub ok
Feb 02 09:42:32 compute-1 ceph-mon[80115]: 11.f scrub starts
Feb 02 09:42:32 compute-1 ceph-mon[80115]: 11.f scrub ok
Feb 02 09:42:32 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb 02 09:42:32 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb 02 09:42:32 compute-1 ceph-mon[80115]: Reconfiguring mgr.compute-0.djvyfo (monmap changed)...
Feb 02 09:42:32 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "auth get-or-create", "entity": "mgr.compute-0.djvyfo", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]: dispatch
Feb 02 09:42:32 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "mgr services"}]: dispatch
Feb 02 09:42:32 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Feb 02 09:42:32 compute-1 ceph-mon[80115]: Reconfiguring daemon mgr.compute-0.djvyfo on compute-0
Feb 02 09:42:32 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb 02 09:42:32 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Feb 02 09:42:32 compute-1 sshd-session[87517]: Connection closed by 192.168.122.30 port 56634
Feb 02 09:42:32 compute-1 sshd-session[87514]: pam_unix(sshd:session): session closed for user zuul
Feb 02 09:42:32 compute-1 systemd[1]: session-37.scope: Deactivated successfully.
Feb 02 09:42:32 compute-1 systemd[1]: session-37.scope: Consumed 7.908s CPU time.
Feb 02 09:42:32 compute-1 systemd-logind[805]: Session 37 logged out. Waiting for processes to exit.
Feb 02 09:42:32 compute-1 systemd-logind[805]: Removed session 37.
Feb 02 09:42:32 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:42:32 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:42:32 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:42:32.970 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:42:32 compute-1 ceph-osd[77691]: log_channel(cluster) log [DBG] : 12.c scrub starts
Feb 02 09:42:32 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:42:32 : epoch 6980717c : compute-1 : ganesha.nfsd-2[main] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Feb 02 09:42:32 compute-1 ceph-osd[77691]: log_channel(cluster) log [DBG] : 12.c scrub ok
Feb 02 09:42:32 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:42:32 : epoch 6980717c : compute-1 : ganesha.nfsd-2[main] main :NFS STARTUP :WARN :No export entries found in configuration file !!!
Feb 02 09:42:32 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:42:32 : epoch 6980717c : compute-1 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:24): Unknown block (RADOS_URLS)
Feb 02 09:42:32 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:42:32 : epoch 6980717c : compute-1 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:29): Unknown block (RGW)
Feb 02 09:42:32 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:42:32 : epoch 6980717c : compute-1 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :CAP_SYS_RESOURCE was successfully removed for proper quota management in FSAL
Feb 02 09:42:32 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:42:32 : epoch 6980717c : compute-1 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :currently set capabilities are: cap_chown,cap_dac_override,cap_fowner,cap_fsetid,cap_kill,cap_setgid,cap_setuid,cap_setpcap,cap_net_bind_service,cap_sys_chroot,cap_setfcap=ep
Feb 02 09:42:32 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:42:32 : epoch 6980717c : compute-1 : ganesha.nfsd-2[main] gsh_dbus_pkginit :DBUS :CRIT :dbus_bus_get failed (Failed to connect to socket /run/dbus/system_bus_socket: No such file or directory)
Feb 02 09:42:32 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:42:32 : epoch 6980717c : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Feb 02 09:42:32 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:42:32 : epoch 6980717c : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Feb 02 09:42:32 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:42:32 : epoch 6980717c : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Feb 02 09:42:33 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:42:33 : epoch 6980717c : compute-1 : ganesha.nfsd-2[main] nfs_Init_svc :DISP :CRIT :Cannot acquire credentials for principal nfs
Feb 02 09:42:33 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:42:33 : epoch 6980717c : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Feb 02 09:42:33 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:42:33 : epoch 6980717c : compute-1 : ganesha.nfsd-2[main] nfs_Init_admin_thread :NFS CB :EVENT :Admin thread initialized
Feb 02 09:42:33 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:42:33 : epoch 6980717c : compute-1 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :EVENT :Callback creds directory (/var/run/ganesha) already exists
Feb 02 09:42:33 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:42:33 : epoch 6980717c : compute-1 : ganesha.nfsd-2[main] find_keytab_entry :NFS CB :WARN :Configuration file does not specify default realm while getting default realm name
Feb 02 09:42:33 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:42:33 : epoch 6980717c : compute-1 : ganesha.nfsd-2[main] gssd_refresh_krb5_machine_credential :NFS CB :CRIT :ERROR: gssd_refresh_krb5_machine_credential: no usable keytab entry found in keytab /etc/krb5.keytab for connection with host localhost
Feb 02 09:42:33 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:42:33 : epoch 6980717c : compute-1 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :WARN :gssd_refresh_krb5_machine_credential failed (-1765328160:0)
Feb 02 09:42:33 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:42:33 : epoch 6980717c : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :Starting delayed executor.
Feb 02 09:42:33 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:42:33 : epoch 6980717c : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :gsh_dbusthread was started successfully
Feb 02 09:42:33 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:42:33 : epoch 6980717c : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :admin thread was started successfully
Feb 02 09:42:33 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:42:33 : epoch 6980717c : compute-1 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :CRIT :DBUS not initialized, service thread exiting
Feb 02 09:42:33 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:42:33 : epoch 6980717c : compute-1 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :EVENT :shutdown
Feb 02 09:42:33 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:42:33 : epoch 6980717c : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :reaper thread was started successfully
Feb 02 09:42:33 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:42:33 : epoch 6980717c : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :General fridge was started successfully
Feb 02 09:42:33 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:42:33 : epoch 6980717c : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Feb 02 09:42:33 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:42:33 : epoch 6980717c : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :             NFS SERVER INITIALIZED
Feb 02 09:42:33 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:42:33 : epoch 6980717c : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Feb 02 09:42:33 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:42:33 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2e4000df0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:42:33 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:42:33 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000025s ======
Feb 02 09:42:33 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:42:33.061 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Feb 02 09:42:33 compute-1 ceph-mon[80115]: 12.e scrub starts
Feb 02 09:42:33 compute-1 ceph-mon[80115]: 12.e scrub ok
Feb 02 09:42:33 compute-1 ceph-mon[80115]: 11.e scrub starts
Feb 02 09:42:33 compute-1 ceph-mon[80115]: 11.e scrub ok
Feb 02 09:42:33 compute-1 ceph-mon[80115]: 11.4 deep-scrub starts
Feb 02 09:42:33 compute-1 ceph-mon[80115]: 11.4 deep-scrub ok
Feb 02 09:42:33 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb 02 09:42:33 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb 02 09:42:33 compute-1 ceph-mon[80115]: Reconfiguring crash.compute-0 (monmap changed)...
Feb 02 09:42:33 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "auth get-or-create", "entity": "client.crash.compute-0", "caps": ["mon", "profile crash", "mgr", "profile crash"]}]: dispatch
Feb 02 09:42:33 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Feb 02 09:42:33 compute-1 ceph-mon[80115]: Reconfiguring daemon crash.compute-0 on compute-0
Feb 02 09:42:33 compute-1 ceph-mon[80115]: pgmap v20: 353 pgs: 1 active+clean+scrubbing, 2 remapped+peering, 2 peering, 348 active+clean; 458 KiB data, 125 MiB used, 60 GiB / 60 GiB avail; 91 B/s, 3 objects/s recovering
Feb 02 09:42:33 compute-1 ceph-osd[77691]: log_channel(cluster) log [DBG] : 12.12 scrub starts
Feb 02 09:42:33 compute-1 ceph-osd[77691]: log_channel(cluster) log [DBG] : 12.12 scrub ok
Feb 02 09:42:34 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:42:34 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2e4000df0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:42:34 compute-1 ceph-mon[80115]: 12.c scrub starts
Feb 02 09:42:34 compute-1 ceph-mon[80115]: 12.c scrub ok
Feb 02 09:42:34 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb 02 09:42:34 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb 02 09:42:34 compute-1 ceph-mon[80115]: Reconfiguring osd.1 (monmap changed)...
Feb 02 09:42:34 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "auth get", "entity": "osd.1"}]: dispatch
Feb 02 09:42:34 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Feb 02 09:42:34 compute-1 ceph-mon[80115]: Reconfiguring daemon osd.1 on compute-0
Feb 02 09:42:34 compute-1 ceph-mon[80115]: 11.a scrub starts
Feb 02 09:42:34 compute-1 ceph-mon[80115]: 11.a scrub ok
Feb 02 09:42:34 compute-1 ceph-mon[80115]: 8.1b deep-scrub starts
Feb 02 09:42:34 compute-1 ceph-mon[80115]: 8.1b deep-scrub ok
Feb 02 09:42:34 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb 02 09:42:34 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb 02 09:42:34 compute-1 ceph-mon[80115]: Reconfiguring node-exporter.compute-0 (unknown last config time)...
Feb 02 09:42:34 compute-1 ceph-mon[80115]: Reconfiguring daemon node-exporter.compute-0 on compute-0
Feb 02 09:42:34 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:42:34 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2c0000b60 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:42:34 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:42:34 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:42:34 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:42:34.974 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:42:34 compute-1 ceph-osd[77691]: log_channel(cluster) log [DBG] : 10.18 scrub starts
Feb 02 09:42:34 compute-1 ceph-osd[77691]: log_channel(cluster) log [DBG] : 10.18 scrub ok
Feb 02 09:42:35 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-haproxy-nfs-cephfs-compute-1-sryqbx[86108]: [WARNING] 032/094235 (4) : Server backend/nfs.cephfs.0 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Feb 02 09:42:35 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:42:35 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2c4000b60 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:42:35 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:42:35 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:42:35 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:42:35.064 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:42:35 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e85 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 02 09:42:35 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e86 e86: 3 total, 3 up, 3 in
Feb 02 09:42:35 compute-1 ceph-mon[80115]: 12.12 scrub starts
Feb 02 09:42:35 compute-1 ceph-mon[80115]: 12.12 scrub ok
Feb 02 09:42:35 compute-1 ceph-mon[80115]: 11.7 deep-scrub starts
Feb 02 09:42:35 compute-1 ceph-mon[80115]: 11.7 deep-scrub ok
Feb 02 09:42:35 compute-1 ceph-mon[80115]: pgmap v21: 353 pgs: 353 active+clean; 458 KiB data, 125 MiB used, 60 GiB / 60 GiB avail; 32 B/s, 1 objects/s recovering
Feb 02 09:42:35 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "13"}]: dispatch
Feb 02 09:42:35 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb 02 09:42:35 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb 02 09:42:35 compute-1 ceph-mon[80115]: Reconfiguring alertmanager.compute-0 (dependencies changed)...
Feb 02 09:42:35 compute-1 ceph-mon[80115]: Reconfiguring daemon alertmanager.compute-0 on compute-0
Feb 02 09:42:36 compute-1 ceph-osd[77691]: log_channel(cluster) log [DBG] : 10.5 scrub starts
Feb 02 09:42:36 compute-1 ceph-osd[77691]: log_channel(cluster) log [DBG] : 10.5 scrub ok
Feb 02 09:42:36 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:42:36 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2cc000d00 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:42:36 compute-1 ceph-mon[80115]: 12.9 deep-scrub starts
Feb 02 09:42:36 compute-1 ceph-mon[80115]: 12.9 deep-scrub ok
Feb 02 09:42:36 compute-1 ceph-mon[80115]: 10.18 scrub starts
Feb 02 09:42:36 compute-1 ceph-mon[80115]: 10.18 scrub ok
Feb 02 09:42:36 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "13"}]': finished
Feb 02 09:42:36 compute-1 ceph-mon[80115]: osdmap e86: 3 total, 3 up, 3 in
Feb 02 09:42:36 compute-1 ceph-mon[80115]: 8.c scrub starts
Feb 02 09:42:36 compute-1 ceph-mon[80115]: 8.c scrub ok
Feb 02 09:42:36 compute-1 ceph-mon[80115]: 11.1b deep-scrub starts
Feb 02 09:42:36 compute-1 ceph-mon[80115]: 11.1b deep-scrub ok
Feb 02 09:42:36 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:42:36 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2e4002070 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:42:36 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:42:36 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:42:36 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:42:36.976 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:42:37 compute-1 ceph-osd[77691]: log_channel(cluster) log [DBG] : 12.19 scrub starts
Feb 02 09:42:37 compute-1 ceph-osd[77691]: log_channel(cluster) log [DBG] : 12.19 scrub ok
Feb 02 09:42:37 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:42:37 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2c00016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:42:37 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:42:37 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:42:37 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:42:37.066 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:42:37 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e87 e87: 3 total, 3 up, 3 in
Feb 02 09:42:37 compute-1 ceph-mon[80115]: 10.5 scrub starts
Feb 02 09:42:37 compute-1 ceph-mon[80115]: 10.5 scrub ok
Feb 02 09:42:37 compute-1 ceph-mon[80115]: 8.d deep-scrub starts
Feb 02 09:42:37 compute-1 ceph-mon[80115]: 8.d deep-scrub ok
Feb 02 09:42:37 compute-1 ceph-mon[80115]: 8.4 scrub starts
Feb 02 09:42:37 compute-1 ceph-mon[80115]: 8.4 scrub ok
Feb 02 09:42:37 compute-1 ceph-mon[80115]: pgmap v23: 353 pgs: 353 active+clean; 458 KiB data, 125 MiB used, 60 GiB / 60 GiB avail; 28 B/s, 1 objects/s recovering
Feb 02 09:42:37 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "14"}]: dispatch
Feb 02 09:42:37 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb 02 09:42:37 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb 02 09:42:37 compute-1 ceph-mon[80115]: Reconfiguring grafana.compute-0 (dependencies changed)...
Feb 02 09:42:37 compute-1 ceph-mon[80115]: Reconfiguring daemon grafana.compute-0 on compute-0
Feb 02 09:42:38 compute-1 ceph-osd[77691]: log_channel(cluster) log [DBG] : 10.19 scrub starts
Feb 02 09:42:38 compute-1 ceph-osd[77691]: log_channel(cluster) log [DBG] : 10.19 scrub ok
Feb 02 09:42:38 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:42:38 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2c40016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:42:38 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e88 e88: 3 total, 3 up, 3 in
Feb 02 09:42:38 compute-1 ceph-mon[80115]: 12.19 scrub starts
Feb 02 09:42:38 compute-1 ceph-mon[80115]: 12.19 scrub ok
Feb 02 09:42:38 compute-1 ceph-mon[80115]: 8.11 scrub starts
Feb 02 09:42:38 compute-1 ceph-mon[80115]: 8.11 scrub ok
Feb 02 09:42:38 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "14"}]': finished
Feb 02 09:42:38 compute-1 ceph-mon[80115]: osdmap e87: 3 total, 3 up, 3 in
Feb 02 09:42:38 compute-1 ceph-mon[80115]: 11.1d scrub starts
Feb 02 09:42:38 compute-1 ceph-mon[80115]: 11.1d scrub ok
Feb 02 09:42:38 compute-1 ceph-mon[80115]: 11.17 scrub starts
Feb 02 09:42:38 compute-1 ceph-mon[80115]: 11.17 scrub ok
Feb 02 09:42:38 compute-1 ceph-mon[80115]: osdmap e88: 3 total, 3 up, 3 in
Feb 02 09:42:38 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:42:38 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2cc001820 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:42:38 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:42:38 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000025s ======
Feb 02 09:42:38 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:42:38.979 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Feb 02 09:42:39 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:42:39 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2e4002070 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:42:39 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:42:39 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:42:39 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:42:39.068 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:42:39 compute-1 ceph-osd[77691]: log_channel(cluster) log [DBG] : 12.1c scrub starts
Feb 02 09:42:39 compute-1 ceph-osd[77691]: log_channel(cluster) log [DBG] : 12.1c scrub ok
Feb 02 09:42:39 compute-1 sudo[88904]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 02 09:42:39 compute-1 sudo[88904]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:42:39 compute-1 sudo[88904]: pam_unix(sudo:session): session closed for user root
Feb 02 09:42:39 compute-1 sudo[88929]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/d241d473-9fcb-5f74-b163-f1ca4454e7f1/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --image quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec --timeout 895 _orch deploy --fsid d241d473-9fcb-5f74-b163-f1ca4454e7f1
Feb 02 09:42:39 compute-1 sudo[88929]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:42:39 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e89 e89: 3 total, 3 up, 3 in
Feb 02 09:42:39 compute-1 podman[88972]: 2026-02-02 09:42:39.596974171 +0000 UTC m=+0.052074361 container create c3586480d58c31685c7db2d649d58f9caa869b8594014115259c615aeed48067 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=stupefied_edison, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid)
Feb 02 09:42:39 compute-1 ceph-mon[80115]: 10.19 scrub starts
Feb 02 09:42:39 compute-1 ceph-mon[80115]: 10.19 scrub ok
Feb 02 09:42:39 compute-1 ceph-mon[80115]: 11.1e scrub starts
Feb 02 09:42:39 compute-1 ceph-mon[80115]: pgmap v26: 353 pgs: 2 unknown, 351 active+clean; 457 KiB data, 126 MiB used, 60 GiB / 60 GiB avail; 36 B/s, 2 objects/s recovering
Feb 02 09:42:39 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb 02 09:42:39 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb 02 09:42:39 compute-1 ceph-mon[80115]: Reconfiguring crash.compute-1 (monmap changed)...
Feb 02 09:42:39 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "auth get-or-create", "entity": "client.crash.compute-1", "caps": ["mon", "profile crash", "mgr", "profile crash"]}]: dispatch
Feb 02 09:42:39 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Feb 02 09:42:39 compute-1 ceph-mon[80115]: Reconfiguring daemon crash.compute-1 on compute-1
Feb 02 09:42:39 compute-1 ceph-mon[80115]: 10.1 scrub starts
Feb 02 09:42:39 compute-1 ceph-mon[80115]: 10.1 scrub ok
Feb 02 09:42:39 compute-1 ceph-mon[80115]: osdmap e89: 3 total, 3 up, 3 in
Feb 02 09:42:39 compute-1 systemd[1]: Started libpod-conmon-c3586480d58c31685c7db2d649d58f9caa869b8594014115259c615aeed48067.scope.
Feb 02 09:42:39 compute-1 systemd[1]: Started libcrun container.
Feb 02 09:42:39 compute-1 podman[88972]: 2026-02-02 09:42:39.570855149 +0000 UTC m=+0.025955329 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Feb 02 09:42:39 compute-1 podman[88972]: 2026-02-02 09:42:39.675839679 +0000 UTC m=+0.130939869 container init c3586480d58c31685c7db2d649d58f9caa869b8594014115259c615aeed48067 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=stupefied_edison, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, OSD_FLAVOR=default, ceph=True)
Feb 02 09:42:39 compute-1 podman[88972]: 2026-02-02 09:42:39.683824896 +0000 UTC m=+0.138925076 container start c3586480d58c31685c7db2d649d58f9caa869b8594014115259c615aeed48067 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=stupefied_edison, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Feb 02 09:42:39 compute-1 podman[88972]: 2026-02-02 09:42:39.687582418 +0000 UTC m=+0.142682658 container attach c3586480d58c31685c7db2d649d58f9caa869b8594014115259c615aeed48067 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=stupefied_edison, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.license=GPLv2, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Feb 02 09:42:39 compute-1 stupefied_edison[88988]: 167 167
Feb 02 09:42:39 compute-1 systemd[1]: libpod-c3586480d58c31685c7db2d649d58f9caa869b8594014115259c615aeed48067.scope: Deactivated successfully.
Feb 02 09:42:39 compute-1 podman[88972]: 2026-02-02 09:42:39.690888939 +0000 UTC m=+0.145989129 container died c3586480d58c31685c7db2d649d58f9caa869b8594014115259c615aeed48067 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=stupefied_edison, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, ceph=True, org.label-schema.build-date=20250325, CEPH_REF=squid, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Feb 02 09:42:39 compute-1 systemd[1]: var-lib-containers-storage-overlay-d9f3e90accea7336dd75055b02d47aad36752e28d75b34bcc4df3068854b64c9-merged.mount: Deactivated successfully.
Feb 02 09:42:39 compute-1 podman[88972]: 2026-02-02 09:42:39.735734742 +0000 UTC m=+0.190834922 container remove c3586480d58c31685c7db2d649d58f9caa869b8594014115259c615aeed48067 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=stupefied_edison, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, ceph=True, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/)
Feb 02 09:42:39 compute-1 systemd[1]: libpod-conmon-c3586480d58c31685c7db2d649d58f9caa869b8594014115259c615aeed48067.scope: Deactivated successfully.
Feb 02 09:42:39 compute-1 sudo[88929]: pam_unix(sudo:session): session closed for user root
Feb 02 09:42:39 compute-1 sudo[89004]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 02 09:42:39 compute-1 sudo[89004]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:42:39 compute-1 sudo[89004]: pam_unix(sudo:session): session closed for user root
Feb 02 09:42:39 compute-1 sudo[89029]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/d241d473-9fcb-5f74-b163-f1ca4454e7f1/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --image quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec --timeout 895 _orch deploy --fsid d241d473-9fcb-5f74-b163-f1ca4454e7f1
Feb 02 09:42:39 compute-1 sudo[89029]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:42:40 compute-1 ceph-osd[77691]: log_channel(cluster) log [DBG] : 9.14 scrub starts
Feb 02 09:42:40 compute-1 ceph-osd[77691]: log_channel(cluster) log [DBG] : 9.14 scrub ok
Feb 02 09:42:40 compute-1 podman[89069]: 2026-02-02 09:42:40.255776273 +0000 UTC m=+0.041044550 container create d2b237db8ba932eeec5f117837f688215568c8e8aee9fead59f2f2ca21e8f11d (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=distracted_elbakyan, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, io.buildah.version=1.40.1, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/)
Feb 02 09:42:40 compute-1 systemd[1]: Started libpod-conmon-d2b237db8ba932eeec5f117837f688215568c8e8aee9fead59f2f2ca21e8f11d.scope.
Feb 02 09:42:40 compute-1 systemd[1]: Started libcrun container.
Feb 02 09:42:40 compute-1 podman[89069]: 2026-02-02 09:42:40.238415206 +0000 UTC m=+0.023683463 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Feb 02 09:42:40 compute-1 podman[89069]: 2026-02-02 09:42:40.343447297 +0000 UTC m=+0.128715614 container init d2b237db8ba932eeec5f117837f688215568c8e8aee9fead59f2f2ca21e8f11d (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=distracted_elbakyan, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Feb 02 09:42:40 compute-1 podman[89069]: 2026-02-02 09:42:40.349774573 +0000 UTC m=+0.135042850 container start d2b237db8ba932eeec5f117837f688215568c8e8aee9fead59f2f2ca21e8f11d (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=distracted_elbakyan, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, OSD_FLAVOR=default, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Feb 02 09:42:40 compute-1 distracted_elbakyan[89086]: 167 167
Feb 02 09:42:40 compute-1 systemd[1]: libpod-d2b237db8ba932eeec5f117837f688215568c8e8aee9fead59f2f2ca21e8f11d.scope: Deactivated successfully.
Feb 02 09:42:40 compute-1 podman[89069]: 2026-02-02 09:42:40.354184651 +0000 UTC m=+0.139453048 container attach d2b237db8ba932eeec5f117837f688215568c8e8aee9fead59f2f2ca21e8f11d (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=distracted_elbakyan, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_REF=squid, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325)
Feb 02 09:42:40 compute-1 conmon[89086]: conmon d2b237db8ba932eeec5f <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-d2b237db8ba932eeec5f117837f688215568c8e8aee9fead59f2f2ca21e8f11d.scope/container/memory.events
Feb 02 09:42:40 compute-1 podman[89069]: 2026-02-02 09:42:40.355488853 +0000 UTC m=+0.140757140 container died d2b237db8ba932eeec5f117837f688215568c8e8aee9fead59f2f2ca21e8f11d (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=distracted_elbakyan, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Feb 02 09:42:40 compute-1 systemd[1]: var-lib-containers-storage-overlay-38f9f7c4fd5ad40dee5edcb9f87756e146a0d27526016fe6503dd6c79997ea4b-merged.mount: Deactivated successfully.
Feb 02 09:42:40 compute-1 podman[89069]: 2026-02-02 09:42:40.448868419 +0000 UTC m=+0.234136706 container remove d2b237db8ba932eeec5f117837f688215568c8e8aee9fead59f2f2ca21e8f11d (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=distracted_elbakyan, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Feb 02 09:42:40 compute-1 systemd[1]: libpod-conmon-d2b237db8ba932eeec5f117837f688215568c8e8aee9fead59f2f2ca21e8f11d.scope: Deactivated successfully.
Feb 02 09:42:40 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e89 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 02 09:42:40 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:42:40 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2c00016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:42:40 compute-1 sudo[89029]: pam_unix(sudo:session): session closed for user root
Feb 02 09:42:40 compute-1 ceph-mon[80115]: 11.1e scrub ok
Feb 02 09:42:40 compute-1 ceph-mon[80115]: 12.1c scrub starts
Feb 02 09:42:40 compute-1 ceph-mon[80115]: 12.1c scrub ok
Feb 02 09:42:40 compute-1 ceph-mon[80115]: 8.18 scrub starts
Feb 02 09:42:40 compute-1 ceph-mon[80115]: 8.18 scrub ok
Feb 02 09:42:40 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb 02 09:42:40 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb 02 09:42:40 compute-1 ceph-mon[80115]: Reconfiguring osd.0 (monmap changed)...
Feb 02 09:42:40 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "auth get", "entity": "osd.0"}]: dispatch
Feb 02 09:42:40 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Feb 02 09:42:40 compute-1 ceph-mon[80115]: Reconfiguring daemon osd.0 on compute-1
Feb 02 09:42:40 compute-1 ceph-mon[80115]: 11.13 scrub starts
Feb 02 09:42:40 compute-1 ceph-mon[80115]: 11.13 scrub ok
Feb 02 09:42:40 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e90 e90: 3 total, 3 up, 3 in
Feb 02 09:42:40 compute-1 sudo[89112]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 02 09:42:40 compute-1 sudo[89112]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:42:40 compute-1 sudo[89112]: pam_unix(sudo:session): session closed for user root
Feb 02 09:42:40 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:42:40 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2c40016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:42:40 compute-1 sudo[89137]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/d241d473-9fcb-5f74-b163-f1ca4454e7f1/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --image quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec --timeout 895 _orch deploy --fsid d241d473-9fcb-5f74-b163-f1ca4454e7f1
Feb 02 09:42:40 compute-1 sudo[89137]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:42:40 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:42:40 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000025s ======
Feb 02 09:42:40 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:42:40.983 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Feb 02 09:42:41 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:42:41 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2cc001820 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:42:41 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:42:41 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000025s ======
Feb 02 09:42:41 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:42:41.071 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Feb 02 09:42:41 compute-1 ceph-osd[77691]: log_channel(cluster) log [DBG] : 9.11 scrub starts
Feb 02 09:42:41 compute-1 podman[89178]: 2026-02-02 09:42:41.117435731 +0000 UTC m=+0.060619861 container create daf4d34576db2cd3339688760495856b0b6d3cb98718f75eec158d6dcef57c3e (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=silly_lewin, io.buildah.version=1.40.1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, ceph=True, CEPH_REF=squid, OSD_FLAVOR=default)
Feb 02 09:42:41 compute-1 ceph-osd[77691]: log_channel(cluster) log [DBG] : 9.11 scrub ok
Feb 02 09:42:41 compute-1 systemd[1]: Started libpod-conmon-daf4d34576db2cd3339688760495856b0b6d3cb98718f75eec158d6dcef57c3e.scope.
Feb 02 09:42:41 compute-1 systemd[1]: Started libcrun container.
Feb 02 09:42:41 compute-1 podman[89178]: 2026-02-02 09:42:41.090502019 +0000 UTC m=+0.033686229 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Feb 02 09:42:41 compute-1 podman[89178]: 2026-02-02 09:42:41.197763426 +0000 UTC m=+0.140947586 container init daf4d34576db2cd3339688760495856b0b6d3cb98718f75eec158d6dcef57c3e (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=silly_lewin, ceph=True, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Feb 02 09:42:41 compute-1 podman[89178]: 2026-02-02 09:42:41.204422429 +0000 UTC m=+0.147606559 container start daf4d34576db2cd3339688760495856b0b6d3cb98718f75eec158d6dcef57c3e (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=silly_lewin, CEPH_REF=squid, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325)
Feb 02 09:42:41 compute-1 silly_lewin[89194]: 167 167
Feb 02 09:42:41 compute-1 podman[89178]: 2026-02-02 09:42:41.208179122 +0000 UTC m=+0.151363282 container attach daf4d34576db2cd3339688760495856b0b6d3cb98718f75eec158d6dcef57c3e (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=silly_lewin, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.40.1, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid)
Feb 02 09:42:41 compute-1 systemd[1]: libpod-daf4d34576db2cd3339688760495856b0b6d3cb98718f75eec158d6dcef57c3e.scope: Deactivated successfully.
Feb 02 09:42:41 compute-1 podman[89178]: 2026-02-02 09:42:41.209100894 +0000 UTC m=+0.152285034 container died daf4d34576db2cd3339688760495856b0b6d3cb98718f75eec158d6dcef57c3e (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=silly_lewin, OSD_FLAVOR=default, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, CEPH_REF=squid, io.buildah.version=1.40.1, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2)
Feb 02 09:42:41 compute-1 systemd[1]: var-lib-containers-storage-overlay-2b7a7e70d2c6d0e533dfe8ca66f3460c2dd5eeab061c3c6b27a85d610fadc521-merged.mount: Deactivated successfully.
Feb 02 09:42:41 compute-1 podman[89178]: 2026-02-02 09:42:41.244467004 +0000 UTC m=+0.187651164 container remove daf4d34576db2cd3339688760495856b0b6d3cb98718f75eec158d6dcef57c3e (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=silly_lewin, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325, OSD_FLAVOR=default, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/)
Feb 02 09:42:41 compute-1 systemd[1]: libpod-conmon-daf4d34576db2cd3339688760495856b0b6d3cb98718f75eec158d6dcef57c3e.scope: Deactivated successfully.
Feb 02 09:42:41 compute-1 sudo[89137]: pam_unix(sudo:session): session closed for user root
Feb 02 09:42:41 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e91 e91: 3 total, 3 up, 3 in
Feb 02 09:42:41 compute-1 ceph-mon[80115]: 9.14 scrub starts
Feb 02 09:42:41 compute-1 ceph-mon[80115]: 9.14 scrub ok
Feb 02 09:42:41 compute-1 ceph-mon[80115]: osdmap e90: 3 total, 3 up, 3 in
Feb 02 09:42:41 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb 02 09:42:41 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb 02 09:42:41 compute-1 ceph-mon[80115]: Reconfiguring mon.compute-1 (monmap changed)...
Feb 02 09:42:41 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "auth get", "entity": "mon."}]: dispatch
Feb 02 09:42:41 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "config get", "who": "mon", "key": "public_network"}]: dispatch
Feb 02 09:42:41 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Feb 02 09:42:41 compute-1 ceph-mon[80115]: Reconfiguring daemon mon.compute-1 on compute-1
Feb 02 09:42:41 compute-1 ceph-mon[80115]: 11.1c scrub starts
Feb 02 09:42:41 compute-1 ceph-mon[80115]: 11.1c scrub ok
Feb 02 09:42:41 compute-1 ceph-mon[80115]: pgmap v29: 353 pgs: 2 unknown, 351 active+clean; 457 KiB data, 126 MiB used, 60 GiB / 60 GiB avail
Feb 02 09:42:41 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb 02 09:42:41 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb 02 09:42:41 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "auth get", "entity": "mon."}]: dispatch
Feb 02 09:42:41 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "config get", "who": "mon", "key": "public_network"}]: dispatch
Feb 02 09:42:41 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Feb 02 09:42:41 compute-1 ceph-mon[80115]: 12.1a scrub starts
Feb 02 09:42:41 compute-1 ceph-mon[80115]: 12.1a scrub ok
Feb 02 09:42:42 compute-1 ceph-osd[77691]: log_channel(cluster) log [DBG] : 9.10 scrub starts
Feb 02 09:42:42 compute-1 ceph-osd[77691]: log_channel(cluster) log [DBG] : 9.10 scrub ok
Feb 02 09:42:42 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:42:42 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2e4002070 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:42:42 compute-1 ceph-mon[80115]: 9.11 scrub starts
Feb 02 09:42:42 compute-1 ceph-mon[80115]: 9.11 scrub ok
Feb 02 09:42:42 compute-1 ceph-mon[80115]: Reconfiguring mon.compute-2 (monmap changed)...
Feb 02 09:42:42 compute-1 ceph-mon[80115]: Reconfiguring daemon mon.compute-2 on compute-2
Feb 02 09:42:42 compute-1 ceph-mon[80115]: osdmap e91: 3 total, 3 up, 3 in
Feb 02 09:42:42 compute-1 ceph-mon[80115]: 8.12 scrub starts
Feb 02 09:42:42 compute-1 ceph-mon[80115]: 8.12 scrub ok
Feb 02 09:42:42 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb 02 09:42:42 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb 02 09:42:42 compute-1 ceph-mon[80115]: Reconfiguring mgr.compute-2.gzlyac (monmap changed)...
Feb 02 09:42:42 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "auth get-or-create", "entity": "mgr.compute-2.gzlyac", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]: dispatch
Feb 02 09:42:42 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "mgr services"}]: dispatch
Feb 02 09:42:42 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Feb 02 09:42:42 compute-1 ceph-mon[80115]: Reconfiguring daemon mgr.compute-2.gzlyac on compute-2
Feb 02 09:42:42 compute-1 ceph-mon[80115]: 12.18 scrub starts
Feb 02 09:42:42 compute-1 ceph-mon[80115]: 12.18 scrub ok
Feb 02 09:42:42 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb 02 09:42:42 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb 02 09:42:42 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:42:42 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2c00016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:42:42 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:42:42 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:42:42 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:42:42.987 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:42:43 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:42:43 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2c40016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:42:43 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:42:43 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:42:43 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:42:43.075 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:42:43 compute-1 ceph-osd[77691]: log_channel(cluster) log [DBG] : 9.2 scrub starts
Feb 02 09:42:43 compute-1 ceph-osd[77691]: log_channel(cluster) log [DBG] : 9.2 scrub ok
Feb 02 09:42:43 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-haproxy-nfs-cephfs-compute-1-sryqbx[86108]: [WARNING] 032/094243 (4) : Server backend/nfs.cephfs.1 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Feb 02 09:42:43 compute-1 ceph-mon[80115]: 9.10 scrub starts
Feb 02 09:42:43 compute-1 ceph-mon[80115]: 9.10 scrub ok
Feb 02 09:42:43 compute-1 ceph-mon[80115]: Reconfiguring haproxy.rgw.default.compute-2.txhwfs (unknown last config time)...
Feb 02 09:42:43 compute-1 ceph-mon[80115]: Reconfiguring daemon haproxy.rgw.default.compute-2.txhwfs on compute-2
Feb 02 09:42:43 compute-1 ceph-mon[80115]: 11.1a scrub starts
Feb 02 09:42:43 compute-1 ceph-mon[80115]: pgmap v31: 353 pgs: 2 unknown, 351 active+clean; 457 KiB data, 126 MiB used, 60 GiB / 60 GiB avail
Feb 02 09:42:43 compute-1 ceph-mon[80115]: 8.6 scrub starts
Feb 02 09:42:43 compute-1 ceph-mon[80115]: 8.6 scrub ok
Feb 02 09:42:44 compute-1 ceph-osd[77691]: log_channel(cluster) log [DBG] : 9.c scrub starts
Feb 02 09:42:44 compute-1 ceph-osd[77691]: log_channel(cluster) log [DBG] : 9.c scrub ok
Feb 02 09:42:44 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:42:44 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:42:44 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:42:44.500 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:42:44 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:42:44 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2cc001820 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:42:44 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:42:44 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2e4002070 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:42:44 compute-1 ceph-mon[80115]: 11.1a scrub ok
Feb 02 09:42:44 compute-1 ceph-mon[80115]: 9.2 scrub starts
Feb 02 09:42:44 compute-1 ceph-mon[80115]: 9.2 scrub ok
Feb 02 09:42:44 compute-1 ceph-mon[80115]: 11.5 scrub starts
Feb 02 09:42:44 compute-1 ceph-mon[80115]: 11.5 scrub ok
Feb 02 09:42:44 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb 02 09:42:44 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb 02 09:42:44 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "dashboard get-alertmanager-api-host"}]: dispatch
Feb 02 09:42:44 compute-1 ceph-mon[80115]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "dashboard get-alertmanager-api-host"}]: dispatch
Feb 02 09:42:44 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "dashboard get-grafana-api-url"}]: dispatch
Feb 02 09:42:44 compute-1 ceph-mon[80115]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "dashboard get-grafana-api-url"}]: dispatch
Feb 02 09:42:44 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "dashboard set-grafana-api-url", "value": "https://192.168.122.100:3000"}]: dispatch
Feb 02 09:42:44 compute-1 ceph-mon[80115]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "dashboard set-grafana-api-url", "value": "https://192.168.122.100:3000"}]: dispatch
Feb 02 09:42:44 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb 02 09:42:44 compute-1 ceph-mon[80115]: 10.3 scrub starts
Feb 02 09:42:44 compute-1 ceph-mon[80115]: 10.3 scrub ok
Feb 02 09:42:45 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:42:45 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2c0002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:42:45 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:42:45 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 09:42:45 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:42:45.078 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 09:42:45 compute-1 ceph-osd[77691]: log_channel(cluster) log [DBG] : 9.0 deep-scrub starts
Feb 02 09:42:45 compute-1 ceph-osd[77691]: log_channel(cluster) log [DBG] : 9.0 deep-scrub ok
Feb 02 09:42:45 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e91 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 02 09:42:45 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e92 e92: 3 total, 3 up, 3 in
Feb 02 09:42:45 compute-1 ceph-mon[80115]: 9.c scrub starts
Feb 02 09:42:45 compute-1 ceph-mon[80115]: 9.c scrub ok
Feb 02 09:42:45 compute-1 ceph-mon[80115]: 9.16 scrub starts
Feb 02 09:42:45 compute-1 ceph-mon[80115]: 9.16 scrub ok
Feb 02 09:42:45 compute-1 ceph-mon[80115]: pgmap v32: 353 pgs: 353 active+clean; 457 KiB data, 143 MiB used, 60 GiB / 60 GiB avail; 52 B/s, 1 objects/s recovering
Feb 02 09:42:45 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "15"}]: dispatch
Feb 02 09:42:45 compute-1 ceph-mon[80115]: 9.9 scrub starts
Feb 02 09:42:45 compute-1 ceph-mon[80115]: 9.9 scrub ok
Feb 02 09:42:46 compute-1 ceph-osd[77691]: log_channel(cluster) log [DBG] : 9.1 scrub starts
Feb 02 09:42:46 compute-1 ceph-osd[77691]: log_channel(cluster) log [DBG] : 9.1 scrub ok
Feb 02 09:42:46 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:42:46 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 09:42:46 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:42:46.502 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 09:42:46 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:42:46 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2c4002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:42:46 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:42:46 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2cc002cb0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:42:46 compute-1 ceph-mon[80115]: 9.0 deep-scrub starts
Feb 02 09:42:46 compute-1 ceph-mon[80115]: 9.0 deep-scrub ok
Feb 02 09:42:46 compute-1 ceph-mon[80115]: 9.e scrub starts
Feb 02 09:42:46 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "15"}]': finished
Feb 02 09:42:46 compute-1 ceph-mon[80115]: osdmap e92: 3 total, 3 up, 3 in
Feb 02 09:42:46 compute-1 ceph-mon[80115]: 9.e scrub ok
Feb 02 09:42:46 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb 02 09:42:46 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb 02 09:42:46 compute-1 ceph-mon[80115]: 9.3 scrub starts
Feb 02 09:42:46 compute-1 ceph-mon[80115]: 9.3 scrub ok
Feb 02 09:42:47 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:42:47 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2e4009990 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:42:47 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:42:47 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:42:47 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:42:47.081 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:42:47 compute-1 ceph-osd[77691]: log_channel(cluster) log [DBG] : 9.4 scrub starts
Feb 02 09:42:47 compute-1 ceph-osd[77691]: log_channel(cluster) log [DBG] : 9.4 scrub ok
Feb 02 09:42:47 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e93 e93: 3 total, 3 up, 3 in
Feb 02 09:42:47 compute-1 ceph-mon[80115]: 9.1 scrub starts
Feb 02 09:42:47 compute-1 ceph-mon[80115]: 9.1 scrub ok
Feb 02 09:42:47 compute-1 ceph-mon[80115]: 9.a scrub starts
Feb 02 09:42:47 compute-1 ceph-mon[80115]: 9.a scrub ok
Feb 02 09:42:47 compute-1 ceph-mon[80115]: pgmap v34: 353 pgs: 353 active+clean; 457 KiB data, 143 MiB used, 60 GiB / 60 GiB avail; 50 B/s, 1 objects/s recovering
Feb 02 09:42:47 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "16"}]: dispatch
Feb 02 09:42:47 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Feb 02 09:42:47 compute-1 ceph-mon[80115]: 9.17 scrub starts
Feb 02 09:42:47 compute-1 ceph-mon[80115]: 9.17 scrub ok
Feb 02 09:42:48 compute-1 ceph-osd[77691]: log_channel(cluster) log [DBG] : 9.1c scrub starts
Feb 02 09:42:48 compute-1 ceph-osd[77691]: log_channel(cluster) log [DBG] : 9.1c scrub ok
Feb 02 09:42:48 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:42:48 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:42:48 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:42:48.505 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:42:48 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:42:48 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2c0002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:42:48 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:42:48 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2c4002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:42:48 compute-1 ceph-mon[80115]: 9.4 scrub starts
Feb 02 09:42:48 compute-1 ceph-mon[80115]: 9.4 scrub ok
Feb 02 09:42:48 compute-1 ceph-mon[80115]: 9.6 scrub starts
Feb 02 09:42:48 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb 02 09:42:48 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "16"}]': finished
Feb 02 09:42:48 compute-1 ceph-mon[80115]: osdmap e93: 3 total, 3 up, 3 in
Feb 02 09:42:48 compute-1 ceph-mon[80115]: 9.6 scrub ok
Feb 02 09:42:48 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb 02 09:42:48 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Feb 02 09:42:48 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Feb 02 09:42:48 compute-1 ceph-mon[80115]: pgmap v36: 353 pgs: 353 active+clean; 457 KiB data, 143 MiB used, 60 GiB / 60 GiB avail; 50 B/s, 1 objects/s recovering
Feb 02 09:42:48 compute-1 ceph-mon[80115]: pgmap v37: 353 pgs: 353 active+clean; 457 KiB data, 143 MiB used, 60 GiB / 60 GiB avail; 63 B/s, 2 objects/s recovering
Feb 02 09:42:48 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "17"}]: dispatch
Feb 02 09:42:48 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "17"}]: dispatch
Feb 02 09:42:48 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb 02 09:42:48 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb 02 09:42:48 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Feb 02 09:42:48 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Feb 02 09:42:48 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Feb 02 09:42:48 compute-1 ceph-mon[80115]: 9.8 scrub starts
Feb 02 09:42:48 compute-1 ceph-mon[80115]: 9.8 scrub ok
Feb 02 09:42:48 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e94 e94: 3 total, 3 up, 3 in
Feb 02 09:42:48 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 94 pg[9.10( v 44'1041 (0'0,44'1041] local-lis/les=57/58 n=2 ec=57/38 lis/c=57/57 les/c/f=58/58/0 sis=94 pruub=8.670993805s) [1] r=-1 lpr=94 pi=[57,94)/1 crt=44'1041 lcod 0'0 mlcod 0'0 active pruub 228.754318237s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb 02 09:42:48 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 94 pg[9.10( v 44'1041 (0'0,44'1041] local-lis/les=57/58 n=2 ec=57/38 lis/c=57/57 les/c/f=58/58/0 sis=94 pruub=8.670732498s) [1] r=-1 lpr=94 pi=[57,94)/1 crt=44'1041 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 228.754318237s@ mbc={}] state<Start>: transitioning to Stray
Feb 02 09:42:48 compute-1 sshd-session[89216]: Accepted publickey for zuul from 192.168.122.30 port 39176 ssh2: ECDSA SHA256:RIWOugHsRom13QN8+H2eekzMj6VNcm6gUxie+zDStiQ
Feb 02 09:42:48 compute-1 systemd-logind[805]: New session 38 of user zuul.
Feb 02 09:42:48 compute-1 systemd[1]: Started Session 38 of User zuul.
Feb 02 09:42:48 compute-1 sshd-session[89216]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Feb 02 09:42:49 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:42:49 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2cc002cb0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:42:49 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:42:49 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 09:42:49 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:42:49.084 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 09:42:49 compute-1 ceph-osd[77691]: log_channel(cluster) log [DBG] : 9.12 scrub starts
Feb 02 09:42:49 compute-1 ceph-osd[77691]: log_channel(cluster) log [DBG] : 9.12 scrub ok
Feb 02 09:42:49 compute-1 python3.9[89369]: ansible-ansible.legacy.ping Invoked with data=pong
Feb 02 09:42:49 compute-1 ceph-mon[80115]: 9.1c scrub starts
Feb 02 09:42:49 compute-1 ceph-mon[80115]: 9.1c scrub ok
Feb 02 09:42:49 compute-1 ceph-mon[80115]: 9.1a scrub starts
Feb 02 09:42:49 compute-1 ceph-mon[80115]: Health check update: 2 failed cephadm daemon(s) (CEPHADM_FAILED_DAEMON)
Feb 02 09:42:49 compute-1 ceph-mon[80115]: 9.1a scrub ok
Feb 02 09:42:49 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "17"}]': finished
Feb 02 09:42:49 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "17"}]': finished
Feb 02 09:42:49 compute-1 ceph-mon[80115]: osdmap e94: 3 total, 3 up, 3 in
Feb 02 09:42:49 compute-1 ceph-mon[80115]: 9.b scrub starts
Feb 02 09:42:49 compute-1 ceph-mon[80115]: 9.b scrub ok
Feb 02 09:42:49 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e95 e95: 3 total, 3 up, 3 in
Feb 02 09:42:49 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 95 pg[9.10( v 44'1041 (0'0,44'1041] local-lis/les=57/58 n=2 ec=57/38 lis/c=57/57 les/c/f=58/58/0 sis=95) [1]/[0] r=0 lpr=95 pi=[57,95)/1 crt=44'1041 lcod 0'0 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [1] -> [1], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 1, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Feb 02 09:42:49 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 95 pg[9.10( v 44'1041 (0'0,44'1041] local-lis/les=57/58 n=2 ec=57/38 lis/c=57/57 les/c/f=58/58/0 sis=95) [1]/[0] r=0 lpr=95 pi=[57,95)/1 crt=44'1041 lcod 0'0 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Feb 02 09:42:50 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e95 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 02 09:42:50 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:42:50 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:42:50 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:42:50.508 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:42:50 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:42:50 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2e4009990 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:42:50 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:42:50 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2c0002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:42:50 compute-1 ceph-mon[80115]: 9.12 scrub starts
Feb 02 09:42:50 compute-1 ceph-mon[80115]: 9.12 scrub ok
Feb 02 09:42:50 compute-1 ceph-mon[80115]: 9.1e scrub starts
Feb 02 09:42:50 compute-1 ceph-mon[80115]: pgmap v39: 353 pgs: 2 unknown, 351 active+clean; 457 KiB data, 143 MiB used, 60 GiB / 60 GiB avail
Feb 02 09:42:50 compute-1 ceph-mon[80115]: 9.1e scrub ok
Feb 02 09:42:50 compute-1 ceph-mon[80115]: osdmap e95: 3 total, 3 up, 3 in
Feb 02 09:42:50 compute-1 ceph-mon[80115]: 9.7 scrub starts
Feb 02 09:42:50 compute-1 ceph-mon[80115]: 9.7 scrub ok
Feb 02 09:42:50 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e96 e96: 3 total, 3 up, 3 in
Feb 02 09:42:50 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 96 pg[9.10( v 44'1041 (0'0,44'1041] local-lis/les=95/96 n=2 ec=57/38 lis/c=57/57 les/c/f=58/58/0 sis=95) [1]/[0] async=[1] r=0 lpr=95 pi=[57,95)/1 crt=44'1041 lcod 0'0 mlcod 0'0 active+remapped mbc={255={(0+1)=2}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 02 09:42:50 compute-1 python3.9[89544]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 02 09:42:51 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:42:51 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2c4002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:42:51 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:42:51 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:42:51 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:42:51.087 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:42:51 compute-1 sudo[89557]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Feb 02 09:42:51 compute-1 sudo[89557]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:42:51 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e97 e97: 3 total, 3 up, 3 in
Feb 02 09:42:51 compute-1 sudo[89557]: pam_unix(sudo:session): session closed for user root
Feb 02 09:42:51 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 97 pg[9.10( v 44'1041 (0'0,44'1041] local-lis/les=95/96 n=2 ec=57/38 lis/c=95/57 les/c/f=96/58/0 sis=97 pruub=15.609633446s) [1] async=[1] r=-1 lpr=97 pi=[57,97)/1 crt=44'1041 lcod 0'0 mlcod 0'0 active pruub 238.121505737s@ mbc={255={}}] PeeringState::start_peering_interval up [1] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 1 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb 02 09:42:51 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 97 pg[9.10( v 44'1041 (0'0,44'1041] local-lis/les=95/96 n=2 ec=57/38 lis/c=95/57 les/c/f=96/58/0 sis=97 pruub=15.609547615s) [1] r=-1 lpr=97 pi=[57,97)/1 crt=44'1041 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 238.121505737s@ mbc={}] state<Start>: transitioning to Stray
Feb 02 09:42:51 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:42:51 : epoch 6980717c : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Feb 02 09:42:51 compute-1 ceph-mon[80115]: osdmap e96: 3 total, 3 up, 3 in
Feb 02 09:42:51 compute-1 ceph-mon[80115]: osdmap e97: 3 total, 3 up, 3 in
Feb 02 09:42:51 compute-1 ceph-mon[80115]: 9.13 deep-scrub starts
Feb 02 09:42:51 compute-1 ceph-mon[80115]: 9.13 deep-scrub ok
Feb 02 09:42:52 compute-1 sudo[89723]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hqsfrqigmqfnqynsunrecjxclisiqcji ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025371.5984075-89-228251832699641/AnsiballZ_command.py'
Feb 02 09:42:52 compute-1 sudo[89723]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:42:52 compute-1 python3.9[89725]: ansible-ansible.legacy.command Invoked with _raw_params=PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin which growvols
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 02 09:42:52 compute-1 sudo[89723]: pam_unix(sudo:session): session closed for user root
Feb 02 09:42:52 compute-1 systemd[83549]: Starting Mark boot as successful...
Feb 02 09:42:52 compute-1 systemd[83549]: Finished Mark boot as successful.
Feb 02 09:42:52 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e98 e98: 3 total, 3 up, 3 in
Feb 02 09:42:52 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:42:52 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 09:42:52 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:42:52.511 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 09:42:52 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:42:52 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2cc002cb0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:42:52 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:42:52 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2e400a2b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:42:52 compute-1 ceph-mon[80115]: pgmap v43: 353 pgs: 2 unknown, 351 active+clean; 457 KiB data, 143 MiB used, 60 GiB / 60 GiB avail
Feb 02 09:42:52 compute-1 ceph-mon[80115]: 9.f scrub starts
Feb 02 09:42:52 compute-1 ceph-mon[80115]: 9.f scrub ok
Feb 02 09:42:52 compute-1 ceph-mon[80115]: osdmap e98: 3 total, 3 up, 3 in
Feb 02 09:42:52 compute-1 ceph-mon[80115]: 9.19 scrub starts
Feb 02 09:42:52 compute-1 ceph-mon[80115]: 9.19 scrub ok
Feb 02 09:42:52 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb 02 09:42:53 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:42:53 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2c0003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:42:53 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:42:53 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:42:53 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:42:53.091 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:42:53 compute-1 sudo[89880]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bsqxkztleoozazsfwfsdbuwjouvjrtss ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025372.7198396-125-96589710547855/AnsiballZ_stat.py'
Feb 02 09:42:53 compute-1 sudo[89880]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:42:53 compute-1 sudo[89878]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 02 09:42:53 compute-1 sudo[89878]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:42:53 compute-1 sudo[89878]: pam_unix(sudo:session): session closed for user root
Feb 02 09:42:53 compute-1 python3.9[89898]: ansible-ansible.builtin.stat Invoked with path=/etc/ansible/facts.d/bootc.fact follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 02 09:42:53 compute-1 sudo[89880]: pam_unix(sudo:session): session closed for user root
Feb 02 09:42:54 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e99 e99: 3 total, 3 up, 3 in
Feb 02 09:42:54 compute-1 sudo[90057]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-taebngoizbmayfjivdeqftxerxqcjdnu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025373.7147033-158-48336819194838/AnsiballZ_file.py'
Feb 02 09:42:54 compute-1 sudo[90057]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:42:54 compute-1 ceph-mon[80115]: 9.1f deep-scrub starts
Feb 02 09:42:54 compute-1 ceph-mon[80115]: 9.1f deep-scrub ok
Feb 02 09:42:54 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb 02 09:42:54 compute-1 ceph-mon[80115]: 9.18 scrub starts
Feb 02 09:42:54 compute-1 ceph-mon[80115]: 9.18 scrub ok
Feb 02 09:42:54 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "18"}]: dispatch
Feb 02 09:42:54 compute-1 python3.9[90059]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/log/journal setype=var_log_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 02 09:42:54 compute-1 sudo[90057]: pam_unix(sudo:session): session closed for user root
Feb 02 09:42:54 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:42:54 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 09:42:54 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:42:54.515 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 09:42:54 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:42:54 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2c4003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:42:54 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:42:54 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2cc003db0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:42:54 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 99 pg[9.11( v 44'1041 (0'0,44'1041] local-lis/les=57/58 n=5 ec=57/38 lis/c=57/57 les/c/f=58/58/0 sis=99 pruub=10.741518021s) [1] r=-1 lpr=99 pi=[57,99)/1 crt=44'1041 lcod 0'0 mlcod 0'0 active pruub 236.754623413s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb 02 09:42:54 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 99 pg[9.11( v 44'1041 (0'0,44'1041] local-lis/les=57/58 n=5 ec=57/38 lis/c=57/57 les/c/f=58/58/0 sis=99 pruub=10.741458893s) [1] r=-1 lpr=99 pi=[57,99)/1 crt=44'1041 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 236.754623413s@ mbc={}] state<Start>: transitioning to Stray
Feb 02 09:42:54 compute-1 sudo[90210]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dctebospqbilgnnfkuqheykzjugoyink ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025374.6536286-185-267412467231092/AnsiballZ_file.py'
Feb 02 09:42:54 compute-1 sudo[90210]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:42:54 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:42:54 : epoch 6980717c : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Feb 02 09:42:54 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:42:54 : epoch 6980717c : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Feb 02 09:42:55 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:42:55 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2e400a2b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:42:55 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:42:55 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 09:42:55 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:42:55.094 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 09:42:55 compute-1 python3.9[90212]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/config-data/ansible-generated recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 02 09:42:55 compute-1 sudo[90210]: pam_unix(sudo:session): session closed for user root
Feb 02 09:42:55 compute-1 ceph-mon[80115]: pgmap v45: 353 pgs: 353 active+clean; 457 KiB data, 148 MiB used, 60 GiB / 60 GiB avail
Feb 02 09:42:55 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "18"}]': finished
Feb 02 09:42:55 compute-1 ceph-mon[80115]: osdmap e99: 3 total, 3 up, 3 in
Feb 02 09:42:55 compute-1 ceph-mon[80115]: 9.1b scrub starts
Feb 02 09:42:55 compute-1 ceph-mon[80115]: 9.1b scrub ok
Feb 02 09:42:55 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e100 e100: 3 total, 3 up, 3 in
Feb 02 09:42:55 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 100 pg[9.11( v 44'1041 (0'0,44'1041] local-lis/les=57/58 n=5 ec=57/38 lis/c=57/57 les/c/f=58/58/0 sis=100) [1]/[0] r=0 lpr=100 pi=[57,100)/1 crt=44'1041 lcod 0'0 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [1] -> [1], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 1, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Feb 02 09:42:55 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 100 pg[9.11( v 44'1041 (0'0,44'1041] local-lis/les=57/58 n=5 ec=57/38 lis/c=57/57 les/c/f=58/58/0 sis=100) [1]/[0] r=0 lpr=100 pi=[57,100)/1 crt=44'1041 lcod 0'0 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Feb 02 09:42:55 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e100 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 02 09:42:55 compute-1 python3.9[90362]: ansible-ansible.builtin.service_facts Invoked
Feb 02 09:42:55 compute-1 network[90379]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Feb 02 09:42:55 compute-1 network[90380]: 'network-scripts' will be removed from distribution in near future.
Feb 02 09:42:55 compute-1 network[90381]: It is advised to switch to 'NetworkManager' instead for network management.
Feb 02 09:42:56 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e101 e101: 3 total, 3 up, 3 in
Feb 02 09:42:56 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 101 pg[9.12( v 44'1041 (0'0,44'1041] local-lis/les=57/58 n=4 ec=57/38 lis/c=57/57 les/c/f=58/58/0 sis=101 pruub=9.382583618s) [1] r=-1 lpr=101 pi=[57,101)/1 crt=44'1041 lcod 0'0 mlcod 0'0 active pruub 236.764617920s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb 02 09:42:56 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 101 pg[9.12( v 44'1041 (0'0,44'1041] local-lis/les=57/58 n=4 ec=57/38 lis/c=57/57 les/c/f=58/58/0 sis=101 pruub=9.382432938s) [1] r=-1 lpr=101 pi=[57,101)/1 crt=44'1041 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 236.764617920s@ mbc={}] state<Start>: transitioning to Stray
Feb 02 09:42:56 compute-1 ceph-mon[80115]: osdmap e100: 3 total, 3 up, 3 in
Feb 02 09:42:56 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "19"}]: dispatch
Feb 02 09:42:56 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 101 pg[9.11( v 44'1041 (0'0,44'1041] local-lis/les=100/101 n=5 ec=57/38 lis/c=57/57 les/c/f=58/58/0 sis=100) [1]/[0] async=[1] r=0 lpr=100 pi=[57,100)/1 crt=44'1041 lcod 0'0 mlcod 0'0 active+remapped mbc={255={(0+1)=5}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 02 09:42:56 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e102 e102: 3 total, 3 up, 3 in
Feb 02 09:42:56 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 102 pg[9.11( v 44'1041 (0'0,44'1041] local-lis/les=100/101 n=5 ec=57/38 lis/c=100/57 les/c/f=101/58/0 sis=102 pruub=15.896315575s) [1] async=[1] r=-1 lpr=102 pi=[57,102)/1 crt=44'1041 lcod 0'0 mlcod 0'0 active pruub 243.404632568s@ mbc={255={}}] PeeringState::start_peering_interval up [1] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 1 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb 02 09:42:56 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 102 pg[9.11( v 44'1041 (0'0,44'1041] local-lis/les=100/101 n=5 ec=57/38 lis/c=100/57 les/c/f=101/58/0 sis=102 pruub=15.896131516s) [1] r=-1 lpr=102 pi=[57,102)/1 crt=44'1041 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 243.404632568s@ mbc={}] state<Start>: transitioning to Stray
Feb 02 09:42:56 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 102 pg[9.12( v 44'1041 (0'0,44'1041] local-lis/les=57/58 n=4 ec=57/38 lis/c=57/57 les/c/f=58/58/0 sis=102) [1]/[0] r=0 lpr=102 pi=[57,102)/1 crt=44'1041 lcod 0'0 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [1] -> [1], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 1, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Feb 02 09:42:56 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 102 pg[9.12( v 44'1041 (0'0,44'1041] local-lis/les=57/58 n=4 ec=57/38 lis/c=57/57 les/c/f=58/58/0 sis=102) [1]/[0] r=0 lpr=102 pi=[57,102)/1 crt=44'1041 lcod 0'0 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Feb 02 09:42:56 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:42:56 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:42:56 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:42:56.518 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:42:56 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:42:56 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2c0003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:42:56 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:42:56 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2c4003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:42:57 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:42:57 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2cc003db0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:42:57 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:42:57 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:42:57 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:42:57.097 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:42:57 compute-1 ceph-mon[80115]: pgmap v48: 353 pgs: 353 active+clean; 457 KiB data, 148 MiB used, 60 GiB / 60 GiB avail
Feb 02 09:42:57 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "19"}]': finished
Feb 02 09:42:57 compute-1 ceph-mon[80115]: osdmap e101: 3 total, 3 up, 3 in
Feb 02 09:42:57 compute-1 ceph-mon[80115]: osdmap e102: 3 total, 3 up, 3 in
Feb 02 09:42:57 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e103 e103: 3 total, 3 up, 3 in
Feb 02 09:42:57 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 103 pg[9.12( v 44'1041 (0'0,44'1041] local-lis/les=102/103 n=4 ec=57/38 lis/c=57/57 les/c/f=58/58/0 sis=102) [1]/[0] async=[1] r=0 lpr=102 pi=[57,102)/1 crt=44'1041 lcod 0'0 mlcod 0'0 active+remapped mbc={255={(0+1)=4}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 02 09:42:57 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:42:57 : epoch 6980717c : compute-1 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Feb 02 09:42:58 compute-1 ceph-mon[80115]: osdmap e103: 3 total, 3 up, 3 in
Feb 02 09:42:58 compute-1 ceph-mon[80115]: pgmap v52: 353 pgs: 353 active+clean; 457 KiB data, 148 MiB used, 60 GiB / 60 GiB avail
Feb 02 09:42:58 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "20"}]: dispatch
Feb 02 09:42:58 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e104 e104: 3 total, 3 up, 3 in
Feb 02 09:42:58 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 104 pg[9.12( v 44'1041 (0'0,44'1041] local-lis/les=102/103 n=4 ec=57/38 lis/c=102/57 les/c/f=103/58/0 sis=104 pruub=14.986758232s) [1] async=[1] r=-1 lpr=104 pi=[57,104)/1 crt=44'1041 lcod 0'0 mlcod 0'0 active pruub 244.532653809s@ mbc={255={}}] PeeringState::start_peering_interval up [1] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 1 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb 02 09:42:58 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 104 pg[9.12( v 44'1041 (0'0,44'1041] local-lis/les=102/103 n=4 ec=57/38 lis/c=102/57 les/c/f=103/58/0 sis=104 pruub=14.986701012s) [1] r=-1 lpr=104 pi=[57,104)/1 crt=44'1041 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 244.532653809s@ mbc={}] state<Start>: transitioning to Stray
Feb 02 09:42:58 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:42:58 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:42:58 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:42:58.520 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:42:58 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:42:58 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2e400a2b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:42:58 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:42:58 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2c0003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:42:59 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:42:59 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2c4003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:42:59 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:42:59 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:42:59 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:42:59.099 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:42:59 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e105 e105: 3 total, 3 up, 3 in
Feb 02 09:42:59 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "20"}]': finished
Feb 02 09:42:59 compute-1 ceph-mon[80115]: osdmap e104: 3 total, 3 up, 3 in
Feb 02 09:43:00 compute-1 python3.9[90643]: ansible-ansible.builtin.lineinfile Invoked with line=cloud-init=disabled path=/proc/cmdline state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 02 09:43:00 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e106 e106: 3 total, 3 up, 3 in
Feb 02 09:43:00 compute-1 ceph-mon[80115]: osdmap e105: 3 total, 3 up, 3 in
Feb 02 09:43:00 compute-1 ceph-mon[80115]: pgmap v55: 353 pgs: 1 active+remapped, 352 active+clean; 457 KiB data, 148 MiB used, 60 GiB / 60 GiB avail; 0 B/s, 0 objects/s recovering
Feb 02 09:43:00 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "21"}]: dispatch
Feb 02 09:43:00 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:43:00 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 09:43:00 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:43:00.523 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 09:43:00 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e106 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 02 09:43:00 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:43:00 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2cc003db0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:43:00 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:43:00 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2e400a2b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:43:00 compute-1 python3.9[90794]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 02 09:43:01 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:43:01 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2c0003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:43:01 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:43:01 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:43:01 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:43:01.100 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:43:01 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "21"}]': finished
Feb 02 09:43:01 compute-1 ceph-mon[80115]: osdmap e106: 3 total, 3 up, 3 in
Feb 02 09:43:02 compute-1 python3.9[90948]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 02 09:43:02 compute-1 ceph-mon[80115]: pgmap v57: 353 pgs: 1 active+remapped, 352 active+clean; 457 KiB data, 148 MiB used, 60 GiB / 60 GiB avail; 0 B/s, 0 objects/s recovering
Feb 02 09:43:02 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "22"}]: dispatch
Feb 02 09:43:02 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Feb 02 09:43:02 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e107 e107: 3 total, 3 up, 3 in
Feb 02 09:43:02 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:43:02 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 09:43:02 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:43:02.525 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 09:43:02 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:43:02 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2c4003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:43:02 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:43:02 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2cc003db0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:43:03 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:43:03 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2e400a2b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:43:03 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:43:03 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:43:03 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:43:03.102 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:43:03 compute-1 sudo[91105]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ivwnpvshatoselysrxfiyyqrieupelyu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025382.9154332-329-62301884602417/AnsiballZ_setup.py'
Feb 02 09:43:03 compute-1 sudo[91105]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:43:03 compute-1 python3.9[91107]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Feb 02 09:43:03 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e108 e108: 3 total, 3 up, 3 in
Feb 02 09:43:03 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "22"}]': finished
Feb 02 09:43:03 compute-1 ceph-mon[80115]: osdmap e107: 3 total, 3 up, 3 in
Feb 02 09:43:03 compute-1 sudo[91105]: pam_unix(sudo:session): session closed for user root
Feb 02 09:43:03 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-haproxy-nfs-cephfs-compute-1-sryqbx[86108]: [WARNING] 032/094303 (4) : Server backend/nfs.cephfs.1 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Feb 02 09:43:04 compute-1 sudo[91189]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ynmmnvwjymierlzwgelsoqidrcxkyujy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025382.9154332-329-62301884602417/AnsiballZ_dnf.py'
Feb 02 09:43:04 compute-1 sudo[91189]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:43:04 compute-1 python3.9[91191]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Feb 02 09:43:04 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:43:04 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:43:04 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:43:04.528 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:43:04 compute-1 ceph-mon[80115]: osdmap e108: 3 total, 3 up, 3 in
Feb 02 09:43:04 compute-1 ceph-mon[80115]: pgmap v60: 353 pgs: 1 unknown, 352 active+clean; 457 KiB data, 148 MiB used, 60 GiB / 60 GiB avail; 226 B/s rd, 0 B/s wr, 0 op/s
Feb 02 09:43:04 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:43:04 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2e400a2b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:43:04 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e109 e109: 3 total, 3 up, 3 in
Feb 02 09:43:04 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:43:04 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2c4003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:43:05 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:43:05 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2d8000b60 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:43:05 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:43:05 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:43:05 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:43:05.105 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:43:05 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e109 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 02 09:43:05 compute-1 ceph-mon[80115]: osdmap e109: 3 total, 3 up, 3 in
Feb 02 09:43:05 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e110 e110: 3 total, 3 up, 3 in
Feb 02 09:43:06 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:43:06 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:43:06 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:43:06.530 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:43:06 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:43:06 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2c0003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:43:06 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e111 e111: 3 total, 3 up, 3 in
Feb 02 09:43:06 compute-1 ceph-mon[80115]: osdmap e110: 3 total, 3 up, 3 in
Feb 02 09:43:06 compute-1 ceph-mon[80115]: pgmap v63: 353 pgs: 1 unknown, 352 active+clean; 457 KiB data, 148 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 B/s wr, 0 op/s
Feb 02 09:43:06 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:43:06 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2e400a2b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:43:07 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:43:07 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2c4003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:43:07 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:43:07 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:43:07 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:43:07.107 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:43:07 compute-1 ceph-mon[80115]: osdmap e111: 3 total, 3 up, 3 in
Feb 02 09:43:08 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:43:08 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:43:08 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:43:08.532 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:43:08 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:43:08 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2d80016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:43:08 compute-1 ceph-mon[80115]: pgmap v65: 353 pgs: 1 unknown, 352 active+clean; 457 KiB data, 148 MiB used, 60 GiB / 60 GiB avail
Feb 02 09:43:08 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:43:08 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2c0003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:43:09 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:43:09 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2e400a2b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:43:09 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:43:09 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:43:09 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:43:09.111 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:43:09 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e112 e112: 3 total, 3 up, 3 in
Feb 02 09:43:09 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "23"}]: dispatch
Feb 02 09:43:10 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e112 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 02 09:43:10 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:43:10 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:43:10 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:43:10.536 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:43:10 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:43:10 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2c4003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:43:10 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:43:10 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2d80016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:43:10 compute-1 ceph-mon[80115]: pgmap v66: 353 pgs: 353 active+clean; 457 KiB data, 148 MiB used, 60 GiB / 60 GiB avail; 341 B/s rd, 0 op/s; 36 B/s, 1 objects/s recovering
Feb 02 09:43:10 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "23"}]': finished
Feb 02 09:43:10 compute-1 ceph-mon[80115]: osdmap e112: 3 total, 3 up, 3 in
Feb 02 09:43:11 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e113 e113: 3 total, 3 up, 3 in
Feb 02 09:43:11 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:43:11 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2c0003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:43:11 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:43:11 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 09:43:11 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:43:11.113 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 09:43:11 compute-1 sudo[91266]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Feb 02 09:43:11 compute-1 sudo[91266]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:43:11 compute-1 sudo[91266]: pam_unix(sudo:session): session closed for user root
Feb 02 09:43:12 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e114 e114: 3 total, 3 up, 3 in
Feb 02 09:43:12 compute-1 ceph-mon[80115]: osdmap e113: 3 total, 3 up, 3 in
Feb 02 09:43:12 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "24"}]: dispatch
Feb 02 09:43:12 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:43:12 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:43:12 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:43:12.539 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:43:12 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:43:12 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2e400a2b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:43:13 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:43:13 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:43:13 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:43:13.117 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:43:13 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:43:13 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2c4003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:43:13 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:43:13 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2d80016a0 fd 50 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:43:13 compute-1 ceph-mon[80115]: pgmap v69: 353 pgs: 353 active+clean; 457 KiB data, 148 MiB used, 60 GiB / 60 GiB avail; 341 B/s rd, 0 op/s; 36 B/s, 1 objects/s recovering
Feb 02 09:43:13 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "24"}]': finished
Feb 02 09:43:13 compute-1 ceph-mon[80115]: osdmap e114: 3 total, 3 up, 3 in
Feb 02 09:43:13 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e115 e115: 3 total, 3 up, 3 in
Feb 02 09:43:14 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:43:14 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 09:43:14 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:43:14.541 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 09:43:14 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:43:14 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2c0003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:43:14 compute-1 ceph-mon[80115]: osdmap e115: 3 total, 3 up, 3 in
Feb 02 09:43:14 compute-1 ceph-mon[80115]: pgmap v72: 353 pgs: 1 active+remapped, 352 active+clean; 457 KiB data, 148 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s; 27 B/s, 0 objects/s recovering
Feb 02 09:43:14 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "25"}]: dispatch
Feb 02 09:43:14 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e116 e116: 3 total, 3 up, 3 in
Feb 02 09:43:14 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:43:14 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2b8000b60 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:43:15 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:43:15 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2d80016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:43:15 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:43:15 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:43:15 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:43:15.120 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:43:15 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e116 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 02 09:43:15 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "25"}]': finished
Feb 02 09:43:15 compute-1 ceph-mon[80115]: osdmap e116: 3 total, 3 up, 3 in
Feb 02 09:43:16 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:43:16 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:43:16 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:43:16.544 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:43:16 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:43:16 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2e400a2b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:43:16 compute-1 ceph-mon[80115]: pgmap v74: 353 pgs: 1 active+remapped, 352 active+clean; 457 KiB data, 148 MiB used, 60 GiB / 60 GiB avail; 209 B/s rd, 0 op/s; 22 B/s, 0 objects/s recovering
Feb 02 09:43:16 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "26"}]: dispatch
Feb 02 09:43:16 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e117 e117: 3 total, 3 up, 3 in
Feb 02 09:43:16 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 117 pg[9.19( empty local-lis/les=0/0 n=0 ec=57/38 lis/c=82/82 les/c/f=83/83/0 sis=117) [0] r=0 lpr=117 pi=[82,117)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 02 09:43:16 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:43:16 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2e400a2b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:43:17 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:43:17 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2b80016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:43:17 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:43:17 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:43:17 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:43:17.123 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:43:17 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "26"}]': finished
Feb 02 09:43:17 compute-1 ceph-mon[80115]: osdmap e117: 3 total, 3 up, 3 in
Feb 02 09:43:17 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Feb 02 09:43:17 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e118 e118: 3 total, 3 up, 3 in
Feb 02 09:43:17 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 118 pg[9.19( empty local-lis/les=0/0 n=0 ec=57/38 lis/c=82/82 les/c/f=83/83/0 sis=118) [0]/[2] r=-1 lpr=118 pi=[82,118)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb 02 09:43:17 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 118 pg[9.19( empty local-lis/les=0/0 n=0 ec=57/38 lis/c=82/82 les/c/f=83/83/0 sis=118) [0]/[2] r=-1 lpr=118 pi=[82,118)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 02 09:43:18 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:43:18 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 09:43:18 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:43:18.546 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 09:43:18 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:43:18 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2d8003080 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:43:18 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:43:18 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2c0003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:43:19 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:43:19 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2e400a2b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:43:19 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e119 e119: 3 total, 3 up, 3 in
Feb 02 09:43:19 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:43:19 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 09:43:19 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:43:19.125 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 09:43:19 compute-1 ceph-mon[80115]: osdmap e118: 3 total, 3 up, 3 in
Feb 02 09:43:19 compute-1 ceph-mon[80115]: pgmap v77: 353 pgs: 1 active+remapped, 352 active+clean; 457 KiB data, 148 MiB used, 60 GiB / 60 GiB avail
Feb 02 09:43:19 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "27"}]: dispatch
Feb 02 09:43:19 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 119 pg[9.1a( empty local-lis/les=0/0 n=0 ec=57/38 lis/c=84/84 les/c/f=85/85/0 sis=119) [0] r=0 lpr=119 pi=[84,119)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 02 09:43:20 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "27"}]': finished
Feb 02 09:43:20 compute-1 ceph-mon[80115]: osdmap e119: 3 total, 3 up, 3 in
Feb 02 09:43:20 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e120 e120: 3 total, 3 up, 3 in
Feb 02 09:43:20 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 120 pg[9.19( v 44'1041 (0'0,44'1041] local-lis/les=0/0 n=7 ec=57/38 lis/c=118/82 les/c/f=119/83/0 sis=120) [0] r=0 lpr=120 pi=[82,120)/1 luod=0'0 crt=44'1041 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 0 -> 0, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Feb 02 09:43:20 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 120 pg[9.1a( empty local-lis/les=0/0 n=0 ec=57/38 lis/c=84/84 les/c/f=85/85/0 sis=120) [0]/[1] r=-1 lpr=120 pi=[84,120)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb 02 09:43:20 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 120 pg[9.1a( empty local-lis/les=0/0 n=0 ec=57/38 lis/c=84/84 les/c/f=85/85/0 sis=120) [0]/[1] r=-1 lpr=120 pi=[84,120)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 02 09:43:20 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 120 pg[9.19( v 44'1041 (0'0,44'1041] local-lis/les=0/0 n=7 ec=57/38 lis/c=118/82 les/c/f=119/83/0 sis=120) [0] r=0 lpr=120 pi=[82,120)/1 crt=44'1041 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 02 09:43:20 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e120 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 02 09:43:20 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:43:20 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:43:20 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:43:20.549 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:43:20 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:43:20 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2b80016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:43:20 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:43:20 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2d8003080 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:43:21 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:43:21 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2c0003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:43:21 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:43:21 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:43:21 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:43:21.128 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:43:21 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e121 e121: 3 total, 3 up, 3 in
Feb 02 09:43:21 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 121 pg[9.19( v 44'1041 (0'0,44'1041] local-lis/les=120/121 n=7 ec=57/38 lis/c=118/82 les/c/f=119/83/0 sis=120) [0] r=0 lpr=120 pi=[82,120)/1 crt=44'1041 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 02 09:43:21 compute-1 ceph-mon[80115]: pgmap v79: 353 pgs: 1 remapped+peering, 352 active+clean; 457 KiB data, 148 MiB used, 60 GiB / 60 GiB avail; 397 B/s rd, 0 op/s
Feb 02 09:43:21 compute-1 ceph-mon[80115]: osdmap e120: 3 total, 3 up, 3 in
Feb 02 09:43:21 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e122 e122: 3 total, 3 up, 3 in
Feb 02 09:43:21 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 122 pg[9.1a( v 44'1041 (0'0,44'1041] local-lis/les=0/0 n=4 ec=57/38 lis/c=120/84 les/c/f=121/85/0 sis=122) [0] r=0 lpr=122 pi=[84,122)/1 luod=0'0 crt=44'1041 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 0 -> 0, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Feb 02 09:43:21 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 122 pg[9.1a( v 44'1041 (0'0,44'1041] local-lis/les=0/0 n=4 ec=57/38 lis/c=120/84 les/c/f=121/85/0 sis=122) [0] r=0 lpr=122 pi=[84,122)/1 crt=44'1041 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 02 09:43:22 compute-1 ceph-mon[80115]: osdmap e121: 3 total, 3 up, 3 in
Feb 02 09:43:22 compute-1 ceph-mon[80115]: osdmap e122: 3 total, 3 up, 3 in
Feb 02 09:43:22 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:43:22 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:43:22 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:43:22.552 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:43:22 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:43:22 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2c0003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:43:22 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e123 e123: 3 total, 3 up, 3 in
Feb 02 09:43:22 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 123 pg[9.1a( v 44'1041 (0'0,44'1041] local-lis/les=122/123 n=4 ec=57/38 lis/c=120/84 les/c/f=121/85/0 sis=122) [0] r=0 lpr=122 pi=[84,122)/1 crt=44'1041 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 02 09:43:22 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:43:22 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2b80016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:43:23 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:43:23 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2e400a2b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:43:23 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:43:23 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 09:43:23 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:43:23.130 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 09:43:23 compute-1 ceph-mon[80115]: pgmap v83: 353 pgs: 1 remapped+peering, 352 active+clean; 457 KiB data, 148 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Feb 02 09:43:23 compute-1 ceph-mon[80115]: osdmap e123: 3 total, 3 up, 3 in
Feb 02 09:43:24 compute-1 ceph-mon[80115]: pgmap v85: 353 pgs: 353 active+clean; 457 KiB data, 148 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s; 54 B/s, 2 objects/s recovering
Feb 02 09:43:24 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "28"}]: dispatch
Feb 02 09:43:24 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e124 e124: 3 total, 3 up, 3 in
Feb 02 09:43:24 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 124 pg[9.1b( empty local-lis/les=0/0 n=0 ec=57/38 lis/c=68/68 les/c/f=69/69/0 sis=124) [0] r=0 lpr=124 pi=[68,124)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 02 09:43:24 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:43:24 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 09:43:24 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:43:24.554 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 09:43:24 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:43:24 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2d8003d90 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:43:24 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:43:24 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2c0003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:43:25 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:43:25 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2b8002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:43:25 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:43:25 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 09:43:25 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:43:25.133 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 09:43:25 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "28"}]': finished
Feb 02 09:43:25 compute-1 ceph-mon[80115]: osdmap e124: 3 total, 3 up, 3 in
Feb 02 09:43:25 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e125 e125: 3 total, 3 up, 3 in
Feb 02 09:43:25 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e125 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 09:43:25 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 125 pg[9.1b( empty local-lis/les=0/0 n=0 ec=57/38 lis/c=68/68 les/c/f=69/69/0 sis=125) [0]/[2] r=-1 lpr=125 pi=[68,125)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb 02 09:43:25 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 125 pg[9.1b( empty local-lis/les=0/0 n=0 ec=57/38 lis/c=68/68 les/c/f=69/69/0 sis=125) [0]/[2] r=-1 lpr=125 pi=[68,125)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 02 09:43:26 compute-1 ceph-mon[80115]: osdmap e125: 3 total, 3 up, 3 in
Feb 02 09:43:26 compute-1 ceph-mon[80115]: pgmap v88: 353 pgs: 353 active+clean; 457 KiB data, 148 MiB used, 60 GiB / 60 GiB avail; 224 B/s rd, 0 op/s; 48 B/s, 2 objects/s recovering
Feb 02 09:43:26 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "29"}]: dispatch
Feb 02 09:43:26 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:43:26 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:43:26 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:43:26.558 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:43:26 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:43:26 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2e400a2b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:43:26 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e126 e126: 3 total, 3 up, 3 in
Feb 02 09:43:26 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:43:26 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2d8003d90 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:43:27 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:43:27 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2c0003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:43:27 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:43:27 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:43:27 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:43:27.137 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:43:27 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e127 e127: 3 total, 3 up, 3 in
Feb 02 09:43:27 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 127 pg[9.1b( v 44'1041 (0'0,44'1041] local-lis/les=0/0 n=2 ec=57/38 lis/c=125/68 les/c/f=126/69/0 sis=127) [0] r=0 lpr=127 pi=[68,127)/1 luod=0'0 crt=44'1041 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 0 -> 0, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Feb 02 09:43:27 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 127 pg[9.1b( v 44'1041 (0'0,44'1041] local-lis/les=0/0 n=2 ec=57/38 lis/c=125/68 les/c/f=126/69/0 sis=127) [0] r=0 lpr=127 pi=[68,127)/1 crt=44'1041 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 02 09:43:27 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "29"}]': finished
Feb 02 09:43:27 compute-1 ceph-mon[80115]: osdmap e126: 3 total, 3 up, 3 in
Feb 02 09:43:28 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:43:28 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 09:43:28 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:43:28.560 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 09:43:28 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:43:28 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2b8002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:43:28 compute-1 ceph-mon[80115]: osdmap e127: 3 total, 3 up, 3 in
Feb 02 09:43:28 compute-1 ceph-mon[80115]: pgmap v91: 353 pgs: 353 active+clean; 457 KiB data, 148 MiB used, 60 GiB / 60 GiB avail
Feb 02 09:43:28 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "30"}]: dispatch
Feb 02 09:43:28 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e128 e128: 3 total, 3 up, 3 in
Feb 02 09:43:28 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 128 pg[9.1b( v 44'1041 (0'0,44'1041] local-lis/les=127/128 n=2 ec=57/38 lis/c=125/68 les/c/f=126/69/0 sis=127) [0] r=0 lpr=127 pi=[68,127)/1 crt=44'1041 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 02 09:43:28 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:43:28 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2e400a2d0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:43:29 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:43:29 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2d8003d90 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:43:29 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:43:29 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:43:29 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:43:29.140 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:43:29 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "30"}]': finished
Feb 02 09:43:29 compute-1 ceph-mon[80115]: osdmap e128: 3 total, 3 up, 3 in
Feb 02 09:43:29 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e129 e129: 3 total, 3 up, 3 in
Feb 02 09:43:30 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e129 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 09:43:30 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:43:30 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:43:30 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:43:30.563 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:43:30 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:43:30 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2c0003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:43:30 compute-1 ceph-mon[80115]: osdmap e129: 3 total, 3 up, 3 in
Feb 02 09:43:30 compute-1 ceph-mon[80115]: pgmap v94: 353 pgs: 1 peering, 352 active+clean; 457 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s; 27 B/s, 0 objects/s recovering
Feb 02 09:43:30 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e130 e130: 3 total, 3 up, 3 in
Feb 02 09:43:30 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:43:30 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2b8002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:43:31 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:43:31 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2e400a2f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:43:31 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:43:31 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 09:43:31 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:43:31.142 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 09:43:31 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e131 e131: 3 total, 3 up, 3 in
Feb 02 09:43:31 compute-1 sudo[91379]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Feb 02 09:43:31 compute-1 sudo[91379]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:43:31 compute-1 sudo[91379]: pam_unix(sudo:session): session closed for user root
Feb 02 09:43:31 compute-1 ceph-mon[80115]: osdmap e130: 3 total, 3 up, 3 in
Feb 02 09:43:31 compute-1 ceph-mon[80115]: osdmap e131: 3 total, 3 up, 3 in
Feb 02 09:43:32 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e132 e132: 3 total, 3 up, 3 in
Feb 02 09:43:32 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:43:32 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:43:32 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:43:32.566 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:43:32 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:43:32 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2d8003d90 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:43:32 compute-1 ceph-mon[80115]: pgmap v97: 353 pgs: 1 peering, 352 active+clean; 457 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s; 27 B/s, 0 objects/s recovering
Feb 02 09:43:32 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Feb 02 09:43:32 compute-1 ceph-mon[80115]: osdmap e132: 3 total, 3 up, 3 in
Feb 02 09:43:32 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:43:32 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2c0003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:43:33 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:43:33 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2b8003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:43:33 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:43:33 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 09:43:33 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:43:33.146 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 09:43:33 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-haproxy-nfs-cephfs-compute-1-sryqbx[86108]: [WARNING] 032/094333 (4) : Server backend/nfs.cephfs.1 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Feb 02 09:43:33 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e133 e133: 3 total, 3 up, 3 in
Feb 02 09:43:33 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "31"}]: dispatch
Feb 02 09:43:33 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 133 pg[9.1e( empty local-lis/les=0/0 n=0 ec=57/38 lis/c=74/74 les/c/f=75/75/0 sis=133) [0] r=0 lpr=133 pi=[74,133)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 02 09:43:34 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:43:34 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:43:34 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:43:34.569 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:43:34 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:43:34 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2e400a310 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:43:34 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:43:34 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2d8003d90 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:43:34 compute-1 ceph-mon[80115]: pgmap v99: 353 pgs: 353 active+clean; 457 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 52 B/s, 1 objects/s recovering
Feb 02 09:43:34 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "31"}]': finished
Feb 02 09:43:34 compute-1 ceph-mon[80115]: osdmap e133: 3 total, 3 up, 3 in
Feb 02 09:43:35 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e134 e134: 3 total, 3 up, 3 in
Feb 02 09:43:35 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 134 pg[9.1e( empty local-lis/les=0/0 n=0 ec=57/38 lis/c=74/74 les/c/f=75/75/0 sis=134) [0]/[1] r=-1 lpr=134 pi=[74,134)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb 02 09:43:35 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 134 pg[9.1e( empty local-lis/les=0/0 n=0 ec=57/38 lis/c=74/74 les/c/f=75/75/0 sis=134) [0]/[1] r=-1 lpr=134 pi=[74,134)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 02 09:43:35 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:43:35 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2c0003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:43:35 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:43:35 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:43:35 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:43:35.149 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:43:35 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 09:43:36 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e135 e135: 3 total, 3 up, 3 in
Feb 02 09:43:36 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 135 pg[9.1f( empty local-lis/les=0/0 n=0 ec=57/38 lis/c=96/96 les/c/f=97/97/0 sis=135) [0] r=0 lpr=135 pi=[96,135)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 02 09:43:36 compute-1 ceph-mon[80115]: osdmap e134: 3 total, 3 up, 3 in
Feb 02 09:43:36 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "32"}]: dispatch
Feb 02 09:43:36 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e136 e136: 3 total, 3 up, 3 in
Feb 02 09:43:36 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 136 pg[9.1f( empty local-lis/les=0/0 n=0 ec=57/38 lis/c=96/96 les/c/f=97/97/0 sis=136) [0]/[1] r=-1 lpr=136 pi=[96,136)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb 02 09:43:36 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 136 pg[9.1f( empty local-lis/les=0/0 n=0 ec=57/38 lis/c=96/96 les/c/f=97/97/0 sis=136) [0]/[1] r=-1 lpr=136 pi=[96,136)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 02 09:43:36 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 136 pg[9.1e( v 44'1041 (0'0,44'1041] local-lis/les=0/0 n=5 ec=57/38 lis/c=134/74 les/c/f=135/75/0 sis=136) [0] r=0 lpr=136 pi=[74,136)/1 luod=0'0 crt=44'1041 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 0 -> 0, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Feb 02 09:43:36 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 136 pg[9.1e( v 44'1041 (0'0,44'1041] local-lis/les=0/0 n=5 ec=57/38 lis/c=134/74 les/c/f=135/75/0 sis=136) [0] r=0 lpr=136 pi=[74,136)/1 crt=44'1041 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 02 09:43:36 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:43:36 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:43:36 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:43:36.571 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:43:36 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:43:36 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2b8003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:43:36 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:43:36 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2e400a330 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:43:37 compute-1 ceph-mon[80115]: pgmap v102: 353 pgs: 353 active+clean; 457 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 48 B/s, 1 objects/s recovering
Feb 02 09:43:37 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "32"}]': finished
Feb 02 09:43:37 compute-1 ceph-mon[80115]: osdmap e135: 3 total, 3 up, 3 in
Feb 02 09:43:37 compute-1 ceph-mon[80115]: osdmap e136: 3 total, 3 up, 3 in
Feb 02 09:43:37 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:43:37 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2d8003d90 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:43:37 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:43:37 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:43:37 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:43:37.151 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:43:37 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e137 e137: 3 total, 3 up, 3 in
Feb 02 09:43:37 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 137 pg[9.1e( v 44'1041 (0'0,44'1041] local-lis/les=136/137 n=5 ec=57/38 lis/c=134/74 les/c/f=135/75/0 sis=136) [0] r=0 lpr=136 pi=[74,136)/1 crt=44'1041 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 02 09:43:38 compute-1 ceph-mon[80115]: osdmap e137: 3 total, 3 up, 3 in
Feb 02 09:43:38 compute-1 ceph-mon[80115]: pgmap v106: 353 pgs: 353 active+clean; 457 KiB data, 149 MiB used, 60 GiB / 60 GiB avail
Feb 02 09:43:38 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e138 e138: 3 total, 3 up, 3 in
Feb 02 09:43:38 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 138 pg[9.1f( v 44'1041 (0'0,44'1041] local-lis/les=0/0 n=5 ec=57/38 lis/c=136/96 les/c/f=137/97/0 sis=138) [0] r=0 lpr=138 pi=[96,138)/1 luod=0'0 crt=44'1041 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 0 -> 0, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Feb 02 09:43:38 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 138 pg[9.1f( v 44'1041 (0'0,44'1041] local-lis/les=0/0 n=5 ec=57/38 lis/c=136/96 les/c/f=137/97/0 sis=138) [0] r=0 lpr=138 pi=[96,138)/1 crt=44'1041 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 02 09:43:38 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:43:38 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:43:38 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:43:38.572 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:43:38 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:43:38 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2c0003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:43:38 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:43:38 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2b8003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:43:39 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:43:39 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2e400a350 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:43:39 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:43:39 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 09:43:39 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:43:39.154 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 09:43:39 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 e139: 3 total, 3 up, 3 in
Feb 02 09:43:39 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 139 pg[9.1f( v 44'1041 (0'0,44'1041] local-lis/les=138/139 n=5 ec=57/38 lis/c=136/96 les/c/f=137/97/0 sis=138) [0] r=0 lpr=138 pi=[96,138)/1 crt=44'1041 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 02 09:43:39 compute-1 ceph-mon[80115]: osdmap e138: 3 total, 3 up, 3 in
Feb 02 09:43:40 compute-1 ceph-mon[80115]: osdmap e139: 3 total, 3 up, 3 in
Feb 02 09:43:40 compute-1 ceph-mon[80115]: pgmap v109: 353 pgs: 1 active+remapped, 352 active+clean; 457 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 526 B/s rd, 0 op/s; 28 B/s, 2 objects/s recovering
Feb 02 09:43:40 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 09:43:40 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:43:40 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:43:40 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:43:40.574 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:43:40 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:43:40 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2e400a350 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:43:40 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:43:40 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2c0003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:43:41 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:43:41 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2b8003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:43:41 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:43:41 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 09:43:41 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:43:41.157 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 09:43:42 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:43:42 : epoch 6980717c : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Feb 02 09:43:42 compute-1 sudo[91189]: pam_unix(sudo:session): session closed for user root
Feb 02 09:43:42 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:43:42 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:43:42 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:43:42.577 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:43:42 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:43:42 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2e400a350 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:43:42 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:43:42 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2e400a350 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:43:42 compute-1 ceph-mon[80115]: pgmap v110: 353 pgs: 1 active+remapped, 352 active+clean; 457 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 369 B/s rd, 0 op/s; 19 B/s, 1 objects/s recovering
Feb 02 09:43:43 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:43:43 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2c0003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:43:43 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:43:43 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 09:43:43 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:43:43.160 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 09:43:44 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:43:44 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:43:44 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:43:44.580 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:43:44 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:43:44 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2b8003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:43:44 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:43:44 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2b8003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:43:45 compute-1 ceph-mon[80115]: pgmap v111: 353 pgs: 353 active+clean; 457 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 2.3 KiB/s rd, 1.1 KiB/s wr, 3 op/s; 16 B/s, 1 objects/s recovering
Feb 02 09:43:45 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:43:45 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2b8003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:43:45 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:43:45 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:43:45 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:43:45.164 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:43:45 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:43:45 : epoch 6980717c : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Feb 02 09:43:45 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:43:45 : epoch 6980717c : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Feb 02 09:43:45 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 09:43:46 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:43:46 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 09:43:46 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:43:46.582 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 09:43:46 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:43:46 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2c4002690 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:43:46 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:43:46 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2cc002830 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:43:47 compute-1 ceph-mon[80115]: pgmap v112: 353 pgs: 353 active+clean; 457 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 1.9 KiB/s rd, 895 B/s wr, 2 op/s; 13 B/s, 1 objects/s recovering
Feb 02 09:43:47 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:43:47 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2c0003db0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:43:47 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:43:47 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:43:47 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:43:47.167 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:43:48 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Feb 02 09:43:48 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:43:48 : epoch 6980717c : compute-1 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Feb 02 09:43:48 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:43:48 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:43:48 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:43:48.585 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:43:48 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:43:48 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2b8003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:43:48 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:43:48 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2c4002690 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:43:49 compute-1 ceph-mon[80115]: pgmap v113: 353 pgs: 353 active+clean; 457 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 1.4 KiB/s rd, 752 B/s wr, 2 op/s
Feb 02 09:43:49 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:43:49 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2cc002830 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:43:49 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:43:49 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 09:43:49 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:43:49.169 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 09:43:50 compute-1 ceph-mon[80115]: mgrmap e30: compute-0.djvyfo(active, since 92s), standbys: compute-2.gzlyac, compute-1.teascl
Feb 02 09:43:50 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 09:43:50 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:43:50 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 09:43:50 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:43:50.588 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 09:43:50 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:43:50 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2c0003db0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:43:50 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:43:50 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2b8003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:43:51 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:43:51 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2c4002690 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:43:51 compute-1 ceph-mon[80115]: pgmap v114: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 2.3 KiB/s rd, 1.3 KiB/s wr, 3 op/s
Feb 02 09:43:51 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:43:51 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:43:51 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:43:51.172 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:43:51 compute-1 sudo[91565]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cevfxqxghidvvyzuduxsmaxtgkmwpfun ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025431.1962197-365-53347667112394/AnsiballZ_command.py'
Feb 02 09:43:51 compute-1 sudo[91565]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:43:51 compute-1 sudo[91566]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Feb 02 09:43:51 compute-1 sudo[91566]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:43:51 compute-1 sudo[91566]: pam_unix(sudo:session): session closed for user root
Feb 02 09:43:51 compute-1 python3.9[91580]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 02 09:43:52 compute-1 sudo[91565]: pam_unix(sudo:session): session closed for user root
Feb 02 09:43:52 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:43:52 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 09:43:52 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:43:52.591 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 09:43:52 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:43:52 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2cc002830 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:43:52 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:43:52 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2c0003db0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:43:53 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:43:53 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2b8003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:43:53 compute-1 sudo[91878]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kffgsshatacbolczlceiiboabqnahqwk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025432.5977435-389-204861884161242/AnsiballZ_selinux.py'
Feb 02 09:43:53 compute-1 sudo[91878]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:43:53 compute-1 ceph-mon[80115]: pgmap v115: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 2.0 KiB/s rd, 1.2 KiB/s wr, 3 op/s
Feb 02 09:43:53 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:43:53 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:43:53 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:43:53.174 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:43:53 compute-1 sudo[91881]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 02 09:43:53 compute-1 sudo[91881]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:43:53 compute-1 sudo[91881]: pam_unix(sudo:session): session closed for user root
Feb 02 09:43:53 compute-1 python3.9[91880]: ansible-ansible.posix.selinux Invoked with policy=targeted state=enforcing configfile=/etc/selinux/config update_kernel_param=False
Feb 02 09:43:53 compute-1 sudo[91878]: pam_unix(sudo:session): session closed for user root
Feb 02 09:43:53 compute-1 sudo[91906]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/d241d473-9fcb-5f74-b163-f1ca4454e7f1/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Feb 02 09:43:53 compute-1 sudo[91906]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:43:53 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-haproxy-nfs-cephfs-compute-1-sryqbx[86108]: [WARNING] 032/094353 (4) : Server backend/nfs.cephfs.1 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Feb 02 09:43:53 compute-1 sudo[91906]: pam_unix(sudo:session): session closed for user root
Feb 02 09:43:54 compute-1 sudo[92110]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lcmanenehebjppjxysxexzryglvmhpqv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025434.10801-422-213100508479522/AnsiballZ_command.py'
Feb 02 09:43:54 compute-1 sudo[92110]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:43:54 compute-1 python3.9[92112]: ansible-ansible.legacy.command Invoked with cmd=dd if=/dev/zero of=/swap count=1024 bs=1M creates=/swap _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None removes=None stdin=None
Feb 02 09:43:54 compute-1 sudo[92110]: pam_unix(sudo:session): session closed for user root
Feb 02 09:43:54 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:43:54 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:43:54 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:43:54.594 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:43:54 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:43:54 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2c4002690 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:43:54 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:43:54 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2cc002830 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:43:55 compute-1 sudo[92263]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hmglakexluhgfzjjbegvzvvdkggujhrz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025434.8734264-446-129879133089361/AnsiballZ_file.py'
Feb 02 09:43:55 compute-1 sudo[92263]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:43:55 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:43:55 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2c0003db0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:43:55 compute-1 ceph-mon[80115]: pgmap v116: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 2.2 KiB/s rd, 1.2 KiB/s wr, 3 op/s
Feb 02 09:43:55 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:43:55 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:43:55 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:43:55.177 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:43:55 compute-1 python3.9[92265]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/swap recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 02 09:43:55 compute-1 sudo[92263]: pam_unix(sudo:session): session closed for user root
Feb 02 09:43:55 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 09:43:55 compute-1 sudo[92415]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aoatqcrnuosajdbnebtxgtskiqmcfzfc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025435.501458-470-139900633554889/AnsiballZ_mount.py'
Feb 02 09:43:55 compute-1 sudo[92415]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:43:56 compute-1 python3.9[92417]: ansible-ansible.posix.mount Invoked with dump=0 fstype=swap name=none opts=sw passno=0 src=/swap state=present path=none boot=True opts_no_log=False backup=False fstab=None
Feb 02 09:43:56 compute-1 sudo[92415]: pam_unix(sudo:session): session closed for user root
Feb 02 09:43:56 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:43:56 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:43:56 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:43:56.596 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:43:56 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:43:56 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2b8003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:43:56 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:43:56 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2b8003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:43:57 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:43:57 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2cc002830 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:43:57 compute-1 ceph-mon[80115]: pgmap v117: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 1.2 KiB/s rd, 682 B/s wr, 2 op/s
Feb 02 09:43:57 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb 02 09:43:57 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb 02 09:43:57 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:43:57 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:43:57 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:43:57.180 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:43:57 compute-1 sudo[92568]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-djkrkhrbegudbrnkqmtysmclzyqhdcdl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025437.1747832-554-154731930340413/AnsiballZ_file.py'
Feb 02 09:43:57 compute-1 sudo[92568]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:43:57 compute-1 python3.9[92570]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/ca-trust/source/anchors setype=cert_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 02 09:43:57 compute-1 sudo[92568]: pam_unix(sudo:session): session closed for user root
Feb 02 09:43:58 compute-1 sudo[92720]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bauiczirqkoewkwaoeqtafbcqsygihlv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025437.8895144-579-225748778361840/AnsiballZ_stat.py'
Feb 02 09:43:58 compute-1 sudo[92720]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:43:58 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Feb 02 09:43:58 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Feb 02 09:43:58 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb 02 09:43:58 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb 02 09:43:58 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Feb 02 09:43:58 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Feb 02 09:43:58 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Feb 02 09:43:58 compute-1 python3.9[92722]: ansible-ansible.legacy.stat Invoked with path=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 02 09:43:58 compute-1 sudo[92720]: pam_unix(sudo:session): session closed for user root
Feb 02 09:43:58 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:43:58 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:43:58 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:43:58.599 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:43:58 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:43:58 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2c4002690 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:43:58 compute-1 sudo[92798]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-divkfxuncnocrgypryjvynuwgyupjdnr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025437.8895144-579-225748778361840/AnsiballZ_file.py'
Feb 02 09:43:58 compute-1 sudo[92798]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:43:58 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:43:58 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2c4002690 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:43:58 compute-1 python3.9[92800]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem _original_basename=tls-ca-bundle.pem recurse=False state=file path=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 02 09:43:58 compute-1 sudo[92798]: pam_unix(sudo:session): session closed for user root
Feb 02 09:43:59 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:43:59 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2b8003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:43:59 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:43:59 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:43:59 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:43:59.182 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:43:59 compute-1 ceph-mon[80115]: pgmap v118: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 1.2 KiB/s rd, 697 B/s wr, 2 op/s
Feb 02 09:44:00 compute-1 sudo[92951]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-skbyfizeurkvgwqfkznrpbyopfsakqaf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025439.897622-641-209971624502943/AnsiballZ_stat.py'
Feb 02 09:44:00 compute-1 sudo[92951]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:44:00 compute-1 python3.9[92953]: ansible-ansible.builtin.stat Invoked with path=/etc/lvm/devices/system.devices follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 02 09:44:00 compute-1 sudo[92951]: pam_unix(sudo:session): session closed for user root
Feb 02 09:44:00 compute-1 ceph-mon[80115]: pgmap v119: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 1.2 KiB/s rd, 697 B/s wr, 2 op/s
Feb 02 09:44:00 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 09:44:00 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:44:00 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:44:00 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2cc003630 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:44:00 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 09:44:00 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:44:00.601 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 09:44:00 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:44:00 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2c4002690 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:44:01 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:44:01 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2cc003630 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:44:01 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:44:01 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 09:44:01 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:44:01.185 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 09:44:01 compute-1 sudo[93106]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pjqvxfmougxaitzcjjlrzvjqmpmqzxnm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025441.0432317-680-105895009390509/AnsiballZ_getent.py'
Feb 02 09:44:01 compute-1 sudo[93106]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:44:01 compute-1 python3.9[93108]: ansible-ansible.builtin.getent Invoked with database=passwd key=qemu fail_key=True service=None split=None
Feb 02 09:44:01 compute-1 sudo[93106]: pam_unix(sudo:session): session closed for user root
Feb 02 09:44:02 compute-1 sudo[93259]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zqzgsnlfnbqgckazxifdoglvqffrenuq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025442.1346796-710-270608925294415/AnsiballZ_getent.py'
Feb 02 09:44:02 compute-1 sudo[93259]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:44:02 compute-1 python3.9[93261]: ansible-ansible.builtin.getent Invoked with database=passwd key=hugetlbfs fail_key=True service=None split=None
Feb 02 09:44:02 compute-1 sudo[93259]: pam_unix(sudo:session): session closed for user root
Feb 02 09:44:02 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:44:02 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2c0003db0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:44:02 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:44:02 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:44:02 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:44:02.604 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:44:02 compute-1 sudo[93263]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 02 09:44:02 compute-1 sudo[93263]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:44:02 compute-1 sudo[93263]: pam_unix(sudo:session): session closed for user root
Feb 02 09:44:02 compute-1 ceph-mon[80115]: pgmap v120: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 261 B/s rd, 87 B/s wr, 0 op/s
Feb 02 09:44:02 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Feb 02 09:44:02 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb 02 09:44:02 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb 02 09:44:02 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:44:02 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2b8003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:44:03 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:44:03 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2c4003fb0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:44:03 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:44:03 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:44:03 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:44:03.189 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:44:03 compute-1 sudo[93438]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hjzvjptgrfsgjrardbbrourmzfgsvlxz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025442.8473032-734-201035447817/AnsiballZ_group.py'
Feb 02 09:44:03 compute-1 sudo[93438]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:44:03 compute-1 python3.9[93440]: ansible-ansible.builtin.group Invoked with gid=42477 name=hugetlbfs state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Feb 02 09:44:03 compute-1 sudo[93438]: pam_unix(sudo:session): session closed for user root
Feb 02 09:44:04 compute-1 sudo[93590]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ykgvphsigndxkppmqzmqgidhtpqobdug ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025443.812351-761-114538909253653/AnsiballZ_file.py'
Feb 02 09:44:04 compute-1 sudo[93590]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:44:04 compute-1 python3.9[93592]: ansible-ansible.builtin.file Invoked with group=qemu mode=0755 owner=qemu path=/var/lib/vhost_sockets setype=virt_cache_t seuser=system_u state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None serole=None selevel=None attributes=None
Feb 02 09:44:04 compute-1 sudo[93590]: pam_unix(sudo:session): session closed for user root
Feb 02 09:44:04 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:44:04 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2cc003630 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:44:04 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:44:04 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:44:04 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:44:04.606 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:44:04 compute-1 ceph-mon[80115]: pgmap v121: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 522 B/s rd, 87 B/s wr, 0 op/s
Feb 02 09:44:04 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:44:04 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2cc003630 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:44:05 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:44:05 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2b8003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:44:05 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:44:05 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:44:05 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:44:05.192 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:44:05 compute-1 sudo[93743]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-elcaxzpnnctvlvgwddjudskgmhlcvhlo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025444.935514-794-149621930548842/AnsiballZ_dnf.py'
Feb 02 09:44:05 compute-1 sudo[93743]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:44:05 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 09:44:05 compute-1 python3.9[93745]: ansible-ansible.legacy.dnf Invoked with name=['dracut-config-generic'] state=absent allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Feb 02 09:44:06 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:44:06 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2c4003fb0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:44:06 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:44:06 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:44:06 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:44:06.609 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:44:06 compute-1 ceph-mon[80115]: pgmap v122: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 261 B/s rd, 0 op/s
Feb 02 09:44:06 compute-1 sudo[93743]: pam_unix(sudo:session): session closed for user root
Feb 02 09:44:06 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:44:06 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2c0003dd0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:44:07 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:44:07 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2cc003630 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:44:07 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:44:07 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:44:07 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:44:07.194 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:44:07 compute-1 sudo[93897]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wimpilinwhpnzsxhnavginxnqezaaklr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025447.1449535-818-281272838452313/AnsiballZ_file.py'
Feb 02 09:44:07 compute-1 sudo[93897]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:44:07 compute-1 python3.9[93899]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/modules-load.d setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 02 09:44:07 compute-1 sudo[93897]: pam_unix(sudo:session): session closed for user root
Feb 02 09:44:08 compute-1 sudo[94049]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dirwomtvjwanyejfeeemftjsqrmmxwyx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025447.8607213-842-11163585779684/AnsiballZ_stat.py'
Feb 02 09:44:08 compute-1 sudo[94049]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:44:08 compute-1 python3.9[94051]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 02 09:44:08 compute-1 sudo[94049]: pam_unix(sudo:session): session closed for user root
Feb 02 09:44:08 compute-1 sudo[94127]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qtrqhfeiltnghfwgorjlcfktclgyzulc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025447.8607213-842-11163585779684/AnsiballZ_file.py'
Feb 02 09:44:08 compute-1 sudo[94127]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:44:08 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:44:08 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2b8003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:44:08 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:44:08 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:44:08 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:44:08.611 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:44:08 compute-1 python3.9[94129]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root setype=etc_t dest=/etc/modules-load.d/99-edpm.conf _original_basename=edpm-modprobe.conf.j2 recurse=False state=file path=/etc/modules-load.d/99-edpm.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 02 09:44:08 compute-1 sudo[94127]: pam_unix(sudo:session): session closed for user root
Feb 02 09:44:08 compute-1 ceph-mon[80115]: pgmap v123: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 261 B/s rd, 0 op/s
Feb 02 09:44:08 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:44:08 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2c4003fb0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:44:09 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:44:09 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2c0003df0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:44:09 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:44:09 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:44:09 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:44:09.198 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:44:09 compute-1 sudo[94282]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-szfahuzkyyopnpdarfmzevkhovanfzls ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025449.098409-881-188381008225428/AnsiballZ_stat.py'
Feb 02 09:44:09 compute-1 sudo[94282]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:44:09 compute-1 sshd-session[94155]: Invalid user solv from 80.94.92.184 port 60244
Feb 02 09:44:09 compute-1 sshd-session[94155]: Connection closed by invalid user solv 80.94.92.184 port 60244 [preauth]
Feb 02 09:44:09 compute-1 python3.9[94284]: ansible-ansible.legacy.stat Invoked with path=/etc/sysctl.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 02 09:44:09 compute-1 sudo[94282]: pam_unix(sudo:session): session closed for user root
Feb 02 09:44:09 compute-1 sudo[94360]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dkemezhspqucvrlkscdluhvsebzzyknq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025449.098409-881-188381008225428/AnsiballZ_file.py'
Feb 02 09:44:09 compute-1 sudo[94360]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:44:09 compute-1 python3.9[94362]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root setype=etc_t dest=/etc/sysctl.d/99-edpm.conf _original_basename=edpm-sysctl.conf.j2 recurse=False state=file path=/etc/sysctl.d/99-edpm.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 02 09:44:09 compute-1 sudo[94360]: pam_unix(sudo:session): session closed for user root
Feb 02 09:44:10 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 09:44:10 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:44:10 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2cc003630 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:44:10 compute-1 sudo[94513]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-unotvfjzhrbarxklbmvrjxyxcykjlzgw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025450.5537655-926-147516078928354/AnsiballZ_dnf.py'
Feb 02 09:44:10 compute-1 sudo[94513]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:44:10 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:44:10 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2cc003630 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:44:10 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:44:10 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:44:10 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:44:10.957 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:44:10 compute-1 ceph-mon[80115]: pgmap v124: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Feb 02 09:44:11 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:44:11 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2c4003fb0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:44:11 compute-1 python3.9[94515]: ansible-ansible.legacy.dnf Invoked with name=['tuned', 'tuned-profiles-cpu-partitioning'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Feb 02 09:44:11 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:44:11 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:44:11 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:44:11.201 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:44:11 compute-1 sudo[94517]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Feb 02 09:44:11 compute-1 sudo[94517]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:44:11 compute-1 sudo[94517]: pam_unix(sudo:session): session closed for user root
Feb 02 09:44:12 compute-1 sudo[94513]: pam_unix(sudo:session): session closed for user root
Feb 02 09:44:12 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:44:12 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2c0003e10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:44:12 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:44:12 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2cc003630 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:44:12 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:44:12 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 09:44:12 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:44:12.960 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 09:44:13 compute-1 ceph-mon[80115]: pgmap v125: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Feb 02 09:44:13 compute-1 ceph-mon[80115]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #19. Immutable memtables: 0.
Feb 02 09:44:13 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-09:44:13.018688) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Feb 02 09:44:13 compute-1 ceph-mon[80115]: rocksdb: [db/flush_job.cc:856] [default] [JOB 7] Flushing memtable with next log file: 19
Feb 02 09:44:13 compute-1 ceph-mon[80115]: rocksdb: EVENT_LOG_v1 {"time_micros": 1770025453018734, "job": 7, "event": "flush_started", "num_memtables": 1, "num_entries": 2994, "num_deletes": 251, "total_data_size": 8909660, "memory_usage": 9050384, "flush_reason": "Manual Compaction"}
Feb 02 09:44:13 compute-1 ceph-mon[80115]: rocksdb: [db/flush_job.cc:885] [default] [JOB 7] Level-0 flush table #20: started
Feb 02 09:44:13 compute-1 ceph-mon[80115]: rocksdb: EVENT_LOG_v1 {"time_micros": 1770025453052326, "cf_name": "default", "job": 7, "event": "table_file_creation", "file_number": 20, "file_size": 5479434, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 7760, "largest_seqno": 10749, "table_properties": {"data_size": 5466280, "index_size": 8496, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 3589, "raw_key_size": 30984, "raw_average_key_size": 21, "raw_value_size": 5438333, "raw_average_value_size": 3854, "num_data_blocks": 372, "num_entries": 1411, "num_filter_entries": 1411, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1770025339, "oldest_key_time": 1770025339, "file_creation_time": 1770025453, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "fac1d709-8a2a-487d-b05b-57255ec289c7", "db_session_id": "DE871D21TSCUFP8UED8E", "orig_file_number": 20, "seqno_to_time_mapping": "N/A"}}
Feb 02 09:44:13 compute-1 ceph-mon[80115]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 7] Flush lasted 33689 microseconds, and 6790 cpu microseconds.
Feb 02 09:44:13 compute-1 ceph-mon[80115]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Feb 02 09:44:13 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-09:44:13.052379) [db/flush_job.cc:967] [default] [JOB 7] Level-0 flush table #20: 5479434 bytes OK
Feb 02 09:44:13 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-09:44:13.052401) [db/memtable_list.cc:519] [default] Level-0 commit table #20 started
Feb 02 09:44:13 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-09:44:13.058432) [db/memtable_list.cc:722] [default] Level-0 commit table #20: memtable #1 done
Feb 02 09:44:13 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-09:44:13.058457) EVENT_LOG_v1 {"time_micros": 1770025453058450, "job": 7, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Feb 02 09:44:13 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-09:44:13.058478) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Feb 02 09:44:13 compute-1 ceph-mon[80115]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 7] Try to delete WAL files size 8895363, prev total WAL file size 8895363, number of live WAL files 2.
Feb 02 09:44:13 compute-1 ceph-mon[80115]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000016.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb 02 09:44:13 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-09:44:13.059997) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F7300323531' seq:72057594037927935, type:22 .. '7061786F7300353033' seq:0, type:0; will stop at (end)
Feb 02 09:44:13 compute-1 ceph-mon[80115]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 8] Compacting 1@0 + 1@6 files to L6, score -1.00
Feb 02 09:44:13 compute-1 ceph-mon[80115]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 7 Base level 0, inputs: [20(5351KB)], [18(11MB)]
Feb 02 09:44:13 compute-1 ceph-mon[80115]: rocksdb: EVENT_LOG_v1 {"time_micros": 1770025453060054, "job": 8, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [20], "files_L6": [18], "score": -1, "input_data_size": 17908171, "oldest_snapshot_seqno": -1}
Feb 02 09:44:13 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:44:13 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2b8003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:44:13 compute-1 ceph-mon[80115]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 8] Generated table #21: 4077 keys, 14270142 bytes, temperature: kUnknown
Feb 02 09:44:13 compute-1 ceph-mon[80115]: rocksdb: EVENT_LOG_v1 {"time_micros": 1770025453147093, "cf_name": "default", "job": 8, "event": "table_file_creation", "file_number": 21, "file_size": 14270142, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 14237502, "index_size": 21330, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 10245, "raw_key_size": 104104, "raw_average_key_size": 25, "raw_value_size": 14157384, "raw_average_value_size": 3472, "num_data_blocks": 916, "num_entries": 4077, "num_filter_entries": 4077, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1770025175, "oldest_key_time": 0, "file_creation_time": 1770025453, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "fac1d709-8a2a-487d-b05b-57255ec289c7", "db_session_id": "DE871D21TSCUFP8UED8E", "orig_file_number": 21, "seqno_to_time_mapping": "N/A"}}
Feb 02 09:44:13 compute-1 ceph-mon[80115]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Feb 02 09:44:13 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-09:44:13.147412) [db/compaction/compaction_job.cc:1663] [default] [JOB 8] Compacted 1@0 + 1@6 files to L6 => 14270142 bytes
Feb 02 09:44:13 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-09:44:13.148704) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 205.5 rd, 163.7 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(5.2, 11.9 +0.0 blob) out(13.6 +0.0 blob), read-write-amplify(5.9) write-amplify(2.6) OK, records in: 4608, records dropped: 531 output_compression: NoCompression
Feb 02 09:44:13 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-09:44:13.148734) EVENT_LOG_v1 {"time_micros": 1770025453148720, "job": 8, "event": "compaction_finished", "compaction_time_micros": 87159, "compaction_time_cpu_micros": 19720, "output_level": 6, "num_output_files": 1, "total_output_size": 14270142, "num_input_records": 4608, "num_output_records": 4077, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Feb 02 09:44:13 compute-1 ceph-mon[80115]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000020.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb 02 09:44:13 compute-1 ceph-mon[80115]: rocksdb: EVENT_LOG_v1 {"time_micros": 1770025453149812, "job": 8, "event": "table_file_deletion", "file_number": 20}
Feb 02 09:44:13 compute-1 ceph-mon[80115]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000018.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb 02 09:44:13 compute-1 ceph-mon[80115]: rocksdb: EVENT_LOG_v1 {"time_micros": 1770025453151575, "job": 8, "event": "table_file_deletion", "file_number": 18}
Feb 02 09:44:13 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-09:44:13.059912) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 02 09:44:13 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-09:44:13.151658) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 02 09:44:13 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-09:44:13.151666) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 02 09:44:13 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-09:44:13.151670) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 02 09:44:13 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-09:44:13.151673) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 02 09:44:13 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-09:44:13.151676) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 02 09:44:13 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:44:13 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 09:44:13 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:44:13.203 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 09:44:13 compute-1 python3.9[94692]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/active_profile follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 02 09:44:14 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:44:14 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2c4003fb0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:44:14 compute-1 python3.9[94844]: ansible-ansible.builtin.slurp Invoked with src=/etc/tuned/active_profile
Feb 02 09:44:14 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:44:14 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2c0003e30 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:44:14 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:44:14 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:44:14 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:44:14.962 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:44:15 compute-1 ceph-mon[80115]: pgmap v126: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Feb 02 09:44:15 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:44:15 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2cc003630 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:44:15 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:44:15 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 09:44:15 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:44:15.205 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 09:44:15 compute-1 python3.9[94995]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/throughput-performance-variables.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 02 09:44:15 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 09:44:16 compute-1 sudo[95145]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gsaebmjfdoixnesisvwhczlifdmpkcat ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025455.9272902-1049-82153893857790/AnsiballZ_systemd.py'
Feb 02 09:44:16 compute-1 sudo[95145]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:44:16 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:44:16 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2b8003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:44:16 compute-1 python3.9[95147]: ansible-ansible.builtin.systemd Invoked with enabled=True name=tuned state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 02 09:44:16 compute-1 systemd[1]: Stopping Dynamic System Tuning Daemon...
Feb 02 09:44:16 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:44:16 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2c4003fb0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:44:16 compute-1 systemd[1]: tuned.service: Deactivated successfully.
Feb 02 09:44:16 compute-1 systemd[1]: Stopped Dynamic System Tuning Daemon.
Feb 02 09:44:16 compute-1 systemd[1]: Starting Dynamic System Tuning Daemon...
Feb 02 09:44:16 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:44:16 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:44:16 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:44:16.965 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:44:17 compute-1 ceph-mon[80115]: pgmap v127: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Feb 02 09:44:17 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:44:17 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2e8001ac0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:44:17 compute-1 systemd[1]: Started Dynamic System Tuning Daemon.
Feb 02 09:44:17 compute-1 sudo[95145]: pam_unix(sudo:session): session closed for user root
Feb 02 09:44:17 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:44:17 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:44:17 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:44:17.208 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:44:17 compute-1 python3.9[95312]: ansible-ansible.builtin.slurp Invoked with src=/proc/cmdline
Feb 02 09:44:18 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Feb 02 09:44:18 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:44:18 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2b8003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:44:18 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:44:18 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2d8001bd0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:44:18 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:44:18 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:44:18 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:44:18.969 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:44:19 compute-1 ceph-mon[80115]: pgmap v128: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Feb 02 09:44:19 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:44:19 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2c4003fb0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:44:19 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:44:19 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:44:19 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:44:19.211 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:44:20 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 09:44:20 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:44:20 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2c4003fb0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:44:20 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:44:20 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2e8001ac0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:44:20 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:44:20 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:44:20 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:44:20.972 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:44:21 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:44:21 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2d8001090 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:44:21 compute-1 ceph-mon[80115]: pgmap v129: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Feb 02 09:44:21 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:44:21 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:44:21 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:44:21.214 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:44:21 compute-1 sudo[95464]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ltzjvxakhjimbldhjscofyegzmjintmz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025461.2227452-1220-265921133316773/AnsiballZ_systemd.py'
Feb 02 09:44:21 compute-1 sudo[95464]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:44:21 compute-1 python3.9[95466]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksm.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 02 09:44:21 compute-1 sudo[95464]: pam_unix(sudo:session): session closed for user root
Feb 02 09:44:21 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-haproxy-nfs-cephfs-compute-1-sryqbx[86108]: [WARNING] 032/094421 (4) : Server backend/nfs.cephfs.1 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Feb 02 09:44:22 compute-1 sudo[95618]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yddxuvtipwnrphcsezraoijgtjebjvsw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025461.9797478-1220-123831865432438/AnsiballZ_systemd.py'
Feb 02 09:44:22 compute-1 sudo[95618]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:44:22 compute-1 python3.9[95620]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksmtuned.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 02 09:44:22 compute-1 sudo[95618]: pam_unix(sudo:session): session closed for user root
Feb 02 09:44:22 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:44:22 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2c4003fb0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:44:22 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:44:22 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2c4003fb0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:44:22 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:44:22 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:44:22 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:44:22.975 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:44:23 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:44:23 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2e80029b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:44:23 compute-1 ceph-mon[80115]: pgmap v130: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Feb 02 09:44:23 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:44:23 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 09:44:23 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:44:23.216 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 09:44:23 compute-1 sshd-session[89219]: Connection closed by 192.168.122.30 port 39176
Feb 02 09:44:23 compute-1 sshd-session[89216]: pam_unix(sshd:session): session closed for user zuul
Feb 02 09:44:23 compute-1 systemd[1]: session-38.scope: Deactivated successfully.
Feb 02 09:44:23 compute-1 systemd[1]: session-38.scope: Consumed 59.787s CPU time.
Feb 02 09:44:23 compute-1 systemd-logind[805]: Session 38 logged out. Waiting for processes to exit.
Feb 02 09:44:23 compute-1 systemd-logind[805]: Removed session 38.
Feb 02 09:44:24 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:44:24 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2d8001090 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:44:24 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:44:24 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2c4003fb0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:44:24 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:44:24 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 09:44:24 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:44:24.978 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 09:44:25 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:44:25 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2b8003c70 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:44:25 compute-1 ceph-mon[80115]: pgmap v131: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 426 B/s rd, 0 op/s
Feb 02 09:44:25 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:44:25 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:44:25 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:44:25.220 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:44:25 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 09:44:26 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:44:26 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2e80029b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:44:26 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:44:26 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2d8001090 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:44:26 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:44:26 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:44:26 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:44:26.982 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:44:27 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:44:27 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2c4003fb0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:44:27 compute-1 ceph-mon[80115]: pgmap v132: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s
Feb 02 09:44:27 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:44:27 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:44:27 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:44:27.224 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:44:28 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:44:28 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2b8003c90 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:44:28 compute-1 sshd-session[95650]: Accepted publickey for zuul from 192.168.122.30 port 54520 ssh2: ECDSA SHA256:RIWOugHsRom13QN8+H2eekzMj6VNcm6gUxie+zDStiQ
Feb 02 09:44:28 compute-1 systemd-logind[805]: New session 39 of user zuul.
Feb 02 09:44:28 compute-1 systemd[1]: Started Session 39 of User zuul.
Feb 02 09:44:28 compute-1 sshd-session[95650]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Feb 02 09:44:28 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:44:28 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2e80036c0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:44:28 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:44:28 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:44:28 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:44:28.985 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:44:29 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:44:29 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2e80036c0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:44:29 compute-1 ceph-mon[80115]: pgmap v133: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s
Feb 02 09:44:29 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:44:29 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:44:29 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:44:29.227 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:44:29 compute-1 python3.9[95804]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 02 09:44:29 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:44:29 : epoch 6980717c : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Feb 02 09:44:30 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 09:44:30 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:44:30 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2c4003fb0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:44:30 compute-1 sudo[95959]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ohqbtlidzcmjoicmzqyliokkfkbqffue ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025470.4433987-64-227774448225860/AnsiballZ_getent.py'
Feb 02 09:44:30 compute-1 sudo[95959]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:44:30 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:44:30 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2b8003cb0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:44:30 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:44:30 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 09:44:30 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:44:30.988 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 09:44:31 compute-1 python3.9[95961]: ansible-ansible.builtin.getent Invoked with database=passwd key=openvswitch fail_key=True service=None split=None
Feb 02 09:44:31 compute-1 sudo[95959]: pam_unix(sudo:session): session closed for user root
Feb 02 09:44:31 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:44:31 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2e80036c0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:44:31 compute-1 ceph-mon[80115]: pgmap v134: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 596 B/s rd, 85 B/s wr, 0 op/s
Feb 02 09:44:31 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:44:31 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:44:31 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:44:31.229 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:44:31 compute-1 sudo[96060]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Feb 02 09:44:31 compute-1 sudo[96060]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:44:31 compute-1 sudo[96060]: pam_unix(sudo:session): session closed for user root
Feb 02 09:44:31 compute-1 sudo[96137]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-udusqlkhkahmauuqnsyqrkoeobdmtdes ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025471.5135114-100-262049837874696/AnsiballZ_setup.py'
Feb 02 09:44:31 compute-1 sudo[96137]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:44:32 compute-1 python3.9[96139]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Feb 02 09:44:32 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Feb 02 09:44:32 compute-1 sudo[96137]: pam_unix(sudo:session): session closed for user root
Feb 02 09:44:32 compute-1 sudo[96221]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cautzeirgyknbfxjrdqpyvvspncwghjr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025471.5135114-100-262049837874696/AnsiballZ_dnf.py'
Feb 02 09:44:32 compute-1 sudo[96221]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:44:32 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:44:32 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2d8001090 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:44:32 compute-1 python3.9[96223]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openvswitch'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Feb 02 09:44:32 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:44:32 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2c4003fb0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:44:32 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:44:32 : epoch 6980717c : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Feb 02 09:44:32 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:44:32 : epoch 6980717c : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Feb 02 09:44:32 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:44:32 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:44:32 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:44:32.991 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:44:33 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:44:33 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2b8003cd0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:44:33 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:44:33 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 09:44:33 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:44:33.232 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 09:44:33 compute-1 ceph-mon[80115]: pgmap v135: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 596 B/s rd, 85 B/s wr, 0 op/s
Feb 02 09:44:33 compute-1 sudo[96221]: pam_unix(sudo:session): session closed for user root
Feb 02 09:44:34 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:44:34 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2e80036c0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:44:34 compute-1 sudo[96378]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qflzazzeixcdmcatthwjmkfrugaxaxtc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025474.5714312-142-119168501589649/AnsiballZ_dnf.py'
Feb 02 09:44:34 compute-1 sudo[96378]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:44:34 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:44:34 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2d80038d0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:44:34 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:44:34 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:44:34 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:44:34.995 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:44:35 compute-1 python3.9[96380]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Feb 02 09:44:35 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:44:35 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2c4003fb0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:44:35 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:44:35 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.002000054s ======
Feb 02 09:44:35 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:44:35.235 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000054s
Feb 02 09:44:35 compute-1 ceph-mon[80115]: pgmap v136: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 2.3 KiB/s rd, 938 B/s wr, 3 op/s
Feb 02 09:44:35 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 09:44:35 compute-1 sshd-session[96226]: Connection closed by authenticating user root 123.58.212.100 port 35204 [preauth]
Feb 02 09:44:35 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:44:35 : epoch 6980717c : compute-1 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Feb 02 09:44:36 compute-1 sudo[96378]: pam_unix(sudo:session): session closed for user root
Feb 02 09:44:36 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:44:36 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2b8003cf0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:44:36 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:44:36 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2e80047c0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:44:36 compute-1 sshd-session[96382]: Connection closed by authenticating user root 123.58.212.100 port 35220 [preauth]
Feb 02 09:44:36 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:44:36 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 09:44:36 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:44:36.997 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 09:44:37 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:44:37 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2d80038d0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:44:37 compute-1 sudo[96534]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vhxypzxcuaaguykfeauudvdqjztgerob ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025476.4913523-166-69916960886466/AnsiballZ_systemd.py'
Feb 02 09:44:37 compute-1 sudo[96534]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:44:37 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:44:37 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 09:44:37 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:44:37.239 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 09:44:37 compute-1 ceph-mon[80115]: pgmap v137: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 2.2 KiB/s rd, 938 B/s wr, 3 op/s
Feb 02 09:44:37 compute-1 python3.9[96536]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Feb 02 09:44:37 compute-1 sudo[96534]: pam_unix(sudo:session): session closed for user root
Feb 02 09:44:38 compute-1 sshd-session[96537]: Connection closed by authenticating user root 123.58.212.100 port 35234 [preauth]
Feb 02 09:44:38 compute-1 python3.9[96691]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 02 09:44:38 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:44:38 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2c4003fb0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:44:38 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:44:38 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2b8003d10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:44:39 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:44:39 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:44:39 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:44:39.000 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:44:39 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:44:39 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2e80047c0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:44:39 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:44:39 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:44:39 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:44:39.242 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:44:39 compute-1 sudo[96844]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-felnhkqmmeecdozkzbenhkbekhxsfmrd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025478.8173027-220-260474232549451/AnsiballZ_sefcontext.py'
Feb 02 09:44:39 compute-1 sudo[96844]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:44:39 compute-1 ceph-mon[80115]: pgmap v138: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 2.2 KiB/s rd, 938 B/s wr, 3 op/s
Feb 02 09:44:39 compute-1 python3.9[96846]: ansible-community.general.sefcontext Invoked with selevel=s0 setype=container_file_t state=present target=/var/lib/edpm-config(/.*)? ignore_selinux_state=False ftype=a reload=True substitute=None seuser=None
Feb 02 09:44:39 compute-1 sudo[96844]: pam_unix(sudo:session): session closed for user root
Feb 02 09:44:39 compute-1 sshd-session[96699]: Connection closed by authenticating user root 123.58.212.100 port 35244 [preauth]
Feb 02 09:44:40 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 09:44:40 compute-1 python3.9[96998]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 02 09:44:40 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:44:40 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2d80038d0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:44:40 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:44:40 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2c4003fb0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:44:41 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:44:41 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 09:44:41 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:44:41.003 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 09:44:41 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:44:41 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2b8003d30 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:44:41 compute-1 sshd-session[96895]: Connection closed by authenticating user root 123.58.212.100 port 35250 [preauth]
Feb 02 09:44:41 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:44:41 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 09:44:41 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:44:41.245 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 09:44:41 compute-1 ceph-mon[80115]: pgmap v139: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 2.3 KiB/s rd, 1023 B/s wr, 3 op/s
Feb 02 09:44:41 compute-1 sudo[97155]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-npbghnzhxdjnqbegmczhhdpsulifzhjv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025481.135725-274-43947628690102/AnsiballZ_dnf.py'
Feb 02 09:44:41 compute-1 sudo[97155]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:44:41 compute-1 python3.9[97157]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Feb 02 09:44:41 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-haproxy-nfs-cephfs-compute-1-sryqbx[86108]: [WARNING] 032/094441 (4) : Server backend/nfs.cephfs.1 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Feb 02 09:44:42 compute-1 ceph-mon[80115]: pgmap v140: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 1.9 KiB/s rd, 938 B/s wr, 2 op/s
Feb 02 09:44:42 compute-1 sshd-session[97158]: Connection closed by authenticating user root 123.58.212.100 port 36082 [preauth]
Feb 02 09:44:42 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:44:42 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2e80047c0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:44:42 compute-1 sudo[97155]: pam_unix(sudo:session): session closed for user root
Feb 02 09:44:42 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:44:42 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2d80038d0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:44:43 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:44:43 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:44:43 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:44:43.006 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:44:43 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:44:43 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2c4003fb0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:44:43 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:44:43 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:44:43 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:44:43.248 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:44:43 compute-1 sudo[97313]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fpkrwauulwxjjknhprzpqexmasybdyii ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025483.1062763-298-95587557082618/AnsiballZ_command.py'
Feb 02 09:44:43 compute-1 sudo[97313]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:44:43 compute-1 python3.9[97315]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 02 09:44:43 compute-1 sshd-session[97186]: Connection closed by authenticating user root 123.58.212.100 port 36086 [preauth]
Feb 02 09:44:44 compute-1 sudo[97313]: pam_unix(sudo:session): session closed for user root
Feb 02 09:44:44 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:44:44 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2b8003d50 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:44:44 compute-1 ceph-mon[80115]: pgmap v141: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 2.0 KiB/s rd, 938 B/s wr, 3 op/s
Feb 02 09:44:44 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:44:44 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2e80047c0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:44:45 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:44:45 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 09:44:45 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:44:45.009 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 09:44:45 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:44:45 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2d80038d0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:44:45 compute-1 sudo[97603]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-matcohfoqkvbkhjzhtqfgrvwjqulblfy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025484.7233882-322-152609936805204/AnsiballZ_file.py'
Feb 02 09:44:45 compute-1 sudo[97603]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:44:45 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:44:45 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:44:45 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:44:45.251 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:44:45 compute-1 sshd-session[97382]: Connection closed by authenticating user root 123.58.212.100 port 36088 [preauth]
Feb 02 09:44:45 compute-1 python3.9[97605]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config selevel=s0 setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None attributes=None
Feb 02 09:44:45 compute-1 sudo[97603]: pam_unix(sudo:session): session closed for user root
Feb 02 09:44:45 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 09:44:46 compute-1 python3.9[97757]: ansible-ansible.builtin.stat Invoked with path=/etc/cloud/cloud.cfg.d follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 02 09:44:46 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:44:46 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2c4003fb0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:44:46 compute-1 ceph-mon[80115]: pgmap v142: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 85 B/s wr, 0 op/s
Feb 02 09:44:46 compute-1 sudo[97910]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-stujqzlqtzbhwgsidwuhrgkzbxosvary ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025486.5913184-370-95583996928132/AnsiballZ_dnf.py'
Feb 02 09:44:46 compute-1 sudo[97910]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:44:46 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:44:46 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2c4003fb0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:44:47 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:44:47 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 09:44:47 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:44:47.013 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 09:44:47 compute-1 python3.9[97912]: ansible-ansible.legacy.dnf Invoked with name=['NetworkManager-ovs'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Feb 02 09:44:47 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:44:47 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2e80047c0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:44:47 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:44:47 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:44:47 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:44:47.256 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:44:47 compute-1 sshd-session[97652]: Connection closed by authenticating user root 123.58.212.100 port 36090 [preauth]
Feb 02 09:44:47 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Feb 02 09:44:48 compute-1 sudo[97910]: pam_unix(sudo:session): session closed for user root
Feb 02 09:44:48 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:44:48 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2d80038d0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:44:48 compute-1 ceph-mon[80115]: pgmap v143: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 85 B/s wr, 0 op/s
Feb 02 09:44:48 compute-1 sudo[98067]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uifkdiuoyymplwvgpbjupzejzgyriitk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025488.5495977-397-7781207222980/AnsiballZ_dnf.py'
Feb 02 09:44:48 compute-1 sudo[98067]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:44:48 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:44:48 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2c4003fb0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:44:49 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:44:49 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:44:49 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:44:49.016 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:44:49 compute-1 python3.9[98069]: ansible-ansible.legacy.dnf Invoked with name=['os-net-config'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Feb 02 09:44:49 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:44:49 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2c0001090 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:44:49 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:44:49 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:44:49 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:44:49.259 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:44:50 compute-1 sshd-session[98070]: Connection closed by authenticating user root 123.58.212.100 port 36104 [preauth]
Feb 02 09:44:50 compute-1 sudo[98067]: pam_unix(sudo:session): session closed for user root
Feb 02 09:44:50 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 09:44:50 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:44:50 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2c0001090 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:44:50 compute-1 ceph-mon[80115]: pgmap v144: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 426 B/s rd, 85 B/s wr, 0 op/s
Feb 02 09:44:50 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:44:50 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2d80038d0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:44:51 compute-1 sudo[98223]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ztpqwvekhbxobworldswffsoiejyzqly ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025490.7333932-433-162698196758492/AnsiballZ_stat.py'
Feb 02 09:44:51 compute-1 sudo[98223]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:44:51 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:44:51 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:44:51 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:44:51.019 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:44:51 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:44:51 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2d80038d0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:44:51 compute-1 python3.9[98225]: ansible-ansible.builtin.stat Invoked with path=/var/lib/edpm-config/os-net-config.returncode follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 02 09:44:51 compute-1 sudo[98223]: pam_unix(sudo:session): session closed for user root
Feb 02 09:44:51 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:44:51 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:44:51 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:44:51.261 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:44:51 compute-1 sudo[98306]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Feb 02 09:44:51 compute-1 sudo[98306]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:44:51 compute-1 sudo[98306]: pam_unix(sudo:session): session closed for user root
Feb 02 09:44:51 compute-1 sudo[98404]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ycnctrmpvqhzjcpwgcsrdzhgsutdxpps ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025491.4189532-457-252651901157121/AnsiballZ_slurp.py'
Feb 02 09:44:51 compute-1 sudo[98404]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:44:52 compute-1 python3.9[98406]: ansible-ansible.builtin.slurp Invoked with path=/var/lib/edpm-config/os-net-config.returncode src=/var/lib/edpm-config/os-net-config.returncode
Feb 02 09:44:52 compute-1 sudo[98404]: pam_unix(sudo:session): session closed for user root
Feb 02 09:44:52 compute-1 sshd-session[98252]: Connection closed by authenticating user root 123.58.212.100 port 36114 [preauth]
Feb 02 09:44:52 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:44:52 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2e80047c0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:44:52 compute-1 ceph-mon[80115]: pgmap v145: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 B/s wr, 0 op/s
Feb 02 09:44:52 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:44:52 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2c0001090 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:44:53 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:44:53 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:44:53 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:44:53.022 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:44:53 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:44:53 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2c0001090 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:44:53 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:44:53 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:44:53 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:44:53.264 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:44:53 compute-1 sshd-session[95654]: Connection closed by 192.168.122.30 port 54520
Feb 02 09:44:53 compute-1 sshd-session[95650]: pam_unix(sshd:session): session closed for user zuul
Feb 02 09:44:53 compute-1 systemd[1]: session-39.scope: Deactivated successfully.
Feb 02 09:44:53 compute-1 systemd[1]: session-39.scope: Consumed 16.376s CPU time.
Feb 02 09:44:53 compute-1 systemd-logind[805]: Session 39 logged out. Waiting for processes to exit.
Feb 02 09:44:53 compute-1 systemd-logind[805]: Removed session 39.
Feb 02 09:44:54 compute-1 sshd-session[98432]: Connection closed by authenticating user root 123.58.212.100 port 41174 [preauth]
Feb 02 09:44:54 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:44:54 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2d80038d0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:44:54 compute-1 ceph-mon[80115]: pgmap v146: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 341 B/s rd, 0 B/s wr, 0 op/s
Feb 02 09:44:54 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:44:54 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2e80047c0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:44:55 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:44:55 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000028s ======
Feb 02 09:44:55 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:44:55.024 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Feb 02 09:44:55 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:44:55 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2c0001090 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:44:55 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:44:55 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:44:55 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:44:55.267 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:44:55 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 09:44:56 compute-1 sshd-session[98435]: Connection closed by authenticating user root 123.58.212.100 port 41184 [preauth]
Feb 02 09:44:56 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:44:56 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2cc001230 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:44:56 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:44:56 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2d80038d0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:44:56 compute-1 ceph-mon[80115]: pgmap v147: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Feb 02 09:44:57 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:44:57 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:44:57 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:44:57.027 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:44:57 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:44:57 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2e80047c0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:44:57 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:44:57 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:44:57 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:44:57.269 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:44:57 compute-1 sshd-session[98438]: Connection closed by authenticating user root 123.58.212.100 port 41194 [preauth]
Feb 02 09:44:58 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:44:58 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2c0001090 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:44:58 compute-1 sshd-session[98444]: Accepted publickey for zuul from 192.168.122.30 port 52074 ssh2: ECDSA SHA256:RIWOugHsRom13QN8+H2eekzMj6VNcm6gUxie+zDStiQ
Feb 02 09:44:58 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:44:58 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2cc001230 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:44:58 compute-1 systemd-logind[805]: New session 40 of user zuul.
Feb 02 09:44:58 compute-1 systemd[1]: Started Session 40 of User zuul.
Feb 02 09:44:58 compute-1 sshd-session[98444]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Feb 02 09:44:58 compute-1 ceph-mon[80115]: pgmap v148: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Feb 02 09:44:59 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:44:59 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:44:59 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:44:59.030 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:44:59 compute-1 sshd-session[98441]: Connection closed by authenticating user root 123.58.212.100 port 41196 [preauth]
Feb 02 09:44:59 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:44:59 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2d80038d0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:44:59 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:44:59 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000028s ======
Feb 02 09:44:59 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:44:59.272 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Feb 02 09:44:59 compute-1 python3.9[98599]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 02 09:45:00 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 09:45:00 compute-1 sshd-session[98500]: Connection closed by authenticating user root 123.58.212.100 port 41208 [preauth]
Feb 02 09:45:00 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:45:00 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2e80047c0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:45:00 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:45:00 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2c0001090 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:45:01 compute-1 ceph-mon[80115]: pgmap v149: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 426 B/s rd, 0 op/s
Feb 02 09:45:01 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:45:01 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:45:01 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:45:01.033 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:45:01 compute-1 python3.9[98754]: ansible-ansible.builtin.setup Invoked with filter=['ansible_default_ipv4'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Feb 02 09:45:01 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:45:01 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2cc001230 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:45:01 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:45:01 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:45:01 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:45:01.275 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:45:01 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-haproxy-nfs-cephfs-compute-1-sryqbx[86108]: [WARNING] 032/094501 (4) : Server backend/nfs.cephfs.1 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Feb 02 09:45:02 compute-1 sshd-session[98755]: Connection closed by authenticating user root 123.58.212.100 port 41220 [preauth]
Feb 02 09:45:02 compute-1 python3.9[98949]: ansible-ansible.legacy.command Invoked with _raw_params=hostname -f _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 02 09:45:02 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:45:02 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2d80038d0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:45:02 compute-1 sshd-session[98447]: Connection closed by 192.168.122.30 port 52074
Feb 02 09:45:02 compute-1 sshd-session[98444]: pam_unix(sshd:session): session closed for user zuul
Feb 02 09:45:02 compute-1 systemd[1]: session-40.scope: Deactivated successfully.
Feb 02 09:45:02 compute-1 systemd[1]: session-40.scope: Consumed 2.315s CPU time.
Feb 02 09:45:02 compute-1 systemd-logind[805]: Session 40 logged out. Waiting for processes to exit.
Feb 02 09:45:02 compute-1 systemd-logind[805]: Removed session 40.
Feb 02 09:45:02 compute-1 sudo[98978]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 02 09:45:02 compute-1 sudo[98978]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:45:02 compute-1 sudo[98978]: pam_unix(sudo:session): session closed for user root
Feb 02 09:45:02 compute-1 sudo[99004]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/d241d473-9fcb-5f74-b163-f1ca4454e7f1/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Feb 02 09:45:02 compute-1 sudo[99004]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:45:02 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:45:02 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2e80047c0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:45:03 compute-1 ceph-mon[80115]: pgmap v150: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Feb 02 09:45:03 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Feb 02 09:45:03 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:45:03 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:45:03 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:45:03.036 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:45:03 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:45:03 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2c0001090 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:45:03 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:45:03 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:45:03 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:45:03.277 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:45:03 compute-1 sudo[99004]: pam_unix(sudo:session): session closed for user root
Feb 02 09:45:03 compute-1 sshd-session[98950]: Connection closed by authenticating user root 123.58.212.100 port 58584 [preauth]
Feb 02 09:45:04 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:45:04 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2cc001230 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:45:04 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:45:04 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2d80038d0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:45:04 compute-1 sshd-session[99061]: Connection closed by authenticating user root 123.58.212.100 port 58596 [preauth]
Feb 02 09:45:05 compute-1 ceph-mon[80115]: pgmap v151: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Feb 02 09:45:05 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:45:05 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 09:45:05 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:45:05.038 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 09:45:05 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:45:05 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2e80047c0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:45:05 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:45:05 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:45:05 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:45:05.281 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:45:05 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 09:45:06 compute-1 sshd-session[99064]: Connection closed by authenticating user root 123.58.212.100 port 58598 [preauth]
Feb 02 09:45:06 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:45:06 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2c00042d0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:45:06 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:45:06 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2e80047c0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:45:07 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:45:07 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 09:45:07 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:45:07.041 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 09:45:07 compute-1 kernel: ganesha.nfsd[97992]: segfault at 50 ip 00007fd36de9d32e sp 00007fd2d67fb210 error 4 in libntirpc.so.5.8[7fd36de82000+2c000] likely on CPU 1 (core 0, socket 1)
Feb 02 09:45:07 compute-1 kernel: Code: 47 20 66 41 89 86 f2 00 00 00 41 bf 01 00 00 00 b9 40 00 00 00 e9 af fd ff ff 66 90 48 8b 85 f8 00 00 00 48 8b 40 08 4c 8b 28 <45> 8b 65 50 49 8b 75 68 41 8b be 28 02 00 00 b9 40 00 00 00 e8 29
Feb 02 09:45:07 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[87455]: 02/02/2026 09:45:07 : epoch 6980717c : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd2d80038d0 fd 39 proxy ignored for local
Feb 02 09:45:07 compute-1 systemd[1]: Started Process Core Dump (PID 99069/UID 0).
Feb 02 09:45:07 compute-1 ceph-mon[80115]: pgmap v152: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s
Feb 02 09:45:07 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb 02 09:45:07 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb 02 09:45:07 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Feb 02 09:45:07 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Feb 02 09:45:07 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb 02 09:45:07 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb 02 09:45:07 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Feb 02 09:45:07 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Feb 02 09:45:07 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Feb 02 09:45:07 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:45:07 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000028s ======
Feb 02 09:45:07 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:45:07.284 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Feb 02 09:45:07 compute-1 sshd-session[99067]: Connection closed by authenticating user root 123.58.212.100 port 58614 [preauth]
Feb 02 09:45:08 compute-1 systemd-coredump[99070]: Process 87470 (ganesha.nfsd) of user 0 dumped core.
                                                   
                                                   Stack trace of thread 66:
                                                   #0  0x00007fd36de9d32e n/a (/usr/lib64/libntirpc.so.5.8 + 0x2232e)
                                                   #1  0x0000000000000000 n/a (n/a + 0x0)
                                                   #2  0x00007fd36dea7900 n/a (/usr/lib64/libntirpc.so.5.8 + 0x2c900)
                                                   ELF object binary architecture: AMD x86-64
Feb 02 09:45:08 compute-1 systemd[1]: systemd-coredump@2-99069-0.service: Deactivated successfully.
Feb 02 09:45:08 compute-1 podman[99075]: 2026-02-02 09:45:08.098645896 +0000 UTC m=+0.034704793 container died fc0d21172bebee4cc890b402589ad01587ff53c1f4d8ab1d900275946a1bfaf9 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, io.buildah.version=1.40.1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, CEPH_REF=squid, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default)
Feb 02 09:45:08 compute-1 systemd[1]: var-lib-containers-storage-overlay-5b0aa32c59c7b5af7c31cc81a030baa01c0eb043b66d432659d68e87d7177710-merged.mount: Deactivated successfully.
Feb 02 09:45:08 compute-1 podman[99075]: 2026-02-02 09:45:08.193466296 +0000 UTC m=+0.129525233 container remove fc0d21172bebee4cc890b402589ad01587ff53c1f4d8ab1d900275946a1bfaf9 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx, org.label-schema.build-date=20250325, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1, FROM_IMAGE=quay.io/centos/centos:stream9)
Feb 02 09:45:08 compute-1 systemd[1]: ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1@nfs.cephfs.0.0.compute-1.mhzhsx.service: Main process exited, code=exited, status=139/n/a
Feb 02 09:45:08 compute-1 ceph-mon[80115]: pgmap v153: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 181 B/s rd, 0 op/s
Feb 02 09:45:08 compute-1 ceph-mon[80115]: Health check update: 3 failed cephadm daemon(s) (CEPHADM_FAILED_DAEMON)
Feb 02 09:45:08 compute-1 sshd-session[99092]: Accepted publickey for zuul from 192.168.122.30 port 51012 ssh2: ECDSA SHA256:RIWOugHsRom13QN8+H2eekzMj6VNcm6gUxie+zDStiQ
Feb 02 09:45:08 compute-1 systemd-logind[805]: New session 41 of user zuul.
Feb 02 09:45:08 compute-1 systemd[1]: Started Session 41 of User zuul.
Feb 02 09:45:08 compute-1 sshd-session[99092]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Feb 02 09:45:08 compute-1 systemd[1]: ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1@nfs.cephfs.0.0.compute-1.mhzhsx.service: Failed with result 'exit-code'.
Feb 02 09:45:08 compute-1 systemd[1]: ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1@nfs.cephfs.0.0.compute-1.mhzhsx.service: Consumed 1.348s CPU time.
Feb 02 09:45:08 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-haproxy-nfs-cephfs-compute-1-sryqbx[86108]: [WARNING] 032/094508 (4) : Server backend/nfs.cephfs.2 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 1 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Feb 02 09:45:09 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:45:09 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 09:45:09 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:45:09.044 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 09:45:09 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:45:09 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:45:09 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:45:09.287 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:45:09 compute-1 sshd-session[99094]: Connection closed by authenticating user root 123.58.212.100 port 58624 [preauth]
Feb 02 09:45:09 compute-1 python3.9[99277]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 02 09:45:10 compute-1 ceph-mon[80115]: pgmap v154: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 544 B/s rd, 90 B/s wr, 0 op/s
Feb 02 09:45:10 compute-1 python3.9[99433]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 02 09:45:10 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 09:45:10 compute-1 sshd-session[99282]: Connection closed by authenticating user root 123.58.212.100 port 58634 [preauth]
Feb 02 09:45:11 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:45:11 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:45:11 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:45:11.047 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:45:11 compute-1 sudo[99590]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wxgiszuypjihdniumabmnqrogndhnszz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025510.8275743-76-169102150979122/AnsiballZ_setup.py'
Feb 02 09:45:11 compute-1 sudo[99590]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:45:11 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:45:11 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:45:11 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:45:11.289 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:45:11 compute-1 python3.9[99592]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Feb 02 09:45:11 compute-1 sudo[99590]: pam_unix(sudo:session): session closed for user root
Feb 02 09:45:11 compute-1 sudo[99601]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Feb 02 09:45:11 compute-1 sudo[99601]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:45:11 compute-1 sudo[99601]: pam_unix(sudo:session): session closed for user root
Feb 02 09:45:11 compute-1 sudo[99699]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fngrwelwvrivdlariacptyguxxeikyoc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025510.8275743-76-169102150979122/AnsiballZ_dnf.py'
Feb 02 09:45:12 compute-1 sudo[99699]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:45:12 compute-1 sshd-session[99515]: Connection closed by authenticating user root 123.58.212.100 port 58638 [preauth]
Feb 02 09:45:12 compute-1 python3.9[99701]: ansible-ansible.legacy.dnf Invoked with name=['podman'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Feb 02 09:45:13 compute-1 ceph-mon[80115]: pgmap v155: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 363 B/s rd, 90 B/s wr, 0 op/s
Feb 02 09:45:13 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:45:13 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000028s ======
Feb 02 09:45:13 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:45:13.049 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Feb 02 09:45:13 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-haproxy-nfs-cephfs-compute-1-sryqbx[86108]: [WARNING] 032/094513 (4) : Server backend/nfs.cephfs.0 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 0 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Feb 02 09:45:13 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-haproxy-nfs-cephfs-compute-1-sryqbx[86108]: [NOTICE] 032/094513 (4) : haproxy version is 2.3.17-d1c9119
Feb 02 09:45:13 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-haproxy-nfs-cephfs-compute-1-sryqbx[86108]: [NOTICE] 032/094513 (4) : path to executable is /usr/local/sbin/haproxy
Feb 02 09:45:13 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-haproxy-nfs-cephfs-compute-1-sryqbx[86108]: [ALERT] 032/094513 (4) : backend 'backend' has no server available!
Feb 02 09:45:13 compute-1 sudo[99706]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 02 09:45:13 compute-1 sudo[99706]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:45:13 compute-1 sudo[99706]: pam_unix(sudo:session): session closed for user root
Feb 02 09:45:13 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:45:13 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 09:45:13 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:45:13.292 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 09:45:13 compute-1 sshd-session[99703]: Connection closed by authenticating user root 123.58.212.100 port 43906 [preauth]
Feb 02 09:45:13 compute-1 sudo[99699]: pam_unix(sudo:session): session closed for user root
Feb 02 09:45:14 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb 02 09:45:14 compute-1 ceph-mon[80115]: pgmap v156: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 363 B/s rd, 90 B/s wr, 0 op/s
Feb 02 09:45:14 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb 02 09:45:14 compute-1 sudo[99880]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-deqikselhvhtlpzwahvjanwpcpivakyg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025514.2604597-112-279354373719568/AnsiballZ_setup.py'
Feb 02 09:45:14 compute-1 sudo[99880]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:45:14 compute-1 python3.9[99882]: ansible-ansible.builtin.setup Invoked with filter=['ansible_interfaces'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Feb 02 09:45:15 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:45:15 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 09:45:15 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:45:15.052 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 09:45:15 compute-1 sudo[99880]: pam_unix(sudo:session): session closed for user root
Feb 02 09:45:15 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:45:15 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:45:15 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:45:15.296 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:45:15 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 09:45:15 compute-1 sshd-session[71404]: Received disconnect from 38.102.83.241 port 53408:11: disconnected by user
Feb 02 09:45:15 compute-1 sshd-session[71404]: Disconnected from user zuul 38.102.83.241 port 53408
Feb 02 09:45:15 compute-1 sshd-session[71401]: pam_unix(sshd:session): session closed for user zuul
Feb 02 09:45:15 compute-1 systemd[1]: session-19.scope: Deactivated successfully.
Feb 02 09:45:15 compute-1 systemd[1]: session-19.scope: Consumed 7.909s CPU time.
Feb 02 09:45:15 compute-1 systemd-logind[805]: Session 19 logged out. Waiting for processes to exit.
Feb 02 09:45:15 compute-1 systemd-logind[805]: Removed session 19.
Feb 02 09:45:15 compute-1 sudo[100078]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vhqrlmmthldyiwfmzxjueknytplrjtrn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025515.4766881-145-265648322676762/AnsiballZ_file.py'
Feb 02 09:45:15 compute-1 sudo[100078]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:45:16 compute-1 sshd-session[99887]: Connection closed by authenticating user root 123.58.212.100 port 43916 [preauth]
Feb 02 09:45:16 compute-1 ceph-mon[80115]: pgmap v157: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 544 B/s rd, 272 B/s wr, 0 op/s
Feb 02 09:45:16 compute-1 python3.9[100080]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/containers/networks recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 02 09:45:16 compute-1 sudo[100078]: pam_unix(sudo:session): session closed for user root
Feb 02 09:45:16 compute-1 sudo[100233]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dyzyxgmbxlqpvnmvbbkvogogwlaxbhmu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025516.4772046-169-253592717081720/AnsiballZ_command.py'
Feb 02 09:45:16 compute-1 sudo[100233]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:45:17 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:45:17 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:45:17 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:45:17.055 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:45:17 compute-1 python3.9[100235]: ansible-ansible.legacy.command Invoked with _raw_params=podman network inspect podman
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 02 09:45:17 compute-1 sudo[100233]: pam_unix(sudo:session): session closed for user root
Feb 02 09:45:17 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:45:17 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:45:17 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:45:17.298 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:45:17 compute-1 sshd-session[100105]: Connection closed by authenticating user root 123.58.212.100 port 43930 [preauth]
Feb 02 09:45:17 compute-1 sudo[100398]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uddmjwqhklfwzmpdjrksjvlhwufexbjl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025517.4370332-193-21080524670205/AnsiballZ_stat.py'
Feb 02 09:45:17 compute-1 sudo[100398]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:45:18 compute-1 ceph-mon[80115]: pgmap v158: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 544 B/s rd, 272 B/s wr, 0 op/s
Feb 02 09:45:18 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Feb 02 09:45:18 compute-1 python3.9[100400]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/networks/podman.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 02 09:45:18 compute-1 sudo[100398]: pam_unix(sudo:session): session closed for user root
Feb 02 09:45:18 compute-1 systemd[1]: ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1@nfs.cephfs.0.0.compute-1.mhzhsx.service: Scheduled restart job, restart counter is at 3.
Feb 02 09:45:18 compute-1 systemd[1]: Stopped Ceph nfs.cephfs.0.0.compute-1.mhzhsx for d241d473-9fcb-5f74-b163-f1ca4454e7f1.
Feb 02 09:45:18 compute-1 systemd[1]: ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1@nfs.cephfs.0.0.compute-1.mhzhsx.service: Consumed 1.348s CPU time.
Feb 02 09:45:18 compute-1 systemd[1]: Starting Ceph nfs.cephfs.0.0.compute-1.mhzhsx for d241d473-9fcb-5f74-b163-f1ca4454e7f1...
Feb 02 09:45:18 compute-1 sudo[100492]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-esdvbhxboxycmkkdvlkjhxowdileyyng ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025517.4370332-193-21080524670205/AnsiballZ_file.py'
Feb 02 09:45:18 compute-1 sudo[100492]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:45:18 compute-1 podman[100531]: 2026-02-02 09:45:18.571943222 +0000 UTC m=+0.047279958 container create 0a031cdcde842b9c7e1b7bd84624e8f88ad9ced371b6ec53ec2ad93b8ba8a11e (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, io.buildah.version=1.40.1, CEPH_REF=squid, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default, org.label-schema.build-date=20250325)
Feb 02 09:45:18 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7c29b5401ef58f083f85a8e200edfe4e6670b4a6d9a10c29e8c2df49cb38fe6d/merged/etc/ganesha supports timestamps until 2038 (0x7fffffff)
Feb 02 09:45:18 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7c29b5401ef58f083f85a8e200edfe4e6670b4a6d9a10c29e8c2df49cb38fe6d/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Feb 02 09:45:18 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7c29b5401ef58f083f85a8e200edfe4e6670b4a6d9a10c29e8c2df49cb38fe6d/merged/etc/ceph/keyring supports timestamps until 2038 (0x7fffffff)
Feb 02 09:45:18 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7c29b5401ef58f083f85a8e200edfe4e6670b4a6d9a10c29e8c2df49cb38fe6d/merged/var/lib/ceph/radosgw/ceph-nfs.cephfs.0.0.compute-1.mhzhsx-rgw/keyring supports timestamps until 2038 (0x7fffffff)
Feb 02 09:45:18 compute-1 podman[100531]: 2026-02-02 09:45:18.548095428 +0000 UTC m=+0.023432174 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Feb 02 09:45:18 compute-1 podman[100531]: 2026-02-02 09:45:18.642048584 +0000 UTC m=+0.117385320 container init 0a031cdcde842b9c7e1b7bd84624e8f88ad9ced371b6ec53ec2ad93b8ba8a11e (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325, ceph=True, org.label-schema.license=GPLv2, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS)
Feb 02 09:45:18 compute-1 podman[100531]: 2026-02-02 09:45:18.653230461 +0000 UTC m=+0.128567197 container start 0a031cdcde842b9c7e1b7bd84624e8f88ad9ced371b6ec53ec2ad93b8ba8a11e (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, io.buildah.version=1.40.1, org.label-schema.build-date=20250325, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, ceph=True, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/)
Feb 02 09:45:18 compute-1 python3.9[100500]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/containers/networks/podman.json _original_basename=podman_network_config.j2 recurse=False state=file path=/etc/containers/networks/podman.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 02 09:45:18 compute-1 bash[100531]: 0a031cdcde842b9c7e1b7bd84624e8f88ad9ced371b6ec53ec2ad93b8ba8a11e
Feb 02 09:45:18 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[100546]: 02/02/2026 09:45:18 : epoch 6980722e : compute-1 : ganesha.nfsd-2[main] init_logging :LOG :NULL :LOG: Setting log level for all components to NIV_EVENT
Feb 02 09:45:18 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[100546]: 02/02/2026 09:45:18 : epoch 6980722e : compute-1 : ganesha.nfsd-2[main] main :MAIN :EVENT :ganesha.nfsd Starting: Ganesha Version 5.9
Feb 02 09:45:18 compute-1 systemd[1]: Started Ceph nfs.cephfs.0.0.compute-1.mhzhsx for d241d473-9fcb-5f74-b163-f1ca4454e7f1.
Feb 02 09:45:18 compute-1 sudo[100492]: pam_unix(sudo:session): session closed for user root
Feb 02 09:45:18 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[100546]: 02/02/2026 09:45:18 : epoch 6980722e : compute-1 : ganesha.nfsd-2[main] nfs_set_param_from_conf :NFS STARTUP :EVENT :Configuration file successfully parsed
Feb 02 09:45:18 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[100546]: 02/02/2026 09:45:18 : epoch 6980722e : compute-1 : ganesha.nfsd-2[main] monitoring_init :NFS STARTUP :EVENT :Init monitoring at 0.0.0.0:9587
Feb 02 09:45:18 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[100546]: 02/02/2026 09:45:18 : epoch 6980722e : compute-1 : ganesha.nfsd-2[main] fsal_init_fds_limit :MDCACHE LRU :EVENT :Setting the system-imposed limit on FDs to 1048576.
Feb 02 09:45:18 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[100546]: 02/02/2026 09:45:18 : epoch 6980722e : compute-1 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :Initializing ID Mapper.
Feb 02 09:45:18 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[100546]: 02/02/2026 09:45:18 : epoch 6980722e : compute-1 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :ID Mapper successfully initialized.
Feb 02 09:45:18 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[100546]: 02/02/2026 09:45:18 : epoch 6980722e : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Feb 02 09:45:19 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:45:19 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000028s ======
Feb 02 09:45:19 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:45:19.057 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Feb 02 09:45:19 compute-1 sshd-session[100401]: Connection closed by authenticating user root 123.58.212.100 port 43936 [preauth]
Feb 02 09:45:19 compute-1 sudo[100738]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nmdoiqeopxusgymdbviupmtmomopamrx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025518.8633578-229-209951719172216/AnsiballZ_stat.py'
Feb 02 09:45:19 compute-1 sudo[100738]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:45:19 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:45:19 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 09:45:19 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:45:19.300 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 09:45:19 compute-1 python3.9[100740]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 02 09:45:19 compute-1 sudo[100738]: pam_unix(sudo:session): session closed for user root
Feb 02 09:45:19 compute-1 sudo[100818]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dyareecpmoadgutwjlcvvnqehaaqadtd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025518.8633578-229-209951719172216/AnsiballZ_file.py'
Feb 02 09:45:19 compute-1 sudo[100818]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:45:19 compute-1 python3.9[100820]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root setype=etc_t dest=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf _original_basename=registries.conf.j2 recurse=False state=file path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 02 09:45:19 compute-1 sudo[100818]: pam_unix(sudo:session): session closed for user root
Feb 02 09:45:20 compute-1 ceph-mon[80115]: pgmap v159: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 1.7 KiB/s rd, 426 B/s wr, 2 op/s
Feb 02 09:45:20 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 09:45:20 compute-1 sudo[100970]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qcirqkidkthqdhkvzuylaanzyjvnrovh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025520.1969175-268-53398158937742/AnsiballZ_ini_file.py'
Feb 02 09:45:20 compute-1 sudo[100970]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:45:20 compute-1 sshd-session[100741]: Connection closed by authenticating user root 123.58.212.100 port 43940 [preauth]
Feb 02 09:45:20 compute-1 python3.9[100972]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=pids_limit owner=root path=/etc/containers/containers.conf section=containers setype=etc_t value=4096 backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Feb 02 09:45:20 compute-1 sudo[100970]: pam_unix(sudo:session): session closed for user root
Feb 02 09:45:21 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:45:21 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:45:21 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:45:21.060 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:45:21 compute-1 sudo[101125]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ztpxtgjycgulfjrzflgzqfnceimakvsl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025521.012085-268-230871107861993/AnsiballZ_ini_file.py'
Feb 02 09:45:21 compute-1 sudo[101125]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:45:21 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:45:21 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:45:21 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:45:21.303 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:45:21 compute-1 python3.9[101127]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=events_logger owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="journald" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Feb 02 09:45:21 compute-1 sudo[101125]: pam_unix(sudo:session): session closed for user root
Feb 02 09:45:21 compute-1 sudo[101277]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ezwbldoqlvxyffrhawjbagnduxeyouik ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025521.6610851-268-146853455839444/AnsiballZ_ini_file.py'
Feb 02 09:45:21 compute-1 sudo[101277]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:45:22 compute-1 ceph-mon[80115]: pgmap v160: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 1.4 KiB/s rd, 341 B/s wr, 1 op/s
Feb 02 09:45:22 compute-1 sshd-session[101021]: Connection closed by authenticating user root 123.58.212.100 port 43948 [preauth]
Feb 02 09:45:22 compute-1 python3.9[101279]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=runtime owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="crun" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Feb 02 09:45:22 compute-1 sudo[101277]: pam_unix(sudo:session): session closed for user root
Feb 02 09:45:22 compute-1 sudo[101431]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tfaenbhoizixvbsgsheoaynppzttdzdz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025522.3078966-268-51937732126492/AnsiballZ_ini_file.py'
Feb 02 09:45:22 compute-1 sudo[101431]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:45:22 compute-1 python3.9[101433]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=network_backend owner=root path=/etc/containers/containers.conf section=network setype=etc_t value="netavark" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Feb 02 09:45:22 compute-1 sudo[101431]: pam_unix(sudo:session): session closed for user root
Feb 02 09:45:23 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:45:23 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:45:23 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:45:23.096 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:45:23 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:45:23 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:45:23 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:45:23.306 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:45:23 compute-1 sudo[101584]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nfrdwpdkrnslarzhiwbuqdcmprthlmqw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025523.3279526-361-36699291965809/AnsiballZ_dnf.py'
Feb 02 09:45:23 compute-1 sudo[101584]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:45:23 compute-1 python3.9[101586]: ansible-ansible.legacy.dnf Invoked with name=['openssh-server'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Feb 02 09:45:23 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-haproxy-nfs-cephfs-compute-1-sryqbx[86108]: [WARNING] 032/094523 (4) : Server backend/nfs.cephfs.1 is UP, reason: Layer4 check passed, check duration: 0ms. 1 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Feb 02 09:45:24 compute-1 sshd-session[101356]: Connection closed by authenticating user root 123.58.212.100 port 56308 [preauth]
Feb 02 09:45:24 compute-1 ceph-mon[80115]: pgmap v161: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 1.4 KiB/s rd, 341 B/s wr, 1 op/s
Feb 02 09:45:24 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[100546]: 02/02/2026 09:45:24 : epoch 6980722e : compute-1 : ganesha.nfsd-2[main] rados_kv_traverse :CLIENT ID :EVENT :Failed to lst kv ret=-2
Feb 02 09:45:24 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[100546]: 02/02/2026 09:45:24 : epoch 6980722e : compute-1 : ganesha.nfsd-2[main] rados_cluster_read_clids :CLIENT ID :EVENT :Failed to traverse recovery db: -2
Feb 02 09:45:24 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[100546]: 02/02/2026 09:45:24 : epoch 6980722e : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Feb 02 09:45:24 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[100546]: 02/02/2026 09:45:24 : epoch 6980722e : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Feb 02 09:45:24 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[100546]: 02/02/2026 09:45:24 : epoch 6980722e : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Feb 02 09:45:24 compute-1 sudo[101584]: pam_unix(sudo:session): session closed for user root
Feb 02 09:45:25 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:45:25 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:45:25 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:45:25.099 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:45:25 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:45:25 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:45:25 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:45:25.310 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:45:25 compute-1 sshd-session[101588]: Connection closed by authenticating user root 123.58.212.100 port 56324 [preauth]
Feb 02 09:45:25 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 09:45:25 compute-1 sudo[101742]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gehoyowgivaaehuwwqxefhbsfxtqjnrd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025525.6324146-394-712648278793/AnsiballZ_setup.py'
Feb 02 09:45:25 compute-1 sudo[101742]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:45:26 compute-1 ceph-mon[80115]: pgmap v162: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 2.5 KiB/s rd, 852 B/s wr, 3 op/s
Feb 02 09:45:26 compute-1 python3.9[101744]: ansible-setup Invoked with gather_subset=['!all', '!min', 'distribution', 'distribution_major_version', 'distribution_version', 'os_family'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 02 09:45:26 compute-1 sudo[101742]: pam_unix(sudo:session): session closed for user root
Feb 02 09:45:26 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[100546]: 02/02/2026 09:45:26 : epoch 6980722e : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Feb 02 09:45:26 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[100546]: 02/02/2026 09:45:26 : epoch 6980722e : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Feb 02 09:45:26 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[100546]: 02/02/2026 09:45:26 : epoch 6980722e : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Feb 02 09:45:26 compute-1 sudo[101897]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wzwompxusrwgqoixudioobqcjuecbkhd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025526.4742177-418-109972847027397/AnsiballZ_stat.py'
Feb 02 09:45:26 compute-1 sudo[101897]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:45:26 compute-1 python3.9[101899]: ansible-stat Invoked with path=/run/ostree-booted follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 02 09:45:26 compute-1 sudo[101897]: pam_unix(sudo:session): session closed for user root
Feb 02 09:45:27 compute-1 sshd-session[101668]: Connection closed by authenticating user root 123.58.212.100 port 56336 [preauth]
Feb 02 09:45:27 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:45:27 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 09:45:27 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:45:27.102 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 09:45:27 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:45:27 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:45:27 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:45:27.313 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:45:27 compute-1 sudo[102051]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-odnfnoazqbyffamzrtfpwwajskvhkjvt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025527.2955854-445-21915466044049/AnsiballZ_stat.py'
Feb 02 09:45:27 compute-1 sudo[102051]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:45:27 compute-1 python3.9[102053]: ansible-stat Invoked with path=/sbin/transactional-update follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 02 09:45:27 compute-1 sudo[102051]: pam_unix(sudo:session): session closed for user root
Feb 02 09:45:28 compute-1 ceph-mon[80115]: pgmap v163: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 2.3 KiB/s rd, 682 B/s wr, 3 op/s
Feb 02 09:45:28 compute-1 sshd-session[101932]: Connection closed by authenticating user root 123.58.212.100 port 56340 [preauth]
Feb 02 09:45:28 compute-1 sudo[102203]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gpwahsqzaeyjdxohqlmygmrcyqeqfdrz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025528.2304747-475-78055691218778/AnsiballZ_command.py'
Feb 02 09:45:28 compute-1 sudo[102203]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:45:28 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-haproxy-nfs-cephfs-compute-1-sryqbx[86108]: [WARNING] 032/094528 (4) : Server backend/nfs.cephfs.2 is UP, reason: Layer4 check passed, check duration: 0ms. 2 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Feb 02 09:45:28 compute-1 python3.9[102205]: ansible-ansible.legacy.command Invoked with _raw_params=systemctl is-system-running _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 02 09:45:28 compute-1 sudo[102203]: pam_unix(sudo:session): session closed for user root
Feb 02 09:45:29 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:45:29 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 09:45:29 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:45:29.104 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 09:45:29 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:45:29 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:45:29 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:45:29.315 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:45:29 compute-1 sudo[102359]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-abjgtxmqlhmwcbqugbvdzjrkgtnblxqf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025529.1334713-505-229162477751212/AnsiballZ_service_facts.py'
Feb 02 09:45:29 compute-1 sudo[102359]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:45:29 compute-1 python3.9[102361]: ansible-service_facts Invoked
Feb 02 09:45:29 compute-1 network[102378]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Feb 02 09:45:29 compute-1 network[102379]: 'network-scripts' will be removed from distribution in near future.
Feb 02 09:45:29 compute-1 network[102380]: It is advised to switch to 'NetworkManager' instead for network management.
Feb 02 09:45:30 compute-1 ceph-mon[80115]: pgmap v164: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 3.8 KiB/s rd, 1.2 KiB/s wr, 5 op/s
Feb 02 09:45:30 compute-1 sshd-session[102206]: Connection closed by authenticating user root 123.58.212.100 port 56346 [preauth]
Feb 02 09:45:30 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 09:45:31 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:45:31 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:45:31 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:45:31.107 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:45:31 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:45:31 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 09:45:31 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:45:31.317 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 09:45:31 compute-1 sshd-session[102405]: Connection closed by authenticating user root 123.58.212.100 port 56356 [preauth]
Feb 02 09:45:32 compute-1 ceph-mon[80115]: pgmap v165: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 2.6 KiB/s rd, 1023 B/s wr, 3 op/s
Feb 02 09:45:32 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Feb 02 09:45:32 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[100546]: 02/02/2026 09:45:32 : epoch 6980722e : compute-1 : ganesha.nfsd-2[main] rados_cluster_end_grace :CLIENT ID :EVENT :Failed to remove rec-000000000000000a:nfs.cephfs.0: -2
Feb 02 09:45:32 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[100546]: 02/02/2026 09:45:32 : epoch 6980722e : compute-1 : ganesha.nfsd-2[main] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Feb 02 09:45:32 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[100546]: 02/02/2026 09:45:32 : epoch 6980722e : compute-1 : ganesha.nfsd-2[main] main :NFS STARTUP :WARN :No export entries found in configuration file !!!
Feb 02 09:45:32 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[100546]: 02/02/2026 09:45:32 : epoch 6980722e : compute-1 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:24): Unknown block (RADOS_URLS)
Feb 02 09:45:32 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[100546]: 02/02/2026 09:45:32 : epoch 6980722e : compute-1 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:29): Unknown block (RGW)
Feb 02 09:45:32 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[100546]: 02/02/2026 09:45:32 : epoch 6980722e : compute-1 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :CAP_SYS_RESOURCE was successfully removed for proper quota management in FSAL
Feb 02 09:45:32 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[100546]: 02/02/2026 09:45:32 : epoch 6980722e : compute-1 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :currently set capabilities are: cap_chown,cap_dac_override,cap_fowner,cap_fsetid,cap_kill,cap_setgid,cap_setuid,cap_setpcap,cap_net_bind_service,cap_sys_chroot,cap_setfcap=ep
Feb 02 09:45:32 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[100546]: 02/02/2026 09:45:32 : epoch 6980722e : compute-1 : ganesha.nfsd-2[main] gsh_dbus_pkginit :DBUS :CRIT :dbus_bus_get failed (Failed to connect to socket /run/dbus/system_bus_socket: No such file or directory)
Feb 02 09:45:32 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[100546]: 02/02/2026 09:45:32 : epoch 6980722e : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Feb 02 09:45:32 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[100546]: 02/02/2026 09:45:32 : epoch 6980722e : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Feb 02 09:45:32 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[100546]: 02/02/2026 09:45:32 : epoch 6980722e : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Feb 02 09:45:32 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[100546]: 02/02/2026 09:45:32 : epoch 6980722e : compute-1 : ganesha.nfsd-2[main] nfs_Init_svc :DISP :CRIT :Cannot acquire credentials for principal nfs
Feb 02 09:45:32 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[100546]: 02/02/2026 09:45:32 : epoch 6980722e : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Feb 02 09:45:32 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[100546]: 02/02/2026 09:45:32 : epoch 6980722e : compute-1 : ganesha.nfsd-2[main] nfs_Init_admin_thread :NFS CB :EVENT :Admin thread initialized
Feb 02 09:45:32 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[100546]: 02/02/2026 09:45:32 : epoch 6980722e : compute-1 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :EVENT :Callback creds directory (/var/run/ganesha) already exists
Feb 02 09:45:32 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[100546]: 02/02/2026 09:45:32 : epoch 6980722e : compute-1 : ganesha.nfsd-2[main] find_keytab_entry :NFS CB :WARN :Configuration file does not specify default realm while getting default realm name
Feb 02 09:45:32 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[100546]: 02/02/2026 09:45:32 : epoch 6980722e : compute-1 : ganesha.nfsd-2[main] gssd_refresh_krb5_machine_credential :NFS CB :CRIT :ERROR: gssd_refresh_krb5_machine_credential: no usable keytab entry found in keytab /etc/krb5.keytab for connection with host localhost
Feb 02 09:45:32 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[100546]: 02/02/2026 09:45:32 : epoch 6980722e : compute-1 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :WARN :gssd_refresh_krb5_machine_credential failed (-1765328160:0)
Feb 02 09:45:32 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[100546]: 02/02/2026 09:45:32 : epoch 6980722e : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :Starting delayed executor.
Feb 02 09:45:32 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[100546]: 02/02/2026 09:45:32 : epoch 6980722e : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :gsh_dbusthread was started successfully
Feb 02 09:45:32 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[100546]: 02/02/2026 09:45:32 : epoch 6980722e : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :admin thread was started successfully
Feb 02 09:45:32 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[100546]: 02/02/2026 09:45:32 : epoch 6980722e : compute-1 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :CRIT :DBUS not initialized, service thread exiting
Feb 02 09:45:32 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[100546]: 02/02/2026 09:45:32 : epoch 6980722e : compute-1 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :EVENT :shutdown
Feb 02 09:45:32 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[100546]: 02/02/2026 09:45:32 : epoch 6980722e : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :reaper thread was started successfully
Feb 02 09:45:32 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[100546]: 02/02/2026 09:45:32 : epoch 6980722e : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :General fridge was started successfully
Feb 02 09:45:32 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[100546]: 02/02/2026 09:45:32 : epoch 6980722e : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Feb 02 09:45:32 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[100546]: 02/02/2026 09:45:32 : epoch 6980722e : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :             NFS SERVER INITIALIZED
Feb 02 09:45:32 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[100546]: 02/02/2026 09:45:32 : epoch 6980722e : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Feb 02 09:45:32 compute-1 sudo[102359]: pam_unix(sudo:session): session closed for user root
Feb 02 09:45:32 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[100546]: 02/02/2026 09:45:32 : epoch 6980722e : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe1f0000df0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:45:32 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[100546]: 02/02/2026 09:45:32 : epoch 6980722e : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe1e8001c40 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:45:33 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:45:33 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:45:33 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:45:33.109 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:45:33 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[100546]: 02/02/2026 09:45:33 : epoch 6980722e : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe1d0000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:45:33 compute-1 sshd-session[102475]: Connection closed by authenticating user root 123.58.212.100 port 60396 [preauth]
Feb 02 09:45:33 compute-1 sudo[102537]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Feb 02 09:45:33 compute-1 sudo[102537]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:45:33 compute-1 sudo[102537]: pam_unix(sudo:session): session closed for user root
Feb 02 09:45:33 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:45:33 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:45:33 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:45:33.321 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:45:34 compute-1 ceph-mon[80115]: pgmap v166: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 2.6 KiB/s rd, 1023 B/s wr, 3 op/s
Feb 02 09:45:34 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[100546]: 02/02/2026 09:45:34 : epoch 6980722e : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe1c8000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:45:34 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[100546]: 02/02/2026 09:45:34 : epoch 6980722e : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe1e0001230 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:45:35 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:45:35 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:45:35 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:45:35.111 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:45:35 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-haproxy-nfs-cephfs-compute-1-sryqbx[86108]: [WARNING] 032/094535 (4) : Server backend/nfs.cephfs.0 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Feb 02 09:45:35 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[100546]: 02/02/2026 09:45:35 : epoch 6980722e : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe1e8001c40 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:45:35 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:45:35 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:45:35 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:45:35.325 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:45:35 compute-1 sshd-session[102562]: Connection closed by authenticating user root 123.58.212.100 port 60402 [preauth]
Feb 02 09:45:35 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 09:45:35 compute-1 sudo[102715]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wtswdowslanivnwxyxgljzbvfbccfucq ; /bin/bash /home/zuul/.ansible/tmp/ansible-tmp-1770025535.4041154-550-221928548032105/AnsiballZ_timesync_provider.sh /home/zuul/.ansible/tmp/ansible-tmp-1770025535.4041154-550-221928548032105/args'
Feb 02 09:45:35 compute-1 sudo[102715]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:45:35 compute-1 sudo[102715]: pam_unix(sudo:session): session closed for user root
Feb 02 09:45:36 compute-1 ceph-mon[80115]: pgmap v167: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 3.2 KiB/s rd, 1.2 KiB/s wr, 4 op/s
Feb 02 09:45:36 compute-1 sudo[102882]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lopjyzsrydzczeiddwhpkowfxfutgihy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025536.1559105-583-201819179204784/AnsiballZ_dnf.py'
Feb 02 09:45:36 compute-1 sudo[102882]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:45:36 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[100546]: 02/02/2026 09:45:36 : epoch 6980722e : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe1d00016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:45:36 compute-1 python3.9[102884]: ansible-ansible.legacy.dnf Invoked with name=['chrony'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Feb 02 09:45:36 compute-1 sshd-session[102665]: Connection closed by authenticating user root 123.58.212.100 port 60406 [preauth]
Feb 02 09:45:36 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[100546]: 02/02/2026 09:45:36 : epoch 6980722e : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe1c80016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:45:37 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:45:37 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:45:37 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:45:37.113 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:45:37 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[100546]: 02/02/2026 09:45:37 : epoch 6980722e : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe1e0001d50 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:45:37 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:45:37 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 09:45:37 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:45:37.327 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 09:45:37 compute-1 sudo[102882]: pam_unix(sudo:session): session closed for user root
Feb 02 09:45:38 compute-1 ceph-mon[80115]: pgmap v168: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 2.1 KiB/s rd, 682 B/s wr, 2 op/s
Feb 02 09:45:38 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[100546]: 02/02/2026 09:45:38 : epoch 6980722e : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe1e8002b30 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:45:38 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[100546]: 02/02/2026 09:45:38 : epoch 6980722e : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe1d00016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:45:39 compute-1 sudo[103039]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oargpbzwydayhadmdhrrrtecbzuzarbk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025538.4383936-622-16869116248349/AnsiballZ_package_facts.py'
Feb 02 09:45:39 compute-1 sudo[103039]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:45:39 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:45:39 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:45:39 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:45:39.115 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:45:39 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[100546]: 02/02/2026 09:45:39 : epoch 6980722e : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe1c80016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:45:39 compute-1 sshd-session[102887]: Connection closed by authenticating user root 123.58.212.100 port 60410 [preauth]
Feb 02 09:45:39 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:45:39 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 09:45:39 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:45:39.330 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 09:45:39 compute-1 python3.9[103041]: ansible-package_facts Invoked with manager=['auto'] strategy=first
Feb 02 09:45:39 compute-1 sudo[103039]: pam_unix(sudo:session): session closed for user root
Feb 02 09:45:40 compute-1 ceph-mon[80115]: pgmap v169: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 2.1 KiB/s rd, 682 B/s wr, 2 op/s
Feb 02 09:45:40 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 09:45:40 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[100546]: 02/02/2026 09:45:40 : epoch 6980722e : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe1e0001d50 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:45:40 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[100546]: 02/02/2026 09:45:40 : epoch 6980722e : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe1e8002b30 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:45:40 compute-1 sudo[103194]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kvwcodepivpjsznoxqjpfjtauqrgrnws ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025540.6252036-652-249520972199612/AnsiballZ_stat.py'
Feb 02 09:45:40 compute-1 sudo[103194]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:45:41 compute-1 sshd-session[103042]: Connection closed by authenticating user root 123.58.212.100 port 60416 [preauth]
Feb 02 09:45:41 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:45:41 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 09:45:41 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:45:41.117 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 09:45:41 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[100546]: 02/02/2026 09:45:41 : epoch 6980722e : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe1d00016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:45:41 compute-1 python3.9[103196]: ansible-ansible.legacy.stat Invoked with path=/etc/chrony.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 02 09:45:41 compute-1 sudo[103194]: pam_unix(sudo:session): session closed for user root
Feb 02 09:45:41 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:45:41 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:45:41 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:45:41.333 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:45:41 compute-1 sudo[103274]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dizgjludlttzslpwksfebumpmysrpgux ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025540.6252036-652-249520972199612/AnsiballZ_file.py'
Feb 02 09:45:41 compute-1 sudo[103274]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:45:41 compute-1 python3.9[103276]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/etc/chrony.conf _original_basename=chrony.conf.j2 recurse=False state=file path=/etc/chrony.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 02 09:45:41 compute-1 sudo[103274]: pam_unix(sudo:session): session closed for user root
Feb 02 09:45:42 compute-1 sudo[103426]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lwofzksndgvdujrthtmkpexhkzxhrgop ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025541.8616428-688-211838156934036/AnsiballZ_stat.py'
Feb 02 09:45:42 compute-1 sudo[103426]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:45:42 compute-1 ceph-mon[80115]: pgmap v170: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 597 B/s rd, 170 B/s wr, 0 op/s
Feb 02 09:45:42 compute-1 python3.9[103428]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/chronyd follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 02 09:45:42 compute-1 sudo[103426]: pam_unix(sudo:session): session closed for user root
Feb 02 09:45:42 compute-1 sshd-session[103200]: Connection closed by authenticating user root 123.58.212.100 port 49228 [preauth]
Feb 02 09:45:42 compute-1 sudo[103504]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wuacjhvcedaqjiczuhgnzbdyuhqflgne ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025541.8616428-688-211838156934036/AnsiballZ_file.py'
Feb 02 09:45:42 compute-1 sudo[103504]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:45:42 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[100546]: 02/02/2026 09:45:42 : epoch 6980722e : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe1d00016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:45:42 compute-1 python3.9[103506]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/etc/sysconfig/chronyd _original_basename=chronyd.sysconfig.j2 recurse=False state=file path=/etc/sysconfig/chronyd force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 02 09:45:42 compute-1 sudo[103504]: pam_unix(sudo:session): session closed for user root
Feb 02 09:45:42 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[100546]: 02/02/2026 09:45:42 : epoch 6980722e : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe1d00016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:45:43 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:45:43 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:45:43 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:45:43.120 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:45:43 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[100546]: 02/02/2026 09:45:43 : epoch 6980722e : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe1e8002b30 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:45:43 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:45:43 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:45:43 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:45:43.336 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:45:43 compute-1 sshd-session[103507]: Connection closed by authenticating user root 123.58.212.100 port 49230 [preauth]
Feb 02 09:45:44 compute-1 sudo[103661]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qzbqxhcnmfixbtvovrivhqgmrmhtqyog ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025543.795991-743-203160638196080/AnsiballZ_lineinfile.py'
Feb 02 09:45:44 compute-1 sudo[103661]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:45:44 compute-1 ceph-mon[80115]: pgmap v171: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 597 B/s rd, 170 B/s wr, 0 op/s
Feb 02 09:45:44 compute-1 python3.9[103663]: ansible-lineinfile Invoked with backup=True create=True dest=/etc/sysconfig/network line=PEERNTP=no mode=0644 regexp=^PEERNTP= state=present path=/etc/sysconfig/network encoding=utf-8 backrefs=False firstmatch=False unsafe_writes=False search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 02 09:45:44 compute-1 sudo[103661]: pam_unix(sudo:session): session closed for user root
Feb 02 09:45:44 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[100546]: 02/02/2026 09:45:44 : epoch 6980722e : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe1c8002720 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:45:44 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[100546]: 02/02/2026 09:45:44 : epoch 6980722e : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe1e0002df0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:45:45 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:45:45 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000028s ======
Feb 02 09:45:45 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:45:45.123 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Feb 02 09:45:45 compute-1 sshd-session[103586]: Connection closed by authenticating user root 123.58.212.100 port 49238 [preauth]
Feb 02 09:45:45 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[100546]: 02/02/2026 09:45:45 : epoch 6980722e : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe1d00032f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:45:45 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:45:45 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 09:45:45 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:45:45.338 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 09:45:45 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 09:45:46 compute-1 sudo[103816]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-itcrkvosuupalisarrhyhyqtbfxgifui ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025545.7277558-787-191806652320508/AnsiballZ_setup.py'
Feb 02 09:45:46 compute-1 sudo[103816]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:45:46 compute-1 python3.9[103818]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all'] filter=['ansible_service_mgr'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Feb 02 09:45:46 compute-1 ceph-mon[80115]: pgmap v172: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 852 B/s rd, 170 B/s wr, 1 op/s
Feb 02 09:45:46 compute-1 sshd-session[103689]: Connection closed by authenticating user root 123.58.212.100 port 49240 [preauth]
Feb 02 09:45:46 compute-1 sudo[103816]: pam_unix(sudo:session): session closed for user root
Feb 02 09:45:46 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[100546]: 02/02/2026 09:45:46 : epoch 6980722e : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe1e8002b30 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:45:46 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[100546]: 02/02/2026 09:45:46 : epoch 6980722e : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe1c8002720 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:45:47 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:45:47 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:45:47 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:45:47.126 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:45:47 compute-1 sudo[103903]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tyvghtxvvqfdysfxhopjhvxdvcaavumr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025545.7277558-787-191806652320508/AnsiballZ_systemd.py'
Feb 02 09:45:47 compute-1 sudo[103903]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:45:47 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[100546]: 02/02/2026 09:45:47 : epoch 6980722e : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe1e0002df0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:45:47 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:45:47 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:45:47 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:45:47.342 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:45:47 compute-1 ceph-mon[80115]: pgmap v173: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Feb 02 09:45:47 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Feb 02 09:45:47 compute-1 python3.9[103905]: ansible-ansible.legacy.systemd Invoked with enabled=True name=chronyd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 02 09:45:47 compute-1 sudo[103903]: pam_unix(sudo:session): session closed for user root
Feb 02 09:45:48 compute-1 sshd-session[103828]: Connection closed by authenticating user root 123.58.212.100 port 49246 [preauth]
Feb 02 09:45:48 compute-1 sshd-session[99126]: Connection closed by 192.168.122.30 port 51012
Feb 02 09:45:48 compute-1 sshd-session[99092]: pam_unix(sshd:session): session closed for user zuul
Feb 02 09:45:48 compute-1 systemd[1]: session-41.scope: Deactivated successfully.
Feb 02 09:45:48 compute-1 systemd[1]: session-41.scope: Consumed 22.798s CPU time.
Feb 02 09:45:48 compute-1 systemd-logind[805]: Session 41 logged out. Waiting for processes to exit.
Feb 02 09:45:48 compute-1 systemd-logind[805]: Removed session 41.
Feb 02 09:45:48 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[100546]: 02/02/2026 09:45:48 : epoch 6980722e : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe1d00032f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:45:48 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[100546]: 02/02/2026 09:45:48 : epoch 6980722e : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe1e8002b30 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:45:49 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:45:49 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:45:49 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:45:49.127 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:45:49 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[100546]: 02/02/2026 09:45:49 : epoch 6980722e : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe1c8002720 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:45:49 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:45:49 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:45:49 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:45:49.345 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:45:49 compute-1 sshd-session[103932]: Connection closed by authenticating user root 123.58.212.100 port 49260 [preauth]
Feb 02 09:45:50 compute-1 ceph-mon[80115]: pgmap v174: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Feb 02 09:45:50 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 09:45:50 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[100546]: 02/02/2026 09:45:50 : epoch 6980722e : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe1e0002df0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:45:50 compute-1 sshd-session[103935]: Connection closed by authenticating user root 123.58.212.100 port 49266 [preauth]
Feb 02 09:45:50 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[100546]: 02/02/2026 09:45:50 : epoch 6980722e : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe1d0003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:45:51 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:45:51 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:45:51 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:45:51.131 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:45:51 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[100546]: 02/02/2026 09:45:51 : epoch 6980722e : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe1e8002b30 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:45:51 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:45:51 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000028s ======
Feb 02 09:45:51 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:45:51.347 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Feb 02 09:45:52 compute-1 ceph-mon[80115]: pgmap v175: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Feb 02 09:45:52 compute-1 sshd-session[103938]: Connection closed by authenticating user root 123.58.212.100 port 49270 [preauth]
Feb 02 09:45:52 compute-1 systemd[83549]: Created slice User Background Tasks Slice.
Feb 02 09:45:52 compute-1 systemd[83549]: Starting Cleanup of User's Temporary Files and Directories...
Feb 02 09:45:52 compute-1 systemd[83549]: Finished Cleanup of User's Temporary Files and Directories.
Feb 02 09:45:52 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[100546]: 02/02/2026 09:45:52 : epoch 6980722e : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe1c8003820 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:45:52 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[100546]: 02/02/2026 09:45:52 : epoch 6980722e : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe1e0003ef0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:45:53 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:45:53 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 09:45:53 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:45:53.133 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 09:45:53 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[100546]: 02/02/2026 09:45:53 : epoch 6980722e : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe1d0003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:45:53 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:45:53 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 09:45:53 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:45:53.363 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 09:45:53 compute-1 sudo[103944]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Feb 02 09:45:53 compute-1 sudo[103944]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:45:53 compute-1 sudo[103944]: pam_unix(sudo:session): session closed for user root
Feb 02 09:45:53 compute-1 sshd-session[103969]: Accepted publickey for zuul from 192.168.122.30 port 50160 ssh2: ECDSA SHA256:RIWOugHsRom13QN8+H2eekzMj6VNcm6gUxie+zDStiQ
Feb 02 09:45:53 compute-1 systemd-logind[805]: New session 42 of user zuul.
Feb 02 09:45:53 compute-1 systemd[1]: Started Session 42 of User zuul.
Feb 02 09:45:53 compute-1 sshd-session[103969]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Feb 02 09:45:53 compute-1 sshd-session[103941]: Connection closed by authenticating user root 123.58.212.100 port 51394 [preauth]
Feb 02 09:45:54 compute-1 ceph-mon[80115]: pgmap v176: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Feb 02 09:45:54 compute-1 sudo[104124]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-unfnnqqrcoyyuckprjlutsirlrbsetna ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025553.7145567-22-108855902790677/AnsiballZ_file.py'
Feb 02 09:45:54 compute-1 sudo[104124]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:45:54 compute-1 python3.9[104126]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 02 09:45:54 compute-1 sudo[104124]: pam_unix(sudo:session): session closed for user root
Feb 02 09:45:54 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[100546]: 02/02/2026 09:45:54 : epoch 6980722e : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe1e8002b30 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:45:54 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[100546]: 02/02/2026 09:45:54 : epoch 6980722e : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe1c8003820 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:45:55 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:45:55 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:45:55 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:45:55.136 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:45:55 compute-1 sudo[104277]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-srhljpvhhddffpqtngdddvvjvkanpxnw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025554.6846757-58-59909503620041/AnsiballZ_stat.py'
Feb 02 09:45:55 compute-1 sudo[104277]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:45:55 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[100546]: 02/02/2026 09:45:55 : epoch 6980722e : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe1e0003ef0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:45:55 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:45:55 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:45:55 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:45:55.367 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:45:55 compute-1 python3.9[104279]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/ceph-networks.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 02 09:45:55 compute-1 sshd-session[104032]: Connection closed by authenticating user root 123.58.212.100 port 51404 [preauth]
Feb 02 09:45:55 compute-1 sudo[104277]: pam_unix(sudo:session): session closed for user root
Feb 02 09:45:55 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 09:45:55 compute-1 sudo[104357]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hvwmpwogrxrknegokocygaidgvxrutcp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025554.6846757-58-59909503620041/AnsiballZ_file.py'
Feb 02 09:45:55 compute-1 sudo[104357]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:45:55 compute-1 python3.9[104359]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/ceph-networks.yaml _original_basename=firewall.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/ceph-networks.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 02 09:45:55 compute-1 sudo[104357]: pam_unix(sudo:session): session closed for user root
Feb 02 09:45:56 compute-1 ceph-mon[80115]: pgmap v177: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Feb 02 09:45:56 compute-1 sshd-session[103972]: Connection closed by 192.168.122.30 port 50160
Feb 02 09:45:56 compute-1 sshd-session[103969]: pam_unix(sshd:session): session closed for user zuul
Feb 02 09:45:56 compute-1 systemd[1]: session-42.scope: Deactivated successfully.
Feb 02 09:45:56 compute-1 systemd[1]: session-42.scope: Consumed 1.420s CPU time.
Feb 02 09:45:56 compute-1 systemd-logind[805]: Session 42 logged out. Waiting for processes to exit.
Feb 02 09:45:56 compute-1 systemd-logind[805]: Removed session 42.
Feb 02 09:45:56 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[100546]: 02/02/2026 09:45:56 : epoch 6980722e : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe1d0003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:45:56 compute-1 sshd-session[104335]: Connection closed by authenticating user root 123.58.212.100 port 51418 [preauth]
Feb 02 09:45:56 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[100546]: 02/02/2026 09:45:56 : epoch 6980722e : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe1e8002b30 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:45:57 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:45:57 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:45:57 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:45:57.138 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:45:57 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[100546]: 02/02/2026 09:45:57 : epoch 6980722e : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe1c8003820 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:45:57 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:45:57 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:45:57 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:45:57.370 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:45:58 compute-1 sshd-session[104386]: Connection closed by authenticating user root 123.58.212.100 port 51422 [preauth]
Feb 02 09:45:58 compute-1 ceph-mon[80115]: pgmap v178: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Feb 02 09:45:58 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[100546]: 02/02/2026 09:45:58 : epoch 6980722e : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe1e0003ef0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:45:58 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[100546]: 02/02/2026 09:45:58 : epoch 6980722e : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe1d0003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:45:59 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:45:59 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:45:59 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:45:59.140 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:45:59 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[100546]: 02/02/2026 09:45:59 : epoch 6980722e : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe1e8002b30 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:45:59 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:45:59 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:45:59 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:45:59.374 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:46:00 compute-1 ceph-mon[80115]: pgmap v179: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Feb 02 09:46:00 compute-1 sshd-session[104388]: Connection closed by authenticating user root 123.58.212.100 port 51438 [preauth]
Feb 02 09:46:00 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 09:46:00 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[100546]: 02/02/2026 09:46:00 : epoch 6980722e : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe1c8003820 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:46:00 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[100546]: 02/02/2026 09:46:00 : epoch 6980722e : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe1e0003ef0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:46:01 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:46:01 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:46:01 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:46:01.141 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:46:01 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[100546]: 02/02/2026 09:46:01 : epoch 6980722e : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe1d0003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:46:01 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:46:01 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:46:01 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:46:01.377 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:46:01 compute-1 sshd-session[104394]: Accepted publickey for zuul from 192.168.122.30 port 47826 ssh2: ECDSA SHA256:RIWOugHsRom13QN8+H2eekzMj6VNcm6gUxie+zDStiQ
Feb 02 09:46:01 compute-1 systemd-logind[805]: New session 43 of user zuul.
Feb 02 09:46:01 compute-1 systemd[1]: Started Session 43 of User zuul.
Feb 02 09:46:01 compute-1 sshd-session[104394]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Feb 02 09:46:01 compute-1 sshd-session[104391]: Connection closed by authenticating user root 123.58.212.100 port 51444 [preauth]
Feb 02 09:46:02 compute-1 ceph-mon[80115]: pgmap v180: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Feb 02 09:46:02 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[100546]: 02/02/2026 09:46:02 : epoch 6980722e : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe1e8002b30 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:46:02 compute-1 python3.9[104549]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 02 09:46:02 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[100546]: 02/02/2026 09:46:02 : epoch 6980722e : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe1e8002b30 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:46:03 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:46:03 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:46:03 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:46:03.144 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:46:03 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Feb 02 09:46:03 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[100546]: 02/02/2026 09:46:03 : epoch 6980722e : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe1e8002b30 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:46:03 compute-1 sshd-session[104450]: Connection closed by authenticating user root 123.58.212.100 port 42998 [preauth]
Feb 02 09:46:03 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:46:03 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 09:46:03 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:46:03.380 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 09:46:03 compute-1 sudo[104704]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vbcvmriocwronehdiyozjmpjdtknfdmh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025563.3976903-55-268540564219644/AnsiballZ_file.py'
Feb 02 09:46:03 compute-1 sudo[104704]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:46:03 compute-1 python3.9[104706]: ansible-ansible.builtin.file Invoked with group=zuul mode=0770 owner=zuul path=/root/.config/containers recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 02 09:46:04 compute-1 sudo[104704]: pam_unix(sudo:session): session closed for user root
Feb 02 09:46:04 compute-1 ceph-mon[80115]: pgmap v181: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Feb 02 09:46:04 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[100546]: 02/02/2026 09:46:04 : epoch 6980722e : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe1d0003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:46:04 compute-1 sudo[104882]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nnneaugbeyktxnwiwrnwzpkojawhhkcm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025564.2868757-79-50884814647727/AnsiballZ_stat.py'
Feb 02 09:46:04 compute-1 sudo[104882]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:46:04 compute-1 python3.9[104886]: ansible-ansible.legacy.stat Invoked with path=/root/.config/containers/auth.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 02 09:46:04 compute-1 sudo[104882]: pam_unix(sudo:session): session closed for user root
Feb 02 09:46:04 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[100546]: 02/02/2026 09:46:04 : epoch 6980722e : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe1c8003820 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:46:05 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:46:05 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:46:05 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:46:05.146 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:46:05 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[100546]: 02/02/2026 09:46:05 : epoch 6980722e : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe1cc000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:46:05 compute-1 sudo[104962]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bwvbinjqygttjjotiqdzgjwknuktdkbp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025564.2868757-79-50884814647727/AnsiballZ_file.py'
Feb 02 09:46:05 compute-1 sudo[104962]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:46:05 compute-1 python3.9[104964]: ansible-ansible.legacy.file Invoked with group=zuul mode=0660 owner=zuul dest=/root/.config/containers/auth.json _original_basename=.0juwud0s recurse=False state=file path=/root/.config/containers/auth.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 02 09:46:05 compute-1 sudo[104962]: pam_unix(sudo:session): session closed for user root
Feb 02 09:46:05 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:46:05 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:46:05 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:46:05.382 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:46:05 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 09:46:05 compute-1 sshd-session[104806]: Connection closed by authenticating user root 123.58.212.100 port 43002 [preauth]
Feb 02 09:46:05 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-haproxy-nfs-cephfs-compute-1-sryqbx[86108]: [WARNING] 032/094605 (4) : Server backend/nfs.cephfs.1 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Feb 02 09:46:06 compute-1 ceph-mon[80115]: pgmap v182: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 426 B/s rd, 0 op/s
Feb 02 09:46:06 compute-1 sudo[105116]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wtgsbbecikhkqezucsyiumtnwgyenkuv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025566.0530531-139-229991582817854/AnsiballZ_stat.py'
Feb 02 09:46:06 compute-1 sudo[105116]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:46:06 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[100546]: 02/02/2026 09:46:06 : epoch 6980722e : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe1e8002b30 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:46:06 compute-1 python3.9[105118]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 02 09:46:06 compute-1 sudo[105116]: pam_unix(sudo:session): session closed for user root
Feb 02 09:46:06 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[100546]: 02/02/2026 09:46:06 : epoch 6980722e : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe1e8002b30 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:46:07 compute-1 sudo[105195]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-shqftzwgqvbihrkcqlrpvkhsyvwgbxbd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025566.0530531-139-229991582817854/AnsiballZ_file.py'
Feb 02 09:46:07 compute-1 sudo[105195]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:46:07 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:46:07 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:46:07 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:46:07.149 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:46:07 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[100546]: 02/02/2026 09:46:07 : epoch 6980722e : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe1c8003820 fd 38 proxy ignored for local
Feb 02 09:46:07 compute-1 kernel: ganesha.nfsd[102536]: segfault at 50 ip 00007fe2786d032e sp 00007fe1e4ff8210 error 4 in libntirpc.so.5.8[7fe2786b5000+2c000] likely on CPU 2 (core 0, socket 2)
Feb 02 09:46:07 compute-1 kernel: Code: 47 20 66 41 89 86 f2 00 00 00 41 bf 01 00 00 00 b9 40 00 00 00 e9 af fd ff ff 66 90 48 8b 85 f8 00 00 00 48 8b 40 08 4c 8b 28 <45> 8b 65 50 49 8b 75 68 41 8b be 28 02 00 00 b9 40 00 00 00 e8 29
Feb 02 09:46:07 compute-1 systemd[1]: Started Process Core Dump (PID 105198/UID 0).
Feb 02 09:46:07 compute-1 sshd-session[104989]: Connection closed by authenticating user root 123.58.212.100 port 43006 [preauth]
Feb 02 09:46:07 compute-1 python3.9[105197]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/etc/sysconfig/podman_drop_in _original_basename=.bx0j27wq recurse=False state=file path=/etc/sysconfig/podman_drop_in force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 02 09:46:07 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:46:07 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 09:46:07 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:46:07.384 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 09:46:07 compute-1 sudo[105195]: pam_unix(sudo:session): session closed for user root
Feb 02 09:46:07 compute-1 sudo[105351]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vkjxbibfdsopzxbajiqwuyhesxiqiraw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025567.6578069-178-228947051872825/AnsiballZ_file.py'
Feb 02 09:46:07 compute-1 sudo[105351]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:46:08 compute-1 systemd-coredump[105199]: Process 100550 (ganesha.nfsd) of user 0 dumped core.
                                                    
                                                    Stack trace of thread 55:
                                                    #0  0x00007fe2786d032e n/a (/usr/lib64/libntirpc.so.5.8 + 0x2232e)
                                                    #1  0x0000000000000000 n/a (n/a + 0x0)
                                                    #2  0x00007fe2786da900 n/a (/usr/lib64/libntirpc.so.5.8 + 0x2c900)
                                                    ELF object binary architecture: AMD x86-64
Feb 02 09:46:08 compute-1 systemd[1]: systemd-coredump@3-105198-0.service: Deactivated successfully.
Feb 02 09:46:08 compute-1 python3.9[105353]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 02 09:46:08 compute-1 sudo[105351]: pam_unix(sudo:session): session closed for user root
Feb 02 09:46:08 compute-1 podman[105358]: 2026-02-02 09:46:08.140773729 +0000 UTC m=+0.029216273 container died 0a031cdcde842b9c7e1b7bd84624e8f88ad9ced371b6ec53ec2ad93b8ba8a11e (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, ceph=True, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Feb 02 09:46:08 compute-1 systemd[1]: var-lib-containers-storage-overlay-7c29b5401ef58f083f85a8e200edfe4e6670b4a6d9a10c29e8c2df49cb38fe6d-merged.mount: Deactivated successfully.
Feb 02 09:46:08 compute-1 podman[105358]: 2026-02-02 09:46:08.194327833 +0000 UTC m=+0.082770367 container remove 0a031cdcde842b9c7e1b7bd84624e8f88ad9ced371b6ec53ec2ad93b8ba8a11e (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx, ceph=True, org.label-schema.build-date=20250325, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, CEPH_REF=squid, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS)
Feb 02 09:46:08 compute-1 systemd[1]: ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1@nfs.cephfs.0.0.compute-1.mhzhsx.service: Main process exited, code=exited, status=139/n/a
Feb 02 09:46:08 compute-1 ceph-mon[80115]: pgmap v183: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s
Feb 02 09:46:08 compute-1 systemd[1]: ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1@nfs.cephfs.0.0.compute-1.mhzhsx.service: Failed with result 'exit-code'.
Feb 02 09:46:08 compute-1 systemd[1]: ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1@nfs.cephfs.0.0.compute-1.mhzhsx.service: Consumed 1.070s CPU time.
Feb 02 09:46:08 compute-1 sudo[105551]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bxpwgvvxmvlxrplmadthmnkztlwdiuqb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025568.3915343-202-71151665242878/AnsiballZ_stat.py'
Feb 02 09:46:08 compute-1 sudo[105551]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:46:08 compute-1 sshd-session[105224]: Connection closed by authenticating user root 123.58.212.100 port 43014 [preauth]
Feb 02 09:46:08 compute-1 python3.9[105554]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 02 09:46:08 compute-1 sudo[105551]: pam_unix(sudo:session): session closed for user root
Feb 02 09:46:09 compute-1 sudo[105632]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ixspjpslveomoygsvlmjowhekgazqewb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025568.3915343-202-71151665242878/AnsiballZ_file.py'
Feb 02 09:46:09 compute-1 sudo[105632]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:46:09 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:46:09 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:46:09 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:46:09.153 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:46:09 compute-1 python3.9[105634]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 02 09:46:09 compute-1 sudo[105632]: pam_unix(sudo:session): session closed for user root
Feb 02 09:46:09 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:46:09 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 09:46:09 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:46:09.387 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 09:46:09 compute-1 sudo[105784]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ylqhmtfupuqiznvldslimagenpjxiwbp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025569.466717-202-261505710220534/AnsiballZ_stat.py'
Feb 02 09:46:09 compute-1 sudo[105784]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:46:09 compute-1 python3.9[105786]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 02 09:46:10 compute-1 sudo[105784]: pam_unix(sudo:session): session closed for user root
Feb 02 09:46:10 compute-1 sudo[105862]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yztoctvioqijlqzmeltnniwzgjrcslyx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025569.466717-202-261505710220534/AnsiballZ_file.py'
Feb 02 09:46:10 compute-1 sudo[105862]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:46:10 compute-1 ceph-mon[80115]: pgmap v184: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s
Feb 02 09:46:10 compute-1 python3.9[105864]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 02 09:46:10 compute-1 sudo[105862]: pam_unix(sudo:session): session closed for user root
Feb 02 09:46:10 compute-1 sshd-session[105607]: Connection closed by authenticating user root 123.58.212.100 port 43028 [preauth]
Feb 02 09:46:10 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 09:46:11 compute-1 sudo[106015]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eovinfnujeiknsdvkxgjwfiqcnwuotbq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025570.5882852-271-273582708805954/AnsiballZ_file.py'
Feb 02 09:46:11 compute-1 sudo[106015]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:46:11 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:46:11 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 09:46:11 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:46:11.155 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 09:46:11 compute-1 python3.9[106017]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 02 09:46:11 compute-1 sudo[106015]: pam_unix(sudo:session): session closed for user root
Feb 02 09:46:11 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:46:11 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 09:46:11 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:46:11.390 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 09:46:12 compute-1 ceph-mon[80115]: pgmap v185: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s
Feb 02 09:46:12 compute-1 sshd-session[106042]: Connection closed by authenticating user root 123.58.212.100 port 43030 [preauth]
Feb 02 09:46:12 compute-1 sudo[106169]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-putemhnggbckckmldvnmmwdyignjkjik ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025572.4465888-295-76815533404088/AnsiballZ_stat.py'
Feb 02 09:46:12 compute-1 sudo[106169]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:46:12 compute-1 python3.9[106171]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 02 09:46:12 compute-1 sudo[106169]: pam_unix(sudo:session): session closed for user root
Feb 02 09:46:13 compute-1 sudo[106250]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xfpwiaixdasfnlwbpdylviaqbxdvheke ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025572.4465888-295-76815533404088/AnsiballZ_file.py'
Feb 02 09:46:13 compute-1 sudo[106250]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:46:13 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:46:13 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:46:13 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:46:13.158 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:46:13 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-haproxy-nfs-cephfs-compute-1-sryqbx[86108]: [WARNING] 032/094613 (4) : Server backend/nfs.cephfs.0 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 1 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Feb 02 09:46:13 compute-1 python3.9[106252]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 02 09:46:13 compute-1 sudo[106250]: pam_unix(sudo:session): session closed for user root
Feb 02 09:46:13 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:46:13 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:46:13 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:46:13.394 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:46:13 compute-1 sudo[106253]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Feb 02 09:46:13 compute-1 sudo[106253]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:46:13 compute-1 sudo[106253]: pam_unix(sudo:session): session closed for user root
Feb 02 09:46:13 compute-1 sudo[106273]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 02 09:46:13 compute-1 sudo[106273]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:46:13 compute-1 sudo[106273]: pam_unix(sudo:session): session closed for user root
Feb 02 09:46:13 compute-1 sudo[106303]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/d241d473-9fcb-5f74-b163-f1ca4454e7f1/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Feb 02 09:46:13 compute-1 sudo[106303]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:46:13 compute-1 sudo[106303]: pam_unix(sudo:session): session closed for user root
Feb 02 09:46:14 compute-1 ceph-mon[80115]: pgmap v186: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s
Feb 02 09:46:14 compute-1 sshd-session[106173]: Connection closed by authenticating user root 123.58.212.100 port 57322 [preauth]
Feb 02 09:46:14 compute-1 sudo[106510]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zhrwqkpwdekjlhfcxcozlzogxdytdvlu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025574.3414533-331-258198070278552/AnsiballZ_stat.py'
Feb 02 09:46:14 compute-1 sudo[106510]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:46:14 compute-1 python3.9[106512]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 02 09:46:14 compute-1 sudo[106510]: pam_unix(sudo:session): session closed for user root
Feb 02 09:46:15 compute-1 sudo[106589]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cvsjjhlskjtupmdykzjbwbgyxtnztoul ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025574.3414533-331-258198070278552/AnsiballZ_file.py'
Feb 02 09:46:15 compute-1 sudo[106589]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:46:15 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Feb 02 09:46:15 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Feb 02 09:46:15 compute-1 ceph-mon[80115]: pgmap v187: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 547 B/s rd, 91 B/s wr, 0 op/s
Feb 02 09:46:15 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb 02 09:46:15 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb 02 09:46:15 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Feb 02 09:46:15 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Feb 02 09:46:15 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Feb 02 09:46:15 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:46:15 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:46:15 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:46:15.161 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:46:15 compute-1 python3.9[106591]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 02 09:46:15 compute-1 sudo[106589]: pam_unix(sudo:session): session closed for user root
Feb 02 09:46:15 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:46:15 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 09:46:15 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:46:15.396 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 09:46:15 compute-1 sshd-session[106406]: Connection closed by authenticating user root 123.58.212.100 port 57334 [preauth]
Feb 02 09:46:15 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 09:46:16 compute-1 sudo[106743]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tuofheeijfeuzvzqpwkfwhrbbqssortp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025575.6137989-367-43005292246828/AnsiballZ_systemd.py'
Feb 02 09:46:16 compute-1 sudo[106743]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:46:16 compute-1 python3.9[106745]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 02 09:46:16 compute-1 systemd[1]: Reloading.
Feb 02 09:46:16 compute-1 systemd-rc-local-generator[106771]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 02 09:46:16 compute-1 systemd-sysv-generator[106777]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 02 09:46:16 compute-1 sudo[106743]: pam_unix(sudo:session): session closed for user root
Feb 02 09:46:16 compute-1 sshd-session[106668]: Connection closed by authenticating user root 123.58.212.100 port 57338 [preauth]
Feb 02 09:46:17 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:46:17 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:46:17 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:46:17.163 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:46:17 compute-1 ceph-mon[80115]: pgmap v188: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 364 B/s rd, 91 B/s wr, 0 op/s
Feb 02 09:46:17 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Feb 02 09:46:17 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:46:17 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:46:17 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:46:17.399 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:46:17 compute-1 sudo[106936]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kemdnyfflbaevyvuayvyvexbyorlyvpa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025577.153271-391-82630418579251/AnsiballZ_stat.py'
Feb 02 09:46:17 compute-1 sudo[106936]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:46:17 compute-1 python3.9[106938]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 02 09:46:17 compute-1 sudo[106936]: pam_unix(sudo:session): session closed for user root
Feb 02 09:46:17 compute-1 sudo[107014]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bbnvdmiylglymffmliikwzzjepdydwwy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025577.153271-391-82630418579251/AnsiballZ_file.py'
Feb 02 09:46:17 compute-1 sudo[107014]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:46:18 compute-1 python3.9[107016]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 02 09:46:18 compute-1 sudo[107014]: pam_unix(sudo:session): session closed for user root
Feb 02 09:46:18 compute-1 sshd-session[106809]: Connection closed by authenticating user root 123.58.212.100 port 57344 [preauth]
Feb 02 09:46:18 compute-1 systemd[1]: ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1@nfs.cephfs.0.0.compute-1.mhzhsx.service: Scheduled restart job, restart counter is at 4.
Feb 02 09:46:18 compute-1 systemd[1]: Stopped Ceph nfs.cephfs.0.0.compute-1.mhzhsx for d241d473-9fcb-5f74-b163-f1ca4454e7f1.
Feb 02 09:46:18 compute-1 systemd[1]: ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1@nfs.cephfs.0.0.compute-1.mhzhsx.service: Consumed 1.070s CPU time.
Feb 02 09:46:18 compute-1 systemd[1]: Starting Ceph nfs.cephfs.0.0.compute-1.mhzhsx for d241d473-9fcb-5f74-b163-f1ca4454e7f1...
Feb 02 09:46:18 compute-1 sudo[107190]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-olmovyxvmwtlfxopyzgvvddoytqekgld ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025578.3335078-427-123353966542429/AnsiballZ_stat.py'
Feb 02 09:46:18 compute-1 sudo[107190]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:46:18 compute-1 podman[107220]: 2026-02-02 09:46:18.790941664 +0000 UTC m=+0.052598199 container create 3eeaf63e40bf8f55b6836e53733dc5f05aee9cfaebe087d8c86b57c3e08bcca1 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Feb 02 09:46:18 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/02c4e8818d725dcc6f5dc4c5bdc381f7c01eae4ee14542779dfb9b0e85e8592f/merged/etc/ganesha supports timestamps until 2038 (0x7fffffff)
Feb 02 09:46:18 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/02c4e8818d725dcc6f5dc4c5bdc381f7c01eae4ee14542779dfb9b0e85e8592f/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Feb 02 09:46:18 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/02c4e8818d725dcc6f5dc4c5bdc381f7c01eae4ee14542779dfb9b0e85e8592f/merged/etc/ceph/keyring supports timestamps until 2038 (0x7fffffff)
Feb 02 09:46:18 compute-1 python3.9[107201]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 02 09:46:18 compute-1 podman[107220]: 2026-02-02 09:46:18.768580936 +0000 UTC m=+0.030237451 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Feb 02 09:46:18 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/02c4e8818d725dcc6f5dc4c5bdc381f7c01eae4ee14542779dfb9b0e85e8592f/merged/var/lib/ceph/radosgw/ceph-nfs.cephfs.0.0.compute-1.mhzhsx-rgw/keyring supports timestamps until 2038 (0x7fffffff)
Feb 02 09:46:18 compute-1 podman[107220]: 2026-02-02 09:46:18.877639915 +0000 UTC m=+0.139296440 container init 3eeaf63e40bf8f55b6836e53733dc5f05aee9cfaebe087d8c86b57c3e08bcca1 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, OSD_FLAVOR=default)
Feb 02 09:46:18 compute-1 podman[107220]: 2026-02-02 09:46:18.881632252 +0000 UTC m=+0.143288777 container start 3eeaf63e40bf8f55b6836e53733dc5f05aee9cfaebe087d8c86b57c3e08bcca1 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, CEPH_REF=squid, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/)
Feb 02 09:46:18 compute-1 bash[107220]: 3eeaf63e40bf8f55b6836e53733dc5f05aee9cfaebe087d8c86b57c3e08bcca1
Feb 02 09:46:18 compute-1 sudo[107190]: pam_unix(sudo:session): session closed for user root
Feb 02 09:46:18 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:46:18 : epoch 6980726a : compute-1 : ganesha.nfsd-2[main] init_logging :LOG :NULL :LOG: Setting log level for all components to NIV_EVENT
Feb 02 09:46:18 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:46:18 : epoch 6980726a : compute-1 : ganesha.nfsd-2[main] main :MAIN :EVENT :ganesha.nfsd Starting: Ganesha Version 5.9
Feb 02 09:46:18 compute-1 systemd[1]: Started Ceph nfs.cephfs.0.0.compute-1.mhzhsx for d241d473-9fcb-5f74-b163-f1ca4454e7f1.
Feb 02 09:46:18 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:46:18 : epoch 6980726a : compute-1 : ganesha.nfsd-2[main] nfs_set_param_from_conf :NFS STARTUP :EVENT :Configuration file successfully parsed
Feb 02 09:46:18 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:46:18 : epoch 6980726a : compute-1 : ganesha.nfsd-2[main] monitoring_init :NFS STARTUP :EVENT :Init monitoring at 0.0.0.0:9587
Feb 02 09:46:18 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:46:18 : epoch 6980726a : compute-1 : ganesha.nfsd-2[main] fsal_init_fds_limit :MDCACHE LRU :EVENT :Setting the system-imposed limit on FDs to 1048576.
Feb 02 09:46:18 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:46:18 : epoch 6980726a : compute-1 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :Initializing ID Mapper.
Feb 02 09:46:18 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:46:18 : epoch 6980726a : compute-1 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :ID Mapper successfully initialized.
Feb 02 09:46:18 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:46:18 : epoch 6980726a : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Feb 02 09:46:19 compute-1 sudo[107353]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jadfsnmpypbbacbjuxutekybqecmpzkd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025578.3335078-427-123353966542429/AnsiballZ_file.py'
Feb 02 09:46:19 compute-1 sudo[107353]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:46:19 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:46:19 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:46:19 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:46:19.167 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:46:19 compute-1 ceph-mon[80115]: pgmap v189: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 1.2 KiB/s rd, 456 B/s wr, 1 op/s
Feb 02 09:46:19 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb 02 09:46:19 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb 02 09:46:19 compute-1 sudo[107356]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 02 09:46:19 compute-1 sudo[107356]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:46:19 compute-1 sudo[107356]: pam_unix(sudo:session): session closed for user root
Feb 02 09:46:19 compute-1 python3.9[107355]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 02 09:46:19 compute-1 sudo[107353]: pam_unix(sudo:session): session closed for user root
Feb 02 09:46:19 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:46:19 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:46:19 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:46:19.401 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:46:19 compute-1 sudo[107530]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-shminvnzqqnbyixamrzdeixcbifkbshk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025579.5081675-463-143224619358526/AnsiballZ_systemd.py'
Feb 02 09:46:19 compute-1 sudo[107530]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:46:20 compute-1 sshd-session[107094]: Connection closed by authenticating user root 123.58.212.100 port 57358 [preauth]
Feb 02 09:46:20 compute-1 python3.9[107532]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 02 09:46:20 compute-1 systemd[1]: Reloading.
Feb 02 09:46:20 compute-1 systemd-rc-local-generator[107557]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 02 09:46:20 compute-1 systemd-sysv-generator[107563]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 02 09:46:20 compute-1 systemd[1]: Starting Create netns directory...
Feb 02 09:46:20 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 09:46:20 compute-1 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Feb 02 09:46:20 compute-1 systemd[1]: netns-placeholder.service: Deactivated successfully.
Feb 02 09:46:20 compute-1 systemd[1]: Finished Create netns directory.
Feb 02 09:46:20 compute-1 sudo[107530]: pam_unix(sudo:session): session closed for user root
Feb 02 09:46:21 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:46:21 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 09:46:21 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:46:21.169 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 09:46:21 compute-1 ceph-mon[80115]: pgmap v190: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 1.2 KiB/s rd, 456 B/s wr, 1 op/s
Feb 02 09:46:21 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:46:21 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:46:21 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:46:21.404 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:46:21 compute-1 python3.9[107726]: ansible-ansible.builtin.service_facts Invoked
Feb 02 09:46:21 compute-1 sshd-session[107534]: Connection closed by authenticating user root 123.58.212.100 port 57364 [preauth]
Feb 02 09:46:21 compute-1 network[107743]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Feb 02 09:46:21 compute-1 network[107744]: 'network-scripts' will be removed from distribution in near future.
Feb 02 09:46:21 compute-1 network[107745]: It is advised to switch to 'NetworkManager' instead for network management.
Feb 02 09:46:22 compute-1 sshd-session[107751]: Connection closed by authenticating user root 123.58.212.100 port 39146 [preauth]
Feb 02 09:46:23 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:46:23 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 09:46:23 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:46:23.172 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 09:46:23 compute-1 ceph-mon[80115]: pgmap v191: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 1.2 KiB/s rd, 456 B/s wr, 1 op/s
Feb 02 09:46:23 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:46:23 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:46:23 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:46:23.406 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:46:24 compute-1 sshd-session[107776]: Connection closed by authenticating user root 123.58.212.100 port 39154 [preauth]
Feb 02 09:46:24 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:46:24 : epoch 6980726a : compute-1 : ganesha.nfsd-2[main] rados_kv_traverse :CLIENT ID :EVENT :Failed to lst kv ret=-2
Feb 02 09:46:25 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:46:24 : epoch 6980726a : compute-1 : ganesha.nfsd-2[main] rados_cluster_read_clids :CLIENT ID :EVENT :Failed to traverse recovery db: -2
Feb 02 09:46:25 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:46:24 : epoch 6980726a : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Feb 02 09:46:25 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:46:24 : epoch 6980726a : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Feb 02 09:46:25 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:46:25 : epoch 6980726a : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Feb 02 09:46:25 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:46:25 : epoch 6980726a : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Feb 02 09:46:25 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:46:25 : epoch 6980726a : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Feb 02 09:46:25 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:46:25 : epoch 6980726a : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Feb 02 09:46:25 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:46:25 : epoch 6980726a : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Feb 02 09:46:25 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:46:25 : epoch 6980726a : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Feb 02 09:46:25 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:46:25 : epoch 6980726a : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Feb 02 09:46:25 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:46:25 : epoch 6980726a : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Feb 02 09:46:25 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:46:25 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:46:25 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:46:25.175 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:46:25 compute-1 ceph-mon[80115]: pgmap v192: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 2.4 KiB/s rd, 1.1 KiB/s wr, 3 op/s
Feb 02 09:46:25 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:46:25 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 09:46:25 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:46:25.408 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 09:46:25 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 09:46:25 compute-1 sshd-session[107855]: Connection closed by authenticating user root 123.58.212.100 port 39158 [preauth]
Feb 02 09:46:25 compute-1 sudo[108013]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ujhswjiwffwamwvqbycbsrquqvfzdayk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025585.609709-541-103033828413325/AnsiballZ_stat.py'
Feb 02 09:46:25 compute-1 sudo[108013]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:46:26 compute-1 python3.9[108015]: ansible-ansible.legacy.stat Invoked with path=/etc/ssh/sshd_config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 02 09:46:26 compute-1 sudo[108013]: pam_unix(sudo:session): session closed for user root
Feb 02 09:46:26 compute-1 sudo[108093]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aogsuyhqhzshemojefrfrmpgpaqpoumr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025585.609709-541-103033828413325/AnsiballZ_file.py'
Feb 02 09:46:26 compute-1 sudo[108093]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:46:26 compute-1 python3.9[108095]: ansible-ansible.legacy.file Invoked with mode=0600 dest=/etc/ssh/sshd_config _original_basename=sshd_config_block.j2 recurse=False state=file path=/etc/ssh/sshd_config force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 02 09:46:26 compute-1 sudo[108093]: pam_unix(sudo:session): session closed for user root
Feb 02 09:46:26 compute-1 sudo[108246]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-axtvbhbxawgqqbyigrftopsknjlatvtr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025586.745429-580-210896452910448/AnsiballZ_file.py'
Feb 02 09:46:26 compute-1 sudo[108246]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:46:27 compute-1 sshd-session[108016]: Connection closed by authenticating user root 123.58.212.100 port 39166 [preauth]
Feb 02 09:46:27 compute-1 python3.9[108248]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 02 09:46:27 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:46:27 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:46:27 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:46:27.177 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:46:27 compute-1 sudo[108246]: pam_unix(sudo:session): session closed for user root
Feb 02 09:46:27 compute-1 ceph-mon[80115]: pgmap v193: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 1.9 KiB/s rd, 938 B/s wr, 2 op/s
Feb 02 09:46:27 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:46:27 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:46:27 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:46:27.411 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:46:27 compute-1 sudo[108400]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eslzyghipuzmbdiwdverdudilirgqydq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025587.4500334-604-263473830279837/AnsiballZ_stat.py'
Feb 02 09:46:27 compute-1 sudo[108400]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:46:27 compute-1 python3.9[108402]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/sshd-networks.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 02 09:46:27 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-haproxy-nfs-cephfs-compute-1-sryqbx[86108]: [WARNING] 032/094627 (4) : Server backend/nfs.cephfs.1 is UP, reason: Layer4 check passed, check duration: 0ms. 2 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Feb 02 09:46:27 compute-1 sudo[108400]: pam_unix(sudo:session): session closed for user root
Feb 02 09:46:28 compute-1 sudo[108478]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rghutzscocndzofyaknnwgfkuemlsuer ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025587.4500334-604-263473830279837/AnsiballZ_file.py'
Feb 02 09:46:28 compute-1 sudo[108478]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:46:28 compute-1 python3.9[108480]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/var/lib/edpm-config/firewall/sshd-networks.yaml _original_basename=firewall.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/sshd-networks.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 02 09:46:28 compute-1 sudo[108478]: pam_unix(sudo:session): session closed for user root
Feb 02 09:46:28 compute-1 sshd-session[108273]: Connection closed by authenticating user root 123.58.212.100 port 39170 [preauth]
Feb 02 09:46:29 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:46:29 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 09:46:29 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:46:29.179 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 09:46:29 compute-1 ceph-mon[80115]: pgmap v194: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 3.9 KiB/s rd, 1.7 KiB/s wr, 5 op/s
Feb 02 09:46:29 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:46:29 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:46:29 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:46:29.413 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:46:29 compute-1 sudo[108633]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jcbrjfjyjuxjswwlfebfczqfdnkllfna ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025589.0305676-649-117684390584000/AnsiballZ_timezone.py'
Feb 02 09:46:29 compute-1 sudo[108633]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:46:29 compute-1 python3.9[108635]: ansible-community.general.timezone Invoked with name=UTC hwclock=None
Feb 02 09:46:29 compute-1 systemd[1]: Starting Time & Date Service...
Feb 02 09:46:29 compute-1 systemd[1]: Started Time & Date Service.
Feb 02 09:46:29 compute-1 sudo[108633]: pam_unix(sudo:session): session closed for user root
Feb 02 09:46:29 compute-1 sshd-session[108506]: Connection closed by authenticating user root 123.58.212.100 port 39176 [preauth]
Feb 02 09:46:30 compute-1 sudo[108791]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ryudlcfbmoucowduenvscuxrnuuygtnc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025590.1173878-676-81297485594821/AnsiballZ_file.py'
Feb 02 09:46:30 compute-1 sudo[108791]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:46:30 compute-1 ceph-mon[80115]: pgmap v195: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 3.1 KiB/s rd, 1.3 KiB/s wr, 4 op/s
Feb 02 09:46:30 compute-1 python3.9[108793]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 02 09:46:30 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 09:46:30 compute-1 sudo[108791]: pam_unix(sudo:session): session closed for user root
Feb 02 09:46:31 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:46:31 : epoch 6980726a : compute-1 : ganesha.nfsd-2[main] rados_cluster_end_grace :CLIENT ID :EVENT :Failed to remove rec-000000000000000c:nfs.cephfs.0: -2
Feb 02 09:46:31 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:46:31 : epoch 6980726a : compute-1 : ganesha.nfsd-2[main] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Feb 02 09:46:31 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:46:31 : epoch 6980726a : compute-1 : ganesha.nfsd-2[main] main :NFS STARTUP :WARN :No export entries found in configuration file !!!
Feb 02 09:46:31 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:46:31 : epoch 6980726a : compute-1 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:24): Unknown block (RADOS_URLS)
Feb 02 09:46:31 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:46:31 : epoch 6980726a : compute-1 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:29): Unknown block (RGW)
Feb 02 09:46:31 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:46:31 : epoch 6980726a : compute-1 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :CAP_SYS_RESOURCE was successfully removed for proper quota management in FSAL
Feb 02 09:46:31 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:46:31 : epoch 6980726a : compute-1 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :currently set capabilities are: cap_chown,cap_dac_override,cap_fowner,cap_fsetid,cap_kill,cap_setgid,cap_setuid,cap_setpcap,cap_net_bind_service,cap_sys_chroot,cap_setfcap=ep
Feb 02 09:46:31 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:46:31 : epoch 6980726a : compute-1 : ganesha.nfsd-2[main] gsh_dbus_pkginit :DBUS :CRIT :dbus_bus_get failed (Failed to connect to socket /run/dbus/system_bus_socket: No such file or directory)
Feb 02 09:46:31 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:46:31 : epoch 6980726a : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Feb 02 09:46:31 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:46:31 : epoch 6980726a : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Feb 02 09:46:31 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:46:31 : epoch 6980726a : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Feb 02 09:46:31 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:46:31 : epoch 6980726a : compute-1 : ganesha.nfsd-2[main] nfs_Init_svc :DISP :CRIT :Cannot acquire credentials for principal nfs
Feb 02 09:46:31 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:46:31 : epoch 6980726a : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Feb 02 09:46:31 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:46:31 : epoch 6980726a : compute-1 : ganesha.nfsd-2[main] nfs_Init_admin_thread :NFS CB :EVENT :Admin thread initialized
Feb 02 09:46:31 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:46:31 : epoch 6980726a : compute-1 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :EVENT :Callback creds directory (/var/run/ganesha) already exists
Feb 02 09:46:31 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:46:31 : epoch 6980726a : compute-1 : ganesha.nfsd-2[main] find_keytab_entry :NFS CB :WARN :Configuration file does not specify default realm while getting default realm name
Feb 02 09:46:31 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:46:31 : epoch 6980726a : compute-1 : ganesha.nfsd-2[main] gssd_refresh_krb5_machine_credential :NFS CB :CRIT :ERROR: gssd_refresh_krb5_machine_credential: no usable keytab entry found in keytab /etc/krb5.keytab for connection with host localhost
Feb 02 09:46:31 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:46:31 : epoch 6980726a : compute-1 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :WARN :gssd_refresh_krb5_machine_credential failed (-1765328160:0)
Feb 02 09:46:31 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:46:31 : epoch 6980726a : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :Starting delayed executor.
Feb 02 09:46:31 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:46:31 : epoch 6980726a : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :gsh_dbusthread was started successfully
Feb 02 09:46:31 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:46:31 : epoch 6980726a : compute-1 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :CRIT :DBUS not initialized, service thread exiting
Feb 02 09:46:31 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:46:31 : epoch 6980726a : compute-1 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :EVENT :shutdown
Feb 02 09:46:31 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:46:31 : epoch 6980726a : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :admin thread was started successfully
Feb 02 09:46:31 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:46:31 : epoch 6980726a : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :reaper thread was started successfully
Feb 02 09:46:31 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:46:31 : epoch 6980726a : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :General fridge was started successfully
Feb 02 09:46:31 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:46:31 : epoch 6980726a : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Feb 02 09:46:31 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:46:31 : epoch 6980726a : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :             NFS SERVER INITIALIZED
Feb 02 09:46:31 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:46:31 : epoch 6980726a : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Feb 02 09:46:31 compute-1 sudo[108956]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ftotbzknfipiapgzqyanbgqtaatsywfl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025590.8593693-700-10667560416092/AnsiballZ_stat.py'
Feb 02 09:46:31 compute-1 sudo[108956]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:46:31 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:46:31 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:46:31 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:46:31.183 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:46:31 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:46:31 : epoch 6980726a : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0bd4000df0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:46:31 compute-1 sshd-session[108721]: Connection closed by authenticating user root 123.58.212.100 port 39178 [preauth]
Feb 02 09:46:31 compute-1 python3.9[108958]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 02 09:46:31 compute-1 sudo[108956]: pam_unix(sudo:session): session closed for user root
Feb 02 09:46:31 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:46:31 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:46:31 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:46:31.415 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:46:31 compute-1 sudo[109040]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gelofhrtwzfrjkkaroybjlnzospqecga ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025590.8593693-700-10667560416092/AnsiballZ_file.py'
Feb 02 09:46:31 compute-1 sudo[109040]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:46:31 compute-1 python3.9[109042]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 02 09:46:31 compute-1 sudo[109040]: pam_unix(sudo:session): session closed for user root
Feb 02 09:46:32 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Feb 02 09:46:32 compute-1 sudo[109192]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ixgsdhptmszrxvlnxdzbpklnhwlexmtz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025592.0054634-736-138817432108264/AnsiballZ_stat.py'
Feb 02 09:46:32 compute-1 sudo[109192]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:46:32 compute-1 python3.9[109194]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 02 09:46:32 compute-1 sudo[109192]: pam_unix(sudo:session): session closed for user root
Feb 02 09:46:32 compute-1 sshd-session[109012]: Connection closed by authenticating user root 123.58.212.100 port 37376 [preauth]
Feb 02 09:46:32 compute-1 sudo[109271]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-whdcvaqbqfufivykgizhnxajmdkyfauf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025592.0054634-736-138817432108264/AnsiballZ_file.py'
Feb 02 09:46:32 compute-1 sudo[109271]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:46:32 compute-1 rsyslogd[1009]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Feb 02 09:46:32 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:46:32 : epoch 6980726a : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0bd8001ac0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:46:32 compute-1 python3.9[109273]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.owyvyunj recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 02 09:46:32 compute-1 sudo[109271]: pam_unix(sudo:session): session closed for user root
Feb 02 09:46:32 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:46:32 : epoch 6980726a : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0bb4000e00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:46:33 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:46:33 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:46:33 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:46:33.186 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:46:33 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-haproxy-nfs-cephfs-compute-1-sryqbx[86108]: [WARNING] 032/094633 (4) : Server backend/nfs.cephfs.0 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Feb 02 09:46:33 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:46:33 : epoch 6980726a : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0bb4000e00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:46:33 compute-1 ceph-mon[80115]: pgmap v196: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 3.1 KiB/s rd, 1.3 KiB/s wr, 4 op/s
Feb 02 09:46:33 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:46:33 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 09:46:33 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:46:33.418 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 09:46:33 compute-1 sudo[109353]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Feb 02 09:46:33 compute-1 sudo[109353]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:46:33 compute-1 sudo[109353]: pam_unix(sudo:session): session closed for user root
Feb 02 09:46:33 compute-1 sudo[109451]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mnxjmcrvjyxrkceeystmhhbnknfvcxgc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025593.2078533-772-77385713163678/AnsiballZ_stat.py'
Feb 02 09:46:33 compute-1 sudo[109451]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:46:33 compute-1 python3.9[109453]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 02 09:46:33 compute-1 sudo[109451]: pam_unix(sudo:session): session closed for user root
Feb 02 09:46:34 compute-1 sshd-session[109275]: Connection closed by authenticating user root 123.58.212.100 port 37384 [preauth]
Feb 02 09:46:34 compute-1 sudo[109529]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gwgvqegvjabktlburbrwjivbsxgldmsi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025593.2078533-772-77385713163678/AnsiballZ_file.py'
Feb 02 09:46:34 compute-1 sudo[109529]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:46:34 compute-1 python3.9[109531]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 02 09:46:34 compute-1 sudo[109529]: pam_unix(sudo:session): session closed for user root
Feb 02 09:46:34 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:46:34 : epoch 6980726a : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0bcc001230 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:46:34 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:46:34 : epoch 6980726a : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0bd80023e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:46:35 compute-1 sudo[109682]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dzdmoomvysonljnjqwoyljoeiorgoirl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025594.7525578-811-96003524230489/AnsiballZ_command.py'
Feb 02 09:46:35 compute-1 sudo[109682]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:46:35 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:46:35 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 09:46:35 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:46:35.189 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 09:46:35 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:46:35 : epoch 6980726a : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0bd80023e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:46:35 compute-1 ceph-mon[80115]: pgmap v197: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 3.2 KiB/s rd, 1.3 KiB/s wr, 4 op/s
Feb 02 09:46:35 compute-1 python3.9[109684]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 02 09:46:35 compute-1 sudo[109682]: pam_unix(sudo:session): session closed for user root
Feb 02 09:46:35 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:46:35 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:46:35 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:46:35.421 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:46:35 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 09:46:36 compute-1 sudo[109837]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jbdczquairhhcsyvgqcmukeqssrqasja ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1770025595.6507463-835-69215002258628/AnsiballZ_edpm_nftables_from_files.py'
Feb 02 09:46:36 compute-1 sudo[109837]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:46:36 compute-1 python3[109839]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Feb 02 09:46:36 compute-1 sudo[109837]: pam_unix(sudo:session): session closed for user root
Feb 02 09:46:36 compute-1 sshd-session[109685]: Connection closed by authenticating user root 123.58.212.100 port 37392 [preauth]
Feb 02 09:46:36 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:46:36 : epoch 6980726a : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0bd80023e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:46:37 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:46:37 : epoch 6980726a : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0bcc001d50 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:46:37 compute-1 sudo[109992]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kojcwkoirbxukzabmellsbixmvvryicq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025596.797612-859-172553466125518/AnsiballZ_stat.py'
Feb 02 09:46:37 compute-1 sudo[109992]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:46:37 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:46:37 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:46:37 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:46:37.193 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:46:37 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:46:37 : epoch 6980726a : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0bb4001dc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:46:37 compute-1 ceph-mon[80115]: pgmap v198: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 2.1 KiB/s rd, 767 B/s wr, 2 op/s
Feb 02 09:46:37 compute-1 python3.9[109994]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 02 09:46:37 compute-1 sudo[109992]: pam_unix(sudo:session): session closed for user root
Feb 02 09:46:37 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:46:37 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 09:46:37 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:46:37.424 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 09:46:37 compute-1 sudo[110070]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nritrteqbarnrlynnzptpwcxyzyofhir ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025596.797612-859-172553466125518/AnsiballZ_file.py'
Feb 02 09:46:37 compute-1 sudo[110070]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:46:37 compute-1 python3.9[110072]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 02 09:46:37 compute-1 sudo[110070]: pam_unix(sudo:session): session closed for user root
Feb 02 09:46:37 compute-1 sshd-session[109865]: Connection closed by authenticating user root 123.58.212.100 port 37396 [preauth]
Feb 02 09:46:38 compute-1 sudo[110224]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-plrrrzjlhkinwiwllvwvfepzlaeedblw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025598.0979018-895-242187411062551/AnsiballZ_stat.py'
Feb 02 09:46:38 compute-1 sudo[110224]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:46:38 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:46:38 : epoch 6980726a : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0bb4001dc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:46:38 compute-1 python3.9[110227]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 02 09:46:38 compute-1 sudo[110224]: pam_unix(sudo:session): session closed for user root
Feb 02 09:46:39 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:46:39 : epoch 6980726a : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0bd80023e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:46:39 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:46:39 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:46:39 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:46:39.195 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:46:39 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:46:39 : epoch 6980726a : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0bcc001d50 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:46:39 compute-1 sshd-session[110137]: Connection closed by authenticating user root 123.58.212.100 port 37406 [preauth]
Feb 02 09:46:39 compute-1 ceph-mon[80115]: pgmap v199: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 2.1 KiB/s rd, 767 B/s wr, 2 op/s
Feb 02 09:46:39 compute-1 sudo[110350]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hisggzhimbxvaboqjbyirqerwzlyovvz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025598.0979018-895-242187411062551/AnsiballZ_copy.py'
Feb 02 09:46:39 compute-1 sudo[110350]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:46:39 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:46:39 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:46:39 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:46:39.427 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:46:39 compute-1 python3.9[110352]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-update-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1770025598.0979018-895-242187411062551/.source.nft follow=False _original_basename=jump-chain.j2 checksum=3ce353c89bce3b135a0ed688d4e338b2efb15185 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 02 09:46:39 compute-1 sudo[110350]: pam_unix(sudo:session): session closed for user root
Feb 02 09:46:40 compute-1 sudo[110504]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qfrdmkngewxhlpabcbcewvhyjxlbkahm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025599.914529-940-236886435449609/AnsiballZ_stat.py'
Feb 02 09:46:40 compute-1 sudo[110504]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:46:40 compute-1 python3.9[110506]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 02 09:46:40 compute-1 sudo[110504]: pam_unix(sudo:session): session closed for user root
Feb 02 09:46:40 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 09:46:40 compute-1 sudo[110582]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zcgwcjluhutqebelyknneprjmoiimsyw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025599.914529-940-236886435449609/AnsiballZ_file.py'
Feb 02 09:46:40 compute-1 sudo[110582]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:46:40 compute-1 sshd-session[110353]: Connection closed by authenticating user root 123.58.212.100 port 37422 [preauth]
Feb 02 09:46:40 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:46:40 : epoch 6980726a : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0bb4001dc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:46:40 compute-1 python3.9[110584]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-flushes.nft _original_basename=flush-chain.j2 recurse=False state=file path=/etc/nftables/edpm-flushes.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 02 09:46:40 compute-1 sudo[110582]: pam_unix(sudo:session): session closed for user root
Feb 02 09:46:41 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:46:41 : epoch 6980726a : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0bb4001dc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:46:41 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:46:41 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:46:41 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:46:41.198 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:46:41 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:46:41 : epoch 6980726a : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0bd80023e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:46:41 compute-1 ceph-mon[80115]: pgmap v200: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 85 B/s rd, 0 B/s wr, 0 op/s
Feb 02 09:46:41 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:46:41 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:46:41 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:46:41.429 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:46:41 compute-1 sudo[110737]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rzadnmtzsbalsllmkaklcqicvpinytxw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025601.1462142-976-190175024615891/AnsiballZ_stat.py'
Feb 02 09:46:41 compute-1 sudo[110737]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:46:41 compute-1 python3.9[110739]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 02 09:46:41 compute-1 sudo[110737]: pam_unix(sudo:session): session closed for user root
Feb 02 09:46:41 compute-1 sudo[110815]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pgibkrmivvfmevgvimjhtaxocroracbu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025601.1462142-976-190175024615891/AnsiballZ_file.py'
Feb 02 09:46:41 compute-1 sudo[110815]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:46:42 compute-1 python3.9[110817]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-chains.nft _original_basename=chains.j2 recurse=False state=file path=/etc/nftables/edpm-chains.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 02 09:46:42 compute-1 sudo[110815]: pam_unix(sudo:session): session closed for user root
Feb 02 09:46:42 compute-1 sshd-session[110606]: Connection closed by authenticating user root 123.58.212.100 port 37426 [preauth]
Feb 02 09:46:42 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:46:42 : epoch 6980726a : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0bcc001d50 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:46:42 compute-1 sudo[110970]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hmqppupuebvimjcbwvpqsqilbkfrexcz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025602.580309-1012-55028135963785/AnsiballZ_stat.py'
Feb 02 09:46:42 compute-1 sudo[110970]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:46:43 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:46:43 : epoch 6980726a : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0bb4001dc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:46:43 compute-1 python3.9[110972]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 02 09:46:43 compute-1 sudo[110970]: pam_unix(sudo:session): session closed for user root
Feb 02 09:46:43 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:46:43 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:46:43 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:46:43.199 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:46:43 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:46:43 : epoch 6980726a : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0bb4001dc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:46:43 compute-1 sudo[111048]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pnqnzlscywwmljyyeeirpimnbqjghvng ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025602.580309-1012-55028135963785/AnsiballZ_file.py'
Feb 02 09:46:43 compute-1 sudo[111048]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:46:43 compute-1 ceph-mon[80115]: pgmap v201: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 85 B/s rd, 0 B/s wr, 0 op/s
Feb 02 09:46:43 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:46:43 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 09:46:43 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:46:43.431 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 09:46:43 compute-1 python3.9[111050]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-rules.nft _original_basename=ruleset.j2 recurse=False state=file path=/etc/nftables/edpm-rules.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 02 09:46:43 compute-1 sudo[111048]: pam_unix(sudo:session): session closed for user root
Feb 02 09:46:43 compute-1 sshd-session[110842]: Connection closed by authenticating user root 123.58.212.100 port 59096 [preauth]
Feb 02 09:46:43 compute-1 sudo[111200]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yhrxrquqxzitrlabvlemrvfaucsfkval ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025603.7304974-1051-9282709073688/AnsiballZ_command.py'
Feb 02 09:46:43 compute-1 sudo[111200]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:46:44 compute-1 python3.9[111202]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 02 09:46:44 compute-1 sudo[111200]: pam_unix(sudo:session): session closed for user root
Feb 02 09:46:44 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:46:44 : epoch 6980726a : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0bb80016c0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:46:44 compute-1 sudo[111358]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sxpuyfvpvfhompqifetxdfjqeegpcrsf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025604.416724-1076-234929901966518/AnsiballZ_blockinfile.py'
Feb 02 09:46:44 compute-1 sudo[111358]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:46:44 compute-1 sshd-session[111203]: Invalid user user from 123.58.212.100 port 59114
Feb 02 09:46:44 compute-1 python3.9[111360]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"
                                             include "/etc/nftables/edpm-chains.nft"
                                             include "/etc/nftables/edpm-rules.nft"
                                             include "/etc/nftables/edpm-jumps.nft"
                                              path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 02 09:46:45 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:46:45 : epoch 6980726a : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0bd80023e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:46:45 compute-1 sudo[111358]: pam_unix(sudo:session): session closed for user root
Feb 02 09:46:45 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:46:45 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:46:45 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:46:45.202 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:46:45 compute-1 sshd-session[111203]: Connection closed by invalid user user 123.58.212.100 port 59114 [preauth]
Feb 02 09:46:45 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:46:45 : epoch 6980726a : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0bd80023e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:46:45 compute-1 ceph-mon[80115]: pgmap v202: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 341 B/s rd, 0 B/s wr, 0 op/s
Feb 02 09:46:45 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:46:45 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:46:45 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:46:45.435 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:46:45 compute-1 sudo[111512]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wsorioxbafktltgcxgouzjmmcuqbdwqb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025605.310008-1102-210085836536506/AnsiballZ_file.py'
Feb 02 09:46:45 compute-1 sudo[111512]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:46:45 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 09:46:45 compute-1 python3.9[111514]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages1G state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 02 09:46:45 compute-1 sudo[111512]: pam_unix(sudo:session): session closed for user root
Feb 02 09:46:46 compute-1 sudo[111664]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xdgnljudomslnwoupfkewlxbeezpigke ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025605.8651805-1102-867320788027/AnsiballZ_file.py'
Feb 02 09:46:46 compute-1 sudo[111664]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:46:46 compute-1 sshd-session[111460]: Invalid user user from 123.58.212.100 port 59126
Feb 02 09:46:46 compute-1 python3.9[111666]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages2M state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 02 09:46:46 compute-1 sudo[111664]: pam_unix(sudo:session): session closed for user root
Feb 02 09:46:46 compute-1 sshd-session[111460]: Connection closed by invalid user user 123.58.212.100 port 59126 [preauth]
Feb 02 09:46:46 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:46:46 : epoch 6980726a : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0bd80023e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:46:47 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:46:47 : epoch 6980726a : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0bb4001dc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:46:47 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:46:47 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:46:47 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:46:47.204 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:46:47 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:46:47 : epoch 6980726a : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0bcc0031e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:46:47 compute-1 sudo[111819]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zuspciclmtzriqyplxikseswmjcjzoon ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025606.9504888-1147-34538245118775/AnsiballZ_mount.py'
Feb 02 09:46:47 compute-1 ceph-mon[80115]: pgmap v203: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Feb 02 09:46:47 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Feb 02 09:46:47 compute-1 sudo[111819]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:46:47 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:46:47 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 09:46:47 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:46:47.438 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 09:46:47 compute-1 python3.9[111821]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=1G path=/dev/hugepages1G src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None
Feb 02 09:46:47 compute-1 sudo[111819]: pam_unix(sudo:session): session closed for user root
Feb 02 09:46:47 compute-1 sshd-session[111692]: Invalid user user from 123.58.212.100 port 59144
Feb 02 09:46:48 compute-1 sudo[111971]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vbhnycyecnpxadgzqqmgnmxzcgptkhek ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025607.7653465-1147-65189742648650/AnsiballZ_mount.py'
Feb 02 09:46:48 compute-1 sudo[111971]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:46:48 compute-1 python3.9[111973]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=2M path=/dev/hugepages2M src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None
Feb 02 09:46:48 compute-1 sshd-session[111692]: Connection closed by invalid user user 123.58.212.100 port 59144 [preauth]
Feb 02 09:46:48 compute-1 sudo[111971]: pam_unix(sudo:session): session closed for user root
Feb 02 09:46:48 compute-1 sshd-session[104397]: Connection closed by 192.168.122.30 port 47826
Feb 02 09:46:48 compute-1 sshd-session[104394]: pam_unix(sshd:session): session closed for user zuul
Feb 02 09:46:48 compute-1 systemd[1]: session-43.scope: Deactivated successfully.
Feb 02 09:46:48 compute-1 systemd[1]: session-43.scope: Consumed 26.026s CPU time.
Feb 02 09:46:48 compute-1 systemd-logind[805]: Session 43 logged out. Waiting for processes to exit.
Feb 02 09:46:48 compute-1 systemd-logind[805]: Removed session 43.
Feb 02 09:46:48 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:46:48 : epoch 6980726a : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0bb80021e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:46:49 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:46:49 : epoch 6980726a : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0bd80023e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:46:49 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:46:49 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:46:49 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:46:49.207 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:46:49 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:46:49 : epoch 6980726a : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0bb4003980 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:46:49 compute-1 sshd-session[111998]: Invalid user user from 123.58.212.100 port 59146
Feb 02 09:46:49 compute-1 ceph-mon[80115]: pgmap v204: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Feb 02 09:46:49 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:46:49 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 09:46:49 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:46:49.441 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 09:46:49 compute-1 sshd-session[111998]: Connection closed by invalid user user 123.58.212.100 port 59146 [preauth]
Feb 02 09:46:50 compute-1 ceph-mon[80115]: pgmap v205: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Feb 02 09:46:50 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 09:46:50 compute-1 sshd-session[112001]: Invalid user user from 123.58.212.100 port 59162
Feb 02 09:46:50 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:46:50 : epoch 6980726a : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0bcc0031e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:46:50 compute-1 sshd-session[112001]: Connection closed by invalid user user 123.58.212.100 port 59162 [preauth]
Feb 02 09:46:51 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:46:51 : epoch 6980726a : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0bcc0031e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:46:51 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:46:51 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:46:51 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:46:51.210 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:46:51 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:46:51 : epoch 6980726a : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0bcc0031e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:46:51 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:46:51 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:46:51 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:46:51.444 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:46:52 compute-1 sshd-session[112005]: Invalid user user from 123.58.212.100 port 59174
Feb 02 09:46:52 compute-1 sshd-session[112005]: Connection closed by invalid user user 123.58.212.100 port 59174 [preauth]
Feb 02 09:46:52 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:46:52 : epoch 6980726a : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0bb4003980 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:46:53 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:46:53 : epoch 6980726a : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0bb8002b00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:46:53 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:46:53 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 09:46:53 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:46:53.212 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 09:46:53 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:46:53 : epoch 6980726a : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0bcc0031e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:46:53 compute-1 ceph-mon[80115]: pgmap v206: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Feb 02 09:46:53 compute-1 sshd-session[112007]: Invalid user user from 123.58.212.100 port 39058
Feb 02 09:46:53 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:46:53 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 09:46:53 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:46:53.447 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 09:46:53 compute-1 sshd-session[112007]: Connection closed by invalid user user 123.58.212.100 port 39058 [preauth]
Feb 02 09:46:53 compute-1 sudo[112010]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Feb 02 09:46:53 compute-1 sudo[112010]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:46:53 compute-1 sudo[112010]: pam_unix(sudo:session): session closed for user root
Feb 02 09:46:54 compute-1 sshd-session[112037]: Accepted publickey for zuul from 192.168.122.30 port 33894 ssh2: ECDSA SHA256:RIWOugHsRom13QN8+H2eekzMj6VNcm6gUxie+zDStiQ
Feb 02 09:46:54 compute-1 systemd-logind[805]: New session 44 of user zuul.
Feb 02 09:46:54 compute-1 systemd[1]: Started Session 44 of User zuul.
Feb 02 09:46:54 compute-1 sshd-session[112037]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Feb 02 09:46:54 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:46:54 : epoch 6980726a : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0bd80023e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:46:54 compute-1 sudo[112191]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iiftoqmpbangeytslkqnwtibjgrfpfsj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025614.3590693-19-39942308495037/AnsiballZ_tempfile.py'
Feb 02 09:46:54 compute-1 sudo[112191]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:46:54 compute-1 sshd-session[112035]: Invalid user user from 123.58.212.100 port 39070
Feb 02 09:46:54 compute-1 python3.9[112193]: ansible-ansible.builtin.tempfile Invoked with state=file prefix=ansible. suffix= path=None
Feb 02 09:46:54 compute-1 sudo[112191]: pam_unix(sudo:session): session closed for user root
Feb 02 09:46:55 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:46:55 : epoch 6980726a : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0bb4003980 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:46:55 compute-1 sshd-session[112035]: Connection closed by invalid user user 123.58.212.100 port 39070 [preauth]
Feb 02 09:46:55 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:46:55 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 09:46:55 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:46:55.215 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 09:46:55 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:46:55 : epoch 6980726a : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0bb8002b00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:46:55 compute-1 ceph-mon[80115]: pgmap v207: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Feb 02 09:46:55 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:46:55 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:46:55 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:46:55.451 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:46:55 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 09:46:55 compute-1 sudo[112345]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rjordexnerxpreblhmwmntorzyopuijl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025615.2486267-55-263679992971239/AnsiballZ_stat.py'
Feb 02 09:46:55 compute-1 sudo[112345]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:46:55 compute-1 python3.9[112347]: ansible-ansible.builtin.stat Invoked with path=/etc/ssh/ssh_known_hosts follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 02 09:46:55 compute-1 sudo[112345]: pam_unix(sudo:session): session closed for user root
Feb 02 09:46:56 compute-1 sshd-session[112270]: Invalid user user from 123.58.212.100 port 39072
Feb 02 09:46:56 compute-1 sudo[112499]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ajmidlzoisxrebpldoomnjbebbldcmdg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025616.0446088-79-45094372000649/AnsiballZ_slurp.py'
Feb 02 09:46:56 compute-1 sudo[112499]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:46:56 compute-1 sshd-session[112270]: Connection closed by invalid user user 123.58.212.100 port 39072 [preauth]
Feb 02 09:46:56 compute-1 python3.9[112501]: ansible-ansible.builtin.slurp Invoked with src=/etc/ssh/ssh_known_hosts
Feb 02 09:46:56 compute-1 sudo[112499]: pam_unix(sudo:session): session closed for user root
Feb 02 09:46:56 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:46:56 : epoch 6980726a : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0bcc0031e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:46:57 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:46:57 : epoch 6980726a : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0bd80023e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:46:57 compute-1 sudo[112654]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mxeyjvbzcwmcfqybhfknukiassanuihp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025616.8048086-103-189466703170045/AnsiballZ_stat.py'
Feb 02 09:46:57 compute-1 sudo[112654]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:46:57 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:46:57 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 09:46:57 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:46:57.218 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 09:46:57 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:46:57 : epoch 6980726a : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0bb4003980 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:46:57 compute-1 python3.9[112656]: ansible-ansible.legacy.stat Invoked with path=/tmp/ansible.j21xtwvl follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 02 09:46:57 compute-1 sudo[112654]: pam_unix(sudo:session): session closed for user root
Feb 02 09:46:57 compute-1 ceph-mon[80115]: pgmap v208: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Feb 02 09:46:57 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:46:57 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:46:57 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:46:57.454 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:46:57 compute-1 sshd-session[112527]: Invalid user user from 123.58.212.100 port 39074
Feb 02 09:46:57 compute-1 sudo[112779]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vwicrcqbcdzcfwwowhqdqelqjmrcqmjp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025616.8048086-103-189466703170045/AnsiballZ_copy.py'
Feb 02 09:46:57 compute-1 sudo[112779]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:46:57 compute-1 sshd-session[112527]: Connection closed by invalid user user 123.58.212.100 port 39074 [preauth]
Feb 02 09:46:57 compute-1 python3.9[112781]: ansible-ansible.legacy.copy Invoked with dest=/tmp/ansible.j21xtwvl mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1770025616.8048086-103-189466703170045/.source.j21xtwvl _original_basename=.ztnzdaqt follow=False checksum=0e76d40d6d80e8dcbe1329e9f4d8b9bf39ee9960 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 02 09:46:57 compute-1 sudo[112779]: pam_unix(sudo:session): session closed for user root
Feb 02 09:46:58 compute-1 sudo[112934]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fnbjjtlaazbmsihbtovbrfxqwcpphmmt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025618.1794634-148-172018558704862/AnsiballZ_setup.py'
Feb 02 09:46:58 compute-1 sudo[112934]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:46:58 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:46:58 : epoch 6980726a : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0bb8002b00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:46:59 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:46:59 : epoch 6980726a : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0bcc0042e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:46:59 compute-1 python3.9[112936]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'ssh_host_key_rsa_public', 'ssh_host_key_ed25519_public', 'ssh_host_key_ecdsa_public'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 02 09:46:59 compute-1 sudo[112934]: pam_unix(sudo:session): session closed for user root
Feb 02 09:46:59 compute-1 sshd-session[112806]: Invalid user user from 123.58.212.100 port 39088
Feb 02 09:46:59 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:46:59 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:46:59 : epoch 6980726a : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0bd80023e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:46:59 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 09:46:59 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:46:59.221 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 09:46:59 compute-1 sshd-session[112806]: Connection closed by invalid user user 123.58.212.100 port 39088 [preauth]
Feb 02 09:46:59 compute-1 ceph-mon[80115]: pgmap v209: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Feb 02 09:46:59 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:46:59 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 09:46:59 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:46:59.456 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 09:46:59 compute-1 sudo[113088]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jfgmjuqpdwcdalyjdsdpzojeyvpsxfex ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025619.3144286-173-61498045654769/AnsiballZ_blockinfile.py'
Feb 02 09:46:59 compute-1 sudo[113088]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:46:59 compute-1 systemd[1]: systemd-timedated.service: Deactivated successfully.
Feb 02 09:46:59 compute-1 python3.9[113090]: ansible-ansible.builtin.blockinfile Invoked with block=compute-2.ctlplane.example.com,192.168.122.102,compute-2* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCpaaLVd9Gqbxcksz46sKNkp3Eu2TY3fUjtOhbkLQru93qJt/RNDTocNiUrE9VAj/UXp9dZqSHg1Hr7ScqXu7zqgZ9i+mq6N7P7QR+ZkN8jLQSybnPztI7X/QWaPhT0j1ArMrYk2F2Me+kAQiFL0GoR2d8udRElL8YKKIYQ6zjC/h2ZsU0WyVET9uiTgeMP/njtMzRSgO2Wp6no4KqJEOMSEY1lgURjVsMWkTr4hGz523SooA41GzquuNamnj1ELwKZSAH+TtVgI8oFJ2T+5TZiE/oW2MizbBwjKA3V5DlnGOEG49eG+LhZ/eWb6jQ7OnJARA/iLU/FsJ+CaGSbRK20/OWXP4JSZu7liaD0DIHM0DwrjEnQcXI6SbfAoAQ494KFtZvFamem7CPtrVhgNAKqybRbDcEQGpDxQgrWeA3m4HyGIBym+IvMUfYlNke9frCkwNpXRH93TK6E/ziPFrBHKkdRcFxVdsG2u1Y+adxOQk7KCjq/skzXBPCPDaHnzBM=
                                             compute-2.ctlplane.example.com,192.168.122.102,compute-2* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIIKtQmhiX/LRkxZONUn47u07V1HNePVW1EWKmTbmuGuY
                                             compute-2.ctlplane.example.com,192.168.122.102,compute-2* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBE0cPV3BwiB9Cc5Ne48bCCSZwMzF/hH7iFXwAiP/TK2pzWYsdZw1mOSJ+vDu1KclkDtQKmwN6Cu0N7j7domqlzE=
                                             compute-0.ctlplane.example.com,192.168.122.100,compute-0* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDXvxaVTYbHTHv+9EzKdF3T8+Yr2otW2YLuSqNTF+yJaKACfB7wDlIhKDGTHiU1FDrkO4tJ+R3OL/2ZXoIlxp5JSdCgcb42X+5PTj1wPkayVlQW7e0wQvT3kYhrcPtjLgk4T39/sionMGYUat45idwoB6hUSPLdk/L5+n0/3LEg1lByOM/B1/p8wGzHn6H9CWoIP3Ctd6lmrxtIVU1u+pxiBVQCcMjw5gtqsB54l670fL7El5XEkqjRjKHhylw9QTYN3AWMKuQKwcjClm/57/SoFMP7o52r653wGDH9cpvDgs0RYG4bA1mGY5OMkYbDJfcy0CViKEu5qWW4cTBLh/Z88D2EuNlINj3Q1YJk3RwF6vYl31MMsbBW10YhIiBJrA5XF0BLARqBOZ1e6v7JKTSwa7wGGtRzEzbY+me9zl6ZhhDru/I+h24J4MeBA07HvQIS2v8O95tPz76YZJ3DkWlywFWbALG8M4+fkpuQtvVpBZMgdvIWW0kfXO/grGnrgY8=
                                             compute-0.ctlplane.example.com,192.168.122.100,compute-0* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIG3OEs+fDFWrKRKifY4uXYtOpS/6/8E88qPQNs1apj/z
                                             compute-0.ctlplane.example.com,192.168.122.100,compute-0* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBFy9hRh0QDNcy30491f4FwmL+9BopSuPxbkVyWhY9VytT/FG5rm9/DLYyukpd9IKttcZyerq0gzfokDrht76FB4=
                                             compute-1.ctlplane.example.com,192.168.122.101,compute-1* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDTA16t8OsOL4s99BOiNF3vckRPwnc9DwrgEMUjNAF5ofBbR7O7JlFD47GnI33lZr51vVc0wnvTxhpFA0jVvhKqVWdJ3lApNf34bJmaJBr8uiy/i3Q84MsUtXBLQ0FDCbwgaPnreNbMz3ae+u9H+Z73jQSP+gnQ5oYWhONHgO4HHkF8K7a8Bow3H5qwfbHz8o7mFQmTpYHwOcwhA53BTbh1NiEJZJNSg7wi1hH7vELUAzts1cbF2slTE0nh8XjMogq9ukokrCIKfE+xX7PmAawCuMnfvGX93zF1298pGcUKqvpnIfUOMDGtJtYEZ8sWsr5aH1YXIoJfHuux/YosRx3XDD5oEcpX0nYKVW6bumHsFIS199XAM5LtWWNr2eMcrbZhVwHNdELC6zoL7QjbBQ+2j/+8nJLq9vIghewgO3EFWK3r7kIVQZg8GYLZ/yisH4cvzUTACRXAF+1o2rq+AUfX3nTSsrqyZQUwlnWpc1vsceEO0Lsuac5tvGylnsJBfmM=
                                             compute-1.ctlplane.example.com,192.168.122.101,compute-1* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIN317jbKb2FNELHPgcKtyDLq5kCgCZN/b/8qYDuirt4l
                                             compute-1.ctlplane.example.com,192.168.122.101,compute-1* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBNpgfrlTfGut7rGFnGEpIiXrs2U1SQK0Fr1bAmmw8notvdnn6jtGfPfwX96hGwcOu4AlAS/i7X7XgbLw573Ooww=
                                              create=True mode=0644 path=/tmp/ansible.j21xtwvl state=present marker=# {mark} ANSIBLE MANAGED BLOCK backup=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 02 09:46:59 compute-1 sudo[113088]: pam_unix(sudo:session): session closed for user root
Feb 02 09:47:00 compute-1 sshd-session[113013]: Invalid user user from 123.58.212.100 port 39100
Feb 02 09:47:00 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 09:47:00 compute-1 sudo[113243]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pzquqlaukdcllyqaiwfnthgpwyhjjvcd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025620.174683-197-44937279199017/AnsiballZ_command.py'
Feb 02 09:47:00 compute-1 sudo[113243]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:47:00 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:47:00 : epoch 6980726a : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0bb4003980 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:47:00 compute-1 python3.9[113245]: ansible-ansible.legacy.command Invoked with _raw_params=cat '/tmp/ansible.j21xtwvl' > /etc/ssh/ssh_known_hosts _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 02 09:47:00 compute-1 sudo[113243]: pam_unix(sudo:session): session closed for user root
Feb 02 09:47:00 compute-1 sshd-session[113013]: Connection closed by invalid user user 123.58.212.100 port 39100 [preauth]
Feb 02 09:47:01 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:47:01 : epoch 6980726a : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0bb8002b00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:47:01 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:47:01 : epoch 6980726a : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0bcc0042e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:47:01 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:47:01 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 09:47:01 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:47:01.224 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 09:47:01 compute-1 ceph-mon[80115]: pgmap v210: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Feb 02 09:47:01 compute-1 sudo[113400]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sfcorhccnkimrrgijtssaeikqtvdlvbv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025620.9826193-221-124322475271545/AnsiballZ_file.py'
Feb 02 09:47:01 compute-1 sudo[113400]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:47:01 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:47:01 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:47:01 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:47:01.459 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:47:01 compute-1 python3.9[113402]: ansible-ansible.builtin.file Invoked with path=/tmp/ansible.j21xtwvl state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 02 09:47:01 compute-1 sudo[113400]: pam_unix(sudo:session): session closed for user root
Feb 02 09:47:02 compute-1 sshd-session[112040]: Connection closed by 192.168.122.30 port 33894
Feb 02 09:47:02 compute-1 sshd-session[112037]: pam_unix(sshd:session): session closed for user zuul
Feb 02 09:47:02 compute-1 systemd-logind[805]: Session 44 logged out. Waiting for processes to exit.
Feb 02 09:47:02 compute-1 systemd[1]: session-44.scope: Deactivated successfully.
Feb 02 09:47:02 compute-1 systemd[1]: session-44.scope: Consumed 4.503s CPU time.
Feb 02 09:47:02 compute-1 systemd-logind[805]: Removed session 44.
Feb 02 09:47:02 compute-1 sshd-session[113325]: Invalid user user from 123.58.212.100 port 39116
Feb 02 09:47:02 compute-1 sshd-session[113325]: Connection closed by invalid user user 123.58.212.100 port 39116 [preauth]
Feb 02 09:47:02 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Feb 02 09:47:02 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:47:02 : epoch 6980726a : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0bd80023e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:47:03 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:47:03 : epoch 6980726a : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0bb4003980 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:47:03 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:47:03 : epoch 6980726a : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0bd4000df0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:47:03 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:47:03 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 09:47:03 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:47:03.227 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 09:47:03 compute-1 ceph-mon[80115]: pgmap v211: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Feb 02 09:47:03 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:47:03 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:47:03 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:47:03.461 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:47:03 compute-1 sshd-session[113427]: Invalid user user from 123.58.212.100 port 39184
Feb 02 09:47:03 compute-1 sshd-session[113427]: Connection closed by invalid user user 123.58.212.100 port 39184 [preauth]
Feb 02 09:47:04 compute-1 ceph-mon[80115]: pgmap v212: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Feb 02 09:47:04 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:47:04 : epoch 6980726a : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0bcc0042e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:47:04 compute-1 sshd-session[113433]: Invalid user user from 123.58.212.100 port 39190
Feb 02 09:47:05 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:47:05 : epoch 6980726a : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0bd80023e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:47:05 compute-1 sshd-session[113433]: Connection closed by invalid user user 123.58.212.100 port 39190 [preauth]
Feb 02 09:47:05 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:47:05 : epoch 6980726a : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0bb4003980 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:47:05 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:47:05 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:47:05 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:47:05.230 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:47:05 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:47:05 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 09:47:05 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:47:05.464 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 09:47:05 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 09:47:06 compute-1 sshd-session[113436]: Invalid user user from 123.58.212.100 port 39202
Feb 02 09:47:06 compute-1 sshd-session[113436]: Connection closed by invalid user user 123.58.212.100 port 39202 [preauth]
Feb 02 09:47:06 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:47:06 : epoch 6980726a : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0bd4001e20 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:47:07 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:47:07 : epoch 6980726a : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0bcc0042e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:47:07 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:47:07 : epoch 6980726a : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0bd80023e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:47:07 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:47:07 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:47:07 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:47:07.232 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:47:07 compute-1 sshd-session[113441]: Accepted publickey for zuul from 192.168.122.30 port 60978 ssh2: ECDSA SHA256:RIWOugHsRom13QN8+H2eekzMj6VNcm6gUxie+zDStiQ
Feb 02 09:47:07 compute-1 systemd-logind[805]: New session 45 of user zuul.
Feb 02 09:47:07 compute-1 systemd[1]: Started Session 45 of User zuul.
Feb 02 09:47:07 compute-1 sshd-session[113441]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Feb 02 09:47:07 compute-1 ceph-mon[80115]: pgmap v213: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Feb 02 09:47:07 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:47:07 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:47:07 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:47:07.468 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:47:07 compute-1 sshd-session[113439]: Invalid user user from 123.58.212.100 port 39208
Feb 02 09:47:08 compute-1 sshd-session[113439]: Connection closed by invalid user user 123.58.212.100 port 39208 [preauth]
Feb 02 09:47:08 compute-1 python3.9[113594]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 02 09:47:08 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:47:08 : epoch 6980726a : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0bb4003980 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:47:09 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:47:09 : epoch 6980726a : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0bd4001e20 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:47:09 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:47:09 : epoch 6980726a : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0bcc0042e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:47:09 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:47:09 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 09:47:09 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:47:09.234 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 09:47:09 compute-1 sshd-session[113595]: Invalid user user from 123.58.212.100 port 39216
Feb 02 09:47:09 compute-1 ceph-mon[80115]: pgmap v214: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Feb 02 09:47:09 compute-1 sudo[113751]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tspowbcqskoxxnonllvaijlotyqnfzbo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025628.8332868-52-228198791571270/AnsiballZ_systemd.py'
Feb 02 09:47:09 compute-1 sudo[113751]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:47:09 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:47:09 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 09:47:09 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:47:09.470 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 09:47:09 compute-1 sshd-session[113595]: Connection closed by invalid user user 123.58.212.100 port 39216 [preauth]
Feb 02 09:47:09 compute-1 python3.9[113753]: ansible-ansible.builtin.systemd Invoked with enabled=True name=sshd daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None masked=None
Feb 02 09:47:09 compute-1 sudo[113751]: pam_unix(sudo:session): session closed for user root
Feb 02 09:47:10 compute-1 sudo[113907]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zagovfbjfbatjomforvwwagblbajzuny ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025630.0320055-76-117087887870537/AnsiballZ_systemd.py'
Feb 02 09:47:10 compute-1 sudo[113907]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:47:10 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 09:47:10 compute-1 sshd-session[113756]: Invalid user user from 123.58.212.100 port 39228
Feb 02 09:47:10 compute-1 python3.9[113909]: ansible-ansible.builtin.systemd Invoked with name=sshd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Feb 02 09:47:10 compute-1 sudo[113907]: pam_unix(sudo:session): session closed for user root
Feb 02 09:47:10 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:47:10 : epoch 6980726a : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0bd80023e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:47:11 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:47:11 : epoch 6980726a : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0bb4003980 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:47:11 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:47:11 : epoch 6980726a : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0bd4001e20 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:47:11 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:47:11 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:47:11 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:47:11.237 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:47:11 compute-1 sshd-session[113756]: Connection closed by invalid user user 123.58.212.100 port 39228 [preauth]
Feb 02 09:47:11 compute-1 ceph-mon[80115]: pgmap v215: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Feb 02 09:47:11 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:47:11 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:47:11 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:47:11.473 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:47:11 compute-1 sudo[114063]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eoltqtdexfhfleidxcfnsqvvtumwibim ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025631.5263941-103-206086253288562/AnsiballZ_command.py'
Feb 02 09:47:11 compute-1 sudo[114063]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:47:12 compute-1 sshd-session[113912]: Invalid user user from 123.58.212.100 port 39232
Feb 02 09:47:12 compute-1 python3.9[114065]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 02 09:47:12 compute-1 sudo[114063]: pam_unix(sudo:session): session closed for user root
Feb 02 09:47:12 compute-1 sshd-session[113912]: Connection closed by invalid user user 123.58.212.100 port 39232 [preauth]
Feb 02 09:47:12 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:47:12 : epoch 6980726a : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0bcc0042e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:47:12 compute-1 sudo[114219]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rvflauspamzniozgucqobfaoejtnyhbk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025632.449839-127-14841498917696/AnsiballZ_stat.py'
Feb 02 09:47:12 compute-1 sudo[114219]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:47:13 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:47:13 : epoch 6980726a : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0bd80023e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:47:13 compute-1 python3.9[114221]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 02 09:47:13 compute-1 sudo[114219]: pam_unix(sudo:session): session closed for user root
Feb 02 09:47:13 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:47:13 : epoch 6980726a : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0bb4003980 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:47:13 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:47:13 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 09:47:13 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:47:13.240 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 09:47:13 compute-1 ceph-mon[80115]: pgmap v216: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Feb 02 09:47:13 compute-1 sshd-session[114143]: Invalid user user from 123.58.212.100 port 35914
Feb 02 09:47:13 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:47:13 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:47:13 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:47:13.477 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:47:13 compute-1 sshd-session[114143]: Connection closed by invalid user user 123.58.212.100 port 35914 [preauth]
Feb 02 09:47:13 compute-1 sudo[114322]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Feb 02 09:47:13 compute-1 sudo[114322]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:47:13 compute-1 sudo[114322]: pam_unix(sudo:session): session closed for user root
Feb 02 09:47:13 compute-1 sudo[114397]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iitbkfwlkwwhyrbavowqgsnuvvsffovd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025633.434134-154-5543456804749/AnsiballZ_file.py'
Feb 02 09:47:13 compute-1 sudo[114397]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:47:14 compute-1 python3.9[114399]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 02 09:47:14 compute-1 sudo[114397]: pam_unix(sudo:session): session closed for user root
Feb 02 09:47:14 compute-1 ceph-mon[80115]: pgmap v217: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Feb 02 09:47:14 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:47:14 : epoch 6980726a : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0bd40091b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:47:14 compute-1 sshd-session[114400]: Invalid user user from 123.58.212.100 port 35916
Feb 02 09:47:15 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:47:15 : epoch 6980726a : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0bcc0042e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:47:15 compute-1 sshd-session[113444]: Connection closed by 192.168.122.30 port 60978
Feb 02 09:47:15 compute-1 sshd-session[113441]: pam_unix(sshd:session): session closed for user zuul
Feb 02 09:47:15 compute-1 systemd[1]: session-45.scope: Deactivated successfully.
Feb 02 09:47:15 compute-1 systemd[1]: session-45.scope: Consumed 3.793s CPU time.
Feb 02 09:47:15 compute-1 systemd-logind[805]: Session 45 logged out. Waiting for processes to exit.
Feb 02 09:47:15 compute-1 systemd-logind[805]: Removed session 45.
Feb 02 09:47:15 compute-1 sshd-session[114400]: Connection closed by invalid user user 123.58.212.100 port 35916 [preauth]
Feb 02 09:47:15 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:47:15 : epoch 6980726a : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0bd80023e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:47:15 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:47:15 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:47:15 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:47:15.259 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:47:15 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:47:15 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:47:15 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:47:15.482 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:47:15 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 09:47:16 compute-1 ceph-mon[80115]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #22. Immutable memtables: 0.
Feb 02 09:47:16 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-09:47:16.413219) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Feb 02 09:47:16 compute-1 ceph-mon[80115]: rocksdb: [db/flush_job.cc:856] [default] [JOB 9] Flushing memtable with next log file: 22
Feb 02 09:47:16 compute-1 ceph-mon[80115]: rocksdb: EVENT_LOG_v1 {"time_micros": 1770025636413354, "job": 9, "event": "flush_started", "num_memtables": 1, "num_entries": 1914, "num_deletes": 250, "total_data_size": 5251899, "memory_usage": 5339912, "flush_reason": "Manual Compaction"}
Feb 02 09:47:16 compute-1 ceph-mon[80115]: rocksdb: [db/flush_job.cc:885] [default] [JOB 9] Level-0 flush table #23: started
Feb 02 09:47:16 compute-1 sshd-session[114427]: Invalid user user from 123.58.212.100 port 35930
Feb 02 09:47:16 compute-1 ceph-mon[80115]: rocksdb: EVENT_LOG_v1 {"time_micros": 1770025636429352, "cf_name": "default", "job": 9, "event": "table_file_creation", "file_number": 23, "file_size": 1997941, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 10754, "largest_seqno": 12663, "table_properties": {"data_size": 1992244, "index_size": 2837, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1797, "raw_key_size": 14181, "raw_average_key_size": 20, "raw_value_size": 1979844, "raw_average_value_size": 2800, "num_data_blocks": 126, "num_entries": 707, "num_filter_entries": 707, "num_deletions": 250, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1770025454, "oldest_key_time": 1770025454, "file_creation_time": 1770025636, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "fac1d709-8a2a-487d-b05b-57255ec289c7", "db_session_id": "DE871D21TSCUFP8UED8E", "orig_file_number": 23, "seqno_to_time_mapping": "N/A"}}
Feb 02 09:47:16 compute-1 ceph-mon[80115]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 9] Flush lasted 16172 microseconds, and 4081 cpu microseconds.
Feb 02 09:47:16 compute-1 ceph-mon[80115]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Feb 02 09:47:16 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-09:47:16.429414) [db/flush_job.cc:967] [default] [JOB 9] Level-0 flush table #23: 1997941 bytes OK
Feb 02 09:47:16 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-09:47:16.429432) [db/memtable_list.cc:519] [default] Level-0 commit table #23 started
Feb 02 09:47:16 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-09:47:16.430707) [db/memtable_list.cc:722] [default] Level-0 commit table #23: memtable #1 done
Feb 02 09:47:16 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-09:47:16.430719) EVENT_LOG_v1 {"time_micros": 1770025636430715, "job": 9, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Feb 02 09:47:16 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-09:47:16.430737) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Feb 02 09:47:16 compute-1 ceph-mon[80115]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 9] Try to delete WAL files size 5243327, prev total WAL file size 5243327, number of live WAL files 2.
Feb 02 09:47:16 compute-1 ceph-mon[80115]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000019.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb 02 09:47:16 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-09:47:16.431627) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6D6772737461740030' seq:72057594037927935, type:22 .. '6D67727374617400323531' seq:0, type:0; will stop at (end)
Feb 02 09:47:16 compute-1 ceph-mon[80115]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 10] Compacting 1@0 + 1@6 files to L6, score -1.00
Feb 02 09:47:16 compute-1 ceph-mon[80115]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 9 Base level 0, inputs: [23(1951KB)], [21(13MB)]
Feb 02 09:47:16 compute-1 ceph-mon[80115]: rocksdb: EVENT_LOG_v1 {"time_micros": 1770025636431713, "job": 10, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [23], "files_L6": [21], "score": -1, "input_data_size": 16268083, "oldest_snapshot_seqno": -1}
Feb 02 09:47:16 compute-1 ceph-mon[80115]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 10] Generated table #24: 4357 keys, 14293866 bytes, temperature: kUnknown
Feb 02 09:47:16 compute-1 ceph-mon[80115]: rocksdb: EVENT_LOG_v1 {"time_micros": 1770025636547842, "cf_name": "default", "job": 10, "event": "table_file_creation", "file_number": 24, "file_size": 14293866, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 14260716, "index_size": 21136, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 10949, "raw_key_size": 110176, "raw_average_key_size": 25, "raw_value_size": 14177084, "raw_average_value_size": 3253, "num_data_blocks": 906, "num_entries": 4357, "num_filter_entries": 4357, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1770025175, "oldest_key_time": 0, "file_creation_time": 1770025636, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "fac1d709-8a2a-487d-b05b-57255ec289c7", "db_session_id": "DE871D21TSCUFP8UED8E", "orig_file_number": 24, "seqno_to_time_mapping": "N/A"}}
Feb 02 09:47:16 compute-1 ceph-mon[80115]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Feb 02 09:47:16 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-09:47:16.548324) [db/compaction/compaction_job.cc:1663] [default] [JOB 10] Compacted 1@0 + 1@6 files to L6 => 14293866 bytes
Feb 02 09:47:16 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-09:47:16.551279) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 139.9 rd, 122.9 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.9, 13.6 +0.0 blob) out(13.6 +0.0 blob), read-write-amplify(15.3) write-amplify(7.2) OK, records in: 4784, records dropped: 427 output_compression: NoCompression
Feb 02 09:47:16 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-09:47:16.551319) EVENT_LOG_v1 {"time_micros": 1770025636551301, "job": 10, "event": "compaction_finished", "compaction_time_micros": 116264, "compaction_time_cpu_micros": 33950, "output_level": 6, "num_output_files": 1, "total_output_size": 14293866, "num_input_records": 4784, "num_output_records": 4357, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Feb 02 09:47:16 compute-1 ceph-mon[80115]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000023.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb 02 09:47:16 compute-1 ceph-mon[80115]: rocksdb: EVENT_LOG_v1 {"time_micros": 1770025636551982, "job": 10, "event": "table_file_deletion", "file_number": 23}
Feb 02 09:47:16 compute-1 ceph-mon[80115]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000021.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb 02 09:47:16 compute-1 ceph-mon[80115]: rocksdb: EVENT_LOG_v1 {"time_micros": 1770025636554920, "job": 10, "event": "table_file_deletion", "file_number": 21}
Feb 02 09:47:16 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-09:47:16.431493) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 02 09:47:16 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-09:47:16.554997) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 02 09:47:16 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-09:47:16.555007) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 02 09:47:16 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-09:47:16.555010) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 02 09:47:16 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-09:47:16.555013) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 02 09:47:16 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-09:47:16.555017) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 02 09:47:16 compute-1 sshd-session[114427]: Connection closed by invalid user user 123.58.212.100 port 35930 [preauth]
Feb 02 09:47:16 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:47:16 : epoch 6980726a : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0bb4003980 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:47:17 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:47:17 : epoch 6980726a : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0bd40091b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:47:17 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:47:17 : epoch 6980726a : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0bcc0042e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:47:17 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:47:17 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 09:47:17 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:47:17.261 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 09:47:17 compute-1 ceph-mon[80115]: pgmap v218: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Feb 02 09:47:17 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Feb 02 09:47:17 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:47:17 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:47:17 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:47:17.485 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:47:17 compute-1 sshd-session[114430]: Invalid user user from 123.58.212.100 port 35940
Feb 02 09:47:18 compute-1 sshd-session[114430]: Connection closed by invalid user user 123.58.212.100 port 35940 [preauth]
Feb 02 09:47:18 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:47:18 : epoch 6980726a : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0bd80023e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:47:19 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:47:19 : epoch 6980726a : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0bb4003980 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:47:19 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:47:19 : epoch 6980726a : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0bd4009ec0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:47:19 compute-1 sshd-session[114432]: Invalid user user from 123.58.212.100 port 35952
Feb 02 09:47:19 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:47:19 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 09:47:19 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:47:19.264 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 09:47:19 compute-1 ceph-mon[80115]: pgmap v219: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Feb 02 09:47:19 compute-1 sshd-session[114432]: Connection closed by invalid user user 123.58.212.100 port 35952 [preauth]
Feb 02 09:47:19 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:47:19 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:47:19 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:47:19.488 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:47:19 compute-1 sudo[114435]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 02 09:47:19 compute-1 sudo[114435]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:47:19 compute-1 sudo[114435]: pam_unix(sudo:session): session closed for user root
Feb 02 09:47:19 compute-1 sudo[114460]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/d241d473-9fcb-5f74-b163-f1ca4454e7f1/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Feb 02 09:47:19 compute-1 sudo[114460]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:47:20 compute-1 sudo[114460]: pam_unix(sudo:session): session closed for user root
Feb 02 09:47:20 compute-1 sshd-session[114505]: Accepted publickey for zuul from 192.168.122.30 port 42644 ssh2: ECDSA SHA256:RIWOugHsRom13QN8+H2eekzMj6VNcm6gUxie+zDStiQ
Feb 02 09:47:20 compute-1 systemd-logind[805]: New session 46 of user zuul.
Feb 02 09:47:20 compute-1 systemd[1]: Started Session 46 of User zuul.
Feb 02 09:47:20 compute-1 sshd-session[114505]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Feb 02 09:47:20 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Feb 02 09:47:20 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Feb 02 09:47:20 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb 02 09:47:20 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb 02 09:47:20 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Feb 02 09:47:20 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Feb 02 09:47:20 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Feb 02 09:47:20 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 09:47:20 compute-1 sshd-session[114485]: Invalid user user from 123.58.212.100 port 35958
Feb 02 09:47:20 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:47:20 : epoch 6980726a : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0bcc0042e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:47:20 compute-1 sshd-session[114485]: Connection closed by invalid user user 123.58.212.100 port 35958 [preauth]
Feb 02 09:47:21 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:47:21 : epoch 6980726a : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0bcc0042e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:47:21 compute-1 python3.9[114673]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 02 09:47:21 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:47:21 : epoch 6980726a : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0bb4003980 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:47:21 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:47:21 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 09:47:21 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:47:21.266 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 09:47:21 compute-1 ceph-mon[80115]: pgmap v220: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Feb 02 09:47:21 compute-1 ceph-mon[80115]: pgmap v221: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 306 B/s rd, 0 op/s
Feb 02 09:47:21 compute-1 ceph-mon[80115]: pgmap v222: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 383 B/s rd, 0 op/s
Feb 02 09:47:21 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:47:21 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:47:21 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:47:21.489 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:47:22 compute-1 sudo[114829]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-igyyqcihcxcmxakbhbvbqdvmrbtorxky ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025641.8425932-58-134099614559754/AnsiballZ_setup.py'
Feb 02 09:47:22 compute-1 sudo[114829]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:47:22 compute-1 python3.9[114831]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Feb 02 09:47:22 compute-1 sshd-session[114702]: Invalid user user from 123.58.212.100 port 35966
Feb 02 09:47:22 compute-1 sudo[114829]: pam_unix(sudo:session): session closed for user root
Feb 02 09:47:22 compute-1 sshd-session[114702]: Connection closed by invalid user user 123.58.212.100 port 35966 [preauth]
Feb 02 09:47:22 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:47:22 : epoch 6980726a : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0bd4009ec0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:47:23 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:47:23 : epoch 6980726a : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0bcc0042e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:47:23 compute-1 sudo[114916]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sqrhugwdvvmpyrrfqtuoprunqllpzzar ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025641.8425932-58-134099614559754/AnsiballZ_dnf.py'
Feb 02 09:47:23 compute-1 sudo[114916]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:47:23 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:47:23 : epoch 6980726a : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0bd80023e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:47:23 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:47:23 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:47:23 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:47:23.269 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:47:23 compute-1 ceph-mon[80115]: pgmap v223: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail
Feb 02 09:47:23 compute-1 python3.9[114918]: ansible-ansible.legacy.dnf Invoked with name=['yum-utils'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Feb 02 09:47:23 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:47:23 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:47:23 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:47:23.492 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:47:23 compute-1 sshd-session[114841]: Invalid user user from 123.58.212.100 port 47938
Feb 02 09:47:24 compute-1 sshd-session[114841]: Connection closed by invalid user user 123.58.212.100 port 47938 [preauth]
Feb 02 09:47:24 compute-1 ceph-mon[80115]: pgmap v224: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 383 B/s rd, 0 op/s
Feb 02 09:47:24 compute-1 sudo[114916]: pam_unix(sudo:session): session closed for user root
Feb 02 09:47:24 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:47:24 : epoch 6980726a : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0bb4003980 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:47:25 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:47:25 : epoch 6980726a : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0bd4009ec0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:47:25 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:47:25 : epoch 6980726a : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0bcc0042e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:47:25 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:47:25 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 09:47:25 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:47:25.272 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 09:47:25 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:47:25 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:47:25 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:47:25.495 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:47:25 compute-1 sudo[115073]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 02 09:47:25 compute-1 sudo[115073]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:47:25 compute-1 sudo[115073]: pam_unix(sudo:session): session closed for user root
Feb 02 09:47:25 compute-1 sshd-session[114920]: Invalid user user from 123.58.212.100 port 47940
Feb 02 09:47:25 compute-1 python3.9[115072]: ansible-ansible.legacy.command Invoked with _raw_params=needs-restarting -r _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 02 09:47:25 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 09:47:25 compute-1 sshd-session[114920]: Connection closed by invalid user user 123.58.212.100 port 47940 [preauth]
Feb 02 09:47:26 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb 02 09:47:26 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb 02 09:47:26 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:47:26 : epoch 6980726a : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0bd80023e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:47:26 compute-1 python3.9[115251]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/reboot_required/'] patterns=[] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Feb 02 09:47:26 compute-1 sshd-session[115123]: Invalid user user from 123.58.212.100 port 47944
Feb 02 09:47:27 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:47:27 : epoch 6980726a : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0bb4003980 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:47:27 compute-1 sshd-session[115123]: Connection closed by invalid user user 123.58.212.100 port 47944 [preauth]
Feb 02 09:47:27 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:47:27 : epoch 6980726a : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0bd4009ec0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:47:27 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:47:27 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 09:47:27 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:47:27.275 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 09:47:27 compute-1 ceph-mon[80115]: pgmap v225: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 383 B/s rd, 0 op/s
Feb 02 09:47:27 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:47:27 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 09:47:27 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:47:27.497 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 09:47:27 compute-1 python3.9[115402]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 02 09:47:28 compute-1 sshd-session[115386]: Invalid user user from 123.58.212.100 port 47946
Feb 02 09:47:28 compute-1 ceph-mon[80115]: pgmap v226: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 383 B/s rd, 0 op/s
Feb 02 09:47:28 compute-1 python3.9[115553]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 02 09:47:28 compute-1 sshd-session[115386]: Connection closed by invalid user user 123.58.212.100 port 47946 [preauth]
Feb 02 09:47:28 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:47:28 : epoch 6980726a : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0bcc0042e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:47:29 compute-1 sshd-session[114522]: Connection closed by 192.168.122.30 port 42644
Feb 02 09:47:29 compute-1 sshd-session[114505]: pam_unix(sshd:session): session closed for user zuul
Feb 02 09:47:29 compute-1 systemd-logind[805]: Session 46 logged out. Waiting for processes to exit.
Feb 02 09:47:29 compute-1 systemd[1]: session-46.scope: Deactivated successfully.
Feb 02 09:47:29 compute-1 systemd[1]: session-46.scope: Consumed 5.416s CPU time.
Feb 02 09:47:29 compute-1 systemd-logind[805]: Removed session 46.
Feb 02 09:47:29 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:47:29 : epoch 6980726a : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0bcc0042e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:47:29 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:47:29 : epoch 6980726a : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0bb4003980 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:47:29 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:47:29 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 09:47:29 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:47:29.278 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 09:47:29 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:47:29 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:47:29 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:47:29.501 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:47:29 compute-1 sshd-session[115579]: Invalid user user from 123.58.212.100 port 47948
Feb 02 09:47:30 compute-1 sshd-session[115579]: Connection closed by invalid user user 123.58.212.100 port 47948 [preauth]
Feb 02 09:47:30 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 09:47:30 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:47:30 : epoch 6980726a : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0bcc0042e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:47:31 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:47:31 : epoch 6980726a : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0bd80023e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:47:31 compute-1 sshd-session[115581]: Invalid user user from 123.58.212.100 port 47950
Feb 02 09:47:31 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:47:31 : epoch 6980726a : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0bd4009ec0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:47:31 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:47:31 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 09:47:31 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:47:31.281 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 09:47:31 compute-1 ceph-mon[80115]: pgmap v227: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 307 B/s rd, 0 op/s
Feb 02 09:47:31 compute-1 sshd-session[115581]: Connection closed by invalid user user 123.58.212.100 port 47950 [preauth]
Feb 02 09:47:31 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:47:31 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:47:31 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:47:31.503 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:47:32 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Feb 02 09:47:32 compute-1 sshd-session[115584]: Invalid user user from 123.58.212.100 port 44706
Feb 02 09:47:32 compute-1 sshd-session[115584]: Connection closed by invalid user user 123.58.212.100 port 44706 [preauth]
Feb 02 09:47:32 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:47:32 : epoch 6980726a : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0bb4003980 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:47:33 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:47:33 : epoch 6980726a : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0bcc0042e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:47:33 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:47:33 : epoch 6980726a : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0bd80023e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:47:33 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:47:33 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:47:33 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:47:33.285 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:47:33 compute-1 ceph-mon[80115]: pgmap v228: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Feb 02 09:47:33 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:47:33 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 09:47:33 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:47:33.506 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 09:47:33 compute-1 sudo[115591]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Feb 02 09:47:33 compute-1 sudo[115591]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:47:33 compute-1 sudo[115591]: pam_unix(sudo:session): session closed for user root
Feb 02 09:47:33 compute-1 sshd-session[115587]: Invalid user user from 123.58.212.100 port 44714
Feb 02 09:47:34 compute-1 sshd-session[115589]: Invalid user solv from 80.94.92.184 port 35056
Feb 02 09:47:34 compute-1 sshd-session[115589]: Connection closed by invalid user solv 80.94.92.184 port 35056 [preauth]
Feb 02 09:47:34 compute-1 sshd-session[115587]: Connection closed by invalid user user 123.58.212.100 port 44714 [preauth]
Feb 02 09:47:34 compute-1 ceph-mon[80115]: pgmap v229: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Feb 02 09:47:34 compute-1 sshd-session[115616]: Accepted publickey for zuul from 192.168.122.30 port 39040 ssh2: ECDSA SHA256:RIWOugHsRom13QN8+H2eekzMj6VNcm6gUxie+zDStiQ
Feb 02 09:47:34 compute-1 systemd-logind[805]: New session 47 of user zuul.
Feb 02 09:47:34 compute-1 systemd[1]: Started Session 47 of User zuul.
Feb 02 09:47:34 compute-1 sshd-session[115616]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Feb 02 09:47:34 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:47:34 : epoch 6980726a : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0bd4009ec0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:47:35 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:47:35 : epoch 6980726a : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0bb4003980 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:47:35 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:47:35 : epoch 6980726a : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0bb4003980 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:47:35 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:47:35 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 09:47:35 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:47:35.287 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 09:47:35 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:47:35 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:47:35 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:47:35.510 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:47:35 compute-1 sshd-session[115643]: Invalid user user from 123.58.212.100 port 44716
Feb 02 09:47:35 compute-1 python3.9[115774]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 02 09:47:35 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 09:47:35 compute-1 sshd-session[115643]: Connection closed by invalid user user 123.58.212.100 port 44716 [preauth]
Feb 02 09:47:36 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:47:36 : epoch 6980726a : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0bd80023e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:47:36 compute-1 sshd-session[115803]: Invalid user user from 123.58.212.100 port 44728
Feb 02 09:47:37 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:47:37 : epoch 6980726a : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0bb8001bf0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:47:37 compute-1 sudo[115931]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kwnklbfzpulgpyoeipbglrthvczfipts ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025656.7507834-107-92121607808523/AnsiballZ_file.py'
Feb 02 09:47:37 compute-1 sudo[115931]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:47:37 compute-1 sshd-session[115803]: Connection closed by invalid user user 123.58.212.100 port 44728 [preauth]
Feb 02 09:47:37 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:47:37 : epoch 6980726a : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0bb8001bf0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:47:37 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:47:37 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:47:37 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:47:37.291 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:47:37 compute-1 python3.9[115933]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/libvirt/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 02 09:47:37 compute-1 sudo[115931]: pam_unix(sudo:session): session closed for user root
Feb 02 09:47:37 compute-1 ceph-mon[80115]: pgmap v230: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Feb 02 09:47:37 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:47:37 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:47:37 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:47:37.513 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:47:37 compute-1 sudo[116085]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rewkmlbxcisoxnlbjuwqwtvedhpsdhpi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025657.4581661-107-10135048627153/AnsiballZ_file.py'
Feb 02 09:47:37 compute-1 sudo[116085]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:47:37 compute-1 python3.9[116087]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/libvirt/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 02 09:47:37 compute-1 sudo[116085]: pam_unix(sudo:session): session closed for user root
Feb 02 09:47:38 compute-1 sshd-session[115958]: Invalid user user from 123.58.212.100 port 44740
Feb 02 09:47:38 compute-1 sshd-session[115958]: Connection closed by invalid user user 123.58.212.100 port 44740 [preauth]
Feb 02 09:47:38 compute-1 sudo[116237]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vaqtqpqnhyvzmvctzgzadsbxuprsdrrj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025658.1468506-149-25167533310985/AnsiballZ_stat.py'
Feb 02 09:47:38 compute-1 sudo[116237]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:47:38 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:47:38 : epoch 6980726a : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0bb4003980 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:47:38 compute-1 python3.9[116239]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 02 09:47:38 compute-1 sudo[116237]: pam_unix(sudo:session): session closed for user root
Feb 02 09:47:39 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:47:39 : epoch 6980726a : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0bd80023e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:47:39 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:47:39 : epoch 6980726a : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0bb8001bf0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:47:39 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:47:39 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:47:39 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:47:39.294 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:47:39 compute-1 ceph-mon[80115]: pgmap v231: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Feb 02 09:47:39 compute-1 sudo[116363]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tkbwmqeqgglhwmpquivnfsvkrbbpoffu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025658.1468506-149-25167533310985/AnsiballZ_copy.py'
Feb 02 09:47:39 compute-1 sudo[116363]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:47:39 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:47:39 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:47:39 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:47:39.516 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:47:39 compute-1 python3.9[116365]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1770025658.1468506-149-25167533310985/.source.crt _original_basename=compute-1.ctlplane.example.com-tls.crt follow=False checksum=bbbde0bea78cfeba25f07606728ce69c42c7d6f3 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 02 09:47:39 compute-1 sudo[116363]: pam_unix(sudo:session): session closed for user root
Feb 02 09:47:39 compute-1 sshd-session[116241]: Invalid user user from 123.58.212.100 port 44750
Feb 02 09:47:39 compute-1 sshd-session[116241]: Connection closed by invalid user user 123.58.212.100 port 44750 [preauth]
Feb 02 09:47:40 compute-1 sudo[116515]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-stqrayrlmrkthxbpfykvglpzitpuvhhj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025659.7466278-149-59931358464830/AnsiballZ_stat.py'
Feb 02 09:47:40 compute-1 sudo[116515]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:47:40 compute-1 python3.9[116517]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 02 09:47:40 compute-1 sudo[116515]: pam_unix(sudo:session): session closed for user root
Feb 02 09:47:40 compute-1 ceph-mon[80115]: pgmap v232: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Feb 02 09:47:40 compute-1 sudo[116640]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qzwgcrxejgwezzkvazhanrstrnxqoqhj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025659.7466278-149-59931358464830/AnsiballZ_copy.py'
Feb 02 09:47:40 compute-1 sudo[116640]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:47:40 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 09:47:40 compute-1 python3.9[116642]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1770025659.7466278-149-59931358464830/.source.crt _original_basename=compute-1.ctlplane.example.com-ca.crt follow=False checksum=98d348615e61a9b68b5c5fd470bc9aeb831c56b4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 02 09:47:40 compute-1 sudo[116640]: pam_unix(sudo:session): session closed for user root
Feb 02 09:47:40 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:47:40 : epoch 6980726a : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0bb8001bf0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:47:41 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:47:41 : epoch 6980726a : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0bb4003980 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:47:41 compute-1 sudo[116793]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mnhhyqkvknookcalezciazesqlqnndvp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025660.8416958-149-279182533082446/AnsiballZ_stat.py'
Feb 02 09:47:41 compute-1 sudo[116793]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:47:41 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:47:41 : epoch 6980726a : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0bd80023e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:47:41 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:47:41 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:47:41 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:47:41.298 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:47:41 compute-1 python3.9[116795]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 02 09:47:41 compute-1 sshd-session[116518]: Invalid user user from 123.58.212.100 port 44760
Feb 02 09:47:41 compute-1 sudo[116793]: pam_unix(sudo:session): session closed for user root
Feb 02 09:47:41 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:47:41 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 09:47:41 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:47:41.518 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 09:47:41 compute-1 sshd-session[116518]: Connection closed by invalid user user 123.58.212.100 port 44760 [preauth]
Feb 02 09:47:41 compute-1 sudo[116916]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uvlhfcxugwdrgzheirtssxypuuoodeqh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025660.8416958-149-279182533082446/AnsiballZ_copy.py'
Feb 02 09:47:41 compute-1 sudo[116916]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:47:41 compute-1 python3.9[116918]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1770025660.8416958-149-279182533082446/.source.key _original_basename=compute-1.ctlplane.example.com-tls.key follow=False checksum=42ff746d592f57a5dc4052c4590df75e42f43be8 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 02 09:47:41 compute-1 sudo[116916]: pam_unix(sudo:session): session closed for user root
Feb 02 09:47:42 compute-1 sudo[117070]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yszwgtcgpmrhnmwgmlxxgkwjhdorddnx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025662.0619612-279-173434607578561/AnsiballZ_file.py'
Feb 02 09:47:42 compute-1 sudo[117070]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:47:42 compute-1 python3.9[117072]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/neutron-metadata/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 02 09:47:42 compute-1 sudo[117070]: pam_unix(sudo:session): session closed for user root
Feb 02 09:47:42 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:47:42 : epoch 6980726a : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0bb8001bf0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:47:42 compute-1 sshd-session[116943]: Invalid user user from 123.58.212.100 port 43848
Feb 02 09:47:42 compute-1 sudo[117223]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-srcdiktaaoerqhikylokzmhokkenbfdi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025662.6718092-279-240650010379382/AnsiballZ_file.py'
Feb 02 09:47:42 compute-1 sudo[117223]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:47:43 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:47:43 : epoch 6980726a : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0bac001080 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:47:43 compute-1 sshd-session[116943]: Connection closed by invalid user user 123.58.212.100 port 43848 [preauth]
Feb 02 09:47:43 compute-1 python3.9[117225]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/neutron-metadata/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 02 09:47:43 compute-1 sudo[117223]: pam_unix(sudo:session): session closed for user root
Feb 02 09:47:43 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:47:43 : epoch 6980726a : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0bb4003980 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:47:43 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:47:43 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 09:47:43 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:47:43.300 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 09:47:43 compute-1 ceph-mon[80115]: pgmap v233: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Feb 02 09:47:43 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:47:43 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:47:43 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:47:43.521 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:47:43 compute-1 sudo[117377]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zqjvhuesfdwfidzgesbbmoqtwezlkpmi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025663.3480186-326-207816344631668/AnsiballZ_stat.py'
Feb 02 09:47:43 compute-1 sudo[117377]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:47:43 compute-1 python3.9[117379]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 02 09:47:43 compute-1 sudo[117377]: pam_unix(sudo:session): session closed for user root
Feb 02 09:47:44 compute-1 sshd-session[117252]: Invalid user user from 123.58.212.100 port 43864
Feb 02 09:47:44 compute-1 sudo[117500]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tiymnefedwzubfxbdnhigxgczxrfjsnp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025663.3480186-326-207816344631668/AnsiballZ_copy.py'
Feb 02 09:47:44 compute-1 sudo[117500]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:47:44 compute-1 python3.9[117502]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1770025663.3480186-326-207816344631668/.source.crt _original_basename=compute-1.ctlplane.example.com-tls.crt follow=False checksum=453d3b7a2aef7dfcee9fd995557ae6920a7b055b backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 02 09:47:44 compute-1 sshd-session[117252]: Connection closed by invalid user user 123.58.212.100 port 43864 [preauth]
Feb 02 09:47:44 compute-1 sudo[117500]: pam_unix(sudo:session): session closed for user root
Feb 02 09:47:44 compute-1 sudo[117655]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oajgemaajrmmhwvlykgfwoageghfezmq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025664.537104-326-46680391540927/AnsiballZ_stat.py'
Feb 02 09:47:44 compute-1 sudo[117655]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:47:44 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:47:44 : epoch 6980726a : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0bd80023e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:47:44 compute-1 python3.9[117657]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 02 09:47:45 compute-1 sudo[117655]: pam_unix(sudo:session): session closed for user root
Feb 02 09:47:45 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:47:45 : epoch 6980726a : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0bb8001bf0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:47:45 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:47:45 : epoch 6980726a : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0bac001ba0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:47:45 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:47:45 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:47:45 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:47:45.303 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:47:45 compute-1 sudo[117778]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vxmsbeinvluxmarlaklmfdtlbeyisumt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025664.537104-326-46680391540927/AnsiballZ_copy.py'
Feb 02 09:47:45 compute-1 sudo[117778]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:47:45 compute-1 ceph-mon[80115]: pgmap v234: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Feb 02 09:47:45 compute-1 python3.9[117780]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1770025664.537104-326-46680391540927/.source.crt _original_basename=compute-1.ctlplane.example.com-ca.crt follow=False checksum=d08fd1db2672bef6291fde5319a05fae0b3732d1 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 02 09:47:45 compute-1 sudo[117778]: pam_unix(sudo:session): session closed for user root
Feb 02 09:47:45 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:47:45 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:47:45 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:47:45.523 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:47:45 compute-1 sshd-session[117602]: Invalid user user from 123.58.212.100 port 43876
Feb 02 09:47:45 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 09:47:45 compute-1 sshd-session[117602]: Connection closed by invalid user user 123.58.212.100 port 43876 [preauth]
Feb 02 09:47:45 compute-1 sudo[117930]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gmxnjrktypujhjrrglcvhkvqekaukkle ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025665.5931032-326-25727540431725/AnsiballZ_stat.py'
Feb 02 09:47:45 compute-1 sudo[117930]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:47:46 compute-1 python3.9[117932]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 02 09:47:46 compute-1 sudo[117930]: pam_unix(sudo:session): session closed for user root
Feb 02 09:47:46 compute-1 sudo[118055]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oinelputmjbnwyxqpbdtfuztzqajworl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025665.5931032-326-25727540431725/AnsiballZ_copy.py'
Feb 02 09:47:46 compute-1 sudo[118055]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:47:46 compute-1 python3.9[118057]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1770025665.5931032-326-25727540431725/.source.key _original_basename=compute-1.ctlplane.example.com-tls.key follow=False checksum=41662b67015605c8326ed9e90f71bc0a5c935d1a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 02 09:47:46 compute-1 sudo[118055]: pam_unix(sudo:session): session closed for user root
Feb 02 09:47:46 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:47:46 : epoch 6980726a : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0bb40039a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:47:47 compute-1 sshd-session[117933]: Invalid user user from 123.58.212.100 port 43888
Feb 02 09:47:47 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:47:47 : epoch 6980726a : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0bb40039a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:47:47 compute-1 sudo[118208]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uyogmssttwmrldbavjqyxmrcfqvsdhym ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025666.912813-448-227275530095456/AnsiballZ_file.py'
Feb 02 09:47:47 compute-1 sudo[118208]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:47:47 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:47:47 : epoch 6980726a : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0bb8003fa0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:47:47 compute-1 sshd-session[117933]: Connection closed by invalid user user 123.58.212.100 port 43888 [preauth]
Feb 02 09:47:47 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:47:47 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:47:47 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:47:47.305 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:47:47 compute-1 ceph-mon[80115]: pgmap v235: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Feb 02 09:47:47 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Feb 02 09:47:47 compute-1 python3.9[118210]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/ovn/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 02 09:47:47 compute-1 sudo[118208]: pam_unix(sudo:session): session closed for user root
Feb 02 09:47:47 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:47:47 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:47:47 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:47:47.526 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:47:47 compute-1 sudo[118362]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jnudxkhyugpzjwwvwytwbosfewkvijdc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025667.6308134-448-180490127710507/AnsiballZ_file.py'
Feb 02 09:47:47 compute-1 sudo[118362]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:47:48 compute-1 python3.9[118364]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/ovn/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 02 09:47:48 compute-1 sudo[118362]: pam_unix(sudo:session): session closed for user root
Feb 02 09:47:48 compute-1 sudo[118514]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vfqlxqkmbioaiksnngivqzgfzyqcrxnp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025668.250418-495-150443268582787/AnsiballZ_stat.py'
Feb 02 09:47:48 compute-1 sudo[118514]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:47:48 compute-1 sshd-session[118211]: Invalid user user from 123.58.212.100 port 43902
Feb 02 09:47:48 compute-1 python3.9[118516]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 02 09:47:48 compute-1 sudo[118514]: pam_unix(sudo:session): session closed for user root
Feb 02 09:47:48 compute-1 sshd-session[118211]: Connection closed by invalid user user 123.58.212.100 port 43902 [preauth]
Feb 02 09:47:48 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:47:48 : epoch 6980726a : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0bac001ba0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:47:48 compute-1 sudo[118638]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-voacrvaajjjmuwhdnyckimnjxyrbmula ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025668.250418-495-150443268582787/AnsiballZ_copy.py'
Feb 02 09:47:48 compute-1 sudo[118638]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:47:49 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:47:49 : epoch 6980726a : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0bd80023e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:47:49 compute-1 python3.9[118640]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1770025668.250418-495-150443268582787/.source.crt _original_basename=compute-1.ctlplane.example.com-tls.crt follow=False checksum=cc0f86f89c3e4ae1f8702736ba65f2dc1e3f1c08 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 02 09:47:49 compute-1 sudo[118638]: pam_unix(sudo:session): session closed for user root
Feb 02 09:47:49 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:47:49 : epoch 6980726a : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0bb40039c0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:47:49 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:47:49 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 09:47:49 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:47:49.308 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 09:47:49 compute-1 ceph-mon[80115]: pgmap v236: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Feb 02 09:47:49 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:47:49 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:47:49 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:47:49.529 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:47:49 compute-1 sudo[118792]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fkcmnwbtvunushmrcozkcrtzgeoyccno ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025669.3261507-495-62181352058023/AnsiballZ_stat.py'
Feb 02 09:47:49 compute-1 sudo[118792]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:47:49 compute-1 python3.9[118794]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 02 09:47:49 compute-1 sudo[118792]: pam_unix(sudo:session): session closed for user root
Feb 02 09:47:50 compute-1 sshd-session[118641]: Invalid user user from 123.58.212.100 port 43904
Feb 02 09:47:50 compute-1 sudo[118915]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bkvxlmrjzxbduttpuctjmmcbwadwyyko ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025669.3261507-495-62181352058023/AnsiballZ_copy.py'
Feb 02 09:47:50 compute-1 sudo[118915]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:47:50 compute-1 sshd-session[118641]: Connection closed by invalid user user 123.58.212.100 port 43904 [preauth]
Feb 02 09:47:50 compute-1 python3.9[118917]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1770025669.3261507-495-62181352058023/.source.crt _original_basename=compute-1.ctlplane.example.com-ca.crt follow=False checksum=d08fd1db2672bef6291fde5319a05fae0b3732d1 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 02 09:47:50 compute-1 sudo[118915]: pam_unix(sudo:session): session closed for user root
Feb 02 09:47:50 compute-1 ceph-mon[80115]: pgmap v237: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Feb 02 09:47:50 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 09:47:50 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:47:50 : epoch 6980726a : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0bb8003fa0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:47:50 compute-1 sudo[119070]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bsfdtodnthtxnegxslgusdztzwcajfxx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025670.5543776-495-184201265955902/AnsiballZ_stat.py'
Feb 02 09:47:50 compute-1 sudo[119070]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:47:51 compute-1 python3.9[119072]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 02 09:47:51 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:47:51 : epoch 6980726a : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0bac001ba0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:47:51 compute-1 sudo[119070]: pam_unix(sudo:session): session closed for user root
Feb 02 09:47:51 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:47:51 : epoch 6980726a : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0bd80023e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:47:51 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:47:51 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 09:47:51 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:47:51.312 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 09:47:51 compute-1 sshd-session[118942]: Invalid user user from 123.58.212.100 port 43918
Feb 02 09:47:51 compute-1 sudo[119193]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mnhocqtgngnliutwileoxzafqashtvbl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025670.5543776-495-184201265955902/AnsiballZ_copy.py'
Feb 02 09:47:51 compute-1 sudo[119193]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:47:51 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:47:51 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 09:47:51 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:47:51.532 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 09:47:51 compute-1 sshd-session[118942]: Connection closed by invalid user user 123.58.212.100 port 43918 [preauth]
Feb 02 09:47:51 compute-1 python3.9[119195]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1770025670.5543776-495-184201265955902/.source.key _original_basename=compute-1.ctlplane.example.com-tls.key follow=False checksum=a845be08b1950ef1f3ad8a3b70e4630a68f71b53 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 02 09:47:51 compute-1 sudo[119193]: pam_unix(sudo:session): session closed for user root
Feb 02 09:47:52 compute-1 sudo[119347]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-krpkrvuxgjmvqltngxgllgwzbgzcndvh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025672.3942893-662-160750128563108/AnsiballZ_file.py'
Feb 02 09:47:52 compute-1 sudo[119347]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:47:52 compute-1 sshd-session[119220]: Invalid user user from 123.58.212.100 port 52868
Feb 02 09:47:52 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:47:52 : epoch 6980726a : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0bb40039e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:47:52 compute-1 python3.9[119349]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/ovn setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 02 09:47:52 compute-1 sudo[119347]: pam_unix(sudo:session): session closed for user root
Feb 02 09:47:53 compute-1 sshd-session[119220]: Connection closed by invalid user user 123.58.212.100 port 52868 [preauth]
Feb 02 09:47:53 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:47:53 : epoch 6980726a : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0bb8003fa0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:47:53 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:47:53 : epoch 6980726a : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0bac003030 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:47:53 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:47:53 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 09:47:53 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:47:53.315 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 09:47:53 compute-1 sudo[119502]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mrxouivwkhgzcpxvdpyczcszqfivlvng ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025673.045883-699-60187974682127/AnsiballZ_stat.py'
Feb 02 09:47:53 compute-1 sudo[119502]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:47:53 compute-1 ceph-mon[80115]: pgmap v238: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Feb 02 09:47:53 compute-1 python3.9[119504]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 02 09:47:53 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:47:53 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:47:53 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:47:53.535 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:47:53 compute-1 sudo[119502]: pam_unix(sudo:session): session closed for user root
Feb 02 09:47:53 compute-1 sudo[119625]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-auprfbwqmqpkxceipymtalyaqblqgyqh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025673.045883-699-60187974682127/AnsiballZ_copy.py'
Feb 02 09:47:53 compute-1 sudo[119625]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:47:53 compute-1 sudo[119626]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Feb 02 09:47:53 compute-1 sudo[119626]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:47:53 compute-1 sudo[119626]: pam_unix(sudo:session): session closed for user root
Feb 02 09:47:54 compute-1 python3.9[119628]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1770025673.045883-699-60187974682127/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=01ba6f1c4701862bb94c27ffc13223400c80de38 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 02 09:47:54 compute-1 sudo[119625]: pam_unix(sudo:session): session closed for user root
Feb 02 09:47:54 compute-1 sshd-session[119450]: Invalid user user from 123.58.212.100 port 52874
Feb 02 09:47:54 compute-1 sudo[119802]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zzzxzysuqlnnynojnilkaclsturslkyd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025674.2979095-745-147649713895384/AnsiballZ_file.py'
Feb 02 09:47:54 compute-1 sudo[119802]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:47:54 compute-1 sshd-session[119450]: Connection closed by invalid user user 123.58.212.100 port 52874 [preauth]
Feb 02 09:47:54 compute-1 python3.9[119804]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 02 09:47:54 compute-1 sudo[119802]: pam_unix(sudo:session): session closed for user root
Feb 02 09:47:54 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:47:54 : epoch 6980726a : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0bac003030 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:47:55 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:47:55 : epoch 6980726a : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0bb4003a00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:47:55 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:47:55 : epoch 6980726a : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0bb8003fa0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:47:55 compute-1 sudo[119957]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fcgihmkzmfmpulehilirlavsjnbvksre ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025674.9764016-770-260199763306598/AnsiballZ_stat.py'
Feb 02 09:47:55 compute-1 sudo[119957]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:47:55 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:47:55 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:47:55 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:47:55.318 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:47:55 compute-1 ceph-mon[80115]: pgmap v239: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Feb 02 09:47:55 compute-1 python3.9[119959]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/libvirt/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 02 09:47:55 compute-1 sudo[119957]: pam_unix(sudo:session): session closed for user root
Feb 02 09:47:55 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:47:55 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:47:55 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:47:55.538 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:47:55 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 09:47:55 compute-1 sshd-session[119853]: Invalid user user from 123.58.212.100 port 52886
Feb 02 09:47:55 compute-1 sudo[120080]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kpbvxdguwppdovtnevhkdiqbwkshpmji ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025674.9764016-770-260199763306598/AnsiballZ_copy.py'
Feb 02 09:47:55 compute-1 sudo[120080]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:47:56 compute-1 python3.9[120082]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/libvirt/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1770025674.9764016-770-260199763306598/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=01ba6f1c4701862bb94c27ffc13223400c80de38 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 02 09:47:56 compute-1 sudo[120080]: pam_unix(sudo:session): session closed for user root
Feb 02 09:47:56 compute-1 sshd-session[119853]: Connection closed by invalid user user 123.58.212.100 port 52886 [preauth]
Feb 02 09:47:56 compute-1 ceph-mon[80115]: pgmap v240: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Feb 02 09:47:56 compute-1 ceph-mon[80115]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #25. Immutable memtables: 0.
Feb 02 09:47:56 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-09:47:56.468882) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Feb 02 09:47:56 compute-1 ceph-mon[80115]: rocksdb: [db/flush_job.cc:856] [default] [JOB 11] Flushing memtable with next log file: 25
Feb 02 09:47:56 compute-1 ceph-mon[80115]: rocksdb: EVENT_LOG_v1 {"time_micros": 1770025676468945, "job": 11, "event": "flush_started", "num_memtables": 1, "num_entries": 656, "num_deletes": 251, "total_data_size": 1200201, "memory_usage": 1215632, "flush_reason": "Manual Compaction"}
Feb 02 09:47:56 compute-1 ceph-mon[80115]: rocksdb: [db/flush_job.cc:885] [default] [JOB 11] Level-0 flush table #26: started
Feb 02 09:47:56 compute-1 ceph-mon[80115]: rocksdb: EVENT_LOG_v1 {"time_micros": 1770025676481337, "cf_name": "default", "job": 11, "event": "table_file_creation", "file_number": 26, "file_size": 787349, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 12668, "largest_seqno": 13319, "table_properties": {"data_size": 784120, "index_size": 1137, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1029, "raw_key_size": 7412, "raw_average_key_size": 18, "raw_value_size": 777574, "raw_average_value_size": 1963, "num_data_blocks": 50, "num_entries": 396, "num_filter_entries": 396, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1770025637, "oldest_key_time": 1770025637, "file_creation_time": 1770025676, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "fac1d709-8a2a-487d-b05b-57255ec289c7", "db_session_id": "DE871D21TSCUFP8UED8E", "orig_file_number": 26, "seqno_to_time_mapping": "N/A"}}
Feb 02 09:47:56 compute-1 ceph-mon[80115]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 11] Flush lasted 12808 microseconds, and 2187 cpu microseconds.
Feb 02 09:47:56 compute-1 ceph-mon[80115]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Feb 02 09:47:56 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-09:47:56.481689) [db/flush_job.cc:967] [default] [JOB 11] Level-0 flush table #26: 787349 bytes OK
Feb 02 09:47:56 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-09:47:56.481815) [db/memtable_list.cc:519] [default] Level-0 commit table #26 started
Feb 02 09:47:56 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-09:47:56.483332) [db/memtable_list.cc:722] [default] Level-0 commit table #26: memtable #1 done
Feb 02 09:47:56 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-09:47:56.483355) EVENT_LOG_v1 {"time_micros": 1770025676483348, "job": 11, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Feb 02 09:47:56 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-09:47:56.483376) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Feb 02 09:47:56 compute-1 ceph-mon[80115]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 11] Try to delete WAL files size 1196582, prev total WAL file size 1196582, number of live WAL files 2.
Feb 02 09:47:56 compute-1 ceph-mon[80115]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000022.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb 02 09:47:56 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-09:47:56.484636) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F7300353032' seq:72057594037927935, type:22 .. '7061786F7300373534' seq:0, type:0; will stop at (end)
Feb 02 09:47:56 compute-1 ceph-mon[80115]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 12] Compacting 1@0 + 1@6 files to L6, score -1.00
Feb 02 09:47:56 compute-1 ceph-mon[80115]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 11 Base level 0, inputs: [26(768KB)], [24(13MB)]
Feb 02 09:47:56 compute-1 ceph-mon[80115]: rocksdb: EVENT_LOG_v1 {"time_micros": 1770025676484671, "job": 12, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [26], "files_L6": [24], "score": -1, "input_data_size": 15081215, "oldest_snapshot_seqno": -1}
Feb 02 09:47:56 compute-1 sudo[120234]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eaiwwgsyjtcmqptxlgleahabnhcuiokl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025676.3047552-817-222314262149931/AnsiballZ_file.py'
Feb 02 09:47:56 compute-1 sudo[120234]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:47:56 compute-1 ceph-mon[80115]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 12] Generated table #27: 4238 keys, 12202197 bytes, temperature: kUnknown
Feb 02 09:47:56 compute-1 ceph-mon[80115]: rocksdb: EVENT_LOG_v1 {"time_micros": 1770025676622078, "cf_name": "default", "job": 12, "event": "table_file_creation", "file_number": 27, "file_size": 12202197, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 12171406, "index_size": 19097, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 10629, "raw_key_size": 108602, "raw_average_key_size": 25, "raw_value_size": 12091318, "raw_average_value_size": 2853, "num_data_blocks": 807, "num_entries": 4238, "num_filter_entries": 4238, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1770025175, "oldest_key_time": 0, "file_creation_time": 1770025676, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "fac1d709-8a2a-487d-b05b-57255ec289c7", "db_session_id": "DE871D21TSCUFP8UED8E", "orig_file_number": 27, "seqno_to_time_mapping": "N/A"}}
Feb 02 09:47:56 compute-1 ceph-mon[80115]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Feb 02 09:47:56 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-09:47:56.622413) [db/compaction/compaction_job.cc:1663] [default] [JOB 12] Compacted 1@0 + 1@6 files to L6 => 12202197 bytes
Feb 02 09:47:56 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-09:47:56.623696) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 109.6 rd, 88.7 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.8, 13.6 +0.0 blob) out(11.6 +0.0 blob), read-write-amplify(34.7) write-amplify(15.5) OK, records in: 4753, records dropped: 515 output_compression: NoCompression
Feb 02 09:47:56 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-09:47:56.623728) EVENT_LOG_v1 {"time_micros": 1770025676623713, "job": 12, "event": "compaction_finished", "compaction_time_micros": 137541, "compaction_time_cpu_micros": 19722, "output_level": 6, "num_output_files": 1, "total_output_size": 12202197, "num_input_records": 4753, "num_output_records": 4238, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Feb 02 09:47:56 compute-1 ceph-mon[80115]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000026.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb 02 09:47:56 compute-1 ceph-mon[80115]: rocksdb: EVENT_LOG_v1 {"time_micros": 1770025676623964, "job": 12, "event": "table_file_deletion", "file_number": 26}
Feb 02 09:47:56 compute-1 ceph-mon[80115]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000024.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb 02 09:47:56 compute-1 ceph-mon[80115]: rocksdb: EVENT_LOG_v1 {"time_micros": 1770025676626109, "job": 12, "event": "table_file_deletion", "file_number": 24}
Feb 02 09:47:56 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-09:47:56.484549) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 02 09:47:56 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-09:47:56.626228) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 02 09:47:56 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-09:47:56.626236) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 02 09:47:56 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-09:47:56.626239) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 02 09:47:56 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-09:47:56.626241) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 02 09:47:56 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-09:47:56.626244) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 02 09:47:56 compute-1 python3.9[120236]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/neutron-metadata setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 02 09:47:56 compute-1 sudo[120234]: pam_unix(sudo:session): session closed for user root
Feb 02 09:47:56 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:47:56 : epoch 6980726a : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0bd80023e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:47:57 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:47:57 : epoch 6980726a : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0bd80023e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:47:57 compute-1 sshd-session[120130]: Invalid user user from 123.58.212.100 port 52900
Feb 02 09:47:57 compute-1 sudo[120387]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xcqltgbwoswsastpizheqydlbgadusbt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025676.973876-841-158348192090659/AnsiballZ_stat.py'
Feb 02 09:47:57 compute-1 sudo[120387]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:47:57 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:47:57 : epoch 6980726a : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0bb4003a20 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:47:57 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:47:57 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 09:47:57 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:47:57.320 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 09:47:57 compute-1 sshd-session[120130]: Connection closed by invalid user user 123.58.212.100 port 52900 [preauth]
Feb 02 09:47:57 compute-1 python3.9[120389]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 02 09:47:57 compute-1 sudo[120387]: pam_unix(sudo:session): session closed for user root
Feb 02 09:47:57 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:47:57 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:47:57 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:47:57.539 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:47:57 compute-1 sudo[120512]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tgqwnfufwxgofmuxvgxkdajfowauqpji ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025676.973876-841-158348192090659/AnsiballZ_copy.py'
Feb 02 09:47:57 compute-1 sudo[120512]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:47:57 compute-1 python3.9[120514]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1770025676.973876-841-158348192090659/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=01ba6f1c4701862bb94c27ffc13223400c80de38 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 02 09:47:58 compute-1 sudo[120512]: pam_unix(sudo:session): session closed for user root
Feb 02 09:47:58 compute-1 sudo[120664]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wbamuocacxytgtxthpwknswcpoigykbr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025678.2127445-889-221222198853526/AnsiballZ_file.py'
Feb 02 09:47:58 compute-1 sudo[120664]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:47:58 compute-1 sshd-session[120458]: Invalid user user from 123.58.212.100 port 52908
Feb 02 09:47:58 compute-1 python3.9[120666]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/bootstrap setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 02 09:47:58 compute-1 sudo[120664]: pam_unix(sudo:session): session closed for user root
Feb 02 09:47:58 compute-1 sshd-session[120458]: Connection closed by invalid user user 123.58.212.100 port 52908 [preauth]
Feb 02 09:47:58 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:47:58 : epoch 6980726a : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0bb8003fa0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:47:59 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:47:59 : epoch 6980726a : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0bb8003fa0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:47:59 compute-1 sudo[120819]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jopfozohpibjhujxxqvjytbiyozjmzqg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025678.8767219-913-97381034617978/AnsiballZ_stat.py'
Feb 02 09:47:59 compute-1 sudo[120819]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:47:59 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:47:59 : epoch 6980726a : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0bac003030 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:47:59 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:47:59 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:47:59 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:47:59.323 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:47:59 compute-1 python3.9[120821]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/bootstrap/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 02 09:47:59 compute-1 ceph-mon[80115]: pgmap v241: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Feb 02 09:47:59 compute-1 sudo[120819]: pam_unix(sudo:session): session closed for user root
Feb 02 09:47:59 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:47:59 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:47:59 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:47:59.542 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:47:59 compute-1 sudo[120942]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tynegvvyiwdrzswmdonbmcbgmknkqdhr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025678.8767219-913-97381034617978/AnsiballZ_copy.py'
Feb 02 09:47:59 compute-1 sudo[120942]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:47:59 compute-1 python3.9[120944]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/bootstrap/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1770025678.8767219-913-97381034617978/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=01ba6f1c4701862bb94c27ffc13223400c80de38 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 02 09:47:59 compute-1 sudo[120942]: pam_unix(sudo:session): session closed for user root
Feb 02 09:48:00 compute-1 sshd-session[120788]: Invalid user user from 123.58.212.100 port 52920
Feb 02 09:48:00 compute-1 sshd-session[120788]: Connection closed by invalid user user 123.58.212.100 port 52920 [preauth]
Feb 02 09:48:00 compute-1 sudo[121094]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-crnjldesltrrnjmbgavrgzdugdmjtyrd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025680.130342-961-264684981200184/AnsiballZ_file.py'
Feb 02 09:48:00 compute-1 sudo[121094]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:48:00 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 09:48:00 compute-1 python3.9[121096]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/repo-setup setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 02 09:48:00 compute-1 sudo[121094]: pam_unix(sudo:session): session closed for user root
Feb 02 09:48:00 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:48:00 : epoch 6980726a : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0bb4003a40 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:48:01 compute-1 sudo[121249]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tmolapnagzfubqnzptspphwfholqzeke ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025680.8098073-985-178843502379072/AnsiballZ_stat.py'
Feb 02 09:48:01 compute-1 sudo[121249]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:48:01 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:48:01 : epoch 6980726a : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0bb4003a40 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:48:01 compute-1 python3.9[121251]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/repo-setup/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 02 09:48:01 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[107236]: 02/02/2026 09:48:01 : epoch 6980726a : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0bb8003fa0 fd 38 proxy ignored for local
Feb 02 09:48:01 compute-1 kernel: ganesha.nfsd[115675]: segfault at 50 ip 00007f0c5e63c32e sp 00007f0bc27fb210 error 4 in libntirpc.so.5.8[7f0c5e621000+2c000] likely on CPU 2 (core 0, socket 2)
Feb 02 09:48:01 compute-1 kernel: Code: 47 20 66 41 89 86 f2 00 00 00 41 bf 01 00 00 00 b9 40 00 00 00 e9 af fd ff ff 66 90 48 8b 85 f8 00 00 00 48 8b 40 08 4c 8b 28 <45> 8b 65 50 49 8b 75 68 41 8b be 28 02 00 00 b9 40 00 00 00 e8 29
Feb 02 09:48:01 compute-1 sudo[121249]: pam_unix(sudo:session): session closed for user root
Feb 02 09:48:01 compute-1 systemd[1]: Started Process Core Dump (PID 121252/UID 0).
Feb 02 09:48:01 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:48:01 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:48:01 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:48:01.326 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:48:01 compute-1 ceph-mon[80115]: pgmap v242: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Feb 02 09:48:01 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:48:01 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:48:01 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:48:01.544 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:48:01 compute-1 sudo[121374]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cvoonovyuvtdvyqyjbkhtxsmuwohpykr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025680.8098073-985-178843502379072/AnsiballZ_copy.py'
Feb 02 09:48:01 compute-1 sudo[121374]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:48:01 compute-1 sshd-session[121097]: Invalid user user from 123.58.212.100 port 52930
Feb 02 09:48:01 compute-1 python3.9[121376]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/repo-setup/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1770025680.8098073-985-178843502379072/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=01ba6f1c4701862bb94c27ffc13223400c80de38 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 02 09:48:01 compute-1 sudo[121374]: pam_unix(sudo:session): session closed for user root
Feb 02 09:48:01 compute-1 sshd-session[121097]: Connection closed by invalid user user 123.58.212.100 port 52930 [preauth]
Feb 02 09:48:02 compute-1 systemd-coredump[121253]: Process 107242 (ganesha.nfsd) of user 0 dumped core.
                                                    
                                                    Stack trace of thread 59:
                                                    #0  0x00007f0c5e63c32e n/a (/usr/lib64/libntirpc.so.5.8 + 0x2232e)
                                                    ELF object binary architecture: AMD x86-64
Feb 02 09:48:02 compute-1 systemd[1]: systemd-coredump@4-121252-0.service: Deactivated successfully.
Feb 02 09:48:02 compute-1 podman[121462]: 2026-02-02 09:48:02.201834604 +0000 UTC m=+0.026885060 container died 3eeaf63e40bf8f55b6836e53733dc5f05aee9cfaebe087d8c86b57c3e08bcca1 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Feb 02 09:48:02 compute-1 systemd[1]: var-lib-containers-storage-overlay-02c4e8818d725dcc6f5dc4c5bdc381f7c01eae4ee14542779dfb9b0e85e8592f-merged.mount: Deactivated successfully.
Feb 02 09:48:02 compute-1 podman[121462]: 2026-02-02 09:48:02.239584884 +0000 UTC m=+0.064635320 container remove 3eeaf63e40bf8f55b6836e53733dc5f05aee9cfaebe087d8c86b57c3e08bcca1 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, CEPH_REF=squid, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Feb 02 09:48:02 compute-1 systemd[1]: ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1@nfs.cephfs.0.0.compute-1.mhzhsx.service: Main process exited, code=exited, status=139/n/a
Feb 02 09:48:02 compute-1 sudo[121559]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mowrpjdtiwmldukqwrjzsopxruenqjgb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025682.0762987-1033-227613480841170/AnsiballZ_file.py'
Feb 02 09:48:02 compute-1 sudo[121559]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:48:02 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Feb 02 09:48:02 compute-1 systemd[1]: ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1@nfs.cephfs.0.0.compute-1.mhzhsx.service: Failed with result 'exit-code'.
Feb 02 09:48:02 compute-1 systemd[1]: ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1@nfs.cephfs.0.0.compute-1.mhzhsx.service: Consumed 1.142s CPU time.
Feb 02 09:48:02 compute-1 python3.9[121562]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 02 09:48:02 compute-1 sudo[121559]: pam_unix(sudo:session): session closed for user root
Feb 02 09:48:03 compute-1 sudo[121726]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ewluhyxjitbvvpywshlommnzvvfoafse ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025682.6906075-1057-20545773214041/AnsiballZ_stat.py'
Feb 02 09:48:03 compute-1 sudo[121726]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:48:03 compute-1 sshd-session[121499]: Invalid user user from 123.58.212.100 port 49390
Feb 02 09:48:03 compute-1 python3.9[121728]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 02 09:48:03 compute-1 sudo[121726]: pam_unix(sudo:session): session closed for user root
Feb 02 09:48:03 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:48:03 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:48:03 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:48:03.329 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:48:03 compute-1 sshd-session[121499]: Connection closed by invalid user user 123.58.212.100 port 49390 [preauth]
Feb 02 09:48:03 compute-1 ceph-mon[80115]: pgmap v243: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Feb 02 09:48:03 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:48:03 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:48:03 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:48:03.546 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:48:03 compute-1 sudo[121849]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ujqcdjdfmoejjngjqqlqpmymbipisqlz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025682.6906075-1057-20545773214041/AnsiballZ_copy.py'
Feb 02 09:48:03 compute-1 sudo[121849]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:48:03 compute-1 python3.9[121853]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1770025682.6906075-1057-20545773214041/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=01ba6f1c4701862bb94c27ffc13223400c80de38 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 02 09:48:03 compute-1 sudo[121849]: pam_unix(sudo:session): session closed for user root
Feb 02 09:48:04 compute-1 ceph-mon[80115]: pgmap v244: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Feb 02 09:48:04 compute-1 sshd-session[121850]: Invalid user user from 123.58.212.100 port 49396
Feb 02 09:48:04 compute-1 sshd-session[121850]: Connection closed by invalid user user 123.58.212.100 port 49396 [preauth]
Feb 02 09:48:05 compute-1 sshd-session[115619]: Connection closed by 192.168.122.30 port 39040
Feb 02 09:48:05 compute-1 sshd-session[115616]: pam_unix(sshd:session): session closed for user zuul
Feb 02 09:48:05 compute-1 systemd-logind[805]: Session 47 logged out. Waiting for processes to exit.
Feb 02 09:48:05 compute-1 systemd[1]: session-47.scope: Deactivated successfully.
Feb 02 09:48:05 compute-1 systemd[1]: session-47.scope: Consumed 21.079s CPU time.
Feb 02 09:48:05 compute-1 systemd-logind[805]: Removed session 47.
Feb 02 09:48:05 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:48:05 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:48:05 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:48:05.333 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:48:05 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:48:05 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:48:05 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:48:05.549 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:48:05 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 09:48:06 compute-1 sshd-session[121879]: Invalid user user from 123.58.212.100 port 49404
Feb 02 09:48:07 compute-1 sshd-session[121879]: Connection closed by invalid user user 123.58.212.100 port 49404 [preauth]
Feb 02 09:48:07 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-haproxy-nfs-cephfs-compute-1-sryqbx[86108]: [WARNING] 032/094807 (4) : Server backend/nfs.cephfs.0 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Feb 02 09:48:07 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:48:07 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:48:07 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:48:07.336 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:48:07 compute-1 ceph-mon[80115]: pgmap v245: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Feb 02 09:48:07 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:48:07 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:48:07 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:48:07.552 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:48:08 compute-1 sshd-session[121882]: Invalid user user from 123.58.212.100 port 49414
Feb 02 09:48:08 compute-1 sshd-session[121882]: Connection closed by invalid user user 123.58.212.100 port 49414 [preauth]
Feb 02 09:48:09 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:48:09 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:48:09 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:48:09.339 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:48:09 compute-1 ceph-mon[80115]: pgmap v246: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Feb 02 09:48:09 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:48:09 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:48:09 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:48:09.554 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:48:09 compute-1 sshd-session[121885]: Invalid user user from 123.58.212.100 port 49416
Feb 02 09:48:10 compute-1 sshd-session[121885]: Connection closed by invalid user user 123.58.212.100 port 49416 [preauth]
Feb 02 09:48:10 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 09:48:10 compute-1 sshd-session[121890]: Accepted publickey for zuul from 192.168.122.30 port 54390 ssh2: ECDSA SHA256:RIWOugHsRom13QN8+H2eekzMj6VNcm6gUxie+zDStiQ
Feb 02 09:48:10 compute-1 systemd-logind[805]: New session 48 of user zuul.
Feb 02 09:48:10 compute-1 systemd[1]: Started Session 48 of User zuul.
Feb 02 09:48:10 compute-1 sshd-session[121890]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Feb 02 09:48:11 compute-1 sshd-session[121887]: Invalid user user from 123.58.212.100 port 49422
Feb 02 09:48:11 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:48:11 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 09:48:11 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:48:11.341 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 09:48:11 compute-1 ceph-mon[80115]: pgmap v247: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Feb 02 09:48:11 compute-1 sshd-session[121887]: Connection closed by invalid user user 123.58.212.100 port 49422 [preauth]
Feb 02 09:48:11 compute-1 sudo[122043]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cqvotvcgsywcjrrtatbcykgtxheyqfzm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025690.9911823-22-33741793853708/AnsiballZ_file.py'
Feb 02 09:48:11 compute-1 sudo[122043]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:48:11 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:48:11 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 09:48:11 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:48:11.557 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 09:48:11 compute-1 python3.9[122045]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/openstack/config/ceph state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 02 09:48:11 compute-1 sudo[122043]: pam_unix(sudo:session): session closed for user root
Feb 02 09:48:12 compute-1 sudo[122197]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vcymuzdvzpzvazlyxkzrspaqqsvqaarn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025691.9857852-58-248018369872718/AnsiballZ_stat.py'
Feb 02 09:48:12 compute-1 sudo[122197]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:48:12 compute-1 ceph-mon[80115]: pgmap v248: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Feb 02 09:48:12 compute-1 systemd[1]: ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1@nfs.cephfs.0.0.compute-1.mhzhsx.service: Scheduled restart job, restart counter is at 5.
Feb 02 09:48:12 compute-1 systemd[1]: Stopped Ceph nfs.cephfs.0.0.compute-1.mhzhsx for d241d473-9fcb-5f74-b163-f1ca4454e7f1.
Feb 02 09:48:12 compute-1 systemd[1]: ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1@nfs.cephfs.0.0.compute-1.mhzhsx.service: Consumed 1.142s CPU time.
Feb 02 09:48:12 compute-1 systemd[1]: Starting Ceph nfs.cephfs.0.0.compute-1.mhzhsx for d241d473-9fcb-5f74-b163-f1ca4454e7f1...
Feb 02 09:48:12 compute-1 python3.9[122199]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/ceph/ceph.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 02 09:48:12 compute-1 sudo[122197]: pam_unix(sudo:session): session closed for user root
Feb 02 09:48:12 compute-1 sshd-session[122046]: Invalid user user from 123.58.212.100 port 33768
Feb 02 09:48:12 compute-1 podman[122294]: 2026-02-02 09:48:12.791923113 +0000 UTC m=+0.055012941 container create 7d1344a9b85ae5cf187282ef94fd744fe626fdac05803bf052cfc41639d346b5 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Feb 02 09:48:12 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8524dd3387d19fcf089347383faa64c8e2290cf613dceca79426c8e374e209c0/merged/etc/ganesha supports timestamps until 2038 (0x7fffffff)
Feb 02 09:48:12 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8524dd3387d19fcf089347383faa64c8e2290cf613dceca79426c8e374e209c0/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Feb 02 09:48:12 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8524dd3387d19fcf089347383faa64c8e2290cf613dceca79426c8e374e209c0/merged/etc/ceph/keyring supports timestamps until 2038 (0x7fffffff)
Feb 02 09:48:12 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8524dd3387d19fcf089347383faa64c8e2290cf613dceca79426c8e374e209c0/merged/var/lib/ceph/radosgw/ceph-nfs.cephfs.0.0.compute-1.mhzhsx-rgw/keyring supports timestamps until 2038 (0x7fffffff)
Feb 02 09:48:12 compute-1 podman[122294]: 2026-02-02 09:48:12.858533554 +0000 UTC m=+0.121623382 container init 7d1344a9b85ae5cf187282ef94fd744fe626fdac05803bf052cfc41639d346b5 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx, OSD_FLAVOR=default, org.label-schema.license=GPLv2, io.buildah.version=1.40.1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/)
Feb 02 09:48:12 compute-1 podman[122294]: 2026-02-02 09:48:12.767210733 +0000 UTC m=+0.030300601 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Feb 02 09:48:12 compute-1 podman[122294]: 2026-02-02 09:48:12.873273168 +0000 UTC m=+0.136362996 container start 7d1344a9b85ae5cf187282ef94fd744fe626fdac05803bf052cfc41639d346b5 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_REF=squid, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default)
Feb 02 09:48:12 compute-1 bash[122294]: 7d1344a9b85ae5cf187282ef94fd744fe626fdac05803bf052cfc41639d346b5
Feb 02 09:48:12 compute-1 systemd[1]: Started Ceph nfs.cephfs.0.0.compute-1.mhzhsx for d241d473-9fcb-5f74-b163-f1ca4454e7f1.
Feb 02 09:48:12 compute-1 sshd-session[122046]: Connection closed by invalid user user 123.58.212.100 port 33768 [preauth]
Feb 02 09:48:12 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:48:12 : epoch 698072dc : compute-1 : ganesha.nfsd-2[main] init_logging :LOG :NULL :LOG: Setting log level for all components to NIV_EVENT
Feb 02 09:48:12 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:48:12 : epoch 698072dc : compute-1 : ganesha.nfsd-2[main] main :MAIN :EVENT :ganesha.nfsd Starting: Ganesha Version 5.9
Feb 02 09:48:12 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:48:12 : epoch 698072dc : compute-1 : ganesha.nfsd-2[main] nfs_set_param_from_conf :NFS STARTUP :EVENT :Configuration file successfully parsed
Feb 02 09:48:12 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:48:12 : epoch 698072dc : compute-1 : ganesha.nfsd-2[main] monitoring_init :NFS STARTUP :EVENT :Init monitoring at 0.0.0.0:9587
Feb 02 09:48:12 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:48:12 : epoch 698072dc : compute-1 : ganesha.nfsd-2[main] fsal_init_fds_limit :MDCACHE LRU :EVENT :Setting the system-imposed limit on FDs to 1048576.
Feb 02 09:48:12 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:48:12 : epoch 698072dc : compute-1 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :Initializing ID Mapper.
Feb 02 09:48:12 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:48:12 : epoch 698072dc : compute-1 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :ID Mapper successfully initialized.
Feb 02 09:48:12 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:48:12 : epoch 698072dc : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Feb 02 09:48:13 compute-1 sudo[122425]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-arljoqerspepfyjywddajejkfcucqcsr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025691.9857852-58-248018369872718/AnsiballZ_copy.py'
Feb 02 09:48:13 compute-1 sudo[122425]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:48:13 compute-1 python3.9[122429]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/ceph/ceph.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1770025691.9857852-58-248018369872718/.source.conf _original_basename=ceph.conf follow=False checksum=d5af35537b3c8ec6eada2ba8657e5bbbf335fb7a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 02 09:48:13 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:48:13 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 09:48:13 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:48:13.344 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 09:48:13 compute-1 sudo[122425]: pam_unix(sudo:session): session closed for user root
Feb 02 09:48:13 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:48:13 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:48:13 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:48:13.560 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:48:13 compute-1 sudo[122579]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iafilxmqplsemzsevvfqcjkfqpjqvbhl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025693.507988-58-91701217454511/AnsiballZ_stat.py'
Feb 02 09:48:13 compute-1 sudo[122579]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:48:13 compute-1 python3.9[122581]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/ceph/ceph.client.openstack.keyring follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 02 09:48:14 compute-1 sudo[122579]: pam_unix(sudo:session): session closed for user root
Feb 02 09:48:14 compute-1 sudo[122582]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Feb 02 09:48:14 compute-1 sudo[122582]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:48:14 compute-1 sudo[122582]: pam_unix(sudo:session): session closed for user root
Feb 02 09:48:14 compute-1 sshd-session[122426]: Invalid user user from 123.58.212.100 port 33782
Feb 02 09:48:14 compute-1 sudo[122727]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ekjafrpnhjgewxbpcdmdlzfoygqemvyo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025693.507988-58-91701217454511/AnsiballZ_copy.py'
Feb 02 09:48:14 compute-1 sudo[122727]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:48:14 compute-1 sshd-session[122426]: Connection closed by invalid user user 123.58.212.100 port 33782 [preauth]
Feb 02 09:48:14 compute-1 python3.9[122729]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/ceph/ceph.client.openstack.keyring mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1770025693.507988-58-91701217454511/.source.keyring _original_basename=ceph.client.openstack.keyring follow=False checksum=b59eb4ee1ef760db0b0353d13f50139cad503c44 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 02 09:48:14 compute-1 sudo[122727]: pam_unix(sudo:session): session closed for user root
Feb 02 09:48:15 compute-1 sshd-session[121893]: Connection closed by 192.168.122.30 port 54390
Feb 02 09:48:15 compute-1 sshd-session[121890]: pam_unix(sshd:session): session closed for user zuul
Feb 02 09:48:15 compute-1 systemd[1]: session-48.scope: Deactivated successfully.
Feb 02 09:48:15 compute-1 systemd[1]: session-48.scope: Consumed 2.642s CPU time.
Feb 02 09:48:15 compute-1 systemd-logind[805]: Session 48 logged out. Waiting for processes to exit.
Feb 02 09:48:15 compute-1 systemd-logind[805]: Removed session 48.
Feb 02 09:48:15 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:48:15 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:48:15 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:48:15.348 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:48:15 compute-1 ceph-mon[80115]: pgmap v249: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 852 B/s rd, 85 B/s wr, 0 op/s
Feb 02 09:48:15 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:48:15 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:48:15 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:48:15.562 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:48:15 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 09:48:15 compute-1 sshd-session[122730]: Invalid user user from 123.58.212.100 port 33788
Feb 02 09:48:15 compute-1 sshd-session[122730]: Connection closed by invalid user user 123.58.212.100 port 33788 [preauth]
Feb 02 09:48:17 compute-1 sshd-session[122757]: Invalid user user from 123.58.212.100 port 33798
Feb 02 09:48:17 compute-1 sshd-session[122757]: Connection closed by invalid user user 123.58.212.100 port 33798 [preauth]
Feb 02 09:48:17 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:48:17 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 09:48:17 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:48:17.351 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 09:48:17 compute-1 ceph-mon[80115]: pgmap v250: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 596 B/s rd, 85 B/s wr, 0 op/s
Feb 02 09:48:17 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Feb 02 09:48:17 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:48:17 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 09:48:17 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:48:17.565 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 09:48:18 compute-1 sshd-session[122760]: Invalid user user from 123.58.212.100 port 33812
Feb 02 09:48:18 compute-1 sshd-session[122760]: Connection closed by invalid user user 123.58.212.100 port 33812 [preauth]
Feb 02 09:48:18 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:48:18 : epoch 698072dc : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Feb 02 09:48:18 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:48:18 : epoch 698072dc : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Feb 02 09:48:19 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:48:19 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:48:19 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:48:19.355 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:48:19 compute-1 ceph-mon[80115]: pgmap v251: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 1.2 KiB/s rd, 597 B/s wr, 1 op/s
Feb 02 09:48:19 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:48:19 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:48:19 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:48:19.568 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:48:19 compute-1 sshd-session[122763]: Invalid user user from 123.58.212.100 port 33826
Feb 02 09:48:20 compute-1 sshd-session[122763]: Connection closed by invalid user user 123.58.212.100 port 33826 [preauth]
Feb 02 09:48:20 compute-1 sshd-session[122765]: Accepted publickey for zuul from 192.168.122.30 port 57730 ssh2: ECDSA SHA256:RIWOugHsRom13QN8+H2eekzMj6VNcm6gUxie+zDStiQ
Feb 02 09:48:20 compute-1 systemd-logind[805]: New session 49 of user zuul.
Feb 02 09:48:20 compute-1 systemd[1]: Started Session 49 of User zuul.
Feb 02 09:48:20 compute-1 sshd-session[122765]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Feb 02 09:48:20 compute-1 ceph-mon[80115]: pgmap v252: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 1.2 KiB/s rd, 596 B/s wr, 1 op/s
Feb 02 09:48:20 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 09:48:21 compute-1 python3.9[122921]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 02 09:48:21 compute-1 sshd-session[122821]: Invalid user user from 123.58.212.100 port 33832
Feb 02 09:48:21 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:48:21 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:48:21 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:48:21.359 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:48:21 compute-1 sshd-session[122821]: Connection closed by invalid user user 123.58.212.100 port 33832 [preauth]
Feb 02 09:48:21 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:48:21 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:48:21 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:48:21.571 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:48:22 compute-1 sudo[123077]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fzcrgarghnrjessyvsskkjjklsdiafsk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025701.7137465-58-184067797142119/AnsiballZ_file.py'
Feb 02 09:48:22 compute-1 sudo[123077]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:48:22 compute-1 python3.9[123079]: ansible-ansible.builtin.file Invoked with group=zuul mode=0750 owner=zuul path=/var/lib/edpm-config/firewall setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 02 09:48:22 compute-1 sudo[123077]: pam_unix(sudo:session): session closed for user root
Feb 02 09:48:22 compute-1 sshd-session[122979]: Invalid user user from 123.58.212.100 port 51134
Feb 02 09:48:22 compute-1 sudo[123230]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yxsqzeyyrwmvtechmimdabroampmgosr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025702.5166626-58-261023849090580/AnsiballZ_file.py'
Feb 02 09:48:22 compute-1 sudo[123230]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:48:22 compute-1 sshd-session[122979]: Connection closed by invalid user user 123.58.212.100 port 51134 [preauth]
Feb 02 09:48:23 compute-1 python3.9[123232]: ansible-ansible.builtin.file Invoked with group=openvswitch owner=openvswitch path=/var/lib/openvswitch/ovn setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Feb 02 09:48:23 compute-1 sudo[123230]: pam_unix(sudo:session): session closed for user root
Feb 02 09:48:23 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:48:23 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:48:23 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:48:23.361 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:48:23 compute-1 ceph-mon[80115]: pgmap v253: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 1.2 KiB/s rd, 597 B/s wr, 1 op/s
Feb 02 09:48:23 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:48:23 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:48:23 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:48:23.573 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:48:23 compute-1 python3.9[123384]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 02 09:48:24 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-haproxy-nfs-cephfs-compute-1-sryqbx[86108]: [WARNING] 032/094824 (4) : Server backend/nfs.cephfs.1 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 1 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Feb 02 09:48:24 compute-1 sshd-session[123257]: Invalid user user from 123.58.212.100 port 51144
Feb 02 09:48:24 compute-1 sudo[123534]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qnfbirgnpjztbwrhrdxczekioxpwrqyi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025704.1327333-127-138956296091377/AnsiballZ_seboolean.py'
Feb 02 09:48:24 compute-1 sudo[123534]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:48:24 compute-1 sshd-session[123257]: Connection closed by invalid user user 123.58.212.100 port 51144 [preauth]
Feb 02 09:48:24 compute-1 python3.9[123536]: ansible-ansible.posix.seboolean Invoked with name=virt_sandbox_use_netlink persistent=True state=True ignore_selinux_state=False
Feb 02 09:48:25 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:48:25 : epoch 698072dc : compute-1 : ganesha.nfsd-2[main] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Feb 02 09:48:25 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:48:25 : epoch 698072dc : compute-1 : ganesha.nfsd-2[main] main :NFS STARTUP :WARN :No export entries found in configuration file !!!
Feb 02 09:48:25 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:48:25 : epoch 698072dc : compute-1 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:24): Unknown block (RADOS_URLS)
Feb 02 09:48:25 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:48:25 : epoch 698072dc : compute-1 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:29): Unknown block (RGW)
Feb 02 09:48:25 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:48:25 : epoch 698072dc : compute-1 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :CAP_SYS_RESOURCE was successfully removed for proper quota management in FSAL
Feb 02 09:48:25 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:48:25 : epoch 698072dc : compute-1 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :currently set capabilities are: cap_chown,cap_dac_override,cap_fowner,cap_fsetid,cap_kill,cap_setgid,cap_setuid,cap_setpcap,cap_net_bind_service,cap_sys_chroot,cap_setfcap=ep
Feb 02 09:48:25 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:48:25 : epoch 698072dc : compute-1 : ganesha.nfsd-2[main] gsh_dbus_pkginit :DBUS :CRIT :dbus_bus_get failed (Failed to connect to socket /run/dbus/system_bus_socket: No such file or directory)
Feb 02 09:48:25 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:48:25 : epoch 698072dc : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Feb 02 09:48:25 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:48:25 : epoch 698072dc : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Feb 02 09:48:25 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:48:25 : epoch 698072dc : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Feb 02 09:48:25 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:48:25 : epoch 698072dc : compute-1 : ganesha.nfsd-2[main] nfs_Init_svc :DISP :CRIT :Cannot acquire credentials for principal nfs
Feb 02 09:48:25 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:48:25 : epoch 698072dc : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Feb 02 09:48:25 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:48:25 : epoch 698072dc : compute-1 : ganesha.nfsd-2[main] nfs_Init_admin_thread :NFS CB :EVENT :Admin thread initialized
Feb 02 09:48:25 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:48:25 : epoch 698072dc : compute-1 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :EVENT :Callback creds directory (/var/run/ganesha) already exists
Feb 02 09:48:25 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:48:25 : epoch 698072dc : compute-1 : ganesha.nfsd-2[main] find_keytab_entry :NFS CB :WARN :Configuration file does not specify default realm while getting default realm name
Feb 02 09:48:25 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:48:25 : epoch 698072dc : compute-1 : ganesha.nfsd-2[main] gssd_refresh_krb5_machine_credential :NFS CB :CRIT :ERROR: gssd_refresh_krb5_machine_credential: no usable keytab entry found in keytab /etc/krb5.keytab for connection with host localhost
Feb 02 09:48:25 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:48:25 : epoch 698072dc : compute-1 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :WARN :gssd_refresh_krb5_machine_credential failed (-1765328160:0)
Feb 02 09:48:25 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:48:25 : epoch 698072dc : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :Starting delayed executor.
Feb 02 09:48:25 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:48:25 : epoch 698072dc : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :gsh_dbusthread was started successfully
Feb 02 09:48:25 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:48:25 : epoch 698072dc : compute-1 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :CRIT :DBUS not initialized, service thread exiting
Feb 02 09:48:25 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:48:25 : epoch 698072dc : compute-1 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :EVENT :shutdown
Feb 02 09:48:25 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:48:25 : epoch 698072dc : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :admin thread was started successfully
Feb 02 09:48:25 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:48:25 : epoch 698072dc : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :reaper thread was started successfully
Feb 02 09:48:25 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:48:25 : epoch 698072dc : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :General fridge was started successfully
Feb 02 09:48:25 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:48:25 : epoch 698072dc : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Feb 02 09:48:25 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:48:25 : epoch 698072dc : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :             NFS SERVER INITIALIZED
Feb 02 09:48:25 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:48:25 : epoch 698072dc : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Feb 02 09:48:25 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:48:25 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dd8000df0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:48:25 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:48:25 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dcc000da0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:48:25 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:48:25 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 09:48:25 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:48:25.365 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 09:48:25 compute-1 ceph-mon[80115]: pgmap v254: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 2.0 KiB/s rd, 938 B/s wr, 3 op/s
Feb 02 09:48:25 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:48:25 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 09:48:25 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:48:25.575 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 09:48:25 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 09:48:25 compute-1 sudo[123560]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 02 09:48:25 compute-1 dbus-broker-launch[780]: avc:  op=load_policy lsm=selinux seqno=9 res=1
Feb 02 09:48:25 compute-1 sudo[123560]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:48:25 compute-1 sudo[123560]: pam_unix(sudo:session): session closed for user root
Feb 02 09:48:25 compute-1 sudo[123585]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/d241d473-9fcb-5f74-b163-f1ca4454e7f1/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Feb 02 09:48:25 compute-1 sudo[123585]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:48:25 compute-1 sudo[123534]: pam_unix(sudo:session): session closed for user root
Feb 02 09:48:25 compute-1 sshd-session[123538]: Invalid user user from 123.58.212.100 port 51150
Feb 02 09:48:26 compute-1 sshd-session[123538]: Connection closed by invalid user user 123.58.212.100 port 51150 [preauth]
Feb 02 09:48:26 compute-1 sudo[123585]: pam_unix(sudo:session): session closed for user root
Feb 02 09:48:26 compute-1 sudo[123793]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-psdnknlnjpzlzshuvabdsjnbqfvwsxcj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025706.4619625-157-190051453606586/AnsiballZ_setup.py'
Feb 02 09:48:26 compute-1 sudo[123793]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:48:26 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:48:26 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4db8000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:48:27 compute-1 python3.9[123795]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Feb 02 09:48:27 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:48:27 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dbc000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:48:27 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb 02 09:48:27 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb 02 09:48:27 compute-1 ceph-mon[80115]: pgmap v255: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 1.4 KiB/s rd, 852 B/s wr, 2 op/s
Feb 02 09:48:27 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Feb 02 09:48:27 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Feb 02 09:48:27 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb 02 09:48:27 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb 02 09:48:27 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Feb 02 09:48:27 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Feb 02 09:48:27 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Feb 02 09:48:27 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-haproxy-nfs-cephfs-compute-1-sryqbx[86108]: [WARNING] 032/094827 (4) : Server backend/nfs.cephfs.0 is UP, reason: Layer4 check passed, check duration: 0ms. 2 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Feb 02 09:48:27 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:48:27 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dc8001230 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:48:27 compute-1 sudo[123793]: pam_unix(sudo:session): session closed for user root
Feb 02 09:48:27 compute-1 sshd-session[123665]: Invalid user user from 123.58.212.100 port 51164
Feb 02 09:48:27 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:48:27 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 09:48:27 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:48:27.368 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 09:48:27 compute-1 sshd-session[123665]: Connection closed by invalid user user 123.58.212.100 port 51164 [preauth]
Feb 02 09:48:27 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:48:27 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 09:48:27 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:48:27.578 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 09:48:27 compute-1 sudo[123877]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tslezozfghkqroygbrxrubcilrbdqfdf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025706.4619625-157-190051453606586/AnsiballZ_dnf.py'
Feb 02 09:48:27 compute-1 sudo[123877]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:48:27 compute-1 python3.9[123879]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Feb 02 09:48:28 compute-1 ceph-mon[80115]: pgmap v256: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 1.6 KiB/s rd, 960 B/s wr, 2 op/s
Feb 02 09:48:28 compute-1 sshd-session[123880]: Invalid user user from 123.58.212.100 port 51168
Feb 02 09:48:28 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:48:28 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dcc000da0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:48:29 compute-1 sudo[123877]: pam_unix(sudo:session): session closed for user root
Feb 02 09:48:29 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:48:29 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4db80016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:48:29 compute-1 sshd-session[123880]: Connection closed by invalid user user 123.58.212.100 port 51168 [preauth]
Feb 02 09:48:29 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:48:29 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dbc0016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:48:29 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:48:29 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:48:29 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:48:29.373 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:48:29 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:48:29 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:48:29 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:48:29.582 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:48:30 compute-1 sudo[124035]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qpjlawzyzerbiulsvewgomquznfcjbyc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025709.506918-193-127178589463142/AnsiballZ_systemd.py'
Feb 02 09:48:30 compute-1 sudo[124035]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:48:30 compute-1 ceph-mon[80115]: pgmap v257: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 960 B/s rd, 384 B/s wr, 1 op/s
Feb 02 09:48:30 compute-1 sshd-session[123908]: Invalid user user from 123.58.212.100 port 51176
Feb 02 09:48:30 compute-1 python3.9[124037]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Feb 02 09:48:30 compute-1 sudo[124035]: pam_unix(sudo:session): session closed for user root
Feb 02 09:48:30 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 09:48:30 compute-1 sshd-session[123908]: Connection closed by invalid user user 123.58.212.100 port 51176 [preauth]
Feb 02 09:48:30 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:48:30 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dc8001d50 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:48:31 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:48:31 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dcc001ec0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:48:31 compute-1 sudo[124193]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oegjxizgcyfcfgzfqvzbxewktxnipenp ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1770025710.7729995-217-246238904108487/AnsiballZ_edpm_nftables_snippet.py'
Feb 02 09:48:31 compute-1 sudo[124193]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:48:31 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:48:31 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4db80016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:48:31 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:48:31 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:48:31 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:48:31.376 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:48:31 compute-1 python3[124195]: ansible-osp.edpm.edpm_nftables_snippet Invoked with content=- rule_name: 118 neutron vxlan networks
                                             rule:
                                               proto: udp
                                               dport: 4789
                                           - rule_name: 119 neutron geneve networks
                                             rule:
                                               proto: udp
                                               dport: 6081
                                               state: ["UNTRACKED"]
                                           - rule_name: 120 neutron geneve networks no conntrack
                                             rule:
                                               proto: udp
                                               dport: 6081
                                               table: raw
                                               chain: OUTPUT
                                               jump: NOTRACK
                                               action: append
                                               state: []
                                           - rule_name: 121 neutron geneve networks no conntrack
                                             rule:
                                               proto: udp
                                               dport: 6081
                                               table: raw
                                               chain: PREROUTING
                                               jump: NOTRACK
                                               action: append
                                               state: []
                                            dest=/var/lib/edpm-config/firewall/ovn.yaml state=present
Feb 02 09:48:31 compute-1 sudo[124193]: pam_unix(sudo:session): session closed for user root
Feb 02 09:48:31 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:48:31 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:48:31 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:48:31.593 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:48:31 compute-1 sshd-session[124118]: Invalid user user from 123.58.212.100 port 51188
Feb 02 09:48:32 compute-1 sshd-session[124118]: Connection closed by invalid user user 123.58.212.100 port 51188 [preauth]
Feb 02 09:48:32 compute-1 sudo[124345]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-motthoybmhnwamsedrsaukdgmoosfatu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025711.7913682-244-153124373436416/AnsiballZ_file.py'
Feb 02 09:48:32 compute-1 sudo[124345]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:48:32 compute-1 sudo[124348]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 02 09:48:32 compute-1 sudo[124348]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:48:32 compute-1 sudo[124348]: pam_unix(sudo:session): session closed for user root
Feb 02 09:48:32 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:48:32 : epoch 698072dc : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Feb 02 09:48:32 compute-1 python3.9[124347]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 02 09:48:32 compute-1 ceph-mon[80115]: pgmap v258: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 960 B/s rd, 384 B/s wr, 1 op/s
Feb 02 09:48:32 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb 02 09:48:32 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb 02 09:48:32 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Feb 02 09:48:32 compute-1 sudo[124345]: pam_unix(sudo:session): session closed for user root
Feb 02 09:48:32 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:48:32 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dbc0016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:48:33 compute-1 sudo[124525]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hxopzjafgwwkoldsdamkihyqwwfycnoe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025712.524172-268-61988204592891/AnsiballZ_stat.py'
Feb 02 09:48:33 compute-1 sudo[124525]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:48:33 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:48:33 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dc8001d50 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:48:33 compute-1 python3.9[124527]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 02 09:48:33 compute-1 sudo[124525]: pam_unix(sudo:session): session closed for user root
Feb 02 09:48:33 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:48:33 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dcc001ec0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:48:33 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:48:33 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 09:48:33 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:48:33.378 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 09:48:33 compute-1 sshd-session[124373]: Invalid user user from 123.58.212.100 port 33634
Feb 02 09:48:33 compute-1 sudo[124603]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cmlgbfmoaoubdxbstngpoxjifxquhcsj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025712.524172-268-61988204592891/AnsiballZ_file.py'
Feb 02 09:48:33 compute-1 sudo[124603]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:48:33 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:48:33 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 09:48:33 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:48:33.596 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 09:48:33 compute-1 sshd-session[124373]: Connection closed by invalid user user 123.58.212.100 port 33634 [preauth]
Feb 02 09:48:33 compute-1 python3.9[124605]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 02 09:48:33 compute-1 sudo[124603]: pam_unix(sudo:session): session closed for user root
Feb 02 09:48:34 compute-1 sudo[124706]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Feb 02 09:48:34 compute-1 sudo[124706]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:48:34 compute-1 sudo[124706]: pam_unix(sudo:session): session closed for user root
Feb 02 09:48:34 compute-1 sudo[124780]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hukgpupgeexmdgmyesgnxkardpwualtr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025713.8819-304-130795498529386/AnsiballZ_stat.py'
Feb 02 09:48:34 compute-1 sudo[124780]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:48:34 compute-1 ceph-mon[80115]: pgmap v259: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 960 B/s rd, 384 B/s wr, 1 op/s
Feb 02 09:48:34 compute-1 python3.9[124782]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 02 09:48:34 compute-1 sudo[124780]: pam_unix(sudo:session): session closed for user root
Feb 02 09:48:34 compute-1 sudo[124858]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qtwtezuilcvflwzjkprqzfrownuzmejg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025713.8819-304-130795498529386/AnsiballZ_file.py'
Feb 02 09:48:34 compute-1 sudo[124858]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:48:34 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:48:34 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4db80016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:48:34 compute-1 python3.9[124860]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.gxzff1_3 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 02 09:48:34 compute-1 sudo[124858]: pam_unix(sudo:session): session closed for user root
Feb 02 09:48:35 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:48:35 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dbc0016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:48:35 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:48:35 : epoch 698072dc : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Feb 02 09:48:35 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:48:35 : epoch 698072dc : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Feb 02 09:48:35 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:48:35 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dc8001d50 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:48:35 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:48:35 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:48:35 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:48:35.380 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:48:35 compute-1 sudo[125013]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-powtsiwjeppkbbdwunjisxhdexvhbwme ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025715.119811-340-228610888088792/AnsiballZ_stat.py'
Feb 02 09:48:35 compute-1 sudo[125013]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:48:35 compute-1 python3.9[125015]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 02 09:48:35 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 09:48:35 compute-1 sudo[125013]: pam_unix(sudo:session): session closed for user root
Feb 02 09:48:35 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:48:35 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 09:48:35 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:48:35.598 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 09:48:35 compute-1 sudo[125091]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uquyioqmmenzfuurezertksmmtcnfgmh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025715.119811-340-228610888088792/AnsiballZ_file.py'
Feb 02 09:48:35 compute-1 sudo[125091]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:48:35 compute-1 sshd-session[124862]: Invalid user user from 123.58.212.100 port 33650
Feb 02 09:48:36 compute-1 python3.9[125093]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 02 09:48:36 compute-1 sudo[125091]: pam_unix(sudo:session): session closed for user root
Feb 02 09:48:36 compute-1 sshd-session[124862]: Connection closed by invalid user user 123.58.212.100 port 33650 [preauth]
Feb 02 09:48:36 compute-1 ceph-mon[80115]: pgmap v260: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 1.5 KiB/s rd, 672 B/s wr, 2 op/s
Feb 02 09:48:36 compute-1 sudo[125246]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rorvswabwwmiihwctnjbddtbhbqmhgez ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025716.3518252-379-120944667052241/AnsiballZ_command.py'
Feb 02 09:48:36 compute-1 sudo[125246]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:48:36 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:48:36 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dcc001ec0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:48:37 compute-1 python3.9[125248]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 02 09:48:37 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:48:37 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4db8002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:48:37 compute-1 sudo[125246]: pam_unix(sudo:session): session closed for user root
Feb 02 09:48:37 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:48:37 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dbc002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:48:37 compute-1 sshd-session[125170]: Invalid user user from 123.58.212.100 port 33656
Feb 02 09:48:37 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:48:37 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 09:48:37 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:48:37.382 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 09:48:37 compute-1 sshd-session[125170]: Connection closed by invalid user user 123.58.212.100 port 33656 [preauth]
Feb 02 09:48:37 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:48:37 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:48:37 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:48:37.602 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:48:37 compute-1 sudo[125401]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ukjrlkprhscnwtteqmtwukybtjcuryur ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1770025717.3429976-403-238777049262232/AnsiballZ_edpm_nftables_from_files.py'
Feb 02 09:48:37 compute-1 sudo[125401]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:48:38 compute-1 python3[125403]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Feb 02 09:48:38 compute-1 sudo[125401]: pam_unix(sudo:session): session closed for user root
Feb 02 09:48:38 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:48:38 : epoch 698072dc : compute-1 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Feb 02 09:48:38 compute-1 ceph-mon[80115]: pgmap v261: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 1.5 KiB/s rd, 672 B/s wr, 2 op/s
Feb 02 09:48:38 compute-1 sudo[125553]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-infulpvtyednnnyjphbeosuwwsjtmbnn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025718.3788354-427-93548668268750/AnsiballZ_stat.py'
Feb 02 09:48:38 compute-1 sudo[125553]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:48:38 compute-1 sshd-session[125384]: Invalid user user from 123.58.212.100 port 33666
Feb 02 09:48:38 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:48:38 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dc80031e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:48:38 compute-1 python3.9[125555]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 02 09:48:38 compute-1 sudo[125553]: pam_unix(sudo:session): session closed for user root
Feb 02 09:48:38 compute-1 sshd-session[125384]: Connection closed by invalid user user 123.58.212.100 port 33666 [preauth]
Feb 02 09:48:39 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:48:39 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dcc001ec0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:48:39 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:48:39 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4db8002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:48:39 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:48:39 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:48:39 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:48:39.385 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:48:39 compute-1 sudo[125681]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pbifscjiyqdpmfrseginfjxmxrtdhozo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025718.3788354-427-93548668268750/AnsiballZ_copy.py'
Feb 02 09:48:39 compute-1 sudo[125681]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:48:39 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:48:39 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 09:48:39 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:48:39.604 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 09:48:39 compute-1 python3.9[125683]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1770025718.3788354-427-93548668268750/.source.nft follow=False _original_basename=jump-chain.j2 checksum=81c2fc96c23335ffe374f9b064e885d5d971ddf9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 02 09:48:39 compute-1 sudo[125681]: pam_unix(sudo:session): session closed for user root
Feb 02 09:48:40 compute-1 sshd-session[125606]: Invalid user user from 123.58.212.100 port 33670
Feb 02 09:48:40 compute-1 sudo[125833]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xmxxkbujtronxjfysofsyqsfjnixrydr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025719.8363638-472-232289786823075/AnsiballZ_stat.py'
Feb 02 09:48:40 compute-1 sudo[125833]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:48:40 compute-1 sshd-session[125606]: Connection closed by invalid user user 123.58.212.100 port 33670 [preauth]
Feb 02 09:48:40 compute-1 ceph-mon[80115]: pgmap v262: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 2.2 KiB/s rd, 938 B/s wr, 3 op/s
Feb 02 09:48:40 compute-1 python3.9[125835]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 02 09:48:40 compute-1 sudo[125833]: pam_unix(sudo:session): session closed for user root
Feb 02 09:48:40 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 09:48:40 compute-1 sudo[125960]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rcgfkncqnyeuffhgwoguspxvydwpancq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025719.8363638-472-232289786823075/AnsiballZ_copy.py'
Feb 02 09:48:40 compute-1 sudo[125960]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:48:40 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:48:40 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dbc002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:48:40 compute-1 python3.9[125963]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-update-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1770025719.8363638-472-232289786823075/.source.nft follow=False _original_basename=jump-chain.j2 checksum=ac8dea350c18f51f54d48dacc09613cda4c5540c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 02 09:48:40 compute-1 sudo[125960]: pam_unix(sudo:session): session closed for user root
Feb 02 09:48:41 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:48:41 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dcc001ec0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:48:41 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:48:41 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dc80031e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:48:41 compute-1 sshd-session[125890]: Invalid user user from 123.58.212.100 port 33676
Feb 02 09:48:41 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:48:41 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:48:41 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:48:41.389 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:48:41 compute-1 sudo[126113]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tsdbradgqyrpwgejmvuangyprghoaedf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025721.2520907-517-269301735388539/AnsiballZ_stat.py'
Feb 02 09:48:41 compute-1 sudo[126113]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:48:41 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:48:41 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:48:41 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:48:41.607 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:48:41 compute-1 sshd-session[125890]: Connection closed by invalid user user 123.58.212.100 port 33676 [preauth]
Feb 02 09:48:41 compute-1 python3.9[126115]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 02 09:48:41 compute-1 sudo[126113]: pam_unix(sudo:session): session closed for user root
Feb 02 09:48:42 compute-1 sudo[126240]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kinamktjvxihchxyddcfaqimpigmhsco ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025721.2520907-517-269301735388539/AnsiballZ_copy.py'
Feb 02 09:48:42 compute-1 sudo[126240]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:48:42 compute-1 python3.9[126242]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-flushes.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1770025721.2520907-517-269301735388539/.source.nft follow=False _original_basename=flush-chain.j2 checksum=4d3ffec49c8eb1a9b80d2f1e8cd64070063a87b4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 02 09:48:42 compute-1 sudo[126240]: pam_unix(sudo:session): session closed for user root
Feb 02 09:48:42 compute-1 ceph-mon[80115]: pgmap v263: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 2.2 KiB/s rd, 938 B/s wr, 3 op/s
Feb 02 09:48:42 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:48:42 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4db8002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:48:42 compute-1 sudo[126393]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-auewelavsrrxshjyvlenqbpdanncktzh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025722.599423-562-5699715306639/AnsiballZ_stat.py'
Feb 02 09:48:42 compute-1 sudo[126393]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:48:43 compute-1 python3.9[126395]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 02 09:48:43 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:48:43 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dbc002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:48:43 compute-1 sshd-session[126141]: Invalid user user from 123.58.212.100 port 60708
Feb 02 09:48:43 compute-1 sudo[126393]: pam_unix(sudo:session): session closed for user root
Feb 02 09:48:43 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:48:43 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dbc002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:48:43 compute-1 sshd-session[126141]: Connection closed by invalid user user 123.58.212.100 port 60708 [preauth]
Feb 02 09:48:43 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:48:43 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:48:43 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:48:43.391 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:48:43 compute-1 sudo[126518]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xidhkizxctzbyfbqskaarjwljkmhrygv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025722.599423-562-5699715306639/AnsiballZ_copy.py'
Feb 02 09:48:43 compute-1 sudo[126518]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:48:43 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:48:43 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 09:48:43 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:48:43.609 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 09:48:43 compute-1 python3.9[126520]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-chains.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1770025722.599423-562-5699715306639/.source.nft follow=False _original_basename=chains.j2 checksum=298ada419730ec15df17ded0cc50c97a4014a591 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 02 09:48:43 compute-1 sudo[126518]: pam_unix(sudo:session): session closed for user root
Feb 02 09:48:44 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-haproxy-nfs-cephfs-compute-1-sryqbx[86108]: [WARNING] 032/094844 (4) : Server backend/nfs.cephfs.1 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Feb 02 09:48:44 compute-1 sudo[126672]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hzqczmtcokddmjykzcguwynzqnaxugpn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025723.9314237-607-263071308645558/AnsiballZ_stat.py'
Feb 02 09:48:44 compute-1 sudo[126672]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:48:44 compute-1 ceph-mon[80115]: pgmap v264: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 2.2 KiB/s rd, 938 B/s wr, 3 op/s
Feb 02 09:48:44 compute-1 python3.9[126674]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 02 09:48:44 compute-1 sudo[126672]: pam_unix(sudo:session): session closed for user root
Feb 02 09:48:44 compute-1 sshd-session[126521]: Invalid user user from 123.58.212.100 port 60724
Feb 02 09:48:44 compute-1 sshd-session[126521]: Connection closed by invalid user user 123.58.212.100 port 60724 [preauth]
Feb 02 09:48:44 compute-1 sudo[126798]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rdoikjsonubggyenfqfrmcmbhqrugvzw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025723.9314237-607-263071308645558/AnsiballZ_copy.py'
Feb 02 09:48:44 compute-1 sudo[126798]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:48:44 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:48:44 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dc80031e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:48:45 compute-1 python3.9[126800]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1770025723.9314237-607-263071308645558/.source.nft follow=False _original_basename=ruleset.j2 checksum=bdba38546f86123f1927359d89789bd211aba99d backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 02 09:48:45 compute-1 sudo[126798]: pam_unix(sudo:session): session closed for user root
Feb 02 09:48:45 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:48:45 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dc80031e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:48:45 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:48:45 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dcc0033b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:48:45 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:48:45 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 09:48:45 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:48:45.394 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 09:48:45 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 09:48:45 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:48:45 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:48:45 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:48:45.612 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:48:45 compute-1 sudo[126952]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-haznpgobxrpojmfxpfnykoquocokliza ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025725.3216648-652-232809235584157/AnsiballZ_file.py'
Feb 02 09:48:45 compute-1 sudo[126952]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:48:45 compute-1 python3.9[126954]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 02 09:48:45 compute-1 sudo[126952]: pam_unix(sudo:session): session closed for user root
Feb 02 09:48:46 compute-1 sshd-session[126848]: Invalid user ubuntu from 123.58.212.100 port 60736
Feb 02 09:48:46 compute-1 ceph-mon[80115]: pgmap v265: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 2.4 KiB/s rd, 1023 B/s wr, 3 op/s
Feb 02 09:48:46 compute-1 sudo[127104]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wvubdihgmisisjrrjitbzlvcxqbrxoqz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025726.0947902-676-60783625158929/AnsiballZ_command.py'
Feb 02 09:48:46 compute-1 sudo[127104]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:48:46 compute-1 sshd-session[126848]: Connection closed by invalid user ubuntu 123.58.212.100 port 60736 [preauth]
Feb 02 09:48:46 compute-1 python3.9[127106]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 02 09:48:46 compute-1 sudo[127104]: pam_unix(sudo:session): session closed for user root
Feb 02 09:48:46 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:48:46 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dbc003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:48:47 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:48:47 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dc80031e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:48:47 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:48:47 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4db8003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:48:47 compute-1 sudo[127262]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eilcptvzzoqdrnqnimiuukpgbwumwkfu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025726.9171383-700-68394407225536/AnsiballZ_blockinfile.py'
Feb 02 09:48:47 compute-1 sudo[127262]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:48:47 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Feb 02 09:48:47 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:48:47 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 09:48:47 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:48:47.397 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 09:48:47 compute-1 python3.9[127264]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"
                                             include "/etc/nftables/edpm-chains.nft"
                                             include "/etc/nftables/edpm-rules.nft"
                                             include "/etc/nftables/edpm-jumps.nft"
                                              path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 02 09:48:47 compute-1 sudo[127262]: pam_unix(sudo:session): session closed for user root
Feb 02 09:48:47 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:48:47 compute-1 sshd-session[127113]: Invalid user ubuntu from 123.58.212.100 port 60748
Feb 02 09:48:47 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:48:47 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:48:47.614 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:48:47 compute-1 sshd-session[127113]: Connection closed by invalid user ubuntu 123.58.212.100 port 60748 [preauth]
Feb 02 09:48:48 compute-1 sudo[127416]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ixdyssdpzglnsokxmwjzxmlnnqcmhegu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025727.8755107-727-129779338715740/AnsiballZ_command.py'
Feb 02 09:48:48 compute-1 sudo[127416]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:48:48 compute-1 python3.9[127418]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 02 09:48:48 compute-1 sudo[127416]: pam_unix(sudo:session): session closed for user root
Feb 02 09:48:48 compute-1 ceph-mon[80115]: pgmap v266: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 1.2 KiB/s rd, 426 B/s wr, 1 op/s
Feb 02 09:48:48 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:48:48 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dcc0033b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:48:48 compute-1 sudo[127570]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bmjnpqpgfyfaukgzyrcvqaphwceglxqx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025728.6600292-751-101277725931268/AnsiballZ_stat.py'
Feb 02 09:48:48 compute-1 sudo[127570]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:48:48 compute-1 sshd-session[127364]: Invalid user ubuntu from 123.58.212.100 port 60756
Feb 02 09:48:49 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:48:49 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dbc003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:48:49 compute-1 python3.9[127572]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 02 09:48:49 compute-1 sudo[127570]: pam_unix(sudo:session): session closed for user root
Feb 02 09:48:49 compute-1 sshd-session[127364]: Connection closed by invalid user ubuntu 123.58.212.100 port 60756 [preauth]
Feb 02 09:48:49 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:48:49 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dbc003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:48:49 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:48:49 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 09:48:49 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:48:49.400 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 09:48:49 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:48:49 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:48:49 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:48:49.616 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:48:49 compute-1 sudo[127726]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kvszvgsrjlmronyqnkvmnfgmtbsifcro ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025729.4114614-775-199192125787769/AnsiballZ_command.py'
Feb 02 09:48:49 compute-1 sudo[127726]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:48:49 compute-1 python3.9[127728]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 02 09:48:49 compute-1 sudo[127726]: pam_unix(sudo:session): session closed for user root
Feb 02 09:48:50 compute-1 ceph-mon[80115]: pgmap v267: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 1.2 KiB/s rd, 426 B/s wr, 1 op/s
Feb 02 09:48:50 compute-1 sudo[127881]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-btzvjxogyoqibzffjkecavymlpjigvjj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025730.1356332-799-98991234391068/AnsiballZ_file.py'
Feb 02 09:48:50 compute-1 sudo[127881]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:48:50 compute-1 sshd-session[127625]: Invalid user ubuntu from 123.58.212.100 port 60762
Feb 02 09:48:50 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 09:48:50 compute-1 python3.9[127883]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 02 09:48:50 compute-1 sudo[127881]: pam_unix(sudo:session): session closed for user root
Feb 02 09:48:50 compute-1 sshd-session[127625]: Connection closed by invalid user ubuntu 123.58.212.100 port 60762 [preauth]
Feb 02 09:48:50 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:48:50 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dbc003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:48:51 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:48:51 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dcc0033b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:48:51 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:48:51 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dc80042e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:48:51 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:48:51 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:48:51 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:48:51.404 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:48:51 compute-1 ceph-mon[80115]: pgmap v268: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 85 B/s wr, 0 op/s
Feb 02 09:48:51 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:48:51 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:48:51 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:48:51.618 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:48:51 compute-1 sshd-session[127909]: Invalid user ubuntu from 123.58.212.100 port 60774
Feb 02 09:48:51 compute-1 sshd-session[127909]: Connection closed by invalid user ubuntu 123.58.212.100 port 60774 [preauth]
Feb 02 09:48:52 compute-1 python3.9[128036]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'machine'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 02 09:48:52 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-haproxy-nfs-cephfs-compute-1-sryqbx[86108]: [WARNING] 032/094852 (4) : Server backend/nfs.cephfs.2 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Feb 02 09:48:52 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:48:52 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dbc003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:48:53 compute-1 sshd-session[128062]: Invalid user ubuntu from 123.58.212.100 port 34138
Feb 02 09:48:53 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:48:53 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dbc003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:48:53 compute-1 sudo[128190]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zpzvbrfquleihawohsxbziridjmmyjoa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025732.9878323-919-275182659207819/AnsiballZ_command.py'
Feb 02 09:48:53 compute-1 sudo[128190]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:48:53 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:48:53 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dcc0033b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:48:53 compute-1 sshd-session[128062]: Connection closed by invalid user ubuntu 123.58.212.100 port 34138 [preauth]
Feb 02 09:48:53 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:48:53 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 09:48:53 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:48:53.407 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 09:48:53 compute-1 python3.9[128192]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl set open . external_ids:hostname=compute-1.ctlplane.example.com external_ids:ovn-bridge=br-int external_ids:ovn-bridge-mappings=datacentre:br-ex external_ids:ovn-chassis-mac-mappings="datacentre:2e:0a:9e:41:65:cf" external_ids:ovn-encap-ip=172.19.0.101 external_ids:ovn-encap-type=geneve external_ids:ovn-encap-tos=0 external_ids:ovn-match-northd-version=False external_ids:ovn-monitor-all=True external_ids:ovn-remote=ssl:ovsdbserver-sb.openstack.svc:6642 external_ids:ovn-remote-probe-interval=60000 external_ids:ovn-ofctrl-wait-before-clear=8000 external_ids:rundir=/var/run/openvswitch 
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 02 09:48:53 compute-1 ovs-vsctl[128193]: ovs|00001|vsctl|INFO|Called as ovs-vsctl set open . external_ids:hostname=compute-1.ctlplane.example.com external_ids:ovn-bridge=br-int external_ids:ovn-bridge-mappings=datacentre:br-ex external_ids:ovn-chassis-mac-mappings=datacentre:2e:0a:9e:41:65:cf external_ids:ovn-encap-ip=172.19.0.101 external_ids:ovn-encap-type=geneve external_ids:ovn-encap-tos=0 external_ids:ovn-match-northd-version=False external_ids:ovn-monitor-all=True external_ids:ovn-remote=ssl:ovsdbserver-sb.openstack.svc:6642 external_ids:ovn-remote-probe-interval=60000 external_ids:ovn-ofctrl-wait-before-clear=8000 external_ids:rundir=/var/run/openvswitch
Feb 02 09:48:53 compute-1 sudo[128190]: pam_unix(sudo:session): session closed for user root
Feb 02 09:48:53 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:48:53 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:48:53 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:48:53.621 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:48:54 compute-1 ceph-mon[80115]: pgmap v269: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 85 B/s wr, 0 op/s
Feb 02 09:48:54 compute-1 sudo[128345]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uomouynmnkcryyaqlcvjediijatblffj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025733.8326752-946-132117339363806/AnsiballZ_command.py'
Feb 02 09:48:54 compute-1 sudo[128345]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:48:54 compute-1 sudo[128348]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Feb 02 09:48:54 compute-1 sudo[128348]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:48:54 compute-1 sudo[128348]: pam_unix(sudo:session): session closed for user root
Feb 02 09:48:54 compute-1 python3.9[128347]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail
                                             ovs-vsctl show | grep -q "Manager"
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 02 09:48:54 compute-1 sudo[128345]: pam_unix(sudo:session): session closed for user root
Feb 02 09:48:54 compute-1 sshd-session[128215]: Invalid user ubuntu from 123.58.212.100 port 34144
Feb 02 09:48:54 compute-1 sshd-session[128215]: Connection closed by invalid user ubuntu 123.58.212.100 port 34144 [preauth]
Feb 02 09:48:54 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:48:54 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dc80042e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:48:54 compute-1 sudo[128527]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-akgidmwpiftgfjymvxvzrsceblemkpch ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025734.595094-970-779759542085/AnsiballZ_command.py'
Feb 02 09:48:54 compute-1 sudo[128527]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:48:55 compute-1 python3.9[128530]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl --timeout=5 --id=@manager -- create Manager target=\"ptcp:********@manager
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 02 09:48:55 compute-1 ovs-vsctl[128531]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --timeout=5 --id=@manager -- create Manager "target=\"ptcp:6640:127.0.0.1\"" -- add Open_vSwitch . manager_options @manager
Feb 02 09:48:55 compute-1 sudo[128527]: pam_unix(sudo:session): session closed for user root
Feb 02 09:48:55 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:48:55 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dc80042e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:48:55 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:48:55 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4db8003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:48:55 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:48:55 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 09:48:55 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:48:55.410 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 09:48:55 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 09:48:55 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:48:55 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:48:55 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:48:55.623 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:48:55 compute-1 sshd-session[128524]: Invalid user ubuntu from 123.58.212.100 port 34154
Feb 02 09:48:55 compute-1 python3.9[128681]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 02 09:48:56 compute-1 ceph-mon[80115]: pgmap v270: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 426 B/s rd, 85 B/s wr, 0 op/s
Feb 02 09:48:56 compute-1 sshd-session[128524]: Connection closed by invalid user ubuntu 123.58.212.100 port 34154 [preauth]
Feb 02 09:48:56 compute-1 sudo[128835]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rmdiqecmpmigzbvrpjnnvnkgtldadnev ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025736.2534628-1021-85932001212768/AnsiballZ_file.py'
Feb 02 09:48:56 compute-1 sudo[128835]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:48:56 compute-1 python3.9[128837]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 02 09:48:56 compute-1 sudo[128835]: pam_unix(sudo:session): session closed for user root
Feb 02 09:48:56 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:48:56 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dcc0033b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:48:57 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:48:57 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dcc0033b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:48:57 compute-1 sshd-session[128760]: Invalid user ubuntu from 123.58.212.100 port 34158
Feb 02 09:48:57 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:48:57 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4db0000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:48:57 compute-1 sudo[128990]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wbzrzujwlpumbuhksxeklxfxszpzcbwz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025737.0546124-1045-233200739405839/AnsiballZ_stat.py'
Feb 02 09:48:57 compute-1 sudo[128990]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:48:57 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:48:57 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:48:57 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:48:57.413 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:48:57 compute-1 python3.9[128992]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 02 09:48:57 compute-1 sshd-session[128760]: Connection closed by invalid user ubuntu 123.58.212.100 port 34158 [preauth]
Feb 02 09:48:57 compute-1 sudo[128990]: pam_unix(sudo:session): session closed for user root
Feb 02 09:48:57 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:48:57 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:48:57 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:48:57.626 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:48:57 compute-1 sudo[129069]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tpmjmnnmumgiyjogdgwygjmscyolgfbi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025737.0546124-1045-233200739405839/AnsiballZ_file.py'
Feb 02 09:48:57 compute-1 sudo[129069]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:48:57 compute-1 python3.9[129072]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 02 09:48:57 compute-1 sudo[129069]: pam_unix(sudo:session): session closed for user root
Feb 02 09:48:58 compute-1 ceph-mon[80115]: pgmap v271: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s
Feb 02 09:48:58 compute-1 sudo[129222]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tyjgoiyanlbcansgayishxmgfnwyoqms ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025738.107665-1045-224502956003939/AnsiballZ_stat.py'
Feb 02 09:48:58 compute-1 sudo[129222]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:48:58 compute-1 sshd-session[129064]: Invalid user ubuntu from 123.58.212.100 port 34162
Feb 02 09:48:58 compute-1 python3.9[129224]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 02 09:48:58 compute-1 sudo[129222]: pam_unix(sudo:session): session closed for user root
Feb 02 09:48:58 compute-1 sshd-session[129064]: Connection closed by invalid user ubuntu 123.58.212.100 port 34162 [preauth]
Feb 02 09:48:58 compute-1 sudo[129301]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kwkwyesbghxlqxvkjyzcyiyrerbemoip ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025738.107665-1045-224502956003939/AnsiballZ_file.py'
Feb 02 09:48:58 compute-1 sudo[129301]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:48:58 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:48:58 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4db8003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:48:59 compute-1 python3.9[129303]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 02 09:48:59 compute-1 sudo[129301]: pam_unix(sudo:session): session closed for user root
Feb 02 09:48:59 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:48:59 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4db8003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:48:59 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:48:59 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dcc0033b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:48:59 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:48:59 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:48:59 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:48:59.416 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:48:59 compute-1 sudo[129455]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wyrzibotfmqlggantecyirheelsoujfm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025739.397199-1114-173246510463167/AnsiballZ_file.py'
Feb 02 09:48:59 compute-1 sudo[129455]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:48:59 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:48:59 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:48:59 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:48:59.628 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:48:59 compute-1 python3.9[129457]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 02 09:48:59 compute-1 sudo[129455]: pam_unix(sudo:session): session closed for user root
Feb 02 09:49:00 compute-1 sshd-session[129304]: Invalid user ubuntu from 123.58.212.100 port 34174
Feb 02 09:49:00 compute-1 ceph-mon[80115]: pgmap v272: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s
Feb 02 09:49:00 compute-1 sshd-session[129304]: Connection closed by invalid user ubuntu 123.58.212.100 port 34174 [preauth]
Feb 02 09:49:00 compute-1 sudo[129607]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qohokaogqafpkmevlhvpjfutyyjhqvew ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025740.0638554-1138-186645349731738/AnsiballZ_stat.py'
Feb 02 09:49:00 compute-1 sudo[129607]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:49:00 compute-1 python3.9[129609]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 02 09:49:00 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 09:49:00 compute-1 sudo[129607]: pam_unix(sudo:session): session closed for user root
Feb 02 09:49:00 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:49:00 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4db00016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:49:00 compute-1 sudo[129688]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-evahevnxbodrkrtaqtdhfaxtvowiwnkf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025740.0638554-1138-186645349731738/AnsiballZ_file.py'
Feb 02 09:49:00 compute-1 sudo[129688]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:49:01 compute-1 python3.9[129690]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 02 09:49:01 compute-1 sudo[129688]: pam_unix(sudo:session): session closed for user root
Feb 02 09:49:01 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:49:01 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dbc003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:49:01 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:49:01 : epoch 698072dc : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Feb 02 09:49:01 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:49:01 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dbc003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:49:01 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:49:01 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 09:49:01 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:49:01.419 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 09:49:01 compute-1 sshd-session[129610]: Invalid user ubuntu from 123.58.212.100 port 34176
Feb 02 09:49:01 compute-1 sudo[129840]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hgwlkvzjuwzinsfepdvrmjhpmuifxbma ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025741.2852387-1174-6596430157931/AnsiballZ_stat.py'
Feb 02 09:49:01 compute-1 sudo[129840]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:49:01 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:49:01 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:49:01 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:49:01.630 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:49:01 compute-1 sshd-session[129610]: Connection closed by invalid user ubuntu 123.58.212.100 port 34176 [preauth]
Feb 02 09:49:01 compute-1 python3.9[129842]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 02 09:49:01 compute-1 sudo[129840]: pam_unix(sudo:session): session closed for user root
Feb 02 09:49:02 compute-1 sudo[129920]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rsyipwazuoppjuheabudpkawpftywcql ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025741.2852387-1174-6596430157931/AnsiballZ_file.py'
Feb 02 09:49:02 compute-1 sudo[129920]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:49:02 compute-1 ceph-mon[80115]: pgmap v273: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s
Feb 02 09:49:02 compute-1 python3.9[129922]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 02 09:49:02 compute-1 sudo[129920]: pam_unix(sudo:session): session closed for user root
Feb 02 09:49:02 compute-1 sudo[130073]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-napxgpufztyckekskhlovcvqnjvxyjvi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025742.5589235-1210-12765120954356/AnsiballZ_systemd.py'
Feb 02 09:49:02 compute-1 sudo[130073]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:49:02 compute-1 sshd-session[129853]: Invalid user ubuntu from 123.58.212.100 port 56024
Feb 02 09:49:02 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:49:02 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dcc0033b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:49:03 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Feb 02 09:49:03 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:49:03 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dcc0033b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:49:03 compute-1 python3.9[130075]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 02 09:49:03 compute-1 systemd[1]: Reloading.
Feb 02 09:49:03 compute-1 sshd-session[129853]: Connection closed by invalid user ubuntu 123.58.212.100 port 56024 [preauth]
Feb 02 09:49:03 compute-1 systemd-rc-local-generator[130095]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 02 09:49:03 compute-1 systemd-sysv-generator[130105]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 02 09:49:03 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:49:03 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4da8000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:49:03 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:49:03 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 09:49:03 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:49:03.422 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 09:49:03 compute-1 sudo[130073]: pam_unix(sudo:session): session closed for user root
Feb 02 09:49:03 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:49:03 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 09:49:03 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:49:03.632 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 09:49:03 compute-1 sudo[130265]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wcjaglihkdwmbaahmruhzgtnyqgjglsc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025743.6904182-1234-174442180688041/AnsiballZ_stat.py'
Feb 02 09:49:03 compute-1 sudo[130265]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:49:04 compute-1 python3.9[130267]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 02 09:49:04 compute-1 sudo[130265]: pam_unix(sudo:session): session closed for user root
Feb 02 09:49:04 compute-1 ceph-mon[80115]: pgmap v274: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s
Feb 02 09:49:04 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:49:04 : epoch 698072dc : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Feb 02 09:49:04 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:49:04 : epoch 698072dc : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Feb 02 09:49:04 compute-1 sudo[130343]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qyjhcfhyndmhfsyzeksqeugqnqpymfhw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025743.6904182-1234-174442180688041/AnsiballZ_file.py'
Feb 02 09:49:04 compute-1 sudo[130343]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:49:04 compute-1 sshd-session[130114]: Invalid user ubuntu from 123.58.212.100 port 56040
Feb 02 09:49:04 compute-1 python3.9[130345]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 02 09:49:04 compute-1 sudo[130343]: pam_unix(sudo:session): session closed for user root
Feb 02 09:49:04 compute-1 sshd-session[130114]: Connection closed by invalid user ubuntu 123.58.212.100 port 56040 [preauth]
Feb 02 09:49:04 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:49:04 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dd8002010 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:49:05 compute-1 sudo[130498]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bfccdogqmjwliuunlcbijpidbcpebcsc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025744.8482335-1270-111771706586879/AnsiballZ_stat.py'
Feb 02 09:49:05 compute-1 sudo[130498]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:49:05 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:49:05 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4db8003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:49:05 compute-1 python3.9[130500]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 02 09:49:05 compute-1 sudo[130498]: pam_unix(sudo:session): session closed for user root
Feb 02 09:49:05 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:49:05 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4db8003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:49:05 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:49:05 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:49:05 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:49:05.425 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:49:05 compute-1 sudo[130576]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-otlscspxngxzysgtlrkatlfscsfmbvdi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025744.8482335-1270-111771706586879/AnsiballZ_file.py'
Feb 02 09:49:05 compute-1 sudo[130576]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:49:05 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 09:49:05 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:49:05 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:49:05 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:49:05.635 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:49:05 compute-1 python3.9[130578]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 02 09:49:05 compute-1 sudo[130576]: pam_unix(sudo:session): session closed for user root
Feb 02 09:49:05 compute-1 sshd-session[130416]: Invalid user ubuntu from 123.58.212.100 port 56054
Feb 02 09:49:06 compute-1 ceph-mon[80115]: pgmap v275: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 1.3 KiB/s rd, 596 B/s wr, 1 op/s
Feb 02 09:49:06 compute-1 sudo[130728]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gckshrcbsuqrijbdomgteicaxzsaehht ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025745.95547-1306-260938828294222/AnsiballZ_systemd.py'
Feb 02 09:49:06 compute-1 sudo[130728]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:49:06 compute-1 sshd-session[130416]: Connection closed by invalid user ubuntu 123.58.212.100 port 56054 [preauth]
Feb 02 09:49:06 compute-1 python3.9[130730]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 02 09:49:06 compute-1 systemd[1]: Reloading.
Feb 02 09:49:06 compute-1 systemd-sysv-generator[130766]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 02 09:49:06 compute-1 systemd-rc-local-generator[130761]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 02 09:49:06 compute-1 systemd[1]: Starting Create netns directory...
Feb 02 09:49:06 compute-1 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Feb 02 09:49:06 compute-1 systemd[1]: netns-placeholder.service: Deactivated successfully.
Feb 02 09:49:06 compute-1 systemd[1]: Finished Create netns directory.
Feb 02 09:49:06 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:49:06 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4da80016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:49:06 compute-1 sudo[130728]: pam_unix(sudo:session): session closed for user root
Feb 02 09:49:07 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:49:07 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4db8003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:49:07 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:49:07 : epoch 698072dc : compute-1 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Feb 02 09:49:07 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:49:07 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dcc0033b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:49:07 compute-1 sshd-session[130731]: Invalid user ubuntu from 123.58.212.100 port 56056
Feb 02 09:49:07 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:49:07 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:49:07 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:49:07.428 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:49:07 compute-1 sudo[130924]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ubecmowwecztdmgbeqobaghkezteruqp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025747.3293867-1336-198927187335313/AnsiballZ_file.py'
Feb 02 09:49:07 compute-1 sudo[130924]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:49:07 compute-1 sshd-session[130731]: Connection closed by invalid user ubuntu 123.58.212.100 port 56056 [preauth]
Feb 02 09:49:07 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:49:07 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 09:49:07 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:49:07.638 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 09:49:07 compute-1 python3.9[130926]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 02 09:49:07 compute-1 sudo[130924]: pam_unix(sudo:session): session closed for user root
Feb 02 09:49:08 compute-1 ceph-mon[80115]: pgmap v276: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 1.2 KiB/s rd, 596 B/s wr, 1 op/s
Feb 02 09:49:08 compute-1 sudo[131078]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xvuobfwomjjisjkzvlbzofafzdmhqewp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025748.0329247-1360-169299244238092/AnsiballZ_stat.py'
Feb 02 09:49:08 compute-1 sudo[131078]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:49:08 compute-1 python3.9[131080]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ovn_controller/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 02 09:49:08 compute-1 sudo[131078]: pam_unix(sudo:session): session closed for user root
Feb 02 09:49:08 compute-1 sshd-session[130927]: Invalid user ubuntu from 123.58.212.100 port 56060
Feb 02 09:49:08 compute-1 ceph-osd[77691]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Feb 02 09:49:08 compute-1 ceph-osd[77691]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                           ** DB Stats **
                                           Uptime(secs): 600.1 total, 600.0 interval
                                           Cumulative writes: 8069 writes, 33K keys, 8069 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.04 MB/s
                                           Cumulative WAL: 8069 writes, 1528 syncs, 5.28 writes per sync, written: 0.02 GB, 0.04 MB/s
                                           Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                           Interval writes: 8069 writes, 33K keys, 8069 commit groups, 1.0 writes per commit group, ingest: 21.03 MB, 0.04 MB/s
                                           Interval WAL: 8069 writes, 1528 syncs, 5.28 writes per sync, written: 0.02 GB, 0.04 MB/s
                                           Interval stall: 00:00:0.000 H:M:S, 0.0 percent
                                           
                                           ** Compaction Stats [default] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      2/0    2.63 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.2      0.01              0.00         1    0.005       0      0       0.0       0.0
                                            Sum      2/0    2.63 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.2      0.01              0.00         1    0.005       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [default] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.2      0.01              0.00         1    0.005       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x5616dea3d350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 3.1e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [default] **
                                           
                                           ** Compaction Stats [m-0] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [m-0] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x5616dea3d350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 3.1e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [m-0] **
                                           
                                           ** Compaction Stats [m-1] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [m-1] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x5616dea3d350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 3.1e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [m-1] **
                                           
                                           ** Compaction Stats [m-2] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [m-2] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x5616dea3d350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 3.1e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [m-2] **
                                           
                                           ** Compaction Stats [p-0] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      1/0    1.56 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.2      0.01              0.00         1    0.007       0      0       0.0       0.0
                                            Sum      1/0    1.56 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.2      0.01              0.00         1    0.007       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [p-0] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.2      0.01              0.00         1    0.007       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x5616dea3d350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 3.1e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [p-0] **
                                           
                                           ** Compaction Stats [p-1] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [p-1] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x5616dea3d350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 3.1e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [p-1] **
                                           
                                           ** Compaction Stats [p-2] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [p-2] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x5616dea3d350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 3.1e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [p-2] **
                                           
                                           ** Compaction Stats [O-0] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [O-0] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x5616dea3c9b0#2 capacity: 224.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 2 last_secs: 4e-06 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,8.85555e-05%) FilterBlock(1,0.11 KB,4.76837e-05%) IndexBlock(1,0.14 KB,6.13076e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [O-0] **
                                           
                                           ** Compaction Stats [O-1] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [O-1] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x5616dea3c9b0#2 capacity: 224.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 2 last_secs: 4e-06 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,8.85555e-05%) FilterBlock(1,0.11 KB,4.76837e-05%) IndexBlock(1,0.14 KB,6.13076e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [O-1] **
                                           
                                           ** Compaction Stats [O-2] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      1/0    1.25 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.4      0.00              0.00         1    0.003       0      0       0.0       0.0
                                            Sum      1/0    1.25 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.4      0.00              0.00         1    0.003       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [O-2] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.4      0.00              0.00         1    0.003       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x5616dea3c9b0#2 capacity: 224.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 2 last_secs: 4e-06 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,8.85555e-05%) FilterBlock(1,0.11 KB,4.76837e-05%) IndexBlock(1,0.14 KB,6.13076e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [O-2] **
                                           
                                           ** Compaction Stats [L] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.001       0      0       0.0       0.0
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.001       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [L] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.001       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x5616dea3d350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 3.1e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [L] **
                                           
                                           ** Compaction Stats [P] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [P] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x5616dea3d350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 3.1e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [P] **
Feb 02 09:49:08 compute-1 sudo[131202]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wjkthxkjqlbamjsvslhvfozoiwsnixwc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025748.0329247-1360-169299244238092/AnsiballZ_copy.py'
Feb 02 09:49:08 compute-1 sudo[131202]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:49:08 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:49:08 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dd80011e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:49:08 compute-1 sshd-session[130927]: Connection closed by invalid user ubuntu 123.58.212.100 port 56060 [preauth]
Feb 02 09:49:09 compute-1 python3.9[131204]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ovn_controller/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1770025748.0329247-1360-169299244238092/.source _original_basename=healthcheck follow=False checksum=4098dd010265fabdf5c26b97d169fc4e575ff457 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Feb 02 09:49:09 compute-1 sudo[131202]: pam_unix(sudo:session): session closed for user root
Feb 02 09:49:09 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:49:09 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4da80016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:49:09 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:49:09 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4db8003c30 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:49:09 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:49:09 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 09:49:09 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:49:09.432 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 09:49:09 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:49:09 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:49:09 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:49:09.641 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:49:09 compute-1 sudo[131356]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nketrdhndsdgffddgtdvexubursavrlb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025749.6213245-1411-65818355179279/AnsiballZ_file.py'
Feb 02 09:49:09 compute-1 sudo[131356]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:49:10 compute-1 python3.9[131358]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 02 09:49:10 compute-1 sshd-session[131229]: Invalid user ubuntu from 123.58.212.100 port 56062
Feb 02 09:49:10 compute-1 sudo[131356]: pam_unix(sudo:session): session closed for user root
Feb 02 09:49:10 compute-1 ceph-mon[80115]: pgmap v277: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 2.2 KiB/s rd, 1023 B/s wr, 3 op/s
Feb 02 09:49:10 compute-1 sshd-session[131229]: Connection closed by invalid user ubuntu 123.58.212.100 port 56062 [preauth]
Feb 02 09:49:10 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 09:49:10 compute-1 sudo[131508]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-crdqfcqdsyfzkmsecdoeqbgbixpnjpzp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025750.3572469-1435-154981441465586/AnsiballZ_file.py'
Feb 02 09:49:10 compute-1 sudo[131508]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:49:10 compute-1 python3.9[131512]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 02 09:49:10 compute-1 sudo[131508]: pam_unix(sudo:session): session closed for user root
Feb 02 09:49:10 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:49:10 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dcc0033b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:49:11 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:49:11 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dd80011e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:49:11 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:49:11 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4da80016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:49:11 compute-1 sudo[131663]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pvdndybhthabjhqdirmbkbaprgmdmcxm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025751.0971673-1459-92705423630933/AnsiballZ_stat.py'
Feb 02 09:49:11 compute-1 sudo[131663]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:49:11 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:49:11 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:49:11 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:49:11.435 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:49:11 compute-1 python3.9[131665]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ovn_controller.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 02 09:49:11 compute-1 sudo[131663]: pam_unix(sudo:session): session closed for user root
Feb 02 09:49:11 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:49:11 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 09:49:11 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:49:11.643 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 09:49:12 compute-1 sudo[131786]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zkrnoimnhcbbshymazswzvrdhttelyqw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025751.0971673-1459-92705423630933/AnsiballZ_copy.py'
Feb 02 09:49:12 compute-1 sudo[131786]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:49:12 compute-1 python3.9[131788]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/ovn_controller.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1770025751.0971673-1459-92705423630933/.source.json _original_basename=.zs2mzgar follow=False checksum=2328fc98619beeb08ee32b01f15bb43094c10b61 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 02 09:49:12 compute-1 sudo[131786]: pam_unix(sudo:session): session closed for user root
Feb 02 09:49:12 compute-1 ceph-mon[80115]: pgmap v278: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 2.2 KiB/s rd, 1023 B/s wr, 3 op/s
Feb 02 09:49:12 compute-1 sshd-session[131509]: Invalid user ubuntu from 123.58.212.100 port 56074
Feb 02 09:49:12 compute-1 sshd-session[131509]: Connection closed by invalid user ubuntu 123.58.212.100 port 56074 [preauth]
Feb 02 09:49:12 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-haproxy-nfs-cephfs-compute-1-sryqbx[86108]: [WARNING] 032/094912 (4) : Server backend/nfs.cephfs.2 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Feb 02 09:49:12 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:49:12 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4da80016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:49:13 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:49:13 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dcc0033b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:49:13 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:49:13 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dcc0033b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:49:13 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:49:13 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 09:49:13 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:49:13.437 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 09:49:13 compute-1 python3.9[131941]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/ovn_controller state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 02 09:49:13 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:49:13 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:49:13 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:49:13.645 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:49:13 compute-1 sshd-session[131814]: Invalid user ubuntu from 123.58.212.100 port 47498
Feb 02 09:49:14 compute-1 sshd-session[131814]: Connection closed by invalid user ubuntu 123.58.212.100 port 47498 [preauth]
Feb 02 09:49:14 compute-1 sudo[132041]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Feb 02 09:49:14 compute-1 sudo[132041]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:49:14 compute-1 sudo[132041]: pam_unix(sudo:session): session closed for user root
Feb 02 09:49:14 compute-1 ceph-mon[80115]: pgmap v279: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 2.2 KiB/s rd, 1023 B/s wr, 3 op/s
Feb 02 09:49:14 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:49:14 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4da80016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:49:15 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:49:15 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4db8003e10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:49:15 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:49:15 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dd8001380 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:49:15 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:49:15 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 09:49:15 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:49:15.440 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 09:49:15 compute-1 ceph-mon[80115]: pgmap v280: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 2.3 KiB/s rd, 1023 B/s wr, 3 op/s
Feb 02 09:49:15 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 09:49:15 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:49:15 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:49:15 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:49:15.649 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:49:15 compute-1 sshd-session[132054]: Invalid user ubuntu from 123.58.212.100 port 47500
Feb 02 09:49:15 compute-1 sudo[132390]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mdmuppysxucgsyfjetvnjfhgjpvtxqbi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025755.5149279-1579-199479133827866/AnsiballZ_container_config_data.py'
Feb 02 09:49:15 compute-1 sudo[132390]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:49:16 compute-1 sshd-session[132054]: Connection closed by invalid user ubuntu 123.58.212.100 port 47500 [preauth]
Feb 02 09:49:16 compute-1 python3.9[132392]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/ovn_controller config_pattern=*.json debug=False
Feb 02 09:49:16 compute-1 sudo[132390]: pam_unix(sudo:session): session closed for user root
Feb 02 09:49:16 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:49:16 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dcc0033b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:49:17 compute-1 sudo[132545]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qzlzsasmlqipvpqcfbpsqnnrzwvhksho ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025756.6025155-1612-49398202766767/AnsiballZ_container_config_hash.py'
Feb 02 09:49:17 compute-1 sudo[132545]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:49:17 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:49:17 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4da80032f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:49:17 compute-1 python3.9[132547]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack
Feb 02 09:49:17 compute-1 sudo[132545]: pam_unix(sudo:session): session closed for user root
Feb 02 09:49:17 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:49:17 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4db8003e30 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:49:17 compute-1 sshd-session[132417]: Invalid user ubuntu from 123.58.212.100 port 47510
Feb 02 09:49:17 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:49:17 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:49:17 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:49:17.443 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:49:17 compute-1 sshd-session[132417]: Connection closed by invalid user ubuntu 123.58.212.100 port 47510 [preauth]
Feb 02 09:49:17 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:49:17 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:49:17 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:49:17.651 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:49:18 compute-1 ceph-mon[80115]: pgmap v281: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 1.2 KiB/s rd, 426 B/s wr, 1 op/s
Feb 02 09:49:18 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Feb 02 09:49:18 compute-1 sudo[132699]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lnhdosbqvmscffmvrnnbzvjvvdtqgzff ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1770025757.6834905-1642-28717440627598/AnsiballZ_edpm_container_manage.py'
Feb 02 09:49:18 compute-1 sudo[132699]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:49:18 compute-1 python3[132701]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/ovn_controller config_id=ovn_controller config_overrides={} config_patterns=*.json containers=['ovn_controller'] log_base_path=/var/log/containers/stdouts debug=False
Feb 02 09:49:18 compute-1 sshd-session[132624]: Invalid user ubuntu from 123.58.212.100 port 47512
Feb 02 09:49:18 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:49:18 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dd80095a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:49:18 compute-1 sshd-session[132624]: Connection closed by invalid user ubuntu 123.58.212.100 port 47512 [preauth]
Feb 02 09:49:19 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:49:19 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dcc0033b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:49:19 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:49:19 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4db8003e30 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:49:19 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:49:19 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:49:19 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:49:19.451 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:49:19 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:49:19 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:49:19 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:49:19.653 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:49:20 compute-1 ceph-mon[80115]: pgmap v282: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 1.2 KiB/s rd, 426 B/s wr, 1 op/s
Feb 02 09:49:20 compute-1 sshd-session[132735]: Invalid user ubuntu from 123.58.212.100 port 47516
Feb 02 09:49:20 compute-1 sshd-session[132735]: Connection closed by invalid user ubuntu 123.58.212.100 port 47516 [preauth]
Feb 02 09:49:20 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 09:49:20 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:49:20 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4da80032f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:49:21 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:49:21 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dd80095a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:49:21 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:49:21 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dcc0033b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:49:21 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:49:21 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:49:21 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:49:21.453 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:49:21 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:49:21 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:49:21 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:49:21.655 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:49:22 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-haproxy-nfs-cephfs-compute-1-sryqbx[86108]: [WARNING] 032/094922 (4) : Server backend/nfs.cephfs.1 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Feb 02 09:49:22 compute-1 ceph-mon[80115]: pgmap v283: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 B/s wr, 0 op/s
Feb 02 09:49:22 compute-1 sshd-session[132769]: Invalid user ubuntu from 123.58.212.100 port 47520
Feb 02 09:49:22 compute-1 sshd-session[132769]: Connection closed by invalid user ubuntu 123.58.212.100 port 47520 [preauth]
Feb 02 09:49:22 compute-1 podman[132716]: 2026-02-02 09:49:22.794647422 +0000 UTC m=+4.270110576 image pull 9f8c6308802db66f6c1100257e3fa9593740e85d82f038b4185cf756493dc94e quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:099d88ae13fa2b3409da5310cdcba7fa01d2c87a8bc98296299a57054b9a075e
Feb 02 09:49:22 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:49:22 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4db8003e50 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:49:22 compute-1 podman[132841]: 2026-02-02 09:49:22.954558689 +0000 UTC m=+0.075084690 container create 1fb2696999ea15e9688144989093738543d3cef725f9986792d6dfd707aefbe2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:099d88ae13fa2b3409da5310cdcba7fa01d2c87a8bc98296299a57054b9a075e, name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20260127, config_id=ovn_controller, container_name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'db4758ee7523fe447444c4bd2b867b543b1eee4e3bbcf6676cd1b27bf6147d86-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:099d88ae13fa2b3409da5310cdcba7fa01d2c87a8bc98296299a57054b9a075e', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 02 09:49:22 compute-1 podman[132841]: 2026-02-02 09:49:22.909016647 +0000 UTC m=+0.029542648 image pull 9f8c6308802db66f6c1100257e3fa9593740e85d82f038b4185cf756493dc94e quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:099d88ae13fa2b3409da5310cdcba7fa01d2c87a8bc98296299a57054b9a075e
Feb 02 09:49:22 compute-1 python3[132701]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name ovn_controller --conmon-pidfile /run/ovn_controller.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env EDPM_CONFIG_HASH=db4758ee7523fe447444c4bd2b867b543b1eee4e3bbcf6676cd1b27bf6147d86-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0 --healthcheck-command /openstack/healthcheck --label config_id=ovn_controller --label container_name=ovn_controller --label managed_by=edpm_ansible --label config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'db4758ee7523fe447444c4bd2b867b543b1eee4e3bbcf6676cd1b27bf6147d86-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:099d88ae13fa2b3409da5310cdcba7fa01d2c87a8bc98296299a57054b9a075e', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --user root --volume /lib/modules:/lib/modules:ro --volume /run:/run --volume /var/lib/openvswitch/ovn:/run/ovn:shared,z --volume /var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z --volume /var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z --volume /var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z --volume /var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:099d88ae13fa2b3409da5310cdcba7fa01d2c87a8bc98296299a57054b9a075e
Feb 02 09:49:23 compute-1 sudo[132699]: pam_unix(sudo:session): session closed for user root
Feb 02 09:49:23 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:49:23 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4da8003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:49:23 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:49:23 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dd800a2b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:49:23 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:49:23 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:49:23 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:49:23.456 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:49:23 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:49:23 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:49:23 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:49:23.657 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:49:23 compute-1 sshd-session[132862]: Invalid user ubuntu from 123.58.212.100 port 60280
Feb 02 09:49:23 compute-1 sudo[133031]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zjovqbjgbifhgoqdartffljxousmhkqq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025763.7081242-1666-107277386541720/AnsiballZ_stat.py'
Feb 02 09:49:23 compute-1 sudo[133031]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:49:24 compute-1 sshd-session[132862]: Connection closed by invalid user ubuntu 123.58.212.100 port 60280 [preauth]
Feb 02 09:49:24 compute-1 python3.9[133033]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 02 09:49:24 compute-1 sudo[133031]: pam_unix(sudo:session): session closed for user root
Feb 02 09:49:24 compute-1 ceph-mon[80115]: pgmap v284: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 B/s wr, 0 op/s
Feb 02 09:49:24 compute-1 sudo[133188]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jibxfgeqpvfaybuyepyomsvymlmsbvrz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025764.595775-1693-181542765384067/AnsiballZ_file.py'
Feb 02 09:49:24 compute-1 sudo[133188]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:49:24 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:49:24 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dcc0033b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:49:25 compute-1 python3.9[133190]: ansible-file Invoked with path=/etc/systemd/system/edpm_ovn_controller.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 02 09:49:25 compute-1 sudo[133188]: pam_unix(sudo:session): session closed for user root
Feb 02 09:49:25 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:49:25 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4db8003e70 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:49:25 compute-1 sshd-session[133060]: Invalid user ubuntu from 123.58.212.100 port 60282
Feb 02 09:49:25 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:49:25 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4da8003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:49:25 compute-1 sudo[133264]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-grgpjsyhhufrgpwusncxvokmqygrjqdf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025764.595775-1693-181542765384067/AnsiballZ_stat.py'
Feb 02 09:49:25 compute-1 sudo[133264]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:49:25 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:49:25 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 09:49:25 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:49:25.458 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 09:49:25 compute-1 sshd-session[133060]: Connection closed by invalid user ubuntu 123.58.212.100 port 60282 [preauth]
Feb 02 09:49:25 compute-1 python3.9[133266]: ansible-stat Invoked with path=/etc/systemd/system/edpm_ovn_controller_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 02 09:49:25 compute-1 sudo[133264]: pam_unix(sudo:session): session closed for user root
Feb 02 09:49:25 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 09:49:25 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:49:25 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:49:25 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:49:25.660 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:49:26 compute-1 sudo[133417]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dspdsmlrkbqpytozrdxyvhepfvbnlodl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025765.5896683-1693-110278002173700/AnsiballZ_copy.py'
Feb 02 09:49:26 compute-1 sudo[133417]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:49:26 compute-1 ceph-mon[80115]: pgmap v285: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 B/s wr, 0 op/s
Feb 02 09:49:26 compute-1 python3.9[133419]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1770025765.5896683-1693-110278002173700/source dest=/etc/systemd/system/edpm_ovn_controller.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 02 09:49:26 compute-1 sudo[133417]: pam_unix(sudo:session): session closed for user root
Feb 02 09:49:26 compute-1 sudo[133493]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lgqtvyzasqfbawvzqumonouapwsysedp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025765.5896683-1693-110278002173700/AnsiballZ_systemd.py'
Feb 02 09:49:26 compute-1 sudo[133493]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:49:26 compute-1 sshd-session[133319]: Invalid user ubuntu from 123.58.212.100 port 60298
Feb 02 09:49:26 compute-1 python3.9[133495]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Feb 02 09:49:26 compute-1 sshd-session[133319]: Connection closed by invalid user ubuntu 123.58.212.100 port 60298 [preauth]
Feb 02 09:49:26 compute-1 systemd[1]: Reloading.
Feb 02 09:49:26 compute-1 systemd-sysv-generator[133527]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 02 09:49:26 compute-1 systemd-rc-local-generator[133523]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 02 09:49:26 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:49:26 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dd800a2b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:49:27 compute-1 sudo[133493]: pam_unix(sudo:session): session closed for user root
Feb 02 09:49:27 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:49:27 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dcc0033b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:49:27 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:49:27 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4db8003e90 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:49:27 compute-1 sudo[133608]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-phuhdlfmpelkgqpfadmklnkwsiztnkbj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025765.5896683-1693-110278002173700/AnsiballZ_systemd.py'
Feb 02 09:49:27 compute-1 sudo[133608]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:49:27 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:49:27 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:49:27 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:49:27.462 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:49:27 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:49:27 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 09:49:27 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:49:27.662 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 09:49:27 compute-1 python3.9[133610]: ansible-systemd Invoked with state=restarted name=edpm_ovn_controller.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 02 09:49:27 compute-1 systemd[1]: Reloading.
Feb 02 09:49:27 compute-1 systemd-sysv-generator[133641]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 02 09:49:27 compute-1 systemd-rc-local-generator[133637]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 02 09:49:27 compute-1 systemd[1]: Starting ovn_controller container...
Feb 02 09:49:27 compute-1 sshd-session[133533]: Invalid user ubuntu from 123.58.212.100 port 60310
Feb 02 09:49:28 compute-1 systemd[1]: Started libcrun container.
Feb 02 09:49:28 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/983c70ab7990a63defec4761ab2164346dd6a5e764615cf7607090ef8aebdfce/merged/run/ovn supports timestamps until 2038 (0x7fffffff)
Feb 02 09:49:28 compute-1 systemd[1]: Started /usr/bin/podman healthcheck run 1fb2696999ea15e9688144989093738543d3cef725f9986792d6dfd707aefbe2.
Feb 02 09:49:28 compute-1 podman[133650]: 2026-02-02 09:49:28.081297474 +0000 UTC m=+0.134897962 container init 1fb2696999ea15e9688144989093738543d3cef725f9986792d6dfd707aefbe2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:099d88ae13fa2b3409da5310cdcba7fa01d2c87a8bc98296299a57054b9a075e, name=ovn_controller, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'db4758ee7523fe447444c4bd2b867b543b1eee4e3bbcf6676cd1b27bf6147d86-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:099d88ae13fa2b3409da5310cdcba7fa01d2c87a8bc98296299a57054b9a075e', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20260127, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 02 09:49:28 compute-1 ovn_controller[133666]: + sudo -E kolla_set_configs
Feb 02 09:49:28 compute-1 podman[133650]: 2026-02-02 09:49:28.113701566 +0000 UTC m=+0.167302024 container start 1fb2696999ea15e9688144989093738543d3cef725f9986792d6dfd707aefbe2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:099d88ae13fa2b3409da5310cdcba7fa01d2c87a8bc98296299a57054b9a075e, name=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, container_name=ovn_controller, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'db4758ee7523fe447444c4bd2b867b543b1eee4e3bbcf6676cd1b27bf6147d86-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:099d88ae13fa2b3409da5310cdcba7fa01d2c87a8bc98296299a57054b9a075e', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20260127, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Feb 02 09:49:28 compute-1 edpm-start-podman-container[133650]: ovn_controller
Feb 02 09:49:28 compute-1 systemd[1]: Created slice User Slice of UID 0.
Feb 02 09:49:28 compute-1 systemd[1]: Starting User Runtime Directory /run/user/0...
Feb 02 09:49:28 compute-1 systemd[1]: Finished User Runtime Directory /run/user/0.
Feb 02 09:49:28 compute-1 systemd[1]: Starting User Manager for UID 0...
Feb 02 09:49:28 compute-1 edpm-start-podman-container[133649]: Creating additional drop-in dependency for "ovn_controller" (1fb2696999ea15e9688144989093738543d3cef725f9986792d6dfd707aefbe2)
Feb 02 09:49:28 compute-1 systemd[133706]: pam_unix(systemd-user:session): session opened for user root(uid=0) by root(uid=0)
Feb 02 09:49:28 compute-1 podman[133673]: 2026-02-02 09:49:28.209005603 +0000 UTC m=+0.087636114 container health_status 1fb2696999ea15e9688144989093738543d3cef725f9986792d6dfd707aefbe2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:099d88ae13fa2b3409da5310cdcba7fa01d2c87a8bc98296299a57054b9a075e, name=ovn_controller, health_status=starting, health_failing_streak=1, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'db4758ee7523fe447444c4bd2b867b543b1eee4e3bbcf6676cd1b27bf6147d86-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:099d88ae13fa2b3409da5310cdcba7fa01d2c87a8bc98296299a57054b9a075e', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Feb 02 09:49:28 compute-1 systemd[1]: 1fb2696999ea15e9688144989093738543d3cef725f9986792d6dfd707aefbe2-19dd92c79d235583.service: Main process exited, code=exited, status=1/FAILURE
Feb 02 09:49:28 compute-1 systemd[1]: 1fb2696999ea15e9688144989093738543d3cef725f9986792d6dfd707aefbe2-19dd92c79d235583.service: Failed with result 'exit-code'.
Feb 02 09:49:28 compute-1 sshd-session[133533]: Connection closed by invalid user ubuntu 123.58.212.100 port 60310 [preauth]
Feb 02 09:49:28 compute-1 systemd[1]: Reloading.
Feb 02 09:49:28 compute-1 ceph-mon[80115]: pgmap v286: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s
Feb 02 09:49:28 compute-1 systemd-rc-local-generator[133751]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 02 09:49:28 compute-1 systemd-sysv-generator[133754]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 02 09:49:28 compute-1 systemd[133706]: Queued start job for default target Main User Target.
Feb 02 09:49:28 compute-1 systemd[133706]: Created slice User Application Slice.
Feb 02 09:49:28 compute-1 systemd[133706]: Mark boot as successful after the user session has run 2 minutes was skipped because of an unmet condition check (ConditionUser=!@system).
Feb 02 09:49:28 compute-1 systemd[133706]: Started Daily Cleanup of User's Temporary Directories.
Feb 02 09:49:28 compute-1 systemd[133706]: Reached target Paths.
Feb 02 09:49:28 compute-1 systemd[133706]: Reached target Timers.
Feb 02 09:49:28 compute-1 systemd[133706]: Starting D-Bus User Message Bus Socket...
Feb 02 09:49:28 compute-1 systemd[133706]: Starting Create User's Volatile Files and Directories...
Feb 02 09:49:28 compute-1 systemd[133706]: Finished Create User's Volatile Files and Directories.
Feb 02 09:49:28 compute-1 systemd[133706]: Listening on D-Bus User Message Bus Socket.
Feb 02 09:49:28 compute-1 systemd[133706]: Reached target Sockets.
Feb 02 09:49:28 compute-1 systemd[133706]: Reached target Basic System.
Feb 02 09:49:28 compute-1 systemd[133706]: Reached target Main User Target.
Feb 02 09:49:28 compute-1 systemd[133706]: Startup finished in 155ms.
Feb 02 09:49:28 compute-1 systemd[1]: Started User Manager for UID 0.
Feb 02 09:49:28 compute-1 systemd[1]: Started ovn_controller container.
Feb 02 09:49:28 compute-1 systemd[1]: Started Session c1 of User root.
Feb 02 09:49:28 compute-1 sudo[133608]: pam_unix(sudo:session): session closed for user root
Feb 02 09:49:28 compute-1 ovn_controller[133666]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Feb 02 09:49:28 compute-1 ovn_controller[133666]: INFO:__main__:Validating config file
Feb 02 09:49:28 compute-1 ovn_controller[133666]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Feb 02 09:49:28 compute-1 ovn_controller[133666]: INFO:__main__:Writing out command to execute
Feb 02 09:49:28 compute-1 systemd[1]: session-c1.scope: Deactivated successfully.
Feb 02 09:49:28 compute-1 ovn_controller[133666]: ++ cat /run_command
Feb 02 09:49:28 compute-1 ovn_controller[133666]: + CMD='/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '
Feb 02 09:49:28 compute-1 ovn_controller[133666]: + ARGS=
Feb 02 09:49:28 compute-1 ovn_controller[133666]: + sudo kolla_copy_cacerts
Feb 02 09:49:28 compute-1 systemd[1]: Started Session c2 of User root.
Feb 02 09:49:28 compute-1 ovn_controller[133666]: + [[ ! -n '' ]]
Feb 02 09:49:28 compute-1 ovn_controller[133666]: + . kolla_extend_start
Feb 02 09:49:28 compute-1 ovn_controller[133666]: + echo 'Running command: '\''/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '\'''
Feb 02 09:49:28 compute-1 ovn_controller[133666]: Running command: '/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '
Feb 02 09:49:28 compute-1 ovn_controller[133666]: + umask 0022
Feb 02 09:49:28 compute-1 ovn_controller[133666]: + exec /usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt
Feb 02 09:49:28 compute-1 systemd[1]: session-c2.scope: Deactivated successfully.
Feb 02 09:49:28 compute-1 ovn_controller[133666]: 2026-02-02T09:49:28Z|00001|reconnect|INFO|unix:/run/openvswitch/db.sock: connecting...
Feb 02 09:49:28 compute-1 ovn_controller[133666]: 2026-02-02T09:49:28Z|00002|reconnect|INFO|unix:/run/openvswitch/db.sock: connected
Feb 02 09:49:28 compute-1 ovn_controller[133666]: 2026-02-02T09:49:28Z|00003|main|INFO|OVN internal version is : [24.03.8-20.33.0-76.8]
Feb 02 09:49:28 compute-1 ovn_controller[133666]: 2026-02-02T09:49:28Z|00004|main|INFO|OVS IDL reconnected, force recompute.
Feb 02 09:49:28 compute-1 ovn_controller[133666]: 2026-02-02T09:49:28Z|00005|reconnect|INFO|ssl:ovsdbserver-sb.openstack.svc:6642: connecting...
Feb 02 09:49:28 compute-1 ovn_controller[133666]: 2026-02-02T09:49:28Z|00006|main|INFO|OVNSB IDL reconnected, force recompute.
Feb 02 09:49:28 compute-1 NetworkManager[49055]: <info>  [1770025768.6234] manager: (br-int): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/16)
Feb 02 09:49:28 compute-1 NetworkManager[49055]: <info>  [1770025768.6239] device (br-int)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Feb 02 09:49:28 compute-1 NetworkManager[49055]: <warn>  [1770025768.6241] device (br-int)[Open vSwitch Interface]: error setting IPv4 forwarding to '1': No such file or directory
Feb 02 09:49:28 compute-1 NetworkManager[49055]: <info>  [1770025768.6246] manager: (br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/17)
Feb 02 09:49:28 compute-1 NetworkManager[49055]: <info>  [1770025768.6250] manager: (br-int): new Open vSwitch Bridge device (/org/freedesktop/NetworkManager/Devices/18)
Feb 02 09:49:28 compute-1 NetworkManager[49055]: <info>  [1770025768.6253] device (br-int)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'none', managed-type: 'full')
Feb 02 09:49:28 compute-1 kernel: br-int: entered promiscuous mode
Feb 02 09:49:28 compute-1 ovn_controller[133666]: 2026-02-02T09:49:28Z|00007|reconnect|INFO|ssl:ovsdbserver-sb.openstack.svc:6642: connected
Feb 02 09:49:28 compute-1 ovn_controller[133666]: 2026-02-02T09:49:28Z|00008|features|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Feb 02 09:49:28 compute-1 ovn_controller[133666]: 2026-02-02T09:49:28Z|00009|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Feb 02 09:49:28 compute-1 ovn_controller[133666]: 2026-02-02T09:49:28Z|00010|features|INFO|OVS Feature: ct_zero_snat, state: supported
Feb 02 09:49:28 compute-1 ovn_controller[133666]: 2026-02-02T09:49:28Z|00011|features|INFO|OVS Feature: ct_flush, state: supported
Feb 02 09:49:28 compute-1 ovn_controller[133666]: 2026-02-02T09:49:28Z|00012|features|INFO|OVS Feature: dp_hash_l4_sym_support, state: supported
Feb 02 09:49:28 compute-1 ovn_controller[133666]: 2026-02-02T09:49:28Z|00013|reconnect|INFO|unix:/run/openvswitch/db.sock: connecting...
Feb 02 09:49:28 compute-1 ovn_controller[133666]: 2026-02-02T09:49:28Z|00014|main|INFO|OVS feature set changed, force recompute.
Feb 02 09:49:28 compute-1 ovn_controller[133666]: 2026-02-02T09:49:28Z|00015|ofctrl|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Feb 02 09:49:28 compute-1 ovn_controller[133666]: 2026-02-02T09:49:28Z|00016|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Feb 02 09:49:28 compute-1 ovn_controller[133666]: 2026-02-02T09:49:28Z|00017|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Feb 02 09:49:28 compute-1 ovn_controller[133666]: 2026-02-02T09:49:28Z|00018|ofctrl|INFO|ofctrl-wait-before-clear is now 8000 ms (was 0 ms)
Feb 02 09:49:28 compute-1 ovn_controller[133666]: 2026-02-02T09:49:28Z|00019|main|INFO|OVS OpenFlow connection reconnected,force recompute.
Feb 02 09:49:28 compute-1 ovn_controller[133666]: 2026-02-02T09:49:28Z|00020|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Feb 02 09:49:28 compute-1 ovn_controller[133666]: 2026-02-02T09:49:28Z|00021|reconnect|INFO|unix:/run/openvswitch/db.sock: connected
Feb 02 09:49:28 compute-1 ovn_controller[133666]: 2026-02-02T09:49:28Z|00022|main|INFO|OVS feature set changed, force recompute.
Feb 02 09:49:28 compute-1 ovn_controller[133666]: 2026-02-02T09:49:28Z|00023|features|INFO|OVS DB schema supports 4 flow table prefixes, our IDL supports: 4
Feb 02 09:49:28 compute-1 ovn_controller[133666]: 2026-02-02T09:49:28Z|00024|main|INFO|Setting flow table prefixes: ip_src, ip_dst, ipv6_src, ipv6_dst.
Feb 02 09:49:28 compute-1 ovn_controller[133666]: 2026-02-02T09:49:28Z|00001|pinctrl(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Feb 02 09:49:28 compute-1 ovn_controller[133666]: 2026-02-02T09:49:28Z|00002|rconn(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Feb 02 09:49:28 compute-1 ovn_controller[133666]: 2026-02-02T09:49:28Z|00001|statctrl(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Feb 02 09:49:28 compute-1 ovn_controller[133666]: 2026-02-02T09:49:28Z|00002|rconn(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Feb 02 09:49:28 compute-1 ovn_controller[133666]: 2026-02-02T09:49:28Z|00003|rconn(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Feb 02 09:49:28 compute-1 ovn_controller[133666]: 2026-02-02T09:49:28Z|00003|rconn(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Feb 02 09:49:28 compute-1 NetworkManager[49055]: <info>  [1770025768.6446] manager: (ovn-efcb63-0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/19)
Feb 02 09:49:28 compute-1 systemd-udevd[133800]: Network interface NamePolicy= disabled on kernel command line.
Feb 02 09:49:28 compute-1 kernel: genev_sys_6081: entered promiscuous mode
Feb 02 09:49:28 compute-1 NetworkManager[49055]: <info>  [1770025768.6627] device (genev_sys_6081): carrier: link connected
Feb 02 09:49:28 compute-1 NetworkManager[49055]: <info>  [1770025768.6630] manager: (genev_sys_6081): new Generic device (/org/freedesktop/NetworkManager/Devices/20)
Feb 02 09:49:28 compute-1 NetworkManager[49055]: <info>  [1770025768.8043] manager: (ovn-1b0741-0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/21)
Feb 02 09:49:28 compute-1 NetworkManager[49055]: <info>  [1770025768.8704] manager: (ovn-031ca0-0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/22)
Feb 02 09:49:28 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:49:28 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4da8003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:49:29 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:49:29 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4da8003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:49:29 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:49:29 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dcc0033b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:49:29 compute-1 sshd-session[133762]: Invalid user ubuntu from 123.58.212.100 port 60326
Feb 02 09:49:29 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:49:29 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 09:49:29 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:49:29.464 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 09:49:29 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:49:29 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:49:29 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:49:29.665 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:49:29 compute-1 python3.9[133931]: ansible-ansible.builtin.slurp Invoked with src=/var/lib/edpm-config/deployed_services.yaml
Feb 02 09:49:29 compute-1 sshd-session[133762]: Connection closed by invalid user ubuntu 123.58.212.100 port 60326 [preauth]
Feb 02 09:49:30 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:49:30 : epoch 698072dc : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Feb 02 09:49:30 compute-1 ceph-mon[80115]: pgmap v287: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 597 B/s rd, 85 B/s wr, 0 op/s
Feb 02 09:49:30 compute-1 sudo[134083]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-odarlzwdxdjywaopejuifycszskbqkzs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025770.2638068-1828-243749836080261/AnsiballZ_stat.py'
Feb 02 09:49:30 compute-1 sudo[134083]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:49:30 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 09:49:30 compute-1 python3.9[134085]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/deployed_services.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 02 09:49:30 compute-1 sudo[134083]: pam_unix(sudo:session): session closed for user root
Feb 02 09:49:30 compute-1 sshd-session[133956]: Invalid user ubuntu from 123.58.212.100 port 60340
Feb 02 09:49:30 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:49:30 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4db8003eb0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:49:31 compute-1 sshd-session[133956]: Connection closed by invalid user ubuntu 123.58.212.100 port 60340 [preauth]
Feb 02 09:49:31 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:49:31 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4da8003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:49:31 compute-1 sudo[134207]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-evwwltzkycwuvwkakqvkrytqzjmakbwl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025770.2638068-1828-243749836080261/AnsiballZ_copy.py'
Feb 02 09:49:31 compute-1 sudo[134207]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:49:31 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:49:31 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4da8003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:49:31 compute-1 python3.9[134209]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/deployed_services.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1770025770.2638068-1828-243749836080261/.source.yaml _original_basename=.30gupeo4 follow=False checksum=49e9dd6dd1573230eefb068866cfd1da40e184ac backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 02 09:49:31 compute-1 sudo[134207]: pam_unix(sudo:session): session closed for user root
Feb 02 09:49:31 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:49:31 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 09:49:31 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:49:31.467 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 09:49:31 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:49:31 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:49:31 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:49:31.668 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:49:31 compute-1 sudo[134361]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xtrkwzsunfvtniqhtpybewtzkfqvpozb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025771.6758037-1873-40965719010618/AnsiballZ_command.py'
Feb 02 09:49:32 compute-1 sudo[134361]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:49:32 compute-1 sshd-session[134210]: Invalid user ubuntu from 123.58.212.100 port 44248
Feb 02 09:49:32 compute-1 python3.9[134363]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl remove open . other_config hw-offload
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 02 09:49:32 compute-1 ovs-vsctl[134364]: ovs|00001|vsctl|INFO|Called as ovs-vsctl remove open . other_config hw-offload
Feb 02 09:49:32 compute-1 sudo[134361]: pam_unix(sudo:session): session closed for user root
Feb 02 09:49:32 compute-1 ceph-mon[80115]: pgmap v288: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 85 B/s wr, 0 op/s
Feb 02 09:49:32 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Feb 02 09:49:32 compute-1 sudo[134389]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 02 09:49:32 compute-1 sudo[134389]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:49:32 compute-1 sudo[134389]: pam_unix(sudo:session): session closed for user root
Feb 02 09:49:32 compute-1 sshd-session[134210]: Connection closed by invalid user ubuntu 123.58.212.100 port 44248 [preauth]
Feb 02 09:49:32 compute-1 sudo[134414]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/d241d473-9fcb-5f74-b163-f1ca4454e7f1/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 check-host
Feb 02 09:49:32 compute-1 sudo[134414]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:49:32 compute-1 sudo[134414]: pam_unix(sudo:session): session closed for user root
Feb 02 09:49:32 compute-1 sudo[134589]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wipjqsxbqlqrflnksmvzsbefjyanpwjd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025772.5350606-1897-91304455548570/AnsiballZ_command.py'
Feb 02 09:49:32 compute-1 sudo[134589]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:49:32 compute-1 sudo[134590]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 02 09:49:32 compute-1 sudo[134590]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:49:32 compute-1 sudo[134590]: pam_unix(sudo:session): session closed for user root
Feb 02 09:49:32 compute-1 sudo[134617]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/d241d473-9fcb-5f74-b163-f1ca4454e7f1/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Feb 02 09:49:32 compute-1 sudo[134617]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:49:32 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:49:32 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dcc0033b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:49:33 compute-1 python3.9[134597]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl get Open_vSwitch . external_ids:ovn-cms-options | sed 's/\"//g'
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 02 09:49:33 compute-1 ovs-vsctl[134643]: ovs|00001|db_ctl_base|ERR|no key "ovn-cms-options" in Open_vSwitch record "." column external_ids
Feb 02 09:49:33 compute-1 sudo[134589]: pam_unix(sudo:session): session closed for user root
Feb 02 09:49:33 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:49:33 : epoch 698072dc : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Feb 02 09:49:33 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:49:33 : epoch 698072dc : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Feb 02 09:49:33 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:49:33 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4db8003ed0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:49:33 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:49:33 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4da8003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:49:33 compute-1 sudo[134617]: pam_unix(sudo:session): session closed for user root
Feb 02 09:49:33 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:49:33 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:49:33 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:49:33.470 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:49:33 compute-1 sshd-session[134494]: Invalid user ubuntu from 123.58.212.100 port 44260
Feb 02 09:49:33 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:49:33 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:49:33 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:49:33.671 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:49:33 compute-1 sshd-session[134494]: Connection closed by invalid user ubuntu 123.58.212.100 port 44260 [preauth]
Feb 02 09:49:33 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb 02 09:49:33 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb 02 09:49:33 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb 02 09:49:33 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb 02 09:49:33 compute-1 ceph-mon[80115]: pgmap v289: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 85 B/s wr, 0 op/s
Feb 02 09:49:33 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Feb 02 09:49:33 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Feb 02 09:49:33 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb 02 09:49:33 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb 02 09:49:33 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Feb 02 09:49:33 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Feb 02 09:49:33 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Feb 02 09:49:33 compute-1 sudo[134826]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jqaekscsxrkvmgzpkupublshtkbpvbti ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025773.624773-1939-150012624851943/AnsiballZ_command.py'
Feb 02 09:49:33 compute-1 sudo[134826]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:49:34 compute-1 python3.9[134828]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl remove Open_vSwitch . external_ids ovn-cms-options
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 02 09:49:34 compute-1 ovs-vsctl[134831]: ovs|00001|vsctl|INFO|Called as ovs-vsctl remove Open_vSwitch . external_ids ovn-cms-options
Feb 02 09:49:34 compute-1 sudo[134826]: pam_unix(sudo:session): session closed for user root
Feb 02 09:49:34 compute-1 sudo[134856]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Feb 02 09:49:34 compute-1 sudo[134856]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:49:34 compute-1 sudo[134856]: pam_unix(sudo:session): session closed for user root
Feb 02 09:49:34 compute-1 sshd-session[122768]: Connection closed by 192.168.122.30 port 57730
Feb 02 09:49:34 compute-1 sshd-session[122765]: pam_unix(sshd:session): session closed for user zuul
Feb 02 09:49:34 compute-1 systemd[1]: session-49.scope: Deactivated successfully.
Feb 02 09:49:34 compute-1 systemd[1]: session-49.scope: Consumed 52.942s CPU time.
Feb 02 09:49:34 compute-1 systemd-logind[805]: Session 49 logged out. Waiting for processes to exit.
Feb 02 09:49:34 compute-1 systemd-logind[805]: Removed session 49.
Feb 02 09:49:34 compute-1 ceph-mon[80115]: pgmap v290: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 2.4 KiB/s rd, 1.1 KiB/s wr, 3 op/s
Feb 02 09:49:34 compute-1 ceph-mon[80115]: pgmap v291: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 2.8 KiB/s rd, 1.3 KiB/s wr, 4 op/s
Feb 02 09:49:34 compute-1 sshd-session[134829]: Invalid user ubuntu from 123.58.212.100 port 44270
Feb 02 09:49:34 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:49:34 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dd800a2b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:49:35 compute-1 sshd-session[134829]: Connection closed by invalid user ubuntu 123.58.212.100 port 44270 [preauth]
Feb 02 09:49:35 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:49:35 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dcc0033b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:49:35 compute-1 ceph-mon[80115]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Feb 02 09:49:35 compute-1 ceph-mon[80115]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                           ** DB Stats **
                                           Uptime(secs): 600.0 total, 600.0 interval
                                           Cumulative writes: 2411 writes, 14K keys, 2411 commit groups, 1.0 writes per commit group, ingest: 0.04 GB, 0.06 MB/s
                                           Cumulative WAL: 2411 writes, 2411 syncs, 1.00 writes per sync, written: 0.04 GB, 0.06 MB/s
                                           Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                           Interval writes: 2411 writes, 14K keys, 2411 commit groups, 1.0 writes per commit group, ingest: 38.02 MB, 0.06 MB/s
                                           Interval WAL: 2411 writes, 2411 syncs, 1.00 writes per sync, written: 0.04 GB, 0.06 MB/s
                                           Interval stall: 00:00:0.000 H:M:S, 0.0 percent
                                           
                                           ** Compaction Stats [default] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0    106.8      0.20              0.04         6    0.033       0      0       0.0       0.0
                                             L6      1/0   11.64 MB   0.0      0.1     0.0      0.0       0.1      0.0       0.0   2.9    102.0     88.5      0.70              0.11         5    0.139     21K   2265       0.0       0.0
                                            Sum      1/0   11.64 MB   0.0      0.1     0.0      0.0       0.1      0.0       0.0   3.9     79.5     92.5      0.89              0.16        11    0.081     21K   2265       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.1     0.0      0.0       0.1      0.0       0.0   3.9     79.7     92.7      0.89              0.16        10    0.089     21K   2265       0.0       0.0
                                           
                                           ** Compaction Stats [default] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Low      0/0    0.00 KB   0.0      0.1     0.0      0.0       0.1      0.0       0.0   0.0    102.0     88.5      0.70              0.11         5    0.139     21K   2265       0.0       0.0
                                           High      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0    107.9      0.19              0.04         5    0.039       0      0       0.0       0.0
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.8      0.00              0.00         1    0.002       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.0 total, 600.0 interval
                                           Flush(GB): cumulative 0.021, interval 0.021
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.08 GB write, 0.14 MB/s write, 0.07 GB read, 0.12 MB/s read, 0.9 seconds
                                           Interval compaction: 0.08 GB write, 0.14 MB/s write, 0.07 GB read, 0.12 MB/s read, 0.9 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55a64debd350#2 capacity: 304.00 MB usage: 2.56 MB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 0 last_secs: 6.3e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(157,2.36 MB,0.777666%) FilterBlock(11,70.05 KB,0.0225017%) IndexBlock(11,132.27 KB,0.0424887%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [default] **
Feb 02 09:49:35 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:49:35 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dbc002690 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:49:35 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:49:35 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:49:35 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:49:35.474 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:49:35 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 09:49:35 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:49:35 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:49:35 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:49:35.673 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:49:36 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:49:36 : epoch 698072dc : compute-1 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Feb 02 09:49:36 compute-1 sshd-session[134885]: Invalid user ubuntu from 123.58.212.100 port 44274
Feb 02 09:49:36 compute-1 sshd-session[134885]: Connection closed by invalid user ubuntu 123.58.212.100 port 44274 [preauth]
Feb 02 09:49:36 compute-1 ceph-mon[80115]: pgmap v292: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 2.8 KiB/s rd, 1.3 KiB/s wr, 4 op/s
Feb 02 09:49:36 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:49:36 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dc8002c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:49:37 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:49:37 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dd800a2b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:49:37 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:49:37 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dcc0033b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:49:37 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:49:37 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:49:37 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:49:37.477 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:49:37 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:49:37 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:49:37 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:49:37.675 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:49:37 compute-1 sshd-session[134888]: Invalid user ubuntu from 123.58.212.100 port 44276
Feb 02 09:49:37 compute-1 sshd-session[134888]: Connection closed by invalid user ubuntu 123.58.212.100 port 44276 [preauth]
Feb 02 09:49:38 compute-1 sudo[134893]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 02 09:49:38 compute-1 sudo[134893]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:49:38 compute-1 systemd[1]: Stopping User Manager for UID 0...
Feb 02 09:49:38 compute-1 systemd[133706]: Activating special unit Exit the Session...
Feb 02 09:49:38 compute-1 systemd[133706]: Stopped target Main User Target.
Feb 02 09:49:38 compute-1 systemd[133706]: Stopped target Basic System.
Feb 02 09:49:38 compute-1 systemd[133706]: Stopped target Paths.
Feb 02 09:49:38 compute-1 systemd[133706]: Stopped target Sockets.
Feb 02 09:49:38 compute-1 systemd[133706]: Stopped target Timers.
Feb 02 09:49:38 compute-1 systemd[133706]: Stopped Daily Cleanup of User's Temporary Directories.
Feb 02 09:49:38 compute-1 systemd[133706]: Closed D-Bus User Message Bus Socket.
Feb 02 09:49:38 compute-1 sudo[134893]: pam_unix(sudo:session): session closed for user root
Feb 02 09:49:38 compute-1 systemd[133706]: Stopped Create User's Volatile Files and Directories.
Feb 02 09:49:38 compute-1 systemd[133706]: Removed slice User Application Slice.
Feb 02 09:49:38 compute-1 systemd[133706]: Reached target Shutdown.
Feb 02 09:49:38 compute-1 systemd[133706]: Finished Exit the Session.
Feb 02 09:49:38 compute-1 systemd[133706]: Reached target Exit the Session.
Feb 02 09:49:38 compute-1 systemd[1]: user@0.service: Deactivated successfully.
Feb 02 09:49:38 compute-1 systemd[1]: Stopped User Manager for UID 0.
Feb 02 09:49:38 compute-1 systemd[1]: Stopping User Runtime Directory /run/user/0...
Feb 02 09:49:38 compute-1 systemd[1]: run-user-0.mount: Deactivated successfully.
Feb 02 09:49:38 compute-1 systemd[1]: user-runtime-dir@0.service: Deactivated successfully.
Feb 02 09:49:38 compute-1 systemd[1]: Stopped User Runtime Directory /run/user/0.
Feb 02 09:49:38 compute-1 systemd[1]: Removed slice User Slice of UID 0.
Feb 02 09:49:38 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:49:38 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dbc002690 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:49:38 compute-1 ceph-mon[80115]: pgmap v293: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 2.2 KiB/s rd, 1.2 KiB/s wr, 3 op/s
Feb 02 09:49:38 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb 02 09:49:38 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb 02 09:49:39 compute-1 sshd-session[134890]: Invalid user ubuntu from 123.58.212.100 port 44284
Feb 02 09:49:39 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:49:39 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dc8002c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:49:39 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:49:39 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dd800a2b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:49:39 compute-1 sshd-session[134890]: Connection closed by invalid user ubuntu 123.58.212.100 port 44284 [preauth]
Feb 02 09:49:39 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:49:39 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.002000053s ======
Feb 02 09:49:39 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:49:39.479 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000053s
Feb 02 09:49:39 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:49:39 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:49:39 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:49:39.678 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:49:40 compute-1 sshd-session[134919]: Invalid user ubuntu from 123.58.212.100 port 44290
Feb 02 09:49:40 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 09:49:40 compute-1 sshd-session[134921]: Accepted publickey for zuul from 192.168.122.30 port 53054 ssh2: ECDSA SHA256:RIWOugHsRom13QN8+H2eekzMj6VNcm6gUxie+zDStiQ
Feb 02 09:49:40 compute-1 systemd-logind[805]: New session 51 of user zuul.
Feb 02 09:49:40 compute-1 systemd[1]: Started Session 51 of User zuul.
Feb 02 09:49:40 compute-1 sshd-session[134921]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Feb 02 09:49:40 compute-1 sshd-session[134919]: Connection closed by invalid user ubuntu 123.58.212.100 port 44290 [preauth]
Feb 02 09:49:40 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:49:40 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dcc0033b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:49:41 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:49:41 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dbc001dd0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:49:41 compute-1 ceph-mon[80115]: pgmap v294: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 2.5 KiB/s rd, 1.3 KiB/s wr, 4 op/s
Feb 02 09:49:41 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:49:41 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dc8002c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:49:41 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:49:41 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 09:49:41 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:49:41.483 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 09:49:41 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:49:41 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:49:41 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:49:41.682 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:49:41 compute-1 python3.9[135077]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 02 09:49:41 compute-1 sshd-session[134978]: Invalid user ubuntu from 123.58.212.100 port 44300
Feb 02 09:49:42 compute-1 sshd-session[134978]: Connection closed by invalid user ubuntu 123.58.212.100 port 44300 [preauth]
Feb 02 09:49:42 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-haproxy-nfs-cephfs-compute-1-sryqbx[86108]: [WARNING] 032/094942 (4) : Server backend/nfs.cephfs.1 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Feb 02 09:49:42 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:49:42 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dd800a2b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:49:43 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:49:43 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dcc0033b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:49:43 compute-1 sshd-session[135106]: Invalid user ubuntu from 123.58.212.100 port 44220
Feb 02 09:49:43 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:49:43 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dbc001dd0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:49:43 compute-1 sshd-session[135106]: Connection closed by invalid user ubuntu 123.58.212.100 port 44220 [preauth]
Feb 02 09:49:43 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:49:43 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 09:49:43 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:49:43.486 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 09:49:43 compute-1 sudo[135235]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iwkcfoldgyoatcumzurozdfiidcygehi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025783.1973248-58-82833179766649/AnsiballZ_file.py'
Feb 02 09:49:43 compute-1 sudo[135235]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:49:43 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:49:43 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:49:43 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:49:43.685 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:49:43 compute-1 ceph-mon[80115]: pgmap v295: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 2.5 KiB/s rd, 1.3 KiB/s wr, 4 op/s
Feb 02 09:49:44 compute-1 python3.9[135239]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/var/lib/openstack/neutron-ovn-metadata-agent setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Feb 02 09:49:44 compute-1 sudo[135235]: pam_unix(sudo:session): session closed for user root
Feb 02 09:49:44 compute-1 sudo[135389]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-exjycrmwxpdjcmqfhdfdeuzboouiqivf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025784.3290236-58-226713069428188/AnsiballZ_file.py'
Feb 02 09:49:44 compute-1 sshd-session[135236]: Invalid user ubuntu from 123.58.212.100 port 44236
Feb 02 09:49:44 compute-1 sudo[135389]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:49:44 compute-1 sshd-session[135236]: Connection closed by invalid user ubuntu 123.58.212.100 port 44236 [preauth]
Feb 02 09:49:44 compute-1 python3.9[135391]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 02 09:49:44 compute-1 sudo[135389]: pam_unix(sudo:session): session closed for user root
Feb 02 09:49:44 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:49:44 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dc8002c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:49:44 compute-1 ceph-mon[80115]: pgmap v296: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 307 B/s rd, 102 B/s wr, 0 op/s
Feb 02 09:49:45 compute-1 sudo[135544]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rgyhonwqgizjulytsupdbrdelumdnksr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025784.9343393-58-187631113425886/AnsiballZ_file.py'
Feb 02 09:49:45 compute-1 sudo[135544]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:49:45 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:49:45 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dd800a2b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:49:45 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:49:45 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dcc0033b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:49:45 compute-1 python3.9[135546]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/kill_scripts setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 02 09:49:45 compute-1 sudo[135544]: pam_unix(sudo:session): session closed for user root
Feb 02 09:49:45 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:49:45 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 09:49:45 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:49:45.490 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 09:49:45 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 09:49:45 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:49:45 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:49:45 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:49:45.687 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:49:45 compute-1 sudo[135696]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tzdqfbfhblukwmbepbpslueftbekmvgw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025785.503818-58-248029402328898/AnsiballZ_file.py'
Feb 02 09:49:45 compute-1 sudo[135696]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:49:45 compute-1 python3.9[135698]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/ovn-metadata-proxy setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 02 09:49:45 compute-1 sudo[135696]: pam_unix(sudo:session): session closed for user root
Feb 02 09:49:46 compute-1 sudo[135848]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aedumasuwwcevtrfpjdoecgsqnsdlycp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025786.0780296-58-264273625534936/AnsiballZ_file.py'
Feb 02 09:49:46 compute-1 sudo[135848]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:49:46 compute-1 python3.9[135850]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/external/pids setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 02 09:49:46 compute-1 sudo[135848]: pam_unix(sudo:session): session closed for user root
Feb 02 09:49:46 compute-1 sshd-session[135473]: Invalid user ubuntu from 123.58.212.100 port 44244
Feb 02 09:49:46 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:49:46 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dbc001dd0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:49:46 compute-1 sshd-session[135473]: Connection closed by invalid user ubuntu 123.58.212.100 port 44244 [preauth]
Feb 02 09:49:47 compute-1 ceph-mon[80115]: pgmap v297: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 85 B/s wr, 0 op/s
Feb 02 09:49:47 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:49:47 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dc8002c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:49:47 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:49:47 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dd800a2b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:49:47 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:49:47 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:49:47 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:49:47.494 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:49:47 compute-1 python3.9[136002]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 02 09:49:47 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:49:47 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:49:47 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:49:47.689 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:49:48 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Feb 02 09:49:48 compute-1 sudo[136154]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aiayhharelbggotxotkjgtvfupfmsado ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025787.7308905-190-222622303708059/AnsiballZ_seboolean.py'
Feb 02 09:49:48 compute-1 sudo[136154]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:49:48 compute-1 sshd-session[136003]: Invalid user ubuntu from 123.58.212.100 port 44250
Feb 02 09:49:48 compute-1 python3.9[136156]: ansible-ansible.posix.seboolean Invoked with name=virt_sandbox_use_netlink persistent=True state=True ignore_selinux_state=False
Feb 02 09:49:48 compute-1 sshd-session[136003]: Connection closed by invalid user ubuntu 123.58.212.100 port 44250 [preauth]
Feb 02 09:49:48 compute-1 sudo[136154]: pam_unix(sudo:session): session closed for user root
Feb 02 09:49:48 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:49:48 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dcc0033b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:49:49 compute-1 ceph-mon[80115]: pgmap v298: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 85 B/s wr, 0 op/s
Feb 02 09:49:49 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:49:49 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dbc001dd0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:49:49 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:49:49 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dc8003b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:49:49 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:49:49 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:49:49 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:49:49.495 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:49:49 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:49:49 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 09:49:49 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:49:49.691 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 09:49:49 compute-1 sshd-session[136157]: Invalid user ubuntu from 123.58.212.100 port 44264
Feb 02 09:49:49 compute-1 python3.9[136309]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/ovn_metadata_haproxy_wrapper follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 02 09:49:49 compute-1 sshd-session[136157]: Connection closed by invalid user ubuntu 123.58.212.100 port 44264 [preauth]
Feb 02 09:49:50 compute-1 python3.9[136432]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/ovn_metadata_haproxy_wrapper mode=0755 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1770025789.1167662-214-80255114617812/.source follow=False _original_basename=haproxy.j2 checksum=35fdf371a5549b7e7e32a6541c07c1ac75cf4dcf backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 02 09:49:50 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 09:49:50 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:49:50 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dd800a2b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:49:51 compute-1 python3.9[136583]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/kill_scripts/haproxy-kill follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 02 09:49:51 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:49:51 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dbc001dd0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:49:51 compute-1 ceph-mon[80115]: pgmap v299: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 426 B/s rd, 85 B/s wr, 0 op/s
Feb 02 09:49:51 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:49:51 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dcc0033b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:49:51 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:49:51 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 09:49:51 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:49:51.499 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 09:49:51 compute-1 python3.9[136704]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/kill_scripts/haproxy-kill mode=0755 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1770025790.6593184-259-190124914151455/.source follow=False _original_basename=kill-script.j2 checksum=2dfb5489f491f61b95691c3bf95fa1fe48ff3700 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 02 09:49:51 compute-1 sshd-session[136398]: Invalid user ubuntu from 123.58.212.100 port 44278
Feb 02 09:49:51 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:49:51 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:49:51 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:49:51.694 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:49:51 compute-1 sshd-session[136398]: Connection closed by invalid user ubuntu 123.58.212.100 port 44278 [preauth]
Feb 02 09:49:52 compute-1 sudo[136856]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iovamifbblcvrnsqtknkvuykxqdxsceo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025792.1452677-310-101578240659487/AnsiballZ_setup.py'
Feb 02 09:49:52 compute-1 sudo[136856]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:49:52 compute-1 python3.9[136858]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Feb 02 09:49:52 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:49:52 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dc8003b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:49:52 compute-1 sudo[136856]: pam_unix(sudo:session): session closed for user root
Feb 02 09:49:53 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:49:53 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dd800a2b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:49:53 compute-1 sshd-session[136752]: Invalid user ubuntu from 123.58.212.100 port 59960
Feb 02 09:49:53 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:49:53 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dbc003fd0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:49:53 compute-1 ceph-mon[80115]: pgmap v300: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 B/s wr, 0 op/s
Feb 02 09:49:53 compute-1 sudo[136941]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sgwvblaplowhmlopkjxbdddxsqjnywwo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025792.1452677-310-101578240659487/AnsiballZ_dnf.py'
Feb 02 09:49:53 compute-1 sudo[136941]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:49:53 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:49:53 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:49:53 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:49:53.502 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:49:53 compute-1 sshd-session[136752]: Connection closed by invalid user ubuntu 123.58.212.100 port 59960 [preauth]
Feb 02 09:49:53 compute-1 python3.9[136943]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Feb 02 09:49:53 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:49:53 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:49:53 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:49:53.696 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:49:54 compute-1 sudo[136947]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Feb 02 09:49:54 compute-1 sudo[136947]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:49:54 compute-1 sudo[136947]: pam_unix(sudo:session): session closed for user root
Feb 02 09:49:54 compute-1 sshd-session[136945]: Invalid user ubuntu from 123.58.212.100 port 59972
Feb 02 09:49:54 compute-1 sudo[136941]: pam_unix(sudo:session): session closed for user root
Feb 02 09:49:54 compute-1 sshd-session[136945]: Connection closed by invalid user ubuntu 123.58.212.100 port 59972 [preauth]
Feb 02 09:49:54 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:49:54 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dcc0033b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:49:55 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:49:55 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dcc0033b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:49:55 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:49:55 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dd800a2b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:49:55 compute-1 ceph-mon[80115]: pgmap v301: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 341 B/s rd, 0 B/s wr, 0 op/s
Feb 02 09:49:55 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:49:55 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 09:49:55 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:49:55.505 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 09:49:55 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 09:49:55 compute-1 sudo[137124]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iliftnjasaisjtkrzgiwngiyncaelzmj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025795.0830352-346-118260464969516/AnsiballZ_systemd.py'
Feb 02 09:49:55 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:49:55 compute-1 sudo[137124]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:49:55 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:49:55 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:49:55.698 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:49:55 compute-1 python3.9[137126]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Feb 02 09:49:56 compute-1 sudo[137124]: pam_unix(sudo:session): session closed for user root
Feb 02 09:49:56 compute-1 sshd-session[137015]: Invalid user ubuntu from 123.58.212.100 port 59978
Feb 02 09:49:56 compute-1 sshd-session[137015]: Connection closed by invalid user ubuntu 123.58.212.100 port 59978 [preauth]
Feb 02 09:49:56 compute-1 python3.9[137281]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/neutron-ovn-metadata-agent/01-rootwrap.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 02 09:49:56 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:49:56 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dbc003fd0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:49:57 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:49:57 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dcc0033b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:49:57 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:49:57 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dc8003b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:49:57 compute-1 python3.9[137403]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/neutron-ovn-metadata-agent/01-rootwrap.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1770025796.3345869-370-110931356564794/.source.conf follow=False _original_basename=rootwrap.conf.j2 checksum=11f2cfb4b7d97b2cef3c2c2d88089e6999cffe22 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 02 09:49:57 compute-1 ceph-mon[80115]: pgmap v302: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Feb 02 09:49:57 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:49:57 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:49:57 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:49:57.508 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:49:57 compute-1 sshd-session[137253]: Invalid user ubuntu from 123.58.212.100 port 59990
Feb 02 09:49:57 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:49:57 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:49:57 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:49:57.701 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:49:57 compute-1 sshd-session[137253]: Connection closed by invalid user ubuntu 123.58.212.100 port 59990 [preauth]
Feb 02 09:49:57 compute-1 python3.9[137553]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/neutron-ovn-metadata-agent/01-neutron-ovn-metadata-agent.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 02 09:49:58 compute-1 ovn_controller[133666]: 2026-02-02T09:49:58Z|00025|memory|INFO|16384 kB peak resident set size after 29.8 seconds
Feb 02 09:49:58 compute-1 ovn_controller[133666]: 2026-02-02T09:49:58Z|00026|memory|INFO|idl-cells-OVN_Southbound:273 idl-cells-Open_vSwitch:642 ofctrl_desired_flow_usage-KB:7 ofctrl_installed_flow_usage-KB:5 ofctrl_sb_flow_ref_usage-KB:3
Feb 02 09:49:58 compute-1 podman[137650]: 2026-02-02 09:49:58.422710751 +0000 UTC m=+0.120206060 container health_status 1fb2696999ea15e9688144989093738543d3cef725f9986792d6dfd707aefbe2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:099d88ae13fa2b3409da5310cdcba7fa01d2c87a8bc98296299a57054b9a075e, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'db4758ee7523fe447444c4bd2b867b543b1eee4e3bbcf6676cd1b27bf6147d86-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:099d88ae13fa2b3409da5310cdcba7fa01d2c87a8bc98296299a57054b9a075e', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Feb 02 09:49:58 compute-1 python3.9[137687]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/neutron-ovn-metadata-agent/01-neutron-ovn-metadata-agent.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1770025797.4807904-370-194460599842836/.source.conf follow=False _original_basename=neutron-ovn-metadata-agent.conf.j2 checksum=8bc979abbe81c2cf3993a225517a7e2483e20443 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 02 09:49:58 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:49:58 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dd800a2b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:49:59 compute-1 sshd-session[137624]: Invalid user ubuntu from 123.58.212.100 port 59998
Feb 02 09:49:59 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:49:59 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dbc003fd0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:49:59 compute-1 sshd-session[137624]: Connection closed by invalid user ubuntu 123.58.212.100 port 59998 [preauth]
Feb 02 09:49:59 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:49:59 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dcc0033b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:49:59 compute-1 ceph-mon[80115]: pgmap v303: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Feb 02 09:49:59 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:49:59 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 09:49:59 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:49:59.511 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 09:49:59 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:49:59 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:49:59 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:49:59.703 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:49:59 compute-1 python3.9[137851]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/neutron-ovn-metadata-agent/10-neutron-metadata.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 02 09:50:00 compute-1 python3.9[137974]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/neutron-ovn-metadata-agent/10-neutron-metadata.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1770025799.2823985-502-265633110977756/.source.conf _original_basename=10-neutron-metadata.conf follow=False checksum=ca7d4d155f5b812fab1a3b70e34adb495d291b8d backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 02 09:50:00 compute-1 ceph-mon[80115]: Health detail: HEALTH_WARN 3 failed cephadm daemon(s)
Feb 02 09:50:00 compute-1 ceph-mon[80115]: [WRN] CEPHADM_FAILED_DAEMON: 3 failed cephadm daemon(s)
Feb 02 09:50:00 compute-1 ceph-mon[80115]:     daemon nfs.cephfs.2.0.compute-0.fdwwab on compute-0 is in unknown state
Feb 02 09:50:00 compute-1 ceph-mon[80115]:     daemon nfs.cephfs.0.0.compute-1.mhzhsx on compute-1 is in unknown state
Feb 02 09:50:00 compute-1 ceph-mon[80115]:     daemon nfs.cephfs.1.0.compute-2.dciyfa on compute-2 is in unknown state
Feb 02 09:50:00 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 09:50:00 compute-1 python3.9[138124]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/neutron-ovn-metadata-agent/05-nova-metadata.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 02 09:50:00 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:50:00 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dd800a2b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:50:01 compute-1 sshd-session[137852]: Invalid user ubuntu from 123.58.212.100 port 60006
Feb 02 09:50:01 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:50:01 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dc8003b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:50:01 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:50:01 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dbc003fd0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:50:01 compute-1 sshd-session[137852]: Connection closed by invalid user ubuntu 123.58.212.100 port 60006 [preauth]
Feb 02 09:50:01 compute-1 python3.9[138246]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/neutron-ovn-metadata-agent/05-nova-metadata.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1770025800.388307-502-11700354517451/.source.conf _original_basename=05-nova-metadata.conf follow=False checksum=a14d6b38898a379cd37fc0bf365d17f10859446f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 02 09:50:01 compute-1 ceph-mon[80115]: pgmap v304: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 426 B/s rd, 0 op/s
Feb 02 09:50:01 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:50:01 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 09:50:01 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:50:01.514 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 09:50:01 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:50:01 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:50:01 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:50:01.705 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:50:02 compute-1 python3.9[138398]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 02 09:50:02 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Feb 02 09:50:02 compute-1 sshd-session[138271]: Invalid user ubuntu from 123.58.212.100 port 42300
Feb 02 09:50:02 compute-1 sudo[138551]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-utbrxgrzehpserhilrijhrvxztrfnyaw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025802.4567404-616-181104987157215/AnsiballZ_file.py'
Feb 02 09:50:02 compute-1 sudo[138551]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:50:02 compute-1 sshd-session[138271]: Connection closed by invalid user ubuntu 123.58.212.100 port 42300 [preauth]
Feb 02 09:50:02 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:50:02 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dcc0033b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:50:02 compute-1 python3.9[138553]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 02 09:50:02 compute-1 sudo[138551]: pam_unix(sudo:session): session closed for user root
Feb 02 09:50:03 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:50:03 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dcc0033b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:50:03 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:50:03 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dc8003b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:50:03 compute-1 ceph-mon[80115]: pgmap v305: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Feb 02 09:50:03 compute-1 sudo[138705]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rjuuflpizfkyrnrfciqrvpedntueetqf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025803.1774495-640-231925764186846/AnsiballZ_stat.py'
Feb 02 09:50:03 compute-1 sudo[138705]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:50:03 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:50:03 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 09:50:03 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:50:03.517 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 09:50:03 compute-1 python3.9[138707]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 02 09:50:03 compute-1 sudo[138705]: pam_unix(sudo:session): session closed for user root
Feb 02 09:50:03 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:50:03 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:50:03 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:50:03.707 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:50:03 compute-1 sudo[138783]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xazdmuxqgazagewzvzqktssqxlzhnsed ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025803.1774495-640-231925764186846/AnsiballZ_file.py'
Feb 02 09:50:03 compute-1 sudo[138783]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:50:03 compute-1 sshd-session[138578]: Invalid user ubuntu from 123.58.212.100 port 42312
Feb 02 09:50:04 compute-1 python3.9[138785]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 02 09:50:04 compute-1 sudo[138783]: pam_unix(sudo:session): session closed for user root
Feb 02 09:50:04 compute-1 sshd-session[138578]: Connection closed by invalid user ubuntu 123.58.212.100 port 42312 [preauth]
Feb 02 09:50:04 compute-1 sudo[138937]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zrznrjhgsasxrwqfcksptbyczmningff ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025804.2330456-640-34236061730657/AnsiballZ_stat.py'
Feb 02 09:50:04 compute-1 sudo[138937]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:50:04 compute-1 ceph-mon[80115]: pgmap v306: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 341 B/s rd, 0 op/s
Feb 02 09:50:04 compute-1 python3.9[138939]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 02 09:50:04 compute-1 sudo[138937]: pam_unix(sudo:session): session closed for user root
Feb 02 09:50:04 compute-1 sudo[139016]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-flrapwlfvyzsmbegnyvhxsbxuefpbqgw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025804.2330456-640-34236061730657/AnsiballZ_file.py'
Feb 02 09:50:04 compute-1 sudo[139016]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:50:04 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:50:04 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dbc003fd0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:50:05 compute-1 python3.9[139018]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 02 09:50:05 compute-1 sudo[139016]: pam_unix(sudo:session): session closed for user root
Feb 02 09:50:05 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:50:05 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dcc0033b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:50:05 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:50:05 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dd800a2b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:50:05 compute-1 sshd-session[138909]: Invalid user ubuntu from 123.58.212.100 port 42324
Feb 02 09:50:05 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:50:05 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:50:05 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:50:05.520 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:50:05 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 09:50:05 compute-1 sshd-session[138909]: Connection closed by invalid user ubuntu 123.58.212.100 port 42324 [preauth]
Feb 02 09:50:05 compute-1 sudo[139168]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jkgllymyhmxogakyjzclqtnqdbetpjxu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025805.3578262-709-167163614538503/AnsiballZ_file.py'
Feb 02 09:50:05 compute-1 sudo[139168]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:50:05 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:50:05 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:50:05 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:50:05.710 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:50:05 compute-1 python3.9[139170]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 02 09:50:05 compute-1 sudo[139168]: pam_unix(sudo:session): session closed for user root
Feb 02 09:50:06 compute-1 sudo[139320]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ufmmwxfyyfmfdkcqwgbgcntqyhcosuuh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025806.1035876-733-112331510660667/AnsiballZ_stat.py'
Feb 02 09:50:06 compute-1 sudo[139320]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:50:06 compute-1 python3.9[139322]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 02 09:50:06 compute-1 ceph-mon[80115]: pgmap v307: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Feb 02 09:50:06 compute-1 sudo[139320]: pam_unix(sudo:session): session closed for user root
Feb 02 09:50:06 compute-1 sudo[139399]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tybmbxpgokdcletzkdxuamssmwlkvkwb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025806.1035876-733-112331510660667/AnsiballZ_file.py'
Feb 02 09:50:06 compute-1 sudo[139399]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:50:06 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:50:06 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dc8003b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:50:07 compute-1 python3.9[139401]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 02 09:50:07 compute-1 sudo[139399]: pam_unix(sudo:session): session closed for user root
Feb 02 09:50:07 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:50:07 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dbc003fd0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:50:07 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:50:07 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4db0000f30 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:50:07 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:50:07 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 09:50:07 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:50:07.522 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 09:50:07 compute-1 sudo[139556]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mtkgkjhuntkvndfctwhtrkezcktasziv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025807.3293111-769-75625954964560/AnsiballZ_stat.py'
Feb 02 09:50:07 compute-1 sudo[139556]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:50:07 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:50:07 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 09:50:07 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:50:07.713 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 09:50:07 compute-1 sshd-session[139402]: Invalid user ubuntu from 123.58.212.100 port 42334
Feb 02 09:50:07 compute-1 python3.9[139558]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 02 09:50:07 compute-1 sudo[139556]: pam_unix(sudo:session): session closed for user root
Feb 02 09:50:07 compute-1 sshd-session[139402]: Connection closed by invalid user ubuntu 123.58.212.100 port 42334 [preauth]
Feb 02 09:50:08 compute-1 sudo[139634]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wmjijvygbbirpmkdjaqpzaozffghffbf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025807.3293111-769-75625954964560/AnsiballZ_file.py'
Feb 02 09:50:08 compute-1 sudo[139634]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:50:08 compute-1 python3.9[139636]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 02 09:50:08 compute-1 sudo[139634]: pam_unix(sudo:session): session closed for user root
Feb 02 09:50:08 compute-1 ceph-mon[80115]: pgmap v308: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Feb 02 09:50:08 compute-1 sudo[139789]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-swjoyxojbszriqtflcboxctokmjoadfq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025808.560806-805-109632420687588/AnsiballZ_systemd.py'
Feb 02 09:50:08 compute-1 sudo[139789]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:50:08 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:50:08 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4da8001090 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:50:09 compute-1 python3.9[139791]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 02 09:50:09 compute-1 systemd[1]: Reloading.
Feb 02 09:50:09 compute-1 systemd-sysv-generator[139823]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 02 09:50:09 compute-1 systemd-rc-local-generator[139819]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 02 09:50:09 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:50:09 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dc8003b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:50:09 compute-1 sshd-session[139637]: Invalid user ubuntu from 123.58.212.100 port 42344
Feb 02 09:50:09 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:50:09 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dbc003fd0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:50:09 compute-1 sudo[139789]: pam_unix(sudo:session): session closed for user root
Feb 02 09:50:09 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:50:09 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:50:09 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:50:09.525 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:50:09 compute-1 sshd-session[139637]: Connection closed by invalid user ubuntu 123.58.212.100 port 42344 [preauth]
Feb 02 09:50:09 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:50:09 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:50:09 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:50:09.716 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:50:09 compute-1 sudo[139981]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dymfayfhzelyrhaatmdmwbbiytittpkp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025809.6597314-829-211398891849872/AnsiballZ_stat.py'
Feb 02 09:50:09 compute-1 sudo[139981]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:50:10 compute-1 python3.9[139983]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 02 09:50:10 compute-1 sudo[139981]: pam_unix(sudo:session): session closed for user root
Feb 02 09:50:10 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-haproxy-nfs-cephfs-compute-1-sryqbx[86108]: [WARNING] 032/095010 (4) : Server backend/nfs.cephfs.1 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Feb 02 09:50:10 compute-1 sudo[140059]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iprdqnesythjxdkqvvlvvrdhbnwpbqji ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025809.6597314-829-211398891849872/AnsiballZ_file.py'
Feb 02 09:50:10 compute-1 sudo[140059]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:50:10 compute-1 python3.9[140061]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 02 09:50:10 compute-1 sudo[140059]: pam_unix(sudo:session): session closed for user root
Feb 02 09:50:10 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 09:50:10 compute-1 ceph-mon[80115]: pgmap v309: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 426 B/s rd, 0 op/s
Feb 02 09:50:10 compute-1 sshd-session[139929]: Invalid user ubuntu from 123.58.212.100 port 42346
Feb 02 09:50:10 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:50:10 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4db00010d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:50:10 compute-1 sshd-session[139929]: Connection closed by invalid user ubuntu 123.58.212.100 port 42346 [preauth]
Feb 02 09:50:11 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:50:11 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4da8001f70 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:50:11 compute-1 sudo[140212]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wlzvlxgagyjsciatdprihrgiptcpnfhk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025810.8974698-865-53710055341847/AnsiballZ_stat.py'
Feb 02 09:50:11 compute-1 sudo[140212]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:50:11 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:50:11 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dc8003b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:50:11 compute-1 python3.9[140214]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 02 09:50:11 compute-1 sudo[140212]: pam_unix(sudo:session): session closed for user root
Feb 02 09:50:11 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:50:11 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 09:50:11 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:50:11.528 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 09:50:11 compute-1 sudo[140292]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iwepnmrtttdjtpccbdahcykjgupjdzba ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025810.8974698-865-53710055341847/AnsiballZ_file.py'
Feb 02 09:50:11 compute-1 sudo[140292]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:50:11 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:50:11 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:50:11 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:50:11.718 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:50:11 compute-1 python3.9[140294]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 02 09:50:11 compute-1 sudo[140292]: pam_unix(sudo:session): session closed for user root
Feb 02 09:50:12 compute-1 sshd-session[140215]: Invalid user ubuntu from 123.58.212.100 port 42360
Feb 02 09:50:12 compute-1 sshd-session[140215]: Connection closed by invalid user ubuntu 123.58.212.100 port 42360 [preauth]
Feb 02 09:50:12 compute-1 sudo[140444]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wnsslfkekvlcpxwbmcibsmmqaomsnojs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025812.1040165-901-250581101739913/AnsiballZ_systemd.py'
Feb 02 09:50:12 compute-1 sudo[140444]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:50:12 compute-1 python3.9[140446]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 02 09:50:12 compute-1 systemd[1]: Reloading.
Feb 02 09:50:12 compute-1 systemd-sysv-generator[140472]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 02 09:50:12 compute-1 systemd-rc-local-generator[140465]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 02 09:50:12 compute-1 ceph-mon[80115]: pgmap v310: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Feb 02 09:50:12 compute-1 systemd[1]: Starting Create netns directory...
Feb 02 09:50:12 compute-1 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Feb 02 09:50:12 compute-1 systemd[1]: netns-placeholder.service: Deactivated successfully.
Feb 02 09:50:12 compute-1 systemd[1]: Finished Create netns directory.
Feb 02 09:50:12 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:50:12 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dbc003fd0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:50:12 compute-1 sudo[140444]: pam_unix(sudo:session): session closed for user root
Feb 02 09:50:13 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:50:13 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4db0001e80 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:50:13 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:50:13 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4da8001f70 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:50:13 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:50:13 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:50:13 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:50:13.532 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:50:13 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:50:13 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:50:13 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:50:13.721 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:50:14 compute-1 sudo[140638]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-odfejwswmrpoukheppcbqlejerjxzwvu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025813.8846776-931-6530832727791/AnsiballZ_file.py'
Feb 02 09:50:14 compute-1 sudo[140638]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:50:14 compute-1 python3.9[140640]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 02 09:50:14 compute-1 sudo[140638]: pam_unix(sudo:session): session closed for user root
Feb 02 09:50:14 compute-1 sudo[140665]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Feb 02 09:50:14 compute-1 sudo[140665]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:50:14 compute-1 sudo[140665]: pam_unix(sudo:session): session closed for user root
Feb 02 09:50:14 compute-1 ceph-mon[80115]: pgmap v311: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Feb 02 09:50:14 compute-1 sudo[140816]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gmaceqxuamwiiwzapljepdducuhqxlfk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025814.613436-955-95466646659596/AnsiballZ_stat.py'
Feb 02 09:50:14 compute-1 sudo[140816]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:50:14 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:50:14 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dc8003b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:50:15 compute-1 python3.9[140818]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ovn_metadata_agent/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 02 09:50:15 compute-1 sudo[140816]: pam_unix(sudo:session): session closed for user root
Feb 02 09:50:15 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:50:15 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dbc003fd0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:50:15 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:50:15 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4db0002000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:50:15 compute-1 sudo[140939]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gbsejlzcezsiqnecleaqiqydejtiqeiy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025814.613436-955-95466646659596/AnsiballZ_copy.py'
Feb 02 09:50:15 compute-1 sudo[140939]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:50:15 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:50:15 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:50:15 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:50:15.535 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:50:15 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 09:50:15 compute-1 python3.9[140941]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ovn_metadata_agent/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1770025814.613436-955-95466646659596/.source _original_basename=healthcheck follow=False checksum=898a5a1fcd473cf731177fc866e3bd7ebf20a131 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Feb 02 09:50:15 compute-1 sudo[140939]: pam_unix(sudo:session): session closed for user root
Feb 02 09:50:15 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:50:15 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:50:15 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:50:15.723 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:50:16 compute-1 sudo[141093]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iqwmbjcanltjsuqpxdsbkixcnujmkqbv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025816.073592-1006-168392061310116/AnsiballZ_file.py'
Feb 02 09:50:16 compute-1 sudo[141093]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:50:16 compute-1 python3.9[141095]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 02 09:50:16 compute-1 sshd-session[140942]: Invalid user ubuntu from 123.58.212.100 port 50914
Feb 02 09:50:16 compute-1 sudo[141093]: pam_unix(sudo:session): session closed for user root
Feb 02 09:50:16 compute-1 sshd-session[140942]: Connection closed by invalid user ubuntu 123.58.212.100 port 50914 [preauth]
Feb 02 09:50:16 compute-1 ceph-mon[80115]: pgmap v312: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s
Feb 02 09:50:16 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:50:16 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4da8001f70 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:50:17 compute-1 sudo[141248]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ctulvpakjvucjfdlewayftpkybuhhcnd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025816.801128-1030-93985227039656/AnsiballZ_file.py'
Feb 02 09:50:17 compute-1 sudo[141248]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:50:17 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:50:17 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dc8003b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:50:17 compute-1 python3.9[141250]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 02 09:50:17 compute-1 sudo[141248]: pam_unix(sudo:session): session closed for user root
Feb 02 09:50:17 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:50:17 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dbc003fd0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:50:17 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:50:17 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:50:17 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:50:17.538 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:50:17 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:50:17 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 09:50:17 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:50:17.725 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 09:50:17 compute-1 sudo[141400]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pzthbtnbjelufdmkuaqzsojkyrjstitj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025817.613887-1054-149016092379195/AnsiballZ_stat.py'
Feb 02 09:50:17 compute-1 sudo[141400]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:50:17 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Feb 02 09:50:17 compute-1 python3.9[141402]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ovn_metadata_agent.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 02 09:50:17 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:50:17 : epoch 698072dc : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Feb 02 09:50:17 compute-1 sudo[141400]: pam_unix(sudo:session): session closed for user root
Feb 02 09:50:18 compute-1 sshd-session[141220]: Invalid user ubuntu from 123.58.212.100 port 50924
Feb 02 09:50:18 compute-1 sudo[141523]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-obbxzsvvadphbogxiqvhusfllhuuhfnz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025817.613887-1054-149016092379195/AnsiballZ_copy.py'
Feb 02 09:50:18 compute-1 sudo[141523]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:50:18 compute-1 sshd-session[141220]: Connection closed by invalid user ubuntu 123.58.212.100 port 50924 [preauth]
Feb 02 09:50:18 compute-1 python3.9[141525]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/ovn_metadata_agent.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1770025817.613887-1054-149016092379195/.source.json _original_basename=._0d2hgfi follow=False checksum=a908ef151ded3a33ae6c9ac8be72a35e5e33b9dc backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 02 09:50:18 compute-1 sudo[141523]: pam_unix(sudo:session): session closed for user root
Feb 02 09:50:18 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-haproxy-nfs-cephfs-compute-1-sryqbx[86108]: [WARNING] 032/095018 (4) : Server backend/nfs.cephfs.2 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 1 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Feb 02 09:50:18 compute-1 ceph-mon[80115]: pgmap v313: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s
Feb 02 09:50:18 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:50:18 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4db00029b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:50:19 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:50:19 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4da8002dc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:50:19 compute-1 python3.9[141678]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 02 09:50:19 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:50:19 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dc8003b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:50:19 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:50:19 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:50:19 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:50:19.540 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:50:19 compute-1 sshd-session[141550]: Invalid user ubuntu from 123.58.212.100 port 50934
Feb 02 09:50:19 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:50:19 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:50:19 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:50:19.728 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:50:19 compute-1 sshd-session[141550]: Connection closed by invalid user ubuntu 123.58.212.100 port 50934 [preauth]
Feb 02 09:50:20 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 09:50:20 compute-1 ceph-mon[80115]: pgmap v314: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 938 B/s rd, 426 B/s wr, 1 op/s
Feb 02 09:50:20 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:50:20 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dbc003fd0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:50:20 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:50:20 : epoch 698072dc : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Feb 02 09:50:20 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:50:20 : epoch 698072dc : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Feb 02 09:50:21 compute-1 sshd-session[141829]: Invalid user ubuntu from 123.58.212.100 port 50940
Feb 02 09:50:21 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:50:21 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4db00029b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:50:21 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:50:21 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4da8002dc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:50:21 compute-1 sshd-session[141829]: Connection closed by invalid user ubuntu 123.58.212.100 port 50940 [preauth]
Feb 02 09:50:21 compute-1 sudo[142102]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uvdzrrdzfddqihjpwjtlpzksrvbrgkqi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025821.0115652-1174-235524369529334/AnsiballZ_container_config_data.py'
Feb 02 09:50:21 compute-1 sudo[142102]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:50:21 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:50:21 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 09:50:21 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:50:21.544 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 09:50:21 compute-1 python3.9[142104]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent config_pattern=*.json debug=False
Feb 02 09:50:21 compute-1 sudo[142102]: pam_unix(sudo:session): session closed for user root
Feb 02 09:50:21 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:50:21 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:50:21 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:50:21.731 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:50:22 compute-1 sudo[142256]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cxijphuzqpyhryfuhrblaalzhizmeygz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025822.1557326-1207-97213216927328/AnsiballZ_container_config_hash.py'
Feb 02 09:50:22 compute-1 sudo[142256]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:50:22 compute-1 python3.9[142258]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack
Feb 02 09:50:22 compute-1 sudo[142256]: pam_unix(sudo:session): session closed for user root
Feb 02 09:50:22 compute-1 sshd-session[142105]: Invalid user ubuntu from 123.58.212.100 port 53060
Feb 02 09:50:22 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:50:22 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dc8003b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:50:22 compute-1 ceph-mon[80115]: pgmap v315: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 426 B/s wr, 1 op/s
Feb 02 09:50:23 compute-1 sshd-session[142105]: Connection closed by invalid user ubuntu 123.58.212.100 port 53060 [preauth]
Feb 02 09:50:23 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:50:23 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dbc003fd0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:50:23 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:50:23 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4db0003840 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:50:23 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:50:23 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 09:50:23 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:50:23.547 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 09:50:23 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:50:23 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 09:50:23 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:50:23.733 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 09:50:23 compute-1 sudo[142411]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jcvvxvnvlhlqslhgbqzspbejgkzrmmzf ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1770025823.1869295-1237-218821618434565/AnsiballZ_edpm_container_manage.py'
Feb 02 09:50:23 compute-1 sudo[142411]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:50:24 compute-1 python3[142413]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent config_id=ovn_metadata_agent config_overrides={} config_patterns=*.json containers=['ovn_metadata_agent'] log_base_path=/var/log/containers/stdouts debug=False
Feb 02 09:50:24 compute-1 sshd-session[142333]: Invalid user ubuntu from 123.58.212.100 port 53070
Feb 02 09:50:24 compute-1 sshd-session[142333]: Connection closed by invalid user ubuntu 123.58.212.100 port 53070 [preauth]
Feb 02 09:50:24 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:50:24 : epoch 698072dc : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Feb 02 09:50:24 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:50:24 : epoch 698072dc : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Feb 02 09:50:24 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:50:24 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4da8002dc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:50:24 compute-1 ceph-mon[80115]: pgmap v316: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 1.2 KiB/s rd, 426 B/s wr, 1 op/s
Feb 02 09:50:25 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:50:25 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dc8003b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:50:25 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:50:25 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dbc003fd0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:50:25 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:50:25 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:50:25 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:50:25.551 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:50:25 compute-1 sshd-session[142440]: Invalid user ubuntu from 123.58.212.100 port 53084
Feb 02 09:50:25 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 09:50:25 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:50:25 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 09:50:25 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:50:25.736 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 09:50:25 compute-1 sshd-session[142440]: Connection closed by invalid user ubuntu 123.58.212.100 port 53084 [preauth]
Feb 02 09:50:26 compute-1 sshd-session[142480]: Invalid user ubuntu from 123.58.212.100 port 53092
Feb 02 09:50:26 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:50:26 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4db0003840 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:50:27 compute-1 ceph-mon[80115]: pgmap v317: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 1.2 KiB/s rd, 426 B/s wr, 1 op/s
Feb 02 09:50:27 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:50:27 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4da80041e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:50:27 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:50:27 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dc8003b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:50:27 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:50:27 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:50:27 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:50:27.552 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:50:27 compute-1 sshd-session[142480]: Connection closed by invalid user ubuntu 123.58.212.100 port 53092 [preauth]
Feb 02 09:50:27 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:50:27 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:50:27 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:50:27.738 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:50:27 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:50:27 : epoch 698072dc : compute-1 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Feb 02 09:50:27 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:50:27 : epoch 698072dc : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Feb 02 09:50:28 compute-1 sshd-session[142500]: Invalid user ubuntu from 123.58.212.100 port 53102
Feb 02 09:50:28 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:50:28 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dbc003fd0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:50:29 compute-1 sshd-session[142500]: Connection closed by invalid user ubuntu 123.58.212.100 port 53102 [preauth]
Feb 02 09:50:29 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:50:29 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4db0003840 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:50:29 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:50:29 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4da80041e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:50:29 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:50:29 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:50:29 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:50:29.555 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:50:29 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:50:29 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:50:29 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:50:29.742 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:50:30 compute-1 sshd-session[142515]: Invalid user ubuntu from 123.58.212.100 port 53104
Feb 02 09:50:30 compute-1 ceph-mon[80115]: pgmap v318: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 1.2 KiB/s rd, 426 B/s wr, 1 op/s
Feb 02 09:50:30 compute-1 sshd-session[142515]: Connection closed by invalid user ubuntu 123.58.212.100 port 53104 [preauth]
Feb 02 09:50:30 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:50:30 : epoch 698072dc : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Feb 02 09:50:30 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:50:30 : epoch 698072dc : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Feb 02 09:50:30 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 09:50:30 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:50:30 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dc8003b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:50:31 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:50:31 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4db0003840 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:50:31 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:50:31 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4db0003840 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:50:31 compute-1 podman[142503]: 2026-02-02 09:50:31.535776443 +0000 UTC m=+2.769583886 container health_status 1fb2696999ea15e9688144989093738543d3cef725f9986792d6dfd707aefbe2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:099d88ae13fa2b3409da5310cdcba7fa01d2c87a8bc98296299a57054b9a075e, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'db4758ee7523fe447444c4bd2b867b543b1eee4e3bbcf6676cd1b27bf6147d86-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:099d88ae13fa2b3409da5310cdcba7fa01d2c87a8bc98296299a57054b9a075e', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.vendor=CentOS)
Feb 02 09:50:31 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:50:31 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:50:31 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:50:31.557 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:50:31 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:50:31 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:50:31 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:50:31.745 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:50:31 compute-1 sshd-session[142525]: Invalid user ubuntu from 123.58.212.100 port 53118
Feb 02 09:50:32 compute-1 sshd-session[142525]: Connection closed by invalid user ubuntu 123.58.212.100 port 53118 [preauth]
Feb 02 09:50:32 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:50:32 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4da80041e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:50:33 compute-1 sshd-session[142556]: Invalid user ubuntu from 123.58.212.100 port 40682
Feb 02 09:50:33 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:50:33 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4da80041e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:50:33 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:50:33 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dbc003fd0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:50:33 compute-1 sshd-session[142556]: Connection closed by invalid user ubuntu 123.58.212.100 port 40682 [preauth]
Feb 02 09:50:33 compute-1 ceph-mon[80115]: pgmap v319: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 98 KiB/s rd, 1.2 KiB/s wr, 162 op/s
Feb 02 09:50:33 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:50:33 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:50:33 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:50:33.561 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:50:33 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:50:33 : epoch 698072dc : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Feb 02 09:50:33 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:50:33 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 09:50:33 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:50:33.748 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 09:50:33 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:50:33 : epoch 698072dc : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Feb 02 09:50:34 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-haproxy-nfs-cephfs-compute-1-sryqbx[86108]: [WARNING] 032/095034 (4) : Server backend/nfs.cephfs.1 is UP, reason: Layer4 check passed, check duration: 0ms. 2 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Feb 02 09:50:34 compute-1 sshd-session[142560]: Invalid user ubuntu from 123.58.212.100 port 40690
Feb 02 09:50:34 compute-1 sudo[142581]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Feb 02 09:50:34 compute-1 sudo[142581]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:50:34 compute-1 sudo[142581]: pam_unix(sudo:session): session closed for user root
Feb 02 09:50:34 compute-1 sshd-session[142560]: Connection closed by invalid user ubuntu 123.58.212.100 port 40690 [preauth]
Feb 02 09:50:34 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:50:34 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4db0003840 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:50:35 compute-1 ceph-mon[80115]: pgmap v320: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 97 KiB/s rd, 767 B/s wr, 161 op/s
Feb 02 09:50:35 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Feb 02 09:50:35 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:50:35 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4da80041e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:50:35 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:50:35 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4da80041e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:50:35 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:50:35 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:50:35 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:50:35.563 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:50:35 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:50:35 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:50:35 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:50:35.751 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:50:35 compute-1 sshd-session[142607]: Invalid user ubuntu from 123.58.212.100 port 40694
Feb 02 09:50:36 compute-1 sshd-session[142607]: Connection closed by invalid user ubuntu 123.58.212.100 port 40694 [preauth]
Feb 02 09:50:36 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 09:50:36 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:50:36 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dbc003fd0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:50:37 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:50:37 : epoch 698072dc : compute-1 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Feb 02 09:50:37 compute-1 ceph-mon[80115]: pgmap v321: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 98 KiB/s rd, 853 B/s wr, 162 op/s
Feb 02 09:50:37 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:50:37 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4db0003840 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:50:37 compute-1 podman[142427]: 2026-02-02 09:50:37.315236134 +0000 UTC m=+13.236378666 image pull 19964fda6b912d3d57e21b0bcc221725d936e513025030cb508474fe04b06af8 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc
Feb 02 09:50:37 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:50:37 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4da80041e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:50:37 compute-1 sshd-session[142609]: Invalid user ubuntu from 123.58.212.100 port 40706
Feb 02 09:50:37 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:50:37 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:50:37 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:50:37.567 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:50:37 compute-1 podman[142632]: 2026-02-02 09:50:37.477740864 +0000 UTC m=+0.036223866 image pull 19964fda6b912d3d57e21b0bcc221725d936e513025030cb508474fe04b06af8 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc
Feb 02 09:50:37 compute-1 sshd-session[142609]: Connection closed by invalid user ubuntu 123.58.212.100 port 40706 [preauth]
Feb 02 09:50:37 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:50:37 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 09:50:37 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:50:37.753 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 09:50:37 compute-1 podman[142632]: 2026-02-02 09:50:37.891633644 +0000 UTC m=+0.450116596 container create 9cab238676dda9a572dafc047b624a8b78a8fbec063bf997856559897160605d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc, name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20260127, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'db4758ee7523fe447444c4bd2b867b543b1eee4e3bbcf6676cd1b27bf6147d86-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS)
Feb 02 09:50:37 compute-1 python3[142413]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name ovn_metadata_agent --cgroupns=host --conmon-pidfile /run/ovn_metadata_agent.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env EDPM_CONFIG_HASH=db4758ee7523fe447444c4bd2b867b543b1eee4e3bbcf6676cd1b27bf6147d86-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d --healthcheck-command /openstack/healthcheck --label config_id=ovn_metadata_agent --label container_name=ovn_metadata_agent --label managed_by=edpm_ansible --label config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'db4758ee7523fe447444c4bd2b867b543b1eee4e3bbcf6676cd1b27bf6147d86-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']} --log-driver journald --log-level info --network host --pid host --privileged=True --user root --volume /run/openvswitch:/run/openvswitch:z --volume /var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z --volume /run/netns:/run/netns:shared --volume /var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/neutron:/var/lib/neutron:shared,z --volume /var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro --volume /var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro --volume /var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z --volume /var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc
Feb 02 09:50:38 compute-1 sudo[142411]: pam_unix(sudo:session): session closed for user root
Feb 02 09:50:38 compute-1 ceph-mon[80115]: pgmap v322: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 97 KiB/s rd, 852 B/s wr, 161 op/s
Feb 02 09:50:38 compute-1 sudo[142821]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oiaeqaffokolequaxokaqfbgcwwiqbvi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025838.2067342-1261-246143225683215/AnsiballZ_stat.py'
Feb 02 09:50:38 compute-1 sudo[142821]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:50:38 compute-1 python3.9[142823]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 02 09:50:38 compute-1 sudo[142821]: pam_unix(sudo:session): session closed for user root
Feb 02 09:50:38 compute-1 sshd-session[142645]: Invalid user ubuntu from 123.58.212.100 port 40722
Feb 02 09:50:38 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:50:38 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4da80041e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:50:38 compute-1 sudo[142851]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 02 09:50:38 compute-1 sudo[142851]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:50:38 compute-1 sudo[142851]: pam_unix(sudo:session): session closed for user root
Feb 02 09:50:39 compute-1 sshd-session[142645]: Connection closed by invalid user ubuntu 123.58.212.100 port 40722 [preauth]
Feb 02 09:50:39 compute-1 sudo[142900]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/d241d473-9fcb-5f74-b163-f1ca4454e7f1/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Feb 02 09:50:39 compute-1 sudo[142900]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:50:39 compute-1 ceph-mon[80115]: pgmap v323: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 97 KiB/s rd, 852 B/s wr, 161 op/s
Feb 02 09:50:39 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:50:39 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dbc003fd0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:50:39 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:50:39 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dbc003fd0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:50:39 compute-1 sudo[143050]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-inhpyezpjaimaahmqpdlvdvlcaxphnak ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025839.09231-1288-175025987259291/AnsiballZ_file.py'
Feb 02 09:50:39 compute-1 sudo[143050]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:50:39 compute-1 sudo[142900]: pam_unix(sudo:session): session closed for user root
Feb 02 09:50:39 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:50:39 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:50:39 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:50:39.571 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:50:39 compute-1 python3.9[143052]: ansible-file Invoked with path=/etc/systemd/system/edpm_ovn_metadata_agent.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 02 09:50:39 compute-1 sudo[143050]: pam_unix(sudo:session): session closed for user root
Feb 02 09:50:39 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:50:39 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:50:39 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:50:39.756 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:50:39 compute-1 sudo[143138]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-frnewfcxxmgmzyvfsjaydrnoriawpntx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025839.09231-1288-175025987259291/AnsiballZ_stat.py'
Feb 02 09:50:39 compute-1 sudo[143138]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:50:40 compute-1 python3.9[143140]: ansible-stat Invoked with path=/etc/systemd/system/edpm_ovn_metadata_agent_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 02 09:50:40 compute-1 sudo[143138]: pam_unix(sudo:session): session closed for user root
Feb 02 09:50:40 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Feb 02 09:50:40 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Feb 02 09:50:40 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb 02 09:50:40 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb 02 09:50:40 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Feb 02 09:50:40 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Feb 02 09:50:40 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Feb 02 09:50:40 compute-1 sudo[143289]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pfzuqvclrkgtjvbybcestqptwwmeebuo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025840.1800416-1288-191941218005665/AnsiballZ_copy.py'
Feb 02 09:50:40 compute-1 sudo[143289]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:50:40 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-haproxy-nfs-cephfs-compute-1-sryqbx[86108]: [WARNING] 032/095040 (4) : Server backend/nfs.cephfs.2 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Feb 02 09:50:40 compute-1 python3.9[143291]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1770025840.1800416-1288-191941218005665/source dest=/etc/systemd/system/edpm_ovn_metadata_agent.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 02 09:50:40 compute-1 sudo[143289]: pam_unix(sudo:session): session closed for user root
Feb 02 09:50:40 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:50:40 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dd8000df0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:50:41 compute-1 sudo[143366]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qjgsmsdwziafpevgplysgleipradwzmc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025840.1800416-1288-191941218005665/AnsiballZ_systemd.py'
Feb 02 09:50:41 compute-1 sudo[143366]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:50:41 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:50:41 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dcc001690 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:50:41 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:50:41 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4db0003840 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:50:41 compute-1 python3.9[143368]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Feb 02 09:50:41 compute-1 systemd[1]: Reloading.
Feb 02 09:50:41 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 09:50:41 compute-1 systemd-sysv-generator[143393]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 02 09:50:41 compute-1 systemd-rc-local-generator[143387]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 02 09:50:41 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:50:41 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 09:50:41 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:50:41.573 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 09:50:41 compute-1 sudo[143366]: pam_unix(sudo:session): session closed for user root
Feb 02 09:50:41 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:50:41 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 09:50:41 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:50:41.758 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 09:50:41 compute-1 sudo[143477]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-byoqsatcbdpriwpvbidfsouwuebllwlv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025840.1800416-1288-191941218005665/AnsiballZ_systemd.py'
Feb 02 09:50:41 compute-1 sudo[143477]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:50:42 compute-1 ceph-mon[80115]: pgmap v324: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 98 KiB/s rd, 1.3 KiB/s wr, 163 op/s
Feb 02 09:50:42 compute-1 ceph-mon[80115]: pgmap v325: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 2.2 KiB/s rd, 703 B/s wr, 3 op/s
Feb 02 09:50:42 compute-1 ceph-mon[80115]: pgmap v326: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 2.7 KiB/s rd, 874 B/s wr, 3 op/s
Feb 02 09:50:42 compute-1 python3.9[143479]: ansible-systemd Invoked with state=restarted name=edpm_ovn_metadata_agent.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 02 09:50:42 compute-1 systemd[1]: Reloading.
Feb 02 09:50:42 compute-1 systemd-rc-local-generator[143505]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 02 09:50:42 compute-1 systemd-sysv-generator[143511]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 02 09:50:42 compute-1 sshd-session[142963]: Invalid user ubuntu from 123.58.212.100 port 40726
Feb 02 09:50:42 compute-1 systemd[1]: Starting ovn_metadata_agent container...
Feb 02 09:50:42 compute-1 systemd[1]: Started libcrun container.
Feb 02 09:50:42 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3f162e3e4e4516f70737a87aceaba248606c2e295597c4fa34890d6a1f85ad4a/merged/etc/neutron.conf.d supports timestamps until 2038 (0x7fffffff)
Feb 02 09:50:42 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3f162e3e4e4516f70737a87aceaba248606c2e295597c4fa34890d6a1f85ad4a/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Feb 02 09:50:42 compute-1 systemd[1]: Started /usr/bin/podman healthcheck run 9cab238676dda9a572dafc047b624a8b78a8fbec063bf997856559897160605d.
Feb 02 09:50:42 compute-1 sshd-session[142963]: Connection closed by invalid user ubuntu 123.58.212.100 port 40726 [preauth]
Feb 02 09:50:42 compute-1 podman[143520]: 2026-02-02 09:50:42.877798747 +0000 UTC m=+0.287535974 container init 9cab238676dda9a572dafc047b624a8b78a8fbec063bf997856559897160605d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc, name=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'db4758ee7523fe447444c4bd2b867b543b1eee4e3bbcf6676cd1b27bf6147d86-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127)
Feb 02 09:50:42 compute-1 ovn_metadata_agent[143537]: + sudo -E kolla_set_configs
Feb 02 09:50:42 compute-1 podman[143520]: 2026-02-02 09:50:42.910981631 +0000 UTC m=+0.320718818 container start 9cab238676dda9a572dafc047b624a8b78a8fbec063bf997856559897160605d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc, name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'db4758ee7523fe447444c4bd2b867b543b1eee4e3bbcf6676cd1b27bf6147d86-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Feb 02 09:50:42 compute-1 edpm-start-podman-container[143520]: ovn_metadata_agent
Feb 02 09:50:42 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:50:42 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dbc003fd0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:50:42 compute-1 ovn_metadata_agent[143537]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Feb 02 09:50:42 compute-1 ovn_metadata_agent[143537]: INFO:__main__:Validating config file
Feb 02 09:50:42 compute-1 ovn_metadata_agent[143537]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Feb 02 09:50:42 compute-1 ovn_metadata_agent[143537]: INFO:__main__:Copying service configuration files
Feb 02 09:50:42 compute-1 ovn_metadata_agent[143537]: INFO:__main__:Deleting /etc/neutron/rootwrap.conf
Feb 02 09:50:42 compute-1 ovn_metadata_agent[143537]: INFO:__main__:Copying /etc/neutron.conf.d/01-rootwrap.conf to /etc/neutron/rootwrap.conf
Feb 02 09:50:42 compute-1 ovn_metadata_agent[143537]: INFO:__main__:Setting permission for /etc/neutron/rootwrap.conf
Feb 02 09:50:42 compute-1 ovn_metadata_agent[143537]: INFO:__main__:Writing out command to execute
Feb 02 09:50:42 compute-1 ovn_metadata_agent[143537]: INFO:__main__:Setting permission for /var/lib/neutron
Feb 02 09:50:42 compute-1 ovn_metadata_agent[143537]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts
Feb 02 09:50:42 compute-1 ovn_metadata_agent[143537]: INFO:__main__:Setting permission for /var/lib/neutron/ovn-metadata-proxy
Feb 02 09:50:42 compute-1 ovn_metadata_agent[143537]: INFO:__main__:Setting permission for /var/lib/neutron/external
Feb 02 09:50:42 compute-1 ovn_metadata_agent[143537]: INFO:__main__:Setting permission for /var/lib/neutron/ovn_metadata_haproxy_wrapper
Feb 02 09:50:42 compute-1 ovn_metadata_agent[143537]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts/haproxy-kill
Feb 02 09:50:42 compute-1 ovn_metadata_agent[143537]: INFO:__main__:Setting permission for /var/lib/neutron/external/pids
Feb 02 09:50:42 compute-1 ovn_metadata_agent[143537]: ++ cat /run_command
Feb 02 09:50:42 compute-1 ovn_metadata_agent[143537]: + CMD=neutron-ovn-metadata-agent
Feb 02 09:50:42 compute-1 ovn_metadata_agent[143537]: + ARGS=
Feb 02 09:50:42 compute-1 ovn_metadata_agent[143537]: + sudo kolla_copy_cacerts
Feb 02 09:50:43 compute-1 ovn_metadata_agent[143537]: + [[ ! -n '' ]]
Feb 02 09:50:43 compute-1 ovn_metadata_agent[143537]: + . kolla_extend_start
Feb 02 09:50:43 compute-1 ovn_metadata_agent[143537]: Running command: 'neutron-ovn-metadata-agent'
Feb 02 09:50:43 compute-1 ovn_metadata_agent[143537]: + echo 'Running command: '\''neutron-ovn-metadata-agent'\'''
Feb 02 09:50:43 compute-1 ovn_metadata_agent[143537]: + umask 0022
Feb 02 09:50:43 compute-1 ovn_metadata_agent[143537]: + exec neutron-ovn-metadata-agent
Feb 02 09:50:43 compute-1 podman[143544]: 2026-02-02 09:50:43.01755582 +0000 UTC m=+0.096332477 container health_status 9cab238676dda9a572dafc047b624a8b78a8fbec063bf997856559897160605d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'db4758ee7523fe447444c4bd2b867b543b1eee4e3bbcf6676cd1b27bf6147d86-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent)
Feb 02 09:50:43 compute-1 edpm-start-podman-container[143519]: Creating additional drop-in dependency for "ovn_metadata_agent" (9cab238676dda9a572dafc047b624a8b78a8fbec063bf997856559897160605d)
Feb 02 09:50:43 compute-1 systemd[1]: Reloading.
Feb 02 09:50:43 compute-1 systemd-rc-local-generator[143619]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 02 09:50:43 compute-1 systemd-sysv-generator[143624]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 02 09:50:43 compute-1 ceph-mon[80115]: pgmap v327: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.7 KiB/s rd, 749 B/s wr, 2 op/s
Feb 02 09:50:43 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:50:43 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dd8000df0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:50:43 compute-1 systemd[1]: Started ovn_metadata_agent container.
Feb 02 09:50:43 compute-1 sudo[143477]: pam_unix(sudo:session): session closed for user root
Feb 02 09:50:43 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:50:43 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dd8000df0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:50:43 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:50:43 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 09:50:43 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:50:43.576 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 09:50:43 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:50:43 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:50:43 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:50:43.761 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:50:44 compute-1 sshd-session[143591]: Invalid user ubuntu from 123.58.212.100 port 33728
Feb 02 09:50:44 compute-1 sshd-session[143591]: Connection closed by invalid user ubuntu 123.58.212.100 port 33728 [preauth]
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.856 143542 INFO neutron.common.config [-] Logging enabled!
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.856 143542 INFO neutron.common.config [-] /usr/bin/neutron-ovn-metadata-agent version 22.2.2.dev44
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.856 143542 DEBUG neutron.common.config [-] command line: /usr/bin/neutron-ovn-metadata-agent setup_logging /usr/lib/python3.9/site-packages/neutron/common/config.py:123
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.857 143542 DEBUG neutron.agent.ovn.metadata_agent [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.857 143542 DEBUG neutron.agent.ovn.metadata_agent [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.857 143542 DEBUG neutron.agent.ovn.metadata_agent [-] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.857 143542 DEBUG neutron.agent.ovn.metadata_agent [-] config files: ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.857 143542 DEBUG neutron.agent.ovn.metadata_agent [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.857 143542 DEBUG neutron.agent.ovn.metadata_agent [-] agent_down_time                = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.857 143542 DEBUG neutron.agent.ovn.metadata_agent [-] allow_bulk                     = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.857 143542 DEBUG neutron.agent.ovn.metadata_agent [-] api_extensions_path            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.857 143542 DEBUG neutron.agent.ovn.metadata_agent [-] api_paste_config               = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.858 143542 DEBUG neutron.agent.ovn.metadata_agent [-] api_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.858 143542 DEBUG neutron.agent.ovn.metadata_agent [-] auth_ca_cert                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.858 143542 DEBUG neutron.agent.ovn.metadata_agent [-] auth_strategy                  = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.858 143542 DEBUG neutron.agent.ovn.metadata_agent [-] backlog                        = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.858 143542 DEBUG neutron.agent.ovn.metadata_agent [-] base_mac                       = fa:16:3e:00:00:00 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.858 143542 DEBUG neutron.agent.ovn.metadata_agent [-] bind_host                      = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.858 143542 DEBUG neutron.agent.ovn.metadata_agent [-] bind_port                      = 9696 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.858 143542 DEBUG neutron.agent.ovn.metadata_agent [-] client_socket_timeout          = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.858 143542 DEBUG neutron.agent.ovn.metadata_agent [-] config_dir                     = ['/etc/neutron.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.858 143542 DEBUG neutron.agent.ovn.metadata_agent [-] config_file                    = ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.859 143542 DEBUG neutron.agent.ovn.metadata_agent [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.859 143542 DEBUG neutron.agent.ovn.metadata_agent [-] control_exchange               = neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.859 143542 DEBUG neutron.agent.ovn.metadata_agent [-] core_plugin                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.859 143542 DEBUG neutron.agent.ovn.metadata_agent [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.859 143542 DEBUG neutron.agent.ovn.metadata_agent [-] default_availability_zones     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.859 143542 DEBUG neutron.agent.ovn.metadata_agent [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'OFPHandler=INFO', 'OfctlService=INFO', 'os_ken.base.app_manager=INFO', 'os_ken.controller.controller=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.859 143542 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_agent_notification        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.859 143542 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_lease_duration            = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.859 143542 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_load_type                 = networks log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.859 143542 DEBUG neutron.agent.ovn.metadata_agent [-] dns_domain                     = openstacklocal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.860 143542 DEBUG neutron.agent.ovn.metadata_agent [-] enable_new_agents              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.860 143542 DEBUG neutron.agent.ovn.metadata_agent [-] enable_traditional_dhcp        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.860 143542 DEBUG neutron.agent.ovn.metadata_agent [-] external_dns_driver            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.860 143542 DEBUG neutron.agent.ovn.metadata_agent [-] external_pids                  = /var/lib/neutron/external/pids log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.860 143542 DEBUG neutron.agent.ovn.metadata_agent [-] filter_validation              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.860 143542 DEBUG neutron.agent.ovn.metadata_agent [-] global_physnet_mtu             = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.860 143542 DEBUG neutron.agent.ovn.metadata_agent [-] host                           = compute-1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.860 143542 DEBUG neutron.agent.ovn.metadata_agent [-] http_retries                   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.860 143542 DEBUG neutron.agent.ovn.metadata_agent [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.861 143542 DEBUG neutron.agent.ovn.metadata_agent [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.861 143542 DEBUG neutron.agent.ovn.metadata_agent [-] ipam_driver                    = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.861 143542 DEBUG neutron.agent.ovn.metadata_agent [-] ipv6_pd_enabled                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.861 143542 DEBUG neutron.agent.ovn.metadata_agent [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.861 143542 DEBUG neutron.agent.ovn.metadata_agent [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.861 143542 DEBUG neutron.agent.ovn.metadata_agent [-] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.861 143542 DEBUG neutron.agent.ovn.metadata_agent [-] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.861 143542 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.861 143542 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.861 143542 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.862 143542 DEBUG neutron.agent.ovn.metadata_agent [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.862 143542 DEBUG neutron.agent.ovn.metadata_agent [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.862 143542 DEBUG neutron.agent.ovn.metadata_agent [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.862 143542 DEBUG neutron.agent.ovn.metadata_agent [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.862 143542 DEBUG neutron.agent.ovn.metadata_agent [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.862 143542 DEBUG neutron.agent.ovn.metadata_agent [-] max_dns_nameservers            = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.862 143542 DEBUG neutron.agent.ovn.metadata_agent [-] max_header_line                = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.862 143542 DEBUG neutron.agent.ovn.metadata_agent [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.862 143542 DEBUG neutron.agent.ovn.metadata_agent [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.862 143542 DEBUG neutron.agent.ovn.metadata_agent [-] max_subnet_host_routes         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.862 143542 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_backlog               = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.863 143542 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_group           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.863 143542 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_shared_secret   = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.863 143542 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_socket          = /var/lib/neutron/metadata_proxy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.863 143542 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_socket_mode     = deduce log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.863 143542 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_user            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.863 143542 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_workers               = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.863 143542 DEBUG neutron.agent.ovn.metadata_agent [-] network_link_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.863 143542 DEBUG neutron.agent.ovn.metadata_agent [-] notify_nova_on_port_data_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.863 143542 DEBUG neutron.agent.ovn.metadata_agent [-] notify_nova_on_port_status_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.864 143542 DEBUG neutron.agent.ovn.metadata_agent [-] nova_client_cert               =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.864 143542 DEBUG neutron.agent.ovn.metadata_agent [-] nova_client_priv_key           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.864 143542 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_host             = nova-metadata-internal.openstack.svc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.864 143542 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_insecure         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.864 143542 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_port             = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.864 143542 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_protocol         = https log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.864 143542 DEBUG neutron.agent.ovn.metadata_agent [-] pagination_max_limit           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.864 143542 DEBUG neutron.agent.ovn.metadata_agent [-] periodic_fuzzy_delay           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.865 143542 DEBUG neutron.agent.ovn.metadata_agent [-] periodic_interval              = 40 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.865 143542 DEBUG neutron.agent.ovn.metadata_agent [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.865 143542 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.865 143542 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.865 143542 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.865 143542 DEBUG neutron.agent.ovn.metadata_agent [-] retry_until_window             = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.865 143542 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_resources_processing_step  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.865 143542 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_response_max_timeout       = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.865 143542 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_state_report_workers       = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.865 143542 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.866 143542 DEBUG neutron.agent.ovn.metadata_agent [-] send_events_interval           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.866 143542 DEBUG neutron.agent.ovn.metadata_agent [-] service_plugins                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.866 143542 DEBUG neutron.agent.ovn.metadata_agent [-] setproctitle                   = on log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.866 143542 DEBUG neutron.agent.ovn.metadata_agent [-] state_path                     = /var/lib/neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.866 143542 DEBUG neutron.agent.ovn.metadata_agent [-] syslog_log_facility            = syslog log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.866 143542 DEBUG neutron.agent.ovn.metadata_agent [-] tcp_keepidle                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.866 143542 DEBUG neutron.agent.ovn.metadata_agent [-] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.866 143542 DEBUG neutron.agent.ovn.metadata_agent [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.866 143542 DEBUG neutron.agent.ovn.metadata_agent [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.866 143542 DEBUG neutron.agent.ovn.metadata_agent [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.866 143542 DEBUG neutron.agent.ovn.metadata_agent [-] use_ssl                        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.867 143542 DEBUG neutron.agent.ovn.metadata_agent [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.867 143542 DEBUG neutron.agent.ovn.metadata_agent [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.867 143542 DEBUG neutron.agent.ovn.metadata_agent [-] vlan_transparent               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.867 143542 DEBUG neutron.agent.ovn.metadata_agent [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.867 143542 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_default_pool_size         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.867 143542 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.867 143542 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_log_format                = %(client_ip)s "%(request_line)s" status: %(status_code)s  len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.867 143542 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_server_debug              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.867 143542 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.867 143542 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_concurrency.lock_path     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.868 143542 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.connection_string     = messaging:// log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.868 143542 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.enabled               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.868 143542 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_doc_type           = notification log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.868 143542 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_scroll_size        = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.868 143542 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_scroll_time        = 2m log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.868 143542 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.filter_error_trace    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.868 143542 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.hmac_keys             = SECRET_KEY log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.868 143542 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.sentinel_service_name = mymaster log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.868 143542 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.socket_timeout        = 0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.869 143542 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.trace_sqlalchemy      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.869 143542 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.enforce_new_defaults = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.869 143542 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.enforce_scope      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.869 143542 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.869 143542 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.869 143542 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.869 143542 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.869 143542 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.869 143542 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.869 143542 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.870 143542 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.870 143542 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.870 143542 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.870 143542 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.870 143542 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.870 143542 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.870 143542 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.870 143542 DEBUG neutron.agent.ovn.metadata_agent [-] service_providers.service_provider = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.870 143542 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.capabilities           = [21, 12, 1, 2, 19] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.871 143542 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.group                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.871 143542 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.helper_command         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.871 143542 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.logger_name            = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.871 143542 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.thread_pool_size       = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.871 143542 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.user                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.871 143542 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.871 143542 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.group     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.871 143542 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.871 143542 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.871 143542 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.872 143542 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.user      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.872 143542 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.872 143542 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.872 143542 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.872 143542 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.872 143542 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.872 143542 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.872 143542 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.capabilities = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.872 143542 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.872 143542 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.873 143542 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.873 143542 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.873 143542 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.873 143542 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.873 143542 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.873 143542 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.873 143542 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.873 143542 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.873 143542 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.873 143542 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.capabilities      = [12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.874 143542 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.group             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.874 143542 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.helper_command    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.874 143542 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.logger_name       = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.874 143542 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.thread_pool_size  = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.874 143542 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.user              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.874 143542 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.check_child_processes_action = respawn log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.874 143542 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.check_child_processes_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.874 143542 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.comment_iptables_rules   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.874 143542 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.debug_iptables_rules     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.875 143542 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.kill_scripts_path        = /etc/neutron/kill_scripts/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.875 143542 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.root_helper              = sudo neutron-rootwrap /etc/neutron/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.875 143542 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.root_helper_daemon       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.875 143542 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.use_helper_for_ns_read   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.875 143542 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.use_random_fully         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.875 143542 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.875 143542 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.default_quota           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.875 143542 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_driver            = neutron.db.quota.driver_nolock.DbQuotaNoLockDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.875 143542 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_network           = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.875 143542 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_port              = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.876 143542 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_security_group    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.876 143542 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_security_group_rule = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.876 143542 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_subnet            = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.876 143542 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.track_quota_usage       = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.876 143542 DEBUG neutron.agent.ovn.metadata_agent [-] nova.auth_section              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.876 143542 DEBUG neutron.agent.ovn.metadata_agent [-] nova.auth_type                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.876 143542 DEBUG neutron.agent.ovn.metadata_agent [-] nova.cafile                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.876 143542 DEBUG neutron.agent.ovn.metadata_agent [-] nova.certfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.877 143542 DEBUG neutron.agent.ovn.metadata_agent [-] nova.collect_timing            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.877 143542 DEBUG neutron.agent.ovn.metadata_agent [-] nova.endpoint_type             = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.877 143542 DEBUG neutron.agent.ovn.metadata_agent [-] nova.insecure                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.877 143542 DEBUG neutron.agent.ovn.metadata_agent [-] nova.keyfile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.877 143542 DEBUG neutron.agent.ovn.metadata_agent [-] nova.region_name               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.877 143542 DEBUG neutron.agent.ovn.metadata_agent [-] nova.split_loggers             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.877 143542 DEBUG neutron.agent.ovn.metadata_agent [-] nova.timeout                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.877 143542 DEBUG neutron.agent.ovn.metadata_agent [-] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.877 143542 DEBUG neutron.agent.ovn.metadata_agent [-] placement.auth_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.877 143542 DEBUG neutron.agent.ovn.metadata_agent [-] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.878 143542 DEBUG neutron.agent.ovn.metadata_agent [-] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.878 143542 DEBUG neutron.agent.ovn.metadata_agent [-] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.878 143542 DEBUG neutron.agent.ovn.metadata_agent [-] placement.endpoint_type        = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.878 143542 DEBUG neutron.agent.ovn.metadata_agent [-] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.878 143542 DEBUG neutron.agent.ovn.metadata_agent [-] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.878 143542 DEBUG neutron.agent.ovn.metadata_agent [-] placement.region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.878 143542 DEBUG neutron.agent.ovn.metadata_agent [-] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.878 143542 DEBUG neutron.agent.ovn.metadata_agent [-] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.878 143542 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.878 143542 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.879 143542 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.879 143542 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.879 143542 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.879 143542 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.879 143542 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.879 143542 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.enable_notifications    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.879 143542 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.879 143542 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.879 143542 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.interface               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.879 143542 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.880 143542 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.880 143542 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.880 143542 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.880 143542 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.880 143542 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.service_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.880 143542 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.880 143542 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.880 143542 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.880 143542 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.880 143542 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.valid_interfaces        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.881 143542 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.881 143542 DEBUG neutron.agent.ovn.metadata_agent [-] cli_script.dry_run             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.881 143542 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.allow_stateless_action_supported = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.881 143542 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.dhcp_default_lease_time    = 43200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.881 143542 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.disable_ovn_dhcp_for_baremetal_ports = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.881 143542 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.dns_servers                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.881 143542 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.enable_distributed_floating_ip = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.881 143542 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.neutron_sync_mode          = log log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.881 143542 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_dhcp4_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.881 143542 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_dhcp6_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.882 143542 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_emit_need_to_frag      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.882 143542 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_l3_mode                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.882 143542 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_l3_scheduler           = leastloaded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.882 143542 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_metadata_enabled       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.882 143542 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_ca_cert             =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.882 143542 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_certificate         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.882 143542 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_connection          = tcp:127.0.0.1:6641 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.882 143542 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_private_key         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.882 143542 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_ca_cert             = /etc/pki/tls/certs/ovndbca.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.883 143542 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_certificate         = /etc/pki/tls/certs/ovndb.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.883 143542 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_connection          = ssl:ovsdbserver-sb.openstack.svc:6642 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.883 143542 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_private_key         = /etc/pki/tls/private/ovndb.key log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.883 143542 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.883 143542 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_log_level            = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.883 143542 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_probe_interval       = 60000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.883 143542 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_retry_max_interval   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.883 143542 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.vhost_sock_dir             = /var/run/openvswitch log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.883 143542 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.vif_type                   = ovs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.883 143542 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.bridge_mac_table_size      = 50000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.884 143542 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.igmp_snooping_enable       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.884 143542 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.ovsdb_timeout              = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.884 143542 DEBUG neutron.agent.ovn.metadata_agent [-] ovs.ovsdb_connection           = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.884 143542 DEBUG neutron.agent.ovn.metadata_agent [-] ovs.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.884 143542 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.884 143542 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.884 143542 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.884 143542 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.884 143542 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.885 143542 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.885 143542 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.885 143542 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.885 143542 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.885 143542 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.885 143542 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.885 143542 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.885 143542 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.885 143542 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.885 143542 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.886 143542 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.886 143542 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.886 143542 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.886 143542 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.886 143542 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.886 143542 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.886 143542 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.886 143542 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.886 143542 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.886 143542 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.887 143542 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.887 143542 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.887 143542 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.887 143542 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.887 143542 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.887 143542 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.887 143542 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.driver = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.887 143542 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.887 143542 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.887 143542 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.888 143542 DEBUG neutron.agent.ovn.metadata_agent [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.895 143542 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Bridge.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.895 143542 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Port.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.895 143542 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Interface.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.896 143542 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: connecting...
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.896 143542 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: connected
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.908 143542 DEBUG neutron.agent.ovn.metadata.agent [-] Loaded chassis name 2f54a3b0-231a-4b96-9e3a-0a36e3e73216 (UUID: 2f54a3b0-231a-4b96-9e3a-0a36e3e73216) and ovn bridge br-int. _load_config /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:309
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.929 143542 INFO neutron.agent.ovn.metadata.ovsdb [-] Getting OvsdbSbOvnIdl for MetadataAgent with retry
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.930 143542 DEBUG ovsdbapp.backend.ovs_idl [-] Created lookup_table index Chassis.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:87
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.930 143542 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Datapath_Binding.tunnel_key autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.930 143542 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Chassis_Private.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.932 143542 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.937 143542 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.943 143542 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched CREATE: ChassisPrivateCreateEvent(events=('create',), table='Chassis_Private', conditions=(('name', '=', '2f54a3b0-231a-4b96-9e3a-0a36e3e73216'),), old_conditions=None), priority=20 to row=Chassis_Private(chassis=[<ovs.db.idl.Row object at 0x7f9094efd8b0>], external_ids={}, name=2f54a3b0-231a-4b96-9e3a-0a36e3e73216, nb_cfg_timestamp=1770025776644, nb_cfg=1) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.943 143542 DEBUG neutron_lib.callbacks.manager [-] Subscribe: <bound method MetadataProxyHandler.post_fork_initialize of <neutron.agent.ovn.metadata.server.MetadataProxyHandler object at 0x7f9094eeff70>> process after_init 55550000, False subscribe /usr/lib/python3.9/site-packages/neutron_lib/callbacks/manager.py:52
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.944 143542 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.944 143542 DEBUG oslo_concurrency.lockutils [-] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.944 143542 DEBUG oslo_concurrency.lockutils [-] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.945 143542 INFO oslo_service.service [-] Starting 1 workers
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.947 143542 DEBUG oslo_service.service [-] Started child 143784 _start_child /usr/lib/python3.9/site-packages/oslo_service/service.py:575
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.950 143542 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.namespace_cmd', '--privsep_sock_path', '/tmp/tmpbue2c5ic/privsep.sock']
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.950 143784 DEBUG neutron_lib.callbacks.manager [-] Publish callbacks ['neutron.agent.ovn.metadata.server.MetadataProxyHandler.post_fork_initialize-2066399'] for process (None), after_init _notify_loop /usr/lib/python3.9/site-packages/neutron_lib/callbacks/manager.py:184
Feb 02 09:50:44 compute-1 python3.9[143783]: ansible-ansible.builtin.slurp Invoked with src=/var/lib/edpm-config/deployed_services.yaml
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.974 143784 INFO neutron.agent.ovn.metadata.ovsdb [-] Getting OvsdbSbOvnIdl for MetadataAgent with retry
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.974 143784 DEBUG ovsdbapp.backend.ovs_idl [-] Created lookup_table index Chassis.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:87
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.975 143784 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Datapath_Binding.tunnel_key autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.977 143784 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...
Feb 02 09:50:44 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:50:44 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4db0003840 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.983 143784 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected
Feb 02 09:50:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:44.988 143784 INFO eventlet.wsgi.server [-] (143784) wsgi starting up on http:/var/lib/neutron/metadata_proxy
Feb 02 09:50:45 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:50:45 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dbc003fd0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:50:45 compute-1 ceph-mon[80115]: pgmap v328: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.8 KiB/s rd, 750 B/s wr, 2 op/s
Feb 02 09:50:45 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:50:45 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dd8000df0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:50:45 compute-1 sshd-session[143707]: Invalid user ubuntu from 123.58.212.100 port 33736
Feb 02 09:50:45 compute-1 kernel: capability: warning: `privsep-helper' uses deprecated v2 capabilities in a way that may be insecure
Feb 02 09:50:45 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:50:45 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:50:45 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:50:45.580 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:50:45 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:45.591 143542 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap
Feb 02 09:50:45 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:45.592 143542 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmpbue2c5ic/privsep.sock __init__ /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:362
Feb 02 09:50:45 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:45.445 143813 INFO oslo.privsep.daemon [-] privsep daemon starting
Feb 02 09:50:45 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:45.452 143813 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Feb 02 09:50:45 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:45.455 143813 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_SYS_ADMIN/CAP_SYS_ADMIN/none
Feb 02 09:50:45 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:45.456 143813 INFO oslo.privsep.daemon [-] privsep daemon running as pid 143813
Feb 02 09:50:45 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:45.596 143813 DEBUG oslo.privsep.daemon [-] privsep: reply[2de41c97-5f96-4b14-be02-f1c369ad3d48]: (2,) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 02 09:50:45 compute-1 sshd-session[143707]: Connection closed by invalid user ubuntu 123.58.212.100 port 33736 [preauth]
Feb 02 09:50:45 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:50:45 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:50:45 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:50:45.763 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:50:45 compute-1 sudo[143943]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cxjpzktkwqwdiuvcdstoijuxlerniidg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025845.621254-1423-185411318382995/AnsiballZ_stat.py'
Feb 02 09:50:45 compute-1 sudo[143943]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.085 143813 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.085 143813 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.085 143813 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 02 09:50:46 compute-1 python3.9[143947]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/deployed_services.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 02 09:50:46 compute-1 sudo[143943]: pam_unix(sudo:session): session closed for user root
Feb 02 09:50:46 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.550 143813 DEBUG oslo.privsep.daemon [-] privsep: reply[9e5bba41-3b30-4518-ad6b-b4b68eef18ba]: (4, []) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.552 143542 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbAddCommand(_result=None, table=Chassis_Private, record=2f54a3b0-231a-4b96-9e3a-0a36e3e73216, column=external_ids, values=({'neutron:ovn-metadata-id': 'a75d91b6-054c-5910-b05b-ab1c1fe068f6'},)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 02 09:50:46 compute-1 sudo[144070]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gezgrdrqxwqdputatypnsrcutssgkisj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025845.621254-1423-185411318382995/AnsiballZ_copy.py'
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.570 143542 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=2f54a3b0-231a-4b96-9e3a-0a36e3e73216, col_values=(('external_ids', {'neutron:ovn-bridge': 'br-int'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 02 09:50:46 compute-1 sudo[144070]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.583 143542 DEBUG oslo_service.service [-] Full set of CONF: wait /usr/lib/python3.9/site-packages/oslo_service/service.py:649
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.583 143542 DEBUG oslo_service.service [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.583 143542 DEBUG oslo_service.service [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.583 143542 DEBUG oslo_service.service [-] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.583 143542 DEBUG oslo_service.service [-] config files: ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.583 143542 DEBUG oslo_service.service [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.583 143542 DEBUG oslo_service.service [-] agent_down_time                = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.584 143542 DEBUG oslo_service.service [-] allow_bulk                     = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.584 143542 DEBUG oslo_service.service [-] api_extensions_path            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.584 143542 DEBUG oslo_service.service [-] api_paste_config               = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.584 143542 DEBUG oslo_service.service [-] api_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.584 143542 DEBUG oslo_service.service [-] auth_ca_cert                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.584 143542 DEBUG oslo_service.service [-] auth_strategy                  = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.584 143542 DEBUG oslo_service.service [-] backlog                        = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.585 143542 DEBUG oslo_service.service [-] base_mac                       = fa:16:3e:00:00:00 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.585 143542 DEBUG oslo_service.service [-] bind_host                      = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.585 143542 DEBUG oslo_service.service [-] bind_port                      = 9696 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.585 143542 DEBUG oslo_service.service [-] client_socket_timeout          = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.585 143542 DEBUG oslo_service.service [-] config_dir                     = ['/etc/neutron.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.585 143542 DEBUG oslo_service.service [-] config_file                    = ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.585 143542 DEBUG oslo_service.service [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.586 143542 DEBUG oslo_service.service [-] control_exchange               = neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.586 143542 DEBUG oslo_service.service [-] core_plugin                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.586 143542 DEBUG oslo_service.service [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.586 143542 DEBUG oslo_service.service [-] default_availability_zones     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.586 143542 DEBUG oslo_service.service [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'OFPHandler=INFO', 'OfctlService=INFO', 'os_ken.base.app_manager=INFO', 'os_ken.controller.controller=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.586 143542 DEBUG oslo_service.service [-] dhcp_agent_notification        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.587 143542 DEBUG oslo_service.service [-] dhcp_lease_duration            = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.587 143542 DEBUG oslo_service.service [-] dhcp_load_type                 = networks log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.587 143542 DEBUG oslo_service.service [-] dns_domain                     = openstacklocal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.587 143542 DEBUG oslo_service.service [-] enable_new_agents              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.587 143542 DEBUG oslo_service.service [-] enable_traditional_dhcp        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.587 143542 DEBUG oslo_service.service [-] external_dns_driver            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.588 143542 DEBUG oslo_service.service [-] external_pids                  = /var/lib/neutron/external/pids log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.588 143542 DEBUG oslo_service.service [-] filter_validation              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.588 143542 DEBUG oslo_service.service [-] global_physnet_mtu             = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.588 143542 DEBUG oslo_service.service [-] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.588 143542 DEBUG oslo_service.service [-] host                           = compute-1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.588 143542 DEBUG oslo_service.service [-] http_retries                   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.588 143542 DEBUG oslo_service.service [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.589 143542 DEBUG oslo_service.service [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.589 143542 DEBUG oslo_service.service [-] ipam_driver                    = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.589 143542 DEBUG oslo_service.service [-] ipv6_pd_enabled                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.589 143542 DEBUG oslo_service.service [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.589 143542 DEBUG oslo_service.service [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.589 143542 DEBUG oslo_service.service [-] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.589 143542 DEBUG oslo_service.service [-] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.590 143542 DEBUG oslo_service.service [-] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.590 143542 DEBUG oslo_service.service [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.590 143542 DEBUG oslo_service.service [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.590 143542 DEBUG oslo_service.service [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.590 143542 DEBUG oslo_service.service [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.590 143542 DEBUG oslo_service.service [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.590 143542 DEBUG oslo_service.service [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.591 143542 DEBUG oslo_service.service [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.591 143542 DEBUG oslo_service.service [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.591 143542 DEBUG oslo_service.service [-] max_dns_nameservers            = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.591 143542 DEBUG oslo_service.service [-] max_header_line                = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.591 143542 DEBUG oslo_service.service [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.591 143542 DEBUG oslo_service.service [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.591 143542 DEBUG oslo_service.service [-] max_subnet_host_routes         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.592 143542 DEBUG oslo_service.service [-] metadata_backlog               = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.592 143542 DEBUG oslo_service.service [-] metadata_proxy_group           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.592 143542 DEBUG oslo_service.service [-] metadata_proxy_shared_secret   = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.592 143542 DEBUG oslo_service.service [-] metadata_proxy_socket          = /var/lib/neutron/metadata_proxy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.592 143542 DEBUG oslo_service.service [-] metadata_proxy_socket_mode     = deduce log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.592 143542 DEBUG oslo_service.service [-] metadata_proxy_user            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.592 143542 DEBUG oslo_service.service [-] metadata_workers               = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.592 143542 DEBUG oslo_service.service [-] network_link_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.593 143542 DEBUG oslo_service.service [-] notify_nova_on_port_data_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.593 143542 DEBUG oslo_service.service [-] notify_nova_on_port_status_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.593 143542 DEBUG oslo_service.service [-] nova_client_cert               =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.593 143542 DEBUG oslo_service.service [-] nova_client_priv_key           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.593 143542 DEBUG oslo_service.service [-] nova_metadata_host             = nova-metadata-internal.openstack.svc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.593 143542 DEBUG oslo_service.service [-] nova_metadata_insecure         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.593 143542 DEBUG oslo_service.service [-] nova_metadata_port             = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.594 143542 DEBUG oslo_service.service [-] nova_metadata_protocol         = https log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.594 143542 DEBUG oslo_service.service [-] pagination_max_limit           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.594 143542 DEBUG oslo_service.service [-] periodic_fuzzy_delay           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.594 143542 DEBUG oslo_service.service [-] periodic_interval              = 40 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.594 143542 DEBUG oslo_service.service [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.594 143542 DEBUG oslo_service.service [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.594 143542 DEBUG oslo_service.service [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.595 143542 DEBUG oslo_service.service [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.595 143542 DEBUG oslo_service.service [-] retry_until_window             = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.595 143542 DEBUG oslo_service.service [-] rpc_resources_processing_step  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.595 143542 DEBUG oslo_service.service [-] rpc_response_max_timeout       = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.595 143542 DEBUG oslo_service.service [-] rpc_state_report_workers       = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.595 143542 DEBUG oslo_service.service [-] rpc_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.595 143542 DEBUG oslo_service.service [-] send_events_interval           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.596 143542 DEBUG oslo_service.service [-] service_plugins                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.596 143542 DEBUG oslo_service.service [-] setproctitle                   = on log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.596 143542 DEBUG oslo_service.service [-] state_path                     = /var/lib/neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.596 143542 DEBUG oslo_service.service [-] syslog_log_facility            = syslog log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.596 143542 DEBUG oslo_service.service [-] tcp_keepidle                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.596 143542 DEBUG oslo_service.service [-] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.596 143542 DEBUG oslo_service.service [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.596 143542 DEBUG oslo_service.service [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.597 143542 DEBUG oslo_service.service [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.597 143542 DEBUG oslo_service.service [-] use_ssl                        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.597 143542 DEBUG oslo_service.service [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.597 143542 DEBUG oslo_service.service [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.597 143542 DEBUG oslo_service.service [-] vlan_transparent               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.597 143542 DEBUG oslo_service.service [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.597 143542 DEBUG oslo_service.service [-] wsgi_default_pool_size         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.598 143542 DEBUG oslo_service.service [-] wsgi_keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.598 143542 DEBUG oslo_service.service [-] wsgi_log_format                = %(client_ip)s "%(request_line)s" status: %(status_code)s  len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.598 143542 DEBUG oslo_service.service [-] wsgi_server_debug              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.598 143542 DEBUG oslo_service.service [-] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.598 143542 DEBUG oslo_service.service [-] oslo_concurrency.lock_path     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.598 143542 DEBUG oslo_service.service [-] profiler.connection_string     = messaging:// log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.598 143542 DEBUG oslo_service.service [-] profiler.enabled               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.599 143542 DEBUG oslo_service.service [-] profiler.es_doc_type           = notification log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.599 143542 DEBUG oslo_service.service [-] profiler.es_scroll_size        = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.599 143542 DEBUG oslo_service.service [-] profiler.es_scroll_time        = 2m log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.599 143542 DEBUG oslo_service.service [-] profiler.filter_error_trace    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.599 143542 DEBUG oslo_service.service [-] profiler.hmac_keys             = SECRET_KEY log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.599 143542 DEBUG oslo_service.service [-] profiler.sentinel_service_name = mymaster log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.599 143542 DEBUG oslo_service.service [-] profiler.socket_timeout        = 0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.600 143542 DEBUG oslo_service.service [-] profiler.trace_sqlalchemy      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.600 143542 DEBUG oslo_service.service [-] oslo_policy.enforce_new_defaults = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.600 143542 DEBUG oslo_service.service [-] oslo_policy.enforce_scope      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.600 143542 DEBUG oslo_service.service [-] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.600 143542 DEBUG oslo_service.service [-] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.600 143542 DEBUG oslo_service.service [-] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.601 143542 DEBUG oslo_service.service [-] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.601 143542 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.601 143542 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.601 143542 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.601 143542 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.601 143542 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.601 143542 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.601 143542 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.602 143542 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.602 143542 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.602 143542 DEBUG oslo_service.service [-] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.602 143542 DEBUG oslo_service.service [-] service_providers.service_provider = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.602 143542 DEBUG oslo_service.service [-] privsep.capabilities           = [21, 12, 1, 2, 19] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.602 143542 DEBUG oslo_service.service [-] privsep.group                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.603 143542 DEBUG oslo_service.service [-] privsep.helper_command         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.603 143542 DEBUG oslo_service.service [-] privsep.logger_name            = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.603 143542 DEBUG oslo_service.service [-] privsep.thread_pool_size       = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.603 143542 DEBUG oslo_service.service [-] privsep.user                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.603 143542 DEBUG oslo_service.service [-] privsep_dhcp_release.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.603 143542 DEBUG oslo_service.service [-] privsep_dhcp_release.group     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.603 143542 DEBUG oslo_service.service [-] privsep_dhcp_release.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.603 143542 DEBUG oslo_service.service [-] privsep_dhcp_release.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.604 143542 DEBUG oslo_service.service [-] privsep_dhcp_release.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.604 143542 DEBUG oslo_service.service [-] privsep_dhcp_release.user      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.604 143542 DEBUG oslo_service.service [-] privsep_ovs_vsctl.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.604 143542 DEBUG oslo_service.service [-] privsep_ovs_vsctl.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.604 143542 DEBUG oslo_service.service [-] privsep_ovs_vsctl.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.604 143542 DEBUG oslo_service.service [-] privsep_ovs_vsctl.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.604 143542 DEBUG oslo_service.service [-] privsep_ovs_vsctl.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.605 143542 DEBUG oslo_service.service [-] privsep_ovs_vsctl.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.605 143542 DEBUG oslo_service.service [-] privsep_namespace.capabilities = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.605 143542 DEBUG oslo_service.service [-] privsep_namespace.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.605 143542 DEBUG oslo_service.service [-] privsep_namespace.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.605 143542 DEBUG oslo_service.service [-] privsep_namespace.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.605 143542 DEBUG oslo_service.service [-] privsep_namespace.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.605 143542 DEBUG oslo_service.service [-] privsep_namespace.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.605 143542 DEBUG oslo_service.service [-] privsep_conntrack.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.606 143542 DEBUG oslo_service.service [-] privsep_conntrack.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.606 143542 DEBUG oslo_service.service [-] privsep_conntrack.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.606 143542 DEBUG oslo_service.service [-] privsep_conntrack.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.606 143542 DEBUG oslo_service.service [-] privsep_conntrack.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.606 143542 DEBUG oslo_service.service [-] privsep_conntrack.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.606 143542 DEBUG oslo_service.service [-] privsep_link.capabilities      = [12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.606 143542 DEBUG oslo_service.service [-] privsep_link.group             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.607 143542 DEBUG oslo_service.service [-] privsep_link.helper_command    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.607 143542 DEBUG oslo_service.service [-] privsep_link.logger_name       = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.607 143542 DEBUG oslo_service.service [-] privsep_link.thread_pool_size  = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.607 143542 DEBUG oslo_service.service [-] privsep_link.user              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.607 143542 DEBUG oslo_service.service [-] AGENT.check_child_processes_action = respawn log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.607 143542 DEBUG oslo_service.service [-] AGENT.check_child_processes_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.607 143542 DEBUG oslo_service.service [-] AGENT.comment_iptables_rules   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.608 143542 DEBUG oslo_service.service [-] AGENT.debug_iptables_rules     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.608 143542 DEBUG oslo_service.service [-] AGENT.kill_scripts_path        = /etc/neutron/kill_scripts/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.608 143542 DEBUG oslo_service.service [-] AGENT.root_helper              = sudo neutron-rootwrap /etc/neutron/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.608 143542 DEBUG oslo_service.service [-] AGENT.root_helper_daemon       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.608 143542 DEBUG oslo_service.service [-] AGENT.use_helper_for_ns_read   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.608 143542 DEBUG oslo_service.service [-] AGENT.use_random_fully         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.608 143542 DEBUG oslo_service.service [-] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.609 143542 DEBUG oslo_service.service [-] QUOTAS.default_quota           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.609 143542 DEBUG oslo_service.service [-] QUOTAS.quota_driver            = neutron.db.quota.driver_nolock.DbQuotaNoLockDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.609 143542 DEBUG oslo_service.service [-] QUOTAS.quota_network           = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.609 143542 DEBUG oslo_service.service [-] QUOTAS.quota_port              = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.609 143542 DEBUG oslo_service.service [-] QUOTAS.quota_security_group    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.609 143542 DEBUG oslo_service.service [-] QUOTAS.quota_security_group_rule = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.609 143542 DEBUG oslo_service.service [-] QUOTAS.quota_subnet            = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.610 143542 DEBUG oslo_service.service [-] QUOTAS.track_quota_usage       = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.610 143542 DEBUG oslo_service.service [-] nova.auth_section              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.610 143542 DEBUG oslo_service.service [-] nova.auth_type                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.610 143542 DEBUG oslo_service.service [-] nova.cafile                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.610 143542 DEBUG oslo_service.service [-] nova.certfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.610 143542 DEBUG oslo_service.service [-] nova.collect_timing            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.610 143542 DEBUG oslo_service.service [-] nova.endpoint_type             = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.611 143542 DEBUG oslo_service.service [-] nova.insecure                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.611 143542 DEBUG oslo_service.service [-] nova.keyfile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.611 143542 DEBUG oslo_service.service [-] nova.region_name               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.611 143542 DEBUG oslo_service.service [-] nova.split_loggers             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.611 143542 DEBUG oslo_service.service [-] nova.timeout                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.611 143542 DEBUG oslo_service.service [-] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.611 143542 DEBUG oslo_service.service [-] placement.auth_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.611 143542 DEBUG oslo_service.service [-] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.612 143542 DEBUG oslo_service.service [-] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.612 143542 DEBUG oslo_service.service [-] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.612 143542 DEBUG oslo_service.service [-] placement.endpoint_type        = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.612 143542 DEBUG oslo_service.service [-] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.612 143542 DEBUG oslo_service.service [-] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.612 143542 DEBUG oslo_service.service [-] placement.region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.613 143542 DEBUG oslo_service.service [-] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.613 143542 DEBUG oslo_service.service [-] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.613 143542 DEBUG oslo_service.service [-] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.613 143542 DEBUG oslo_service.service [-] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.613 143542 DEBUG oslo_service.service [-] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.613 143542 DEBUG oslo_service.service [-] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.614 143542 DEBUG oslo_service.service [-] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.614 143542 DEBUG oslo_service.service [-] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.614 143542 DEBUG oslo_service.service [-] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.614 143542 DEBUG oslo_service.service [-] ironic.enable_notifications    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.614 143542 DEBUG oslo_service.service [-] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.614 143542 DEBUG oslo_service.service [-] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.614 143542 DEBUG oslo_service.service [-] ironic.interface               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.615 143542 DEBUG oslo_service.service [-] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.615 143542 DEBUG oslo_service.service [-] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.615 143542 DEBUG oslo_service.service [-] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.615 143542 DEBUG oslo_service.service [-] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.615 143542 DEBUG oslo_service.service [-] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.615 143542 DEBUG oslo_service.service [-] ironic.service_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.615 143542 DEBUG oslo_service.service [-] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.616 143542 DEBUG oslo_service.service [-] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.616 143542 DEBUG oslo_service.service [-] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.616 143542 DEBUG oslo_service.service [-] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.616 143542 DEBUG oslo_service.service [-] ironic.valid_interfaces        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.616 143542 DEBUG oslo_service.service [-] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.616 143542 DEBUG oslo_service.service [-] cli_script.dry_run             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.616 143542 DEBUG oslo_service.service [-] ovn.allow_stateless_action_supported = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.617 143542 DEBUG oslo_service.service [-] ovn.dhcp_default_lease_time    = 43200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.617 143542 DEBUG oslo_service.service [-] ovn.disable_ovn_dhcp_for_baremetal_ports = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.617 143542 DEBUG oslo_service.service [-] ovn.dns_servers                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.617 143542 DEBUG oslo_service.service [-] ovn.enable_distributed_floating_ip = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.617 143542 DEBUG oslo_service.service [-] ovn.neutron_sync_mode          = log log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.617 143542 DEBUG oslo_service.service [-] ovn.ovn_dhcp4_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.618 143542 DEBUG oslo_service.service [-] ovn.ovn_dhcp6_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.618 143542 DEBUG oslo_service.service [-] ovn.ovn_emit_need_to_frag      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.618 143542 DEBUG oslo_service.service [-] ovn.ovn_l3_mode                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.618 143542 DEBUG oslo_service.service [-] ovn.ovn_l3_scheduler           = leastloaded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.618 143542 DEBUG oslo_service.service [-] ovn.ovn_metadata_enabled       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.618 143542 DEBUG oslo_service.service [-] ovn.ovn_nb_ca_cert             =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.618 143542 DEBUG oslo_service.service [-] ovn.ovn_nb_certificate         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.618 143542 DEBUG oslo_service.service [-] ovn.ovn_nb_connection          = tcp:127.0.0.1:6641 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.619 143542 DEBUG oslo_service.service [-] ovn.ovn_nb_private_key         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.619 143542 DEBUG oslo_service.service [-] ovn.ovn_sb_ca_cert             = /etc/pki/tls/certs/ovndbca.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.619 143542 DEBUG oslo_service.service [-] ovn.ovn_sb_certificate         = /etc/pki/tls/certs/ovndb.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.619 143542 DEBUG oslo_service.service [-] ovn.ovn_sb_connection          = ssl:ovsdbserver-sb.openstack.svc:6642 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.619 143542 DEBUG oslo_service.service [-] ovn.ovn_sb_private_key         = /etc/pki/tls/private/ovndb.key log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.619 143542 DEBUG oslo_service.service [-] ovn.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.619 143542 DEBUG oslo_service.service [-] ovn.ovsdb_log_level            = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.620 143542 DEBUG oslo_service.service [-] ovn.ovsdb_probe_interval       = 60000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.620 143542 DEBUG oslo_service.service [-] ovn.ovsdb_retry_max_interval   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.620 143542 DEBUG oslo_service.service [-] ovn.vhost_sock_dir             = /var/run/openvswitch log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.620 143542 DEBUG oslo_service.service [-] ovn.vif_type                   = ovs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.620 143542 DEBUG oslo_service.service [-] OVS.bridge_mac_table_size      = 50000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.620 143542 DEBUG oslo_service.service [-] OVS.igmp_snooping_enable       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.620 143542 DEBUG oslo_service.service [-] OVS.ovsdb_timeout              = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.621 143542 DEBUG oslo_service.service [-] ovs.ovsdb_connection           = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.621 143542 DEBUG oslo_service.service [-] ovs.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.621 143542 DEBUG oslo_service.service [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.621 143542 DEBUG oslo_service.service [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.621 143542 DEBUG oslo_service.service [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.621 143542 DEBUG oslo_service.service [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.621 143542 DEBUG oslo_service.service [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.622 143542 DEBUG oslo_service.service [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.622 143542 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.622 143542 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.622 143542 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.622 143542 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.622 143542 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.622 143542 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.623 143542 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.623 143542 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.623 143542 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.623 143542 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.623 143542 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.623 143542 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.623 143542 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.624 143542 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.624 143542 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.624 143542 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.624 143542 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.624 143542 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.624 143542 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.625 143542 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.625 143542 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.625 143542 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.625 143542 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.625 143542 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.625 143542 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.625 143542 DEBUG oslo_service.service [-] oslo_messaging_notifications.driver = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.626 143542 DEBUG oslo_service.service [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.626 143542 DEBUG oslo_service.service [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.627 143542 DEBUG oslo_service.service [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:50:46 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:50:46.627 143542 DEBUG oslo_service.service [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Feb 02 09:50:46 compute-1 sshd-session[143944]: Invalid user ubuntu from 123.58.212.100 port 33740
Feb 02 09:50:46 compute-1 python3.9[144072]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/deployed_services.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1770025845.621254-1423-185411318382995/.source.yaml _original_basename=.gtyzrjnz follow=False checksum=f6b794fee8fdd156223951721cad4bcef298320f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 02 09:50:46 compute-1 sudo[144070]: pam_unix(sudo:session): session closed for user root
Feb 02 09:50:46 compute-1 sudo[144074]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 02 09:50:46 compute-1 sudo[144074]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:50:46 compute-1 sudo[144074]: pam_unix(sudo:session): session closed for user root
Feb 02 09:50:46 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:50:46 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dd8000df0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:50:47 compute-1 sshd-session[143944]: Connection closed by invalid user ubuntu 123.58.212.100 port 33740 [preauth]
Feb 02 09:50:47 compute-1 sshd-session[134925]: Connection closed by 192.168.122.30 port 53054
Feb 02 09:50:47 compute-1 sshd-session[134921]: pam_unix(sshd:session): session closed for user zuul
Feb 02 09:50:47 compute-1 systemd[1]: session-51.scope: Deactivated successfully.
Feb 02 09:50:47 compute-1 systemd[1]: session-51.scope: Consumed 50.394s CPU time.
Feb 02 09:50:47 compute-1 systemd-logind[805]: Session 51 logged out. Waiting for processes to exit.
Feb 02 09:50:47 compute-1 systemd-logind[805]: Removed session 51.
Feb 02 09:50:47 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:50:47 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4db0003840 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:50:47 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:50:47 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dbc003fd0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:50:47 compute-1 ceph-mon[80115]: pgmap v329: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.8 KiB/s rd, 750 B/s wr, 2 op/s
Feb 02 09:50:47 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb 02 09:50:47 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb 02 09:50:47 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Feb 02 09:50:47 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:50:47 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:50:47 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:50:47.583 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:50:47 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:50:47 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:50:47 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:50:47.767 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:50:48 compute-1 sshd-session[144124]: Invalid user ubuntu from 123.58.212.100 port 33752
Feb 02 09:50:48 compute-1 sshd-session[144124]: Connection closed by invalid user ubuntu 123.58.212.100 port 33752 [preauth]
Feb 02 09:50:48 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:50:48 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dd8000df0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:50:49 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:50:49 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dcc002350 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:50:49 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:50:49 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4db0003840 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:50:49 compute-1 ceph-mon[80115]: pgmap v330: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 125 B/s rd, 0 B/s wr, 0 op/s
Feb 02 09:50:49 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:50:49 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:50:49 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:50:49.587 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:50:49 compute-1 sshd-session[144126]: Invalid user ubuntu from 123.58.212.100 port 33762
Feb 02 09:50:49 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:50:49 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:50:49 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:50:49.770 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:50:49 compute-1 sshd-session[144126]: Connection closed by invalid user ubuntu 123.58.212.100 port 33762 [preauth]
Feb 02 09:50:50 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:50:50 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dbc003fd0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:50:51 compute-1 sshd-session[144129]: Invalid user ubuntu from 123.58.212.100 port 33768
Feb 02 09:50:51 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:50:51 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dd8000df0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:50:51 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:50:51 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dcc002c70 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:50:51 compute-1 sshd-session[144129]: Connection closed by invalid user ubuntu 123.58.212.100 port 33768 [preauth]
Feb 02 09:50:51 compute-1 ceph-mon[80115]: pgmap v331: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 204 B/s rd, 0 B/s wr, 0 op/s
Feb 02 09:50:51 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 09:50:51 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:50:51 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 09:50:51 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:50:51.591 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 09:50:51 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:50:51 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 09:50:51 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:50:51.774 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 09:50:52 compute-1 ceph-mon[80115]: pgmap v332: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 B/s wr, 0 op/s
Feb 02 09:50:52 compute-1 sshd-session[144132]: Invalid user ubuntu from 123.58.212.100 port 50758
Feb 02 09:50:52 compute-1 sshd-session[144132]: Connection closed by invalid user ubuntu 123.58.212.100 port 50758 [preauth]
Feb 02 09:50:52 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:50:52 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4db0003840 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:50:53 compute-1 sshd-session[144135]: Accepted publickey for zuul from 192.168.122.30 port 53654 ssh2: ECDSA SHA256:RIWOugHsRom13QN8+H2eekzMj6VNcm6gUxie+zDStiQ
Feb 02 09:50:53 compute-1 systemd-logind[805]: New session 52 of user zuul.
Feb 02 09:50:53 compute-1 systemd[1]: Started Session 52 of User zuul.
Feb 02 09:50:53 compute-1 sshd-session[144135]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Feb 02 09:50:53 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:50:53 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dbc003fd0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:50:53 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:50:53 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dd8000df0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:50:53 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:50:53 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:50:53 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:50:53.595 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:50:53 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:50:53 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:50:53 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:50:53.777 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:50:54 compute-1 python3.9[144290]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 02 09:50:54 compute-1 sshd-session[144141]: Invalid user debian from 123.58.212.100 port 50766
Feb 02 09:50:54 compute-1 sshd-session[144141]: Connection closed by invalid user debian 123.58.212.100 port 50766 [preauth]
Feb 02 09:50:54 compute-1 sudo[144372]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Feb 02 09:50:54 compute-1 sudo[144372]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:50:54 compute-1 sudo[144372]: pam_unix(sudo:session): session closed for user root
Feb 02 09:50:54 compute-1 ceph-mon[80115]: pgmap v333: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 341 B/s rd, 0 B/s wr, 0 op/s
Feb 02 09:50:54 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:50:54 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dcc002c70 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:50:55 compute-1 sudo[144472]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fybtygcxzkoqccncyjyxxyundvzlnzvj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025854.6086228-58-212002717765863/AnsiballZ_command.py'
Feb 02 09:50:55 compute-1 sudo[144472]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:50:55 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:50:55 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4db0003840 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:50:55 compute-1 python3.9[144474]: ansible-ansible.legacy.command Invoked with _raw_params=podman ps -a --filter name=^nova_virtlogd$ --format \{\{.Names\}\} _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 02 09:50:55 compute-1 sudo[144472]: pam_unix(sudo:session): session closed for user root
Feb 02 09:50:55 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:50:55 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dbc003fd0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:50:55 compute-1 sshd-session[144319]: Invalid user debian from 123.58.212.100 port 50778
Feb 02 09:50:55 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:50:55 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:50:55 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:50:55.598 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:50:55 compute-1 sshd-session[144319]: Connection closed by invalid user debian 123.58.212.100 port 50778 [preauth]
Feb 02 09:50:55 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:50:55 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:50:55 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:50:55.779 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:50:56 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 09:50:56 compute-1 sudo[144639]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-biirqzguamoozhgbqwwslexljovoolkf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025855.7662237-91-26153872236672/AnsiballZ_systemd_service.py'
Feb 02 09:50:56 compute-1 sudo[144639]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:50:56 compute-1 python3.9[144641]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Feb 02 09:50:56 compute-1 systemd[1]: Reloading.
Feb 02 09:50:56 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:50:56 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dbc003fd0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:50:57 compute-1 systemd-sysv-generator[144671]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 02 09:50:57 compute-1 systemd-rc-local-generator[144668]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 02 09:50:57 compute-1 ceph-mon[80115]: pgmap v334: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Feb 02 09:50:57 compute-1 sshd-session[144564]: Invalid user debian from 123.58.212.100 port 50790
Feb 02 09:50:57 compute-1 sudo[144639]: pam_unix(sudo:session): session closed for user root
Feb 02 09:50:57 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:50:57 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dcc002e10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:50:57 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:50:57 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4db0003840 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:50:57 compute-1 sshd-session[144564]: Connection closed by invalid user debian 123.58.212.100 port 50790 [preauth]
Feb 02 09:50:57 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:50:57 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 09:50:57 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:50:57.601 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 09:50:57 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:50:57 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:50:57 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:50:57.783 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:50:57 compute-1 sshd-session[144675]: Invalid user solv from 80.94.92.184 port 38094
Feb 02 09:50:57 compute-1 sshd-session[144675]: Connection closed by invalid user solv 80.94.92.184 port 38094 [preauth]
Feb 02 09:50:58 compute-1 python3.9[144831]: ansible-ansible.builtin.service_facts Invoked
Feb 02 09:50:58 compute-1 network[144848]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Feb 02 09:50:58 compute-1 network[144849]: 'network-scripts' will be removed from distribution in near future.
Feb 02 09:50:58 compute-1 network[144850]: It is advised to switch to 'NetworkManager' instead for network management.
Feb 02 09:50:58 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:50:58 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dd8000df0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:50:59 compute-1 ceph-mon[80115]: pgmap v335: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Feb 02 09:50:59 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:50:59 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dbc003fd0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:50:59 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:50:59 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dcc003fa0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:50:59 compute-1 sshd-session[144756]: Invalid user debian from 123.58.212.100 port 50794
Feb 02 09:50:59 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:50:59 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:50:59 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:50:59.606 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:50:59 compute-1 sshd-session[144756]: Connection closed by invalid user debian 123.58.212.100 port 50794 [preauth]
Feb 02 09:50:59 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:50:59 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:50:59 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:50:59.787 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:51:00 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-haproxy-nfs-cephfs-compute-1-sryqbx[86108]: [WARNING] 032/095100 (4) : Server backend/nfs.cephfs.2 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Feb 02 09:51:00 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:51:00 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4db0003840 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:51:01 compute-1 ceph-mon[80115]: pgmap v336: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 341 B/s rd, 0 op/s
Feb 02 09:51:01 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:51:01 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4db0003840 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:51:01 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:51:01 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dbc003fd0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:51:01 compute-1 sudo[145114]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jybybfcsuobgwlznlkcgfweatpmnmssi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025861.1201706-148-169235426098723/AnsiballZ_systemd_service.py'
Feb 02 09:51:01 compute-1 sudo[145114]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:51:01 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 09:51:01 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:51:01 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 09:51:01 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:51:01.608 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 09:51:01 compute-1 python3.9[145116]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_libvirt.target state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 02 09:51:01 compute-1 sudo[145114]: pam_unix(sudo:session): session closed for user root
Feb 02 09:51:01 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:51:01 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 09:51:01 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:51:01.789 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 09:51:01 compute-1 sshd-session[144987]: Invalid user debian from 123.58.212.100 port 50800
Feb 02 09:51:02 compute-1 sshd-session[144987]: Connection closed by invalid user debian 123.58.212.100 port 50800 [preauth]
Feb 02 09:51:02 compute-1 sudo[145267]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-brmdviybvvjvraxudmaukmibswgxplsh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025861.9037764-148-37753009064105/AnsiballZ_systemd_service.py'
Feb 02 09:51:02 compute-1 sudo[145267]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:51:02 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Feb 02 09:51:02 compute-1 python3.9[145269]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtlogd_wrapper.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 02 09:51:02 compute-1 sudo[145267]: pam_unix(sudo:session): session closed for user root
Feb 02 09:51:02 compute-1 sudo[145423]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-phrvppoadjhsylnomhgjyvhmrjbmsyze ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025862.6408315-148-88461281363058/AnsiballZ_systemd_service.py'
Feb 02 09:51:02 compute-1 sudo[145423]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:51:03 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:51:03 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dcc003fa0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:51:03 compute-1 python3.9[145425]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtnodedevd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 02 09:51:03 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:51:03 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dd8000df0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:51:03 compute-1 sudo[145423]: pam_unix(sudo:session): session closed for user root
Feb 02 09:51:03 compute-1 ceph-mon[80115]: pgmap v337: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Feb 02 09:51:03 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:51:03 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4db0003840 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:51:03 compute-1 sshd-session[145270]: Invalid user debian from 123.58.212.100 port 33848
Feb 02 09:51:03 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:51:03 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:51:03 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:51:03.611 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:51:03 compute-1 sshd-session[145270]: Connection closed by invalid user debian 123.58.212.100 port 33848 [preauth]
Feb 02 09:51:03 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:51:03 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 09:51:03 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:51:03.791 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 09:51:03 compute-1 sudo[145587]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ufcjvtefrezxfqfjocrvtripufxfltwr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025863.4021573-148-235175851725900/AnsiballZ_systemd_service.py'
Feb 02 09:51:03 compute-1 sudo[145587]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:51:03 compute-1 podman[145550]: 2026-02-02 09:51:03.874381129 +0000 UTC m=+0.129411330 container health_status 1fb2696999ea15e9688144989093738543d3cef725f9986792d6dfd707aefbe2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:099d88ae13fa2b3409da5310cdcba7fa01d2c87a8bc98296299a57054b9a075e, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'db4758ee7523fe447444c4bd2b867b543b1eee4e3bbcf6676cd1b27bf6147d86-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:099d88ae13fa2b3409da5310cdcba7fa01d2c87a8bc98296299a57054b9a075e', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 02 09:51:04 compute-1 python3.9[145594]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtproxyd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 02 09:51:04 compute-1 sudo[145587]: pam_unix(sudo:session): session closed for user root
Feb 02 09:51:04 compute-1 sudo[145760]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pfvtcdlhtbquaxxrmeoakxburedutzhy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025864.5830643-148-63469221302879/AnsiballZ_systemd_service.py'
Feb 02 09:51:04 compute-1 sudo[145760]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:51:04 compute-1 sshd-session[145607]: Invalid user debian from 123.58.212.100 port 33860
Feb 02 09:51:05 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:51:05 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dbc003fd0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:51:05 compute-1 sshd-session[145607]: Connection closed by invalid user debian 123.58.212.100 port 33860 [preauth]
Feb 02 09:51:05 compute-1 python3.9[145762]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtqemud.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 02 09:51:05 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:51:05 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dcc003fa0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:51:05 compute-1 sudo[145760]: pam_unix(sudo:session): session closed for user root
Feb 02 09:51:05 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:51:05 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dd8000df0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:51:05 compute-1 ceph-mon[80115]: pgmap v338: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 341 B/s rd, 0 op/s
Feb 02 09:51:05 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:51:05 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:51:05 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:51:05.613 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:51:05 compute-1 sudo[145915]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-roovfofotpwuxbjzzmsrwqkoapwgxvjb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025865.489949-148-184734546696855/AnsiballZ_systemd_service.py'
Feb 02 09:51:05 compute-1 sudo[145915]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:51:05 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:51:05 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:51:05 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:51:05.792 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:51:05 compute-1 python3.9[145917]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtsecretd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 02 09:51:06 compute-1 sudo[145915]: pam_unix(sudo:session): session closed for user root
Feb 02 09:51:06 compute-1 sudo[146068]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rnnphzirgkxyfuthqihitqcgihcvbiaw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025866.1195579-148-50853477994815/AnsiballZ_systemd_service.py'
Feb 02 09:51:06 compute-1 sudo[146068]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:51:06 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 09:51:06 compute-1 python3.9[146070]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtstoraged.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 02 09:51:06 compute-1 sudo[146068]: pam_unix(sudo:session): session closed for user root
Feb 02 09:51:07 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:51:07 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4db0004160 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:51:07 compute-1 sshd-session[145769]: Invalid user debian from 123.58.212.100 port 33870
Feb 02 09:51:07 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:51:07 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4db0004160 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:51:07 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:51:07 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dcc004cb0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:51:07 compute-1 sshd-session[145769]: Connection closed by invalid user debian 123.58.212.100 port 33870 [preauth]
Feb 02 09:51:07 compute-1 ceph-mon[80115]: pgmap v339: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s
Feb 02 09:51:07 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:51:07 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 09:51:07 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:51:07.615 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 09:51:07 compute-1 sudo[146224]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ayzthdnkffdzbuvtukvencyhzdhnqqyo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025867.2663195-304-65732200982629/AnsiballZ_file.py'
Feb 02 09:51:07 compute-1 sudo[146224]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:51:07 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:51:07 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:51:07 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:51:07.795 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:51:07 compute-1 python3.9[146226]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_libvirt.target state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 02 09:51:07 compute-1 sudo[146224]: pam_unix(sudo:session): session closed for user root
Feb 02 09:51:08 compute-1 sudo[146376]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-efkqhzrposfqfxbtrtmdwnbejwglekyz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025868.0771153-304-62889996691641/AnsiballZ_file.py'
Feb 02 09:51:08 compute-1 sudo[146376]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:51:08 compute-1 ceph-mon[80115]: pgmap v340: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s
Feb 02 09:51:08 compute-1 python3.9[146378]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtlogd_wrapper.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 02 09:51:08 compute-1 sudo[146376]: pam_unix(sudo:session): session closed for user root
Feb 02 09:51:08 compute-1 sshd-session[146196]: Invalid user debian from 123.58.212.100 port 33872
Feb 02 09:51:08 compute-1 sshd-session[146196]: Connection closed by invalid user debian 123.58.212.100 port 33872 [preauth]
Feb 02 09:51:08 compute-1 sudo[146529]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gfrlgxvszqbwnysqzzeinzkeapkkxwwo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025868.658364-304-145460158679501/AnsiballZ_file.py'
Feb 02 09:51:08 compute-1 sudo[146529]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:51:09 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:51:09 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dd8000df0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:51:09 compute-1 python3.9[146531]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtnodedevd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 02 09:51:09 compute-1 sudo[146529]: pam_unix(sudo:session): session closed for user root
Feb 02 09:51:09 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:51:09 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dd8000df0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:51:09 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:51:09 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dd8000df0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:51:09 compute-1 sudo[146683]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uxolklildafrclejjhbvwgnrgahubuax ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025869.3450394-304-150454930082179/AnsiballZ_file.py'
Feb 02 09:51:09 compute-1 sudo[146683]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:51:09 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:51:09 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:51:09 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:51:09.618 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:51:09 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:51:09 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:51:09 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:51:09.797 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:51:09 compute-1 python3.9[146685]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtproxyd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 02 09:51:09 compute-1 sudo[146683]: pam_unix(sudo:session): session closed for user root
Feb 02 09:51:10 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:51:10 : epoch 698072dc : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Feb 02 09:51:10 compute-1 sshd-session[146532]: Invalid user debian from 123.58.212.100 port 33888
Feb 02 09:51:10 compute-1 sudo[146835]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pbziurrfufautyhkisvmhzhububvtxju ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025870.0438938-304-59383617532153/AnsiballZ_file.py'
Feb 02 09:51:10 compute-1 sudo[146835]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:51:10 compute-1 sshd-session[146532]: Connection closed by invalid user debian 123.58.212.100 port 33888 [preauth]
Feb 02 09:51:10 compute-1 python3.9[146837]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtqemud.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 02 09:51:10 compute-1 sudo[146835]: pam_unix(sudo:session): session closed for user root
Feb 02 09:51:10 compute-1 ceph-mon[80115]: pgmap v341: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 85 B/s wr, 0 op/s
Feb 02 09:51:10 compute-1 sudo[146990]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ojglbcxecjsvykqnvmiidiagrprsobgl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025870.5765235-304-73796198452794/AnsiballZ_file.py'
Feb 02 09:51:10 compute-1 sudo[146990]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:51:11 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:51:11 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dcc004cb0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:51:11 compute-1 python3.9[146992]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtsecretd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 02 09:51:11 compute-1 sudo[146990]: pam_unix(sudo:session): session closed for user root
Feb 02 09:51:11 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:51:11 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4db0004160 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:51:11 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:51:11 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dc8001760 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:51:11 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 09:51:11 compute-1 sudo[147144]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kyoimcxiddmsyetxujkvqgeadbbayusw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025871.1846735-304-76028730968703/AnsiballZ_file.py'
Feb 02 09:51:11 compute-1 sudo[147144]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:51:11 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:51:11 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 09:51:11 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:51:11.621 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 09:51:11 compute-1 sshd-session[146866]: Invalid user debian from 123.58.212.100 port 33900
Feb 02 09:51:11 compute-1 python3.9[147146]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtstoraged.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 02 09:51:11 compute-1 sudo[147144]: pam_unix(sudo:session): session closed for user root
Feb 02 09:51:11 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:51:11 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:51:11 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:51:11.799 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:51:11 compute-1 sshd-session[146866]: Connection closed by invalid user debian 123.58.212.100 port 33900 [preauth]
Feb 02 09:51:12 compute-1 sudo[147297]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aosjjdkylxxuqkoppxkkyfdhmqqkrfqu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025871.883183-454-8497977195636/AnsiballZ_file.py'
Feb 02 09:51:12 compute-1 sudo[147297]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:51:12 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-haproxy-nfs-cephfs-compute-1-sryqbx[86108]: [WARNING] 032/095112 (4) : Server backend/nfs.cephfs.1 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 1 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Feb 02 09:51:12 compute-1 python3.9[147300]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_libvirt.target state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 02 09:51:12 compute-1 sudo[147297]: pam_unix(sudo:session): session closed for user root
Feb 02 09:51:13 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:51:13 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dd8000df0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:51:13 compute-1 sudo[147451]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bnrygmtxrqmlizlmcsrbjufgxrpqifkb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025872.8273587-454-279086742588978/AnsiballZ_file.py'
Feb 02 09:51:13 compute-1 sudo[147451]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:51:13 compute-1 sshd-session[147296]: Invalid user debian from 123.58.212.100 port 55830
Feb 02 09:51:13 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:51:13 : epoch 698072dc : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Feb 02 09:51:13 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:51:13 : epoch 698072dc : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Feb 02 09:51:13 compute-1 podman[147453]: 2026-02-02 09:51:13.136908298 +0000 UTC m=+0.065718343 container health_status 9cab238676dda9a572dafc047b624a8b78a8fbec063bf997856559897160605d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'db4758ee7523fe447444c4bd2b867b543b1eee4e3bbcf6676cd1b27bf6147d86-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 02 09:51:13 compute-1 ceph-mon[80115]: pgmap v342: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 426 B/s rd, 85 B/s wr, 0 op/s
Feb 02 09:51:13 compute-1 python3.9[147454]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtlogd_wrapper.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 02 09:51:13 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:51:13 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4da8001090 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:51:13 compute-1 sudo[147451]: pam_unix(sudo:session): session closed for user root
Feb 02 09:51:13 compute-1 sshd-session[147296]: Connection closed by invalid user debian 123.58.212.100 port 55830 [preauth]
Feb 02 09:51:13 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:51:13 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4db0004160 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:51:13 compute-1 sudo[147626]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-slzeoqmqspzdsiuglfsedirzrsnikiia ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025873.4016187-454-207649466126764/AnsiballZ_file.py'
Feb 02 09:51:13 compute-1 sudo[147626]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:51:13 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:51:13 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:51:13 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:51:13.624 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:51:13 compute-1 python3.9[147628]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtnodedevd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 02 09:51:13 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:51:13 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:51:13 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:51:13.801 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:51:13 compute-1 sudo[147626]: pam_unix(sudo:session): session closed for user root
Feb 02 09:51:14 compute-1 sudo[147778]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fujmcbmmgygfiirvymiehpdywwxiyqpa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025873.9557052-454-162619450366540/AnsiballZ_file.py'
Feb 02 09:51:14 compute-1 sudo[147778]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:51:14 compute-1 python3.9[147780]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtproxyd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 02 09:51:14 compute-1 sudo[147778]: pam_unix(sudo:session): session closed for user root
Feb 02 09:51:14 compute-1 sshd-session[147574]: Invalid user debian from 123.58.212.100 port 55838
Feb 02 09:51:14 compute-1 sshd-session[147574]: Connection closed by invalid user debian 123.58.212.100 port 55838 [preauth]
Feb 02 09:51:14 compute-1 sudo[147902]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Feb 02 09:51:14 compute-1 sudo[147902]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:51:14 compute-1 sudo[147902]: pam_unix(sudo:session): session closed for user root
Feb 02 09:51:14 compute-1 sudo[147956]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eckkycmueeihzurojglpobpncpwmdsja ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025874.6090503-454-163334229543298/AnsiballZ_file.py'
Feb 02 09:51:14 compute-1 sudo[147956]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:51:15 compute-1 python3.9[147958]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtqemud.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 02 09:51:15 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:51:15 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dc8001760 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:51:15 compute-1 sudo[147956]: pam_unix(sudo:session): session closed for user root
Feb 02 09:51:15 compute-1 ceph-mon[80115]: pgmap v343: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.3 KiB/s rd, 426 B/s wr, 1 op/s
Feb 02 09:51:15 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:51:15 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dd8000df0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:51:15 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:51:15 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4da8001090 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:51:15 compute-1 sudo[148110]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rbcywsrurkgwvugaumesllamwphqdfze ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025875.177045-454-248752427927269/AnsiballZ_file.py'
Feb 02 09:51:15 compute-1 sudo[148110]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:51:15 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:51:15 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 09:51:15 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:51:15.627 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 09:51:15 compute-1 python3.9[148112]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtsecretd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 02 09:51:15 compute-1 sudo[148110]: pam_unix(sudo:session): session closed for user root
Feb 02 09:51:15 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:51:15 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:51:15 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:51:15.804 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:51:16 compute-1 sudo[148262]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mqsonqgphknioputlcxuauddhpjzcbrc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025875.8400815-454-169622525730374/AnsiballZ_file.py'
Feb 02 09:51:16 compute-1 sudo[148262]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:51:16 compute-1 python3.9[148264]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtstoraged.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 02 09:51:16 compute-1 sudo[148262]: pam_unix(sudo:session): session closed for user root
Feb 02 09:51:16 compute-1 sshd-session[147959]: Invalid user debian from 123.58.212.100 port 55848
Feb 02 09:51:16 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 09:51:16 compute-1 sshd-session[147959]: Connection closed by invalid user debian 123.58.212.100 port 55848 [preauth]
Feb 02 09:51:17 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:51:17 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4db0004160 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:51:17 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:51:17 : epoch 698072dc : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Feb 02 09:51:17 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:51:17 : epoch 698072dc : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Feb 02 09:51:17 compute-1 sudo[148417]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-plxphlxalqlbpfqyzwhwvbhrofjsndbr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025876.869021-607-156559727113572/AnsiballZ_command.py'
Feb 02 09:51:17 compute-1 sudo[148417]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:51:17 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:51:17 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4db0004160 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:51:17 compute-1 ceph-mon[80115]: pgmap v344: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.2 KiB/s rd, 426 B/s wr, 1 op/s
Feb 02 09:51:17 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Feb 02 09:51:17 compute-1 python3.9[148419]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then
                                               systemctl disable --now certmonger.service
                                               test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service
                                             fi
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 02 09:51:17 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:51:17 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dd8000df0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:51:17 compute-1 sudo[148417]: pam_unix(sudo:session): session closed for user root
Feb 02 09:51:17 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:51:17 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 09:51:17 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:51:17.630 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 09:51:17 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:51:17 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:51:17 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:51:17.807 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:51:17 compute-1 sshd-session[148336]: Invalid user debian from 123.58.212.100 port 55852
Feb 02 09:51:18 compute-1 sshd-session[148336]: Connection closed by invalid user debian 123.58.212.100 port 55852 [preauth]
Feb 02 09:51:18 compute-1 python3.9[148571]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Feb 02 09:51:18 compute-1 sudo[148724]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jrgdnmodckpjdlncobhghwpzlxolpnib ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025878.7053642-661-35674447082133/AnsiballZ_systemd_service.py'
Feb 02 09:51:18 compute-1 sudo[148724]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:51:19 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:51:19 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4da8001090 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:51:19 compute-1 sshd-session[148596]: Invalid user debian from 123.58.212.100 port 55866
Feb 02 09:51:19 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:51:19 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4da8001090 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:51:19 compute-1 ceph-mon[80115]: pgmap v345: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.2 KiB/s rd, 426 B/s wr, 1 op/s
Feb 02 09:51:19 compute-1 python3.9[148726]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Feb 02 09:51:19 compute-1 systemd[1]: Reloading.
Feb 02 09:51:19 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:51:19 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4da8001090 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:51:19 compute-1 sshd-session[148596]: Connection closed by invalid user debian 123.58.212.100 port 55866 [preauth]
Feb 02 09:51:19 compute-1 systemd-sysv-generator[148757]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 02 09:51:19 compute-1 systemd-rc-local-generator[148753]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 02 09:51:19 compute-1 sudo[148724]: pam_unix(sudo:session): session closed for user root
Feb 02 09:51:19 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:51:19 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:51:19 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:51:19.634 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:51:19 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:51:19 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:51:19 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:51:19.809 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:51:20 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:51:20 : epoch 698072dc : compute-1 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Feb 02 09:51:20 compute-1 sudo[148913]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-asljifdzgczwkpcbdfcxtkidrcyjjiao ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025879.8709197-685-3706927425517/AnsiballZ_command.py'
Feb 02 09:51:20 compute-1 sudo[148913]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:51:20 compute-1 python3.9[148915]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_libvirt.target _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 02 09:51:20 compute-1 sudo[148913]: pam_unix(sudo:session): session closed for user root
Feb 02 09:51:20 compute-1 sshd-session[148762]: Invalid user debian from 123.58.212.100 port 55868
Feb 02 09:51:20 compute-1 sudo[149067]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gtulrexexmhptfcwgfygjfetzbddcgkl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025880.534928-685-124154859149921/AnsiballZ_command.py'
Feb 02 09:51:20 compute-1 sudo[149067]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:51:20 compute-1 sshd-session[148762]: Connection closed by invalid user debian 123.58.212.100 port 55868 [preauth]
Feb 02 09:51:20 compute-1 python3.9[149069]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtlogd_wrapper.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 02 09:51:21 compute-1 sudo[149067]: pam_unix(sudo:session): session closed for user root
Feb 02 09:51:21 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:51:21 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dd800a8a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:51:21 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:51:21 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4da8001090 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:51:21 compute-1 ceph-mon[80115]: pgmap v346: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 2.2 KiB/s rd, 767 B/s wr, 3 op/s
Feb 02 09:51:21 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:51:21 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4da8001090 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:51:21 compute-1 sudo[149222]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rulqqbrjwpxbilvvmgvumsbqncnmlslb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025881.139769-685-169338073212749/AnsiballZ_command.py'
Feb 02 09:51:21 compute-1 sudo[149222]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:51:21 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 09:51:21 compute-1 python3.9[149224]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtnodedevd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 02 09:51:21 compute-1 sudo[149222]: pam_unix(sudo:session): session closed for user root
Feb 02 09:51:21 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:51:21 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:51:21 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:51:21.637 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:51:21 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:51:21 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 09:51:21 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:51:21.811 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 09:51:21 compute-1 sshd-session[149073]: Invalid user debian from 123.58.212.100 port 55874
Feb 02 09:51:22 compute-1 sudo[149375]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hymtycbxfckbngfoedyviomerpmfoylo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025881.7466516-685-272453453252345/AnsiballZ_command.py'
Feb 02 09:51:22 compute-1 sudo[149375]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:51:22 compute-1 sshd-session[149073]: Connection closed by invalid user debian 123.58.212.100 port 55874 [preauth]
Feb 02 09:51:22 compute-1 python3.9[149377]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtproxyd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 02 09:51:23 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:51:23 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4da8001090 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:51:23 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:51:23 : epoch 698072dc : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Feb 02 09:51:23 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:51:23 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dc8003380 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:51:23 compute-1 sudo[149375]: pam_unix(sudo:session): session closed for user root
Feb 02 09:51:23 compute-1 ceph-mon[80115]: pgmap v347: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.8 KiB/s rd, 682 B/s wr, 2 op/s
Feb 02 09:51:23 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:51:23 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4db0004160 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:51:23 compute-1 sshd-session[149379]: Invalid user debian from 123.58.212.100 port 52164
Feb 02 09:51:23 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:51:23 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:51:23 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:51:23.640 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:51:23 compute-1 sudo[149531]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-alchrqjkrxnqqdoktsressonmbybiaiw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025883.422226-685-242150517248699/AnsiballZ_command.py'
Feb 02 09:51:23 compute-1 sudo[149531]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:51:23 compute-1 sshd-session[149379]: Connection closed by invalid user debian 123.58.212.100 port 52164 [preauth]
Feb 02 09:51:23 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:51:23 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 09:51:23 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:51:23.813 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 09:51:23 compute-1 python3.9[149533]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtqemud.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 02 09:51:23 compute-1 sudo[149531]: pam_unix(sudo:session): session closed for user root
Feb 02 09:51:24 compute-1 sudo[149686]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ohyyhscikjsxbjshpvcfmdustzdnrmtq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025884.028213-685-267012652998288/AnsiballZ_command.py'
Feb 02 09:51:24 compute-1 sudo[149686]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:51:24 compute-1 python3.9[149688]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtsecretd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 02 09:51:24 compute-1 sudo[149686]: pam_unix(sudo:session): session closed for user root
Feb 02 09:51:24 compute-1 sshd-session[149558]: Invalid user debian from 123.58.212.100 port 52176
Feb 02 09:51:24 compute-1 sudo[149840]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xstijjumhzneidzpvthvwbxwtxxlkxol ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025884.6944716-685-220384405893513/AnsiballZ_command.py'
Feb 02 09:51:24 compute-1 sudo[149840]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:51:25 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:51:25 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dc8003380 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:51:25 compute-1 sshd-session[149558]: Connection closed by invalid user debian 123.58.212.100 port 52176 [preauth]
Feb 02 09:51:25 compute-1 python3.9[149842]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtstoraged.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 02 09:51:25 compute-1 sudo[149840]: pam_unix(sudo:session): session closed for user root
Feb 02 09:51:25 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:51:25 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dc8003380 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:51:25 compute-1 ceph-mon[80115]: pgmap v348: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 2.4 KiB/s rd, 1023 B/s wr, 3 op/s
Feb 02 09:51:25 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:51:25 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dc8003380 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:51:25 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:51:25 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 09:51:25 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:51:25.643 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 09:51:25 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:51:25 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:51:25 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:51:25.816 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:51:26 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:51:26 : epoch 698072dc : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Feb 02 09:51:26 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:51:26 : epoch 698072dc : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Feb 02 09:51:26 compute-1 sudo[149995]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-towqyrkhlkguxpqdnzwxofhkungnyrze ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025885.8288746-847-243965198342694/AnsiballZ_getent.py'
Feb 02 09:51:26 compute-1 sudo[149995]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:51:26 compute-1 sshd-session[149868]: Invalid user debian from 123.58.212.100 port 52188
Feb 02 09:51:26 compute-1 python3.9[149997]: ansible-ansible.builtin.getent Invoked with database=passwd key=libvirt fail_key=True service=None split=None
Feb 02 09:51:26 compute-1 sudo[149995]: pam_unix(sudo:session): session closed for user root
Feb 02 09:51:26 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 09:51:26 compute-1 sshd-session[149868]: Connection closed by invalid user debian 123.58.212.100 port 52188 [preauth]
Feb 02 09:51:26 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-haproxy-nfs-cephfs-compute-1-sryqbx[86108]: [WARNING] 032/095126 (4) : Server backend/nfs.cephfs.2 is UP, reason: Layer4 check passed, check duration: 0ms. 2 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Feb 02 09:51:27 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:51:27 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4da8001090 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:51:27 compute-1 sudo[150151]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vwqikhfmvoastgtklwkhzcqibngkboie ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025886.6701326-871-140408096204819/AnsiballZ_group.py'
Feb 02 09:51:27 compute-1 sudo[150151]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:51:27 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:51:27 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dd800a8a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:51:27 compute-1 python3.9[150153]: ansible-ansible.builtin.group Invoked with gid=42473 name=libvirt state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Feb 02 09:51:27 compute-1 ceph-mon[80115]: pgmap v349: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.5 KiB/s rd, 682 B/s wr, 2 op/s
Feb 02 09:51:27 compute-1 rsyslogd[1009]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Feb 02 09:51:27 compute-1 rsyslogd[1009]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Feb 02 09:51:27 compute-1 groupadd[150154]: group added to /etc/group: name=libvirt, GID=42473
Feb 02 09:51:27 compute-1 groupadd[150154]: group added to /etc/gshadow: name=libvirt
Feb 02 09:51:27 compute-1 groupadd[150154]: new group: name=libvirt, GID=42473
Feb 02 09:51:27 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:51:27 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4db0004160 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:51:27 compute-1 sudo[150151]: pam_unix(sudo:session): session closed for user root
Feb 02 09:51:27 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:51:27 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:51:27 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:51:27.646 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:51:27 compute-1 sshd-session[150076]: Invalid user debian from 123.58.212.100 port 52192
Feb 02 09:51:27 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:51:27 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 09:51:27 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:51:27.818 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 09:51:27 compute-1 sshd-session[150076]: Connection closed by invalid user debian 123.58.212.100 port 52192 [preauth]
Feb 02 09:51:28 compute-1 sudo[150312]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yuddgloivqtitnyxrxgwxcdzuaevigli ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025887.9347892-895-122656762770837/AnsiballZ_user.py'
Feb 02 09:51:28 compute-1 sudo[150312]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:51:28 compute-1 python3.9[150314]: ansible-ansible.builtin.user Invoked with comment=libvirt user group=libvirt groups=[''] name=libvirt shell=/sbin/nologin state=present uid=42473 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-1 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Feb 02 09:51:28 compute-1 useradd[150316]: new user: name=libvirt, UID=42473, GID=42473, home=/home/libvirt, shell=/sbin/nologin, from=/dev/pts/0
Feb 02 09:51:28 compute-1 sudo[150312]: pam_unix(sudo:session): session closed for user root
Feb 02 09:51:29 compute-1 sshd-session[150237]: Invalid user debian from 123.58.212.100 port 52196
Feb 02 09:51:29 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:51:29 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dc8003380 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:51:29 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:51:29 : epoch 698072dc : compute-1 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Feb 02 09:51:29 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:51:29 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4da8001090 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:51:29 compute-1 sshd-session[150237]: Connection closed by invalid user debian 123.58.212.100 port 52196 [preauth]
Feb 02 09:51:29 compute-1 ceph-mon[80115]: pgmap v350: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.5 KiB/s rd, 682 B/s wr, 2 op/s
Feb 02 09:51:29 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:51:29 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dd800a8a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:51:29 compute-1 sudo[150473]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ycdefioxxkrrofrbbetyfzrpawmhngfy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025889.169418-928-267250676885319/AnsiballZ_setup.py'
Feb 02 09:51:29 compute-1 sudo[150473]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:51:29 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:51:29 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 09:51:29 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:51:29.649 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 09:51:29 compute-1 python3.9[150475]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Feb 02 09:51:29 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:51:29 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:51:29 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:51:29.821 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:51:30 compute-1 sudo[150473]: pam_unix(sudo:session): session closed for user root
Feb 02 09:51:30 compute-1 sudo[150559]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lvveawvzjxfqwhhwlrrzsficvgjcqdvp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025889.169418-928-267250676885319/AnsiballZ_dnf.py'
Feb 02 09:51:30 compute-1 sudo[150559]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:51:30 compute-1 sshd-session[150476]: Invalid user debian from 123.58.212.100 port 52210
Feb 02 09:51:30 compute-1 python3.9[150561]: ansible-ansible.legacy.dnf Invoked with name=['libvirt ', 'libvirt-admin ', 'libvirt-client ', 'libvirt-daemon ', 'qemu-kvm', 'qemu-img', 'libguestfs', 'libseccomp', 'swtpm', 'swtpm-tools', 'edk2-ovmf', 'ceph-common', 'cyrus-sasl-scram'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Feb 02 09:51:30 compute-1 sshd-session[150476]: Connection closed by invalid user debian 123.58.212.100 port 52210 [preauth]
Feb 02 09:51:31 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:51:31 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4db0004160 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:51:31 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:51:31 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dc8003380 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:51:31 compute-1 ceph-mon[80115]: pgmap v351: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 3.2 KiB/s rd, 1.3 KiB/s wr, 4 op/s
Feb 02 09:51:31 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:51:31 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4da8001090 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:51:31 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 09:51:31 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:51:31 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 09:51:31 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:51:31.652 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 09:51:31 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:51:31 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 09:51:31 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:51:31.822 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 09:51:32 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-haproxy-nfs-cephfs-compute-1-sryqbx[86108]: [WARNING] 032/095132 (4) : Server backend/nfs.cephfs.1 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Feb 02 09:51:32 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Feb 02 09:51:32 compute-1 sshd-session[150564]: Invalid user debian from 123.58.212.100 port 52214
Feb 02 09:51:32 compute-1 sshd-session[150564]: Connection closed by invalid user debian 123.58.212.100 port 52214 [preauth]
Feb 02 09:51:33 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:51:33 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dd800a8a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:51:33 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:51:33 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4db0004160 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:51:33 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:51:33 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dc8003380 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:51:33 compute-1 ceph-mon[80115]: pgmap v352: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 2.3 KiB/s rd, 1023 B/s wr, 3 op/s
Feb 02 09:51:33 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:51:33 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 09:51:33 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:51:33.655 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 09:51:33 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:51:33 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:51:33 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:51:33.825 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:51:33 compute-1 sshd-session[150575]: Invalid user debian from 123.58.212.100 port 45332
Feb 02 09:51:34 compute-1 sshd-session[150575]: Connection closed by invalid user debian 123.58.212.100 port 45332 [preauth]
Feb 02 09:51:34 compute-1 podman[150579]: 2026-02-02 09:51:34.462390325 +0000 UTC m=+0.120311581 container health_status 1fb2696999ea15e9688144989093738543d3cef725f9986792d6dfd707aefbe2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:099d88ae13fa2b3409da5310cdcba7fa01d2c87a8bc98296299a57054b9a075e, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'db4758ee7523fe447444c4bd2b867b543b1eee4e3bbcf6676cd1b27bf6147d86-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:099d88ae13fa2b3409da5310cdcba7fa01d2c87a8bc98296299a57054b9a075e', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20260127)
Feb 02 09:51:34 compute-1 sudo[150608]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Feb 02 09:51:34 compute-1 sudo[150608]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:51:34 compute-1 sudo[150608]: pam_unix(sudo:session): session closed for user root
Feb 02 09:51:35 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:51:35 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4da80041e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:51:35 compute-1 sshd-session[150580]: Invalid user debian from 123.58.212.100 port 45340
Feb 02 09:51:35 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:51:35 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dd800a8a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:51:35 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:51:35 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4db0004160 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:51:35 compute-1 ceph-mon[80115]: pgmap v353: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 2.4 KiB/s rd, 1023 B/s wr, 3 op/s
Feb 02 09:51:35 compute-1 sshd-session[150580]: Connection closed by invalid user debian 123.58.212.100 port 45340 [preauth]
Feb 02 09:51:35 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:51:35 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 09:51:35 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:51:35.658 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 09:51:35 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:51:35 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:51:35 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:51:35.827 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:51:36 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 09:51:36 compute-1 sshd-session[150653]: Invalid user debian from 123.58.212.100 port 45350
Feb 02 09:51:37 compute-1 sshd-session[150653]: Connection closed by invalid user debian 123.58.212.100 port 45350 [preauth]
Feb 02 09:51:37 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:51:37 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dc8003380 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:51:37 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:51:37 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4da80041e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:51:37 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:51:37 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dd800a8a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:51:37 compute-1 ceph-mon[80115]: pgmap v354: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.8 KiB/s rd, 682 B/s wr, 2 op/s
Feb 02 09:51:37 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:51:37 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 09:51:37 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:51:37.662 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 09:51:37 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:51:37 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:51:37 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:51:37.831 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:51:38 compute-1 ceph-mon[80115]: pgmap v355: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.8 KiB/s rd, 682 B/s wr, 2 op/s
Feb 02 09:51:39 compute-1 sshd-session[150707]: Invalid user debian from 123.58.212.100 port 45362
Feb 02 09:51:39 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:51:39 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4db0004160 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:51:39 compute-1 sshd-session[150707]: Connection closed by invalid user debian 123.58.212.100 port 45362 [preauth]
Feb 02 09:51:39 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:51:39 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4db0004160 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:51:39 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:51:39 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4da80041e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:51:39 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:51:39 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 09:51:39 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:51:39.665 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 09:51:39 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:51:39 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 09:51:39 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:51:39.834 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 09:51:40 compute-1 sshd-session[150782]: Invalid user debian from 123.58.212.100 port 45366
Feb 02 09:51:40 compute-1 sshd-session[150782]: Connection closed by invalid user debian 123.58.212.100 port 45366 [preauth]
Feb 02 09:51:40 compute-1 ceph-mon[80115]: pgmap v356: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.8 KiB/s rd, 682 B/s wr, 2 op/s
Feb 02 09:51:41 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:51:41 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4da80041e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:51:41 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:51:41 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4db0004160 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:51:41 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:51:41 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dbc001090 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:51:41 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 09:51:41 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:51:41 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 09:51:41 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:51:41.668 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 09:51:41 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:51:41 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:51:41 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:51:41.837 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:51:42 compute-1 sshd-session[150814]: Invalid user debian from 123.58.212.100 port 45372
Feb 02 09:51:42 compute-1 sshd-session[150814]: Connection closed by invalid user debian 123.58.212.100 port 45372 [preauth]
Feb 02 09:51:42 compute-1 ceph-mon[80115]: pgmap v357: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 85 B/s rd, 0 B/s wr, 0 op/s
Feb 02 09:51:43 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:51:43 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dcc002860 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:51:43 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:51:43 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_27] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dd800a8a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:51:43 compute-1 podman[150821]: 2026-02-02 09:51:43.39316664 +0000 UTC m=+0.064096281 container health_status 9cab238676dda9a572dafc047b624a8b78a8fbec063bf997856559897160605d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'db4758ee7523fe447444c4bd2b867b543b1eee4e3bbcf6676cd1b27bf6147d86-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Feb 02 09:51:43 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:51:43 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4db0004160 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:51:43 compute-1 sshd-session[150818]: Invalid user debian from 123.58.212.100 port 52082
Feb 02 09:51:43 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:51:43 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 09:51:43 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:51:43.670 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 09:51:43 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:51:43 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:51:43 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:51:43.839 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:51:43 compute-1 sshd-session[150818]: Connection closed by invalid user debian 123.58.212.100 port 52082 [preauth]
Feb 02 09:51:44 compute-1 ceph-mon[80115]: pgmap v358: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 341 B/s rd, 0 B/s wr, 0 op/s
Feb 02 09:51:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:51:44.889 143542 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 02 09:51:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:51:44.891 143542 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 02 09:51:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:51:44.891 143542 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 02 09:51:45 compute-1 sshd-session[150843]: Invalid user debian from 123.58.212.100 port 52098
Feb 02 09:51:45 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:51:45 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dbc001090 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:51:45 compute-1 sshd-session[150843]: Connection closed by invalid user debian 123.58.212.100 port 52098 [preauth]
Feb 02 09:51:45 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:51:45 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dcc002860 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:51:45 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:51:45 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_27] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dd800a8a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:51:45 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:51:45 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 09:51:45 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:51:45.673 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 09:51:45 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:51:45 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:51:45 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:51:45.845 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:51:46 compute-1 sshd-session[150849]: Invalid user debian from 123.58.212.100 port 52114
Feb 02 09:51:46 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 09:51:46 compute-1 sshd-session[150849]: Connection closed by invalid user debian 123.58.212.100 port 52114 [preauth]
Feb 02 09:51:46 compute-1 ceph-mon[80115]: pgmap v359: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Feb 02 09:51:47 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:51:47 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4db0004160 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:51:47 compute-1 sudo[150855]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 02 09:51:47 compute-1 sudo[150855]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:51:47 compute-1 sudo[150855]: pam_unix(sudo:session): session closed for user root
Feb 02 09:51:47 compute-1 sudo[150880]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/d241d473-9fcb-5f74-b163-f1ca4454e7f1/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Feb 02 09:51:47 compute-1 sudo[150880]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:51:47 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:51:47 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dbc001090 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:51:47 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:51:47 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dcc002860 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:51:47 compute-1 sudo[150880]: pam_unix(sudo:session): session closed for user root
Feb 02 09:51:47 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:51:47 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:51:47 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:51:47.701 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:51:47 compute-1 sshd-session[150853]: Invalid user debian from 123.58.212.100 port 52134
Feb 02 09:51:47 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:51:47 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:51:47 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:51:47.848 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:51:47 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Feb 02 09:51:47 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Feb 02 09:51:47 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Feb 02 09:51:47 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb 02 09:51:47 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb 02 09:51:47 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Feb 02 09:51:47 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Feb 02 09:51:47 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Feb 02 09:51:47 compute-1 sshd-session[150853]: Connection closed by invalid user debian 123.58.212.100 port 52134 [preauth]
Feb 02 09:51:48 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-haproxy-nfs-cephfs-compute-1-sryqbx[86108]: [WARNING] 032/095148 (4) : Server backend/nfs.cephfs.2 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Feb 02 09:51:48 compute-1 ceph-mon[80115]: pgmap v360: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Feb 02 09:51:48 compute-1 ceph-mon[80115]: pgmap v361: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 304 B/s rd, 0 op/s
Feb 02 09:51:49 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:51:49 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_27] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dd800a8a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:51:49 compute-1 sshd-session[150937]: Invalid user debian from 123.58.212.100 port 52144
Feb 02 09:51:49 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:51:49 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4db0004160 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:51:49 compute-1 sshd-session[150937]: Connection closed by invalid user debian 123.58.212.100 port 52144 [preauth]
Feb 02 09:51:49 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:51:49 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dbc001090 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:51:49 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:51:49 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:51:49 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:51:49.704 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:51:49 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:51:49 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:51:49 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:51:49.850 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:51:50 compute-1 sshd-session[150940]: Invalid user debian from 123.58.212.100 port 52160
Feb 02 09:51:50 compute-1 ceph-mon[80115]: pgmap v362: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 304 B/s rd, 0 op/s
Feb 02 09:51:51 compute-1 sshd-session[150940]: Connection closed by invalid user debian 123.58.212.100 port 52160 [preauth]
Feb 02 09:51:51 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:51:51 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dcc002860 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:51:51 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:51:51 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_27] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dd800a8a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:51:51 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:51:51 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4db0004160 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:51:51 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 09:51:51 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:51:51 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:51:51 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:51:51.706 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:51:51 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:51:51 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:51:51 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:51:51.853 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:51:52 compute-1 sshd-session[150943]: Invalid user debian from 123.58.212.100 port 58528
Feb 02 09:51:52 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-haproxy-nfs-cephfs-compute-1-sryqbx[86108]: [WARNING] 032/095152 (4) : Server backend/nfs.cephfs.1 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 1 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Feb 02 09:51:52 compute-1 sshd-session[150943]: Connection closed by invalid user debian 123.58.212.100 port 58528 [preauth]
Feb 02 09:51:52 compute-1 sudo[150946]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 02 09:51:52 compute-1 sudo[150946]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:51:52 compute-1 sudo[150946]: pam_unix(sudo:session): session closed for user root
Feb 02 09:51:52 compute-1 ceph-mon[80115]: pgmap v363: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 305 B/s rd, 0 op/s
Feb 02 09:51:52 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb 02 09:51:52 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb 02 09:51:53 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:51:53 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dbc001090 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:51:53 compute-1 kernel: SELinux:  Converting 2782 SID table entries...
Feb 02 09:51:53 compute-1 kernel: SELinux:  policy capability network_peer_controls=1
Feb 02 09:51:53 compute-1 kernel: SELinux:  policy capability open_perms=1
Feb 02 09:51:53 compute-1 kernel: SELinux:  policy capability extended_socket_class=1
Feb 02 09:51:53 compute-1 kernel: SELinux:  policy capability always_check_network=0
Feb 02 09:51:53 compute-1 kernel: SELinux:  policy capability cgroup_seclabel=1
Feb 02 09:51:53 compute-1 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Feb 02 09:51:53 compute-1 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Feb 02 09:51:53 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:51:53 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dcc002860 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:51:53 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:51:53 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_27] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dd800a8a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:51:53 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:51:53 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 09:51:53 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:51:53.709 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 09:51:53 compute-1 sshd-session[150959]: Invalid user debian from 123.58.212.100 port 58532
Feb 02 09:51:53 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:51:53 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:51:53 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:51:53.855 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:51:54 compute-1 sshd-session[150959]: Connection closed by invalid user debian 123.58.212.100 port 58532 [preauth]
Feb 02 09:51:54 compute-1 sudo[150984]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Feb 02 09:51:54 compute-1 dbus-broker-launch[780]: avc:  op=load_policy lsm=selinux seqno=10 res=1
Feb 02 09:51:54 compute-1 sudo[150984]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:51:54 compute-1 sudo[150984]: pam_unix(sudo:session): session closed for user root
Feb 02 09:51:55 compute-1 ceph-mon[80115]: pgmap v364: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 101 B/s rd, 0 op/s
Feb 02 09:51:55 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:51:55 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4db0004160 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:51:55 compute-1 sshd-session[150980]: Invalid user debian from 123.58.212.100 port 58542
Feb 02 09:51:55 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:51:55 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dbc001090 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:51:55 compute-1 sshd-session[150980]: Connection closed by invalid user debian 123.58.212.100 port 58542 [preauth]
Feb 02 09:51:55 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:51:55 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dcc002860 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:51:55 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:51:55 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:51:55 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:51:55.713 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:51:55 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:51:55 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:51:55 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:51:55.857 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:51:56 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 09:51:56 compute-1 sshd-session[151009]: Invalid user debian from 123.58.212.100 port 58558
Feb 02 09:51:56 compute-1 sshd-session[151009]: Connection closed by invalid user debian 123.58.212.100 port 58558 [preauth]
Feb 02 09:51:57 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:51:57 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_27] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dd800a8a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:51:57 compute-1 ceph-mon[80115]: pgmap v365: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 101 B/s rd, 0 op/s
Feb 02 09:51:57 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:51:57 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4db0004160 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:51:57 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:51:57 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dbc001090 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:51:57 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:51:57 : epoch 698072dc : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Feb 02 09:51:57 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:51:57 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 09:51:57 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:51:57.716 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 09:51:57 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:51:57 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:51:57 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:51:57.860 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:51:58 compute-1 sshd-session[151012]: Invalid user debian from 123.58.212.100 port 58570
Feb 02 09:51:58 compute-1 sshd-session[151012]: Connection closed by invalid user debian 123.58.212.100 port 58570 [preauth]
Feb 02 09:51:59 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:51:59 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dcc002860 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:51:59 compute-1 ceph-mon[80115]: pgmap v366: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 101 B/s rd, 0 op/s
Feb 02 09:51:59 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:51:59 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_27] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dd800a8a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:51:59 compute-1 sshd-session[151014]: Invalid user debian from 123.58.212.100 port 58584
Feb 02 09:51:59 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:51:59 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4db0004160 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:51:59 compute-1 sshd-session[151014]: Connection closed by invalid user debian 123.58.212.100 port 58584 [preauth]
Feb 02 09:51:59 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:51:59 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 09:51:59 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:51:59.719 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 09:51:59 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:51:59 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 09:51:59 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:51:59.863 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 09:52:00 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:52:00 : epoch 698072dc : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Feb 02 09:52:00 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:52:00 : epoch 698072dc : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Feb 02 09:52:01 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:52:01 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dbc001090 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:52:01 compute-1 ceph-mon[80115]: pgmap v367: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 852 B/s rd, 426 B/s wr, 1 op/s
Feb 02 09:52:01 compute-1 sshd-session[151017]: Invalid user debian from 123.58.212.100 port 58586
Feb 02 09:52:01 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:52:01 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dcc004920 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:52:01 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:52:01 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_27] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dd800a8a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:52:01 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 09:52:01 compute-1 sshd-session[151017]: Connection closed by invalid user debian 123.58.212.100 port 58586 [preauth]
Feb 02 09:52:01 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:52:01 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 09:52:01 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:52:01.722 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 09:52:01 compute-1 kernel: SELinux:  Converting 2782 SID table entries...
Feb 02 09:52:01 compute-1 kernel: SELinux:  policy capability network_peer_controls=1
Feb 02 09:52:01 compute-1 kernel: SELinux:  policy capability open_perms=1
Feb 02 09:52:01 compute-1 kernel: SELinux:  policy capability extended_socket_class=1
Feb 02 09:52:01 compute-1 kernel: SELinux:  policy capability always_check_network=0
Feb 02 09:52:01 compute-1 kernel: SELinux:  policy capability cgroup_seclabel=1
Feb 02 09:52:01 compute-1 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Feb 02 09:52:01 compute-1 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Feb 02 09:52:01 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:52:01 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:52:01 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:52:01.866 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:52:02 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Feb 02 09:52:02 compute-1 sshd-session[151026]: Invalid user debian from 123.58.212.100 port 50970
Feb 02 09:52:02 compute-1 sshd-session[151026]: Connection closed by invalid user debian 123.58.212.100 port 50970 [preauth]
Feb 02 09:52:03 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:52:03 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4db0004160 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:52:03 compute-1 ceph-mon[80115]: pgmap v368: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 852 B/s rd, 426 B/s wr, 1 op/s
Feb 02 09:52:03 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:52:03 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dbc001090 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:52:03 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:52:03 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dcc004920 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:52:03 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:52:03 : epoch 698072dc : compute-1 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Feb 02 09:52:03 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:52:03 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 09:52:03 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:52:03.725 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 09:52:03 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:52:03 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:52:03 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:52:03.869 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:52:04 compute-1 sshd-session[151030]: Invalid user debian from 123.58.212.100 port 50974
Feb 02 09:52:04 compute-1 sshd-session[151030]: Connection closed by invalid user debian 123.58.212.100 port 50974 [preauth]
Feb 02 09:52:05 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:52:05 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_27] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dd800a8a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:52:05 compute-1 dbus-broker-launch[780]: avc:  op=load_policy lsm=selinux seqno=11 res=1
Feb 02 09:52:05 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:52:05 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4db0004160 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:52:05 compute-1 ceph-mon[80115]: pgmap v369: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.8 KiB/s rd, 938 B/s wr, 2 op/s
Feb 02 09:52:05 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:52:05 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dbc001090 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:52:05 compute-1 podman[151035]: 2026-02-02 09:52:05.422182048 +0000 UTC m=+0.087817959 container health_status 1fb2696999ea15e9688144989093738543d3cef725f9986792d6dfd707aefbe2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:099d88ae13fa2b3409da5310cdcba7fa01d2c87a8bc98296299a57054b9a075e, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_controller, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'db4758ee7523fe447444c4bd2b867b543b1eee4e3bbcf6676cd1b27bf6147d86-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:099d88ae13fa2b3409da5310cdcba7fa01d2c87a8bc98296299a57054b9a075e', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 02 09:52:05 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:52:05 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:52:05 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:52:05.729 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:52:05 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:52:05 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:52:05 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:52:05.871 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:52:06 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 09:52:06 compute-1 sshd-session[151033]: Invalid user debian from 123.58.212.100 port 50990
Feb 02 09:52:06 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:52:06 : epoch 698072dc : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Feb 02 09:52:06 compute-1 sshd-session[151033]: Connection closed by invalid user debian 123.58.212.100 port 50990 [preauth]
Feb 02 09:52:07 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:52:07 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dcc004920 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:52:07 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:52:07 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_27] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dd800a8a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:52:07 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:52:07 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4db0004160 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:52:07 compute-1 ceph-mon[80115]: pgmap v370: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.7 KiB/s rd, 938 B/s wr, 2 op/s
Feb 02 09:52:07 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:52:07 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 09:52:07 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:52:07.731 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 09:52:07 compute-1 sshd-session[151063]: Invalid user debian from 123.58.212.100 port 50994
Feb 02 09:52:07 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:52:07 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 09:52:07 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:52:07.874 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 09:52:08 compute-1 sshd-session[151063]: Connection closed by invalid user debian 123.58.212.100 port 50994 [preauth]
Feb 02 09:52:08 compute-1 ceph-mon[80115]: pgmap v371: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.7 KiB/s rd, 938 B/s wr, 2 op/s
Feb 02 09:52:08 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-haproxy-nfs-cephfs-compute-1-sryqbx[86108]: [WARNING] 032/095208 (4) : Server backend/nfs.cephfs.2 is UP, reason: Layer4 check passed, check duration: 0ms. 2 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Feb 02 09:52:09 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:52:09 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dbc001090 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:52:09 compute-1 sshd-session[151065]: Invalid user debian from 123.58.212.100 port 50996
Feb 02 09:52:09 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:52:09 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dcc004920 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:52:09 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:52:09 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_27] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dd800a8a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:52:09 compute-1 sshd-session[151065]: Connection closed by invalid user debian 123.58.212.100 port 50996 [preauth]
Feb 02 09:52:09 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:52:09 : epoch 698072dc : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Feb 02 09:52:09 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:52:09 : epoch 698072dc : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Feb 02 09:52:09 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:52:09 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 09:52:09 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:52:09.735 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 09:52:09 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:52:09 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 09:52:09 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:52:09.877 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 09:52:10 compute-1 sshd-session[151068]: Invalid user debian from 123.58.212.100 port 51006
Feb 02 09:52:10 compute-1 ceph-mon[80115]: pgmap v372: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 3.1 KiB/s rd, 1.5 KiB/s wr, 4 op/s
Feb 02 09:52:11 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:52:11 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4db0004160 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:52:11 compute-1 sshd-session[151068]: Connection closed by invalid user debian 123.58.212.100 port 51006 [preauth]
Feb 02 09:52:11 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:52:11 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dbc001090 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:52:11 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:52:11 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dcc004920 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:52:11 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 09:52:11 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:52:11 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:52:11 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:52:11.739 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:52:11 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:52:11 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:52:11 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:52:11.880 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:52:12 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:52:12 : epoch 698072dc : compute-1 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Feb 02 09:52:12 compute-1 sshd-session[151071]: Invalid user debian from 123.58.212.100 port 42014
Feb 02 09:52:12 compute-1 sshd-session[151071]: Connection closed by invalid user debian 123.58.212.100 port 42014 [preauth]
Feb 02 09:52:13 compute-1 ceph-mon[80115]: pgmap v373: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 2.3 KiB/s rd, 1.1 KiB/s wr, 3 op/s
Feb 02 09:52:13 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:52:13 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_27] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dd800a8a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:52:13 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:52:13 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_29] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4db0004160 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:52:13 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:52:13 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4da8002f60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:52:13 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:52:13 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 09:52:13 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:52:13.741 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 09:52:13 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:52:13 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:52:13 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:52:13.882 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:52:14 compute-1 podman[152706]: 2026-02-02 09:52:14.374048612 +0000 UTC m=+0.049265997 container health_status 9cab238676dda9a572dafc047b624a8b78a8fbec063bf997856559897160605d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'db4758ee7523fe447444c4bd2b867b543b1eee4e3bbcf6676cd1b27bf6147d86-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Feb 02 09:52:15 compute-1 sudo[153176]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Feb 02 09:52:15 compute-1 sudo[153176]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:52:15 compute-1 sudo[153176]: pam_unix(sudo:session): session closed for user root
Feb 02 09:52:15 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:52:15 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dcc004920 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:52:15 compute-1 sshd-session[152637]: Invalid user debian from 123.58.212.100 port 42024
Feb 02 09:52:15 compute-1 ceph-mon[80115]: pgmap v374: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 3.2 KiB/s rd, 1.3 KiB/s wr, 4 op/s
Feb 02 09:52:15 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:52:15 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_27] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dc8002230 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:52:15 compute-1 sshd-session[152637]: Connection closed by invalid user debian 123.58.212.100 port 42024 [preauth]
Feb 02 09:52:15 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:52:15 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_27] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4db0004160 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:52:15 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:52:15 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 09:52:15 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:52:15.745 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 09:52:15 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:52:15 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:52:15 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:52:15.885 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:52:16 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-haproxy-nfs-cephfs-compute-1-sryqbx[86108]: [WARNING] 032/095216 (4) : Server backend/nfs.cephfs.1 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Feb 02 09:52:16 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 09:52:16 compute-1 sshd-session[153615]: Invalid user debian from 123.58.212.100 port 42032
Feb 02 09:52:16 compute-1 sshd-session[153615]: Connection closed by invalid user debian 123.58.212.100 port 42032 [preauth]
Feb 02 09:52:17 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:52:17 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4da8002f60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:52:17 compute-1 ceph-mon[80115]: pgmap v375: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 2.2 KiB/s rd, 852 B/s wr, 3 op/s
Feb 02 09:52:17 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:52:17 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dc8002230 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:52:17 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:52:17 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_29] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dcc004920 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:52:17 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:52:17 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:52:17 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:52:17.749 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:52:17 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:52:17 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 09:52:17 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:52:17.887 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 09:52:17 compute-1 sshd-session[154613]: Invalid user debian from 123.58.212.100 port 42046
Feb 02 09:52:18 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Feb 02 09:52:18 compute-1 sshd-session[154613]: Connection closed by invalid user debian 123.58.212.100 port 42046 [preauth]
Feb 02 09:52:19 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:52:19 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_27] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4db0004160 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:52:19 compute-1 ceph-mon[80115]: pgmap v376: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 2.2 KiB/s rd, 852 B/s wr, 3 op/s
Feb 02 09:52:19 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:52:19 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4db0004160 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:52:19 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:52:19 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dc8002230 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:52:19 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:52:19 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.002000053s ======
Feb 02 09:52:19 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:52:19.750 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000053s
Feb 02 09:52:19 compute-1 sshd-session[155720]: Invalid user debian from 123.58.212.100 port 42062
Feb 02 09:52:19 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:52:19 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:52:19 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:52:19.889 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:52:20 compute-1 sshd-session[155720]: Connection closed by invalid user debian 123.58.212.100 port 42062 [preauth]
Feb 02 09:52:21 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:52:21 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_29] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dcc004920 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:52:21 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:52:21 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_27] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4db0004160 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:52:21 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:52:21 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4da8002f60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:52:21 compute-1 ceph-mon[80115]: pgmap v377: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 2.2 KiB/s rd, 852 B/s wr, 3 op/s
Feb 02 09:52:21 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 09:52:21 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:52:21 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 09:52:21 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:52:21.753 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 09:52:21 compute-1 sshd-session[156838]: Invalid user debian from 123.58.212.100 port 42072
Feb 02 09:52:21 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:52:21 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:52:21 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:52:21.891 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:52:22 compute-1 sshd-session[156838]: Connection closed by invalid user debian 123.58.212.100 port 42072 [preauth]
Feb 02 09:52:22 compute-1 ceph-mon[80115]: pgmap v378: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 938 B/s rd, 255 B/s wr, 1 op/s
Feb 02 09:52:23 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:52:23 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dc8002230 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:52:23 compute-1 sshd-session[158409]: Invalid user debian from 123.58.212.100 port 36314
Feb 02 09:52:23 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:52:23 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_29] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dcc004920 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:52:23 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:52:23 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_27] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4db0004160 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:52:23 compute-1 sshd-session[158409]: Connection closed by invalid user debian 123.58.212.100 port 36314 [preauth]
Feb 02 09:52:23 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:52:23 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:52:23 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:52:23.756 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:52:23 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:52:23 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:52:23 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:52:23.893 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:52:24 compute-1 sshd-session[159551]: Invalid user debian from 123.58.212.100 port 36328
Feb 02 09:52:24 compute-1 ceph-mon[80115]: pgmap v379: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.1 KiB/s rd, 255 B/s wr, 1 op/s
Feb 02 09:52:25 compute-1 sshd-session[159551]: Connection closed by invalid user debian 123.58.212.100 port 36328 [preauth]
Feb 02 09:52:25 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:52:25 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4da8002f60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:52:25 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:52:25 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_29] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4da8002f60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:52:25 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:52:25 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_29] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dcc004920 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:52:25 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:52:25 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 09:52:25 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:52:25.759 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 09:52:25 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:52:25 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:52:25 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:52:25.895 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:52:26 compute-1 sshd-session[160856]: Invalid user debian from 123.58.212.100 port 36336
Feb 02 09:52:26 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 09:52:26 compute-1 sshd-session[160856]: Connection closed by invalid user debian 123.58.212.100 port 36336 [preauth]
Feb 02 09:52:26 compute-1 ceph-mon[80115]: pgmap v380: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s
Feb 02 09:52:26 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-haproxy-nfs-cephfs-compute-1-sryqbx[86108]: [WARNING] 032/095226 (4) : Server backend/nfs.cephfs.2 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Feb 02 09:52:27 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:52:27 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_27] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4db0004160 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:52:27 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:52:27 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4da8002f60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:52:27 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:52:27 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dc8002230 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:52:27 compute-1 sshd-session[161848]: Invalid user debian from 123.58.212.100 port 36338
Feb 02 09:52:27 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:52:27 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:52:27 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:52:27.762 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:52:27 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:52:27 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:52:27 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:52:27.897 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:52:28 compute-1 sshd-session[161848]: Connection closed by invalid user debian 123.58.212.100 port 36338 [preauth]
Feb 02 09:52:28 compute-1 ceph-mon[80115]: pgmap v381: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s
Feb 02 09:52:29 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:52:29 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_29] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dcc004920 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:52:29 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:52:29 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dcc004920 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:52:29 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:52:29 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dcc004920 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:52:29 compute-1 sshd-session[163487]: Invalid user debian from 123.58.212.100 port 36346
Feb 02 09:52:29 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:52:29 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 09:52:29 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:52:29.765 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 09:52:29 compute-1 sshd-session[163487]: Connection closed by invalid user debian 123.58.212.100 port 36346 [preauth]
Feb 02 09:52:29 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:52:29 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:52:29 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:52:29.899 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:52:30 compute-1 ceph-mon[80115]: pgmap v382: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s
Feb 02 09:52:31 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:52:31 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dc8002230 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:52:31 compute-1 sshd-session[164448]: Invalid user debian from 123.58.212.100 port 36356
Feb 02 09:52:31 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:52:31 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_29] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dcc004920 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:52:31 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:52:31 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dcc004920 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:52:31 compute-1 sshd-session[164448]: Connection closed by invalid user debian 123.58.212.100 port 36356 [preauth]
Feb 02 09:52:31 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 09:52:31 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:52:31 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 09:52:31 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:52:31.768 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 09:52:31 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:52:31 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:52:31 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:52:31.901 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:52:32 compute-1 sshd-session[165664]: Invalid user debian from 123.58.212.100 port 41584
Feb 02 09:52:32 compute-1 sshd-session[165664]: Connection closed by invalid user debian 123.58.212.100 port 41584 [preauth]
Feb 02 09:52:32 compute-1 ceph-mon[80115]: pgmap v383: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s
Feb 02 09:52:32 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Feb 02 09:52:33 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:52:33 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_27] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4da8002f60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:52:33 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:52:33 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dc8002230 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:52:33 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:52:33 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_29] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4db0004160 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:52:33 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:52:33 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 09:52:33 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:52:33.771 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 09:52:33 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:52:33 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:52:33 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:52:33.903 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:52:33 compute-1 sshd-session[166735]: Invalid user debian from 123.58.212.100 port 41590
Feb 02 09:52:34 compute-1 sshd-session[166735]: Connection closed by invalid user debian 123.58.212.100 port 41590 [preauth]
Feb 02 09:52:35 compute-1 ceph-mon[80115]: pgmap v384: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 341 B/s rd, 0 op/s
Feb 02 09:52:35 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:52:35 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_27] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4db0004160 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:52:35 compute-1 sudo[168011]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Feb 02 09:52:35 compute-1 sudo[168011]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:52:35 compute-1 sudo[168011]: pam_unix(sudo:session): session closed for user root
Feb 02 09:52:35 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:52:35 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4db0004160 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:52:35 compute-1 sshd-session[167703]: Invalid user debian from 123.58.212.100 port 41592
Feb 02 09:52:35 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:52:35 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dc8002230 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:52:35 compute-1 sshd-session[167703]: Connection closed by invalid user debian 123.58.212.100 port 41592 [preauth]
Feb 02 09:52:35 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:52:35 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 09:52:35 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:52:35.774 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 09:52:35 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:52:35 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:52:35 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:52:35.905 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:52:36 compute-1 podman[168041]: 2026-02-02 09:52:36.42145437 +0000 UTC m=+0.088953977 container health_status 1fb2696999ea15e9688144989093738543d3cef725f9986792d6dfd707aefbe2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:099d88ae13fa2b3409da5310cdcba7fa01d2c87a8bc98296299a57054b9a075e, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'db4758ee7523fe447444c4bd2b867b543b1eee4e3bbcf6676cd1b27bf6147d86-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:099d88ae13fa2b3409da5310cdcba7fa01d2c87a8bc98296299a57054b9a075e', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127)
Feb 02 09:52:36 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 09:52:36 compute-1 sshd-session[168038]: Invalid user debian from 123.58.212.100 port 41606
Feb 02 09:52:36 compute-1 sshd-session[168038]: Connection closed by invalid user debian 123.58.212.100 port 41606 [preauth]
Feb 02 09:52:36 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:52:36 : epoch 698072dc : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Feb 02 09:52:37 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:52:37 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_29] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dcc00c8f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:52:37 compute-1 ceph-mon[80115]: pgmap v385: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s
Feb 02 09:52:37 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:52:37 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dcc00c8f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:52:37 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:52:37 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dcc00c8f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:52:37 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:52:37 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:52:37 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:52:37.777 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:52:37 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:52:37 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:52:37 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:52:37.907 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:52:38 compute-1 sshd-session[168084]: Invalid user debian from 123.58.212.100 port 41612
Feb 02 09:52:38 compute-1 sshd-session[168084]: Connection closed by invalid user debian 123.58.212.100 port 41612 [preauth]
Feb 02 09:52:39 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:52:39 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4db8002690 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:52:39 compute-1 ceph-mon[80115]: pgmap v386: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s
Feb 02 09:52:39 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:52:39 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_32] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dcc00c8f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:52:39 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:52:39 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dcc00c8f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:52:39 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:52:39 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:52:39 compute-1 sshd-session[168090]: Invalid user debian from 123.58.212.100 port 41614
Feb 02 09:52:39 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:52:39.781 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:52:39 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:52:39 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 09:52:39 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:52:39.909 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 09:52:39 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:52:39 : epoch 698072dc : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Feb 02 09:52:39 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:52:39 : epoch 698072dc : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Feb 02 09:52:40 compute-1 sshd-session[168090]: Connection closed by invalid user debian 123.58.212.100 port 41614 [preauth]
Feb 02 09:52:41 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:52:41 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dcc00c8f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:52:41 compute-1 sshd-session[168092]: Invalid user debian from 123.58.212.100 port 41622
Feb 02 09:52:41 compute-1 ceph-mon[80115]: pgmap v387: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.2 KiB/s rd, 597 B/s wr, 1 op/s
Feb 02 09:52:41 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:52:41 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4db8002690 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:52:41 compute-1 sshd-session[168092]: Connection closed by invalid user debian 123.58.212.100 port 41622 [preauth]
Feb 02 09:52:41 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:52:41 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_32] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dcc00c8f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:52:41 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 09:52:41 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:52:41 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 09:52:41 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:52:41.787 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 09:52:41 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:52:41 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:52:41 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:52:41.912 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:52:42 compute-1 sshd-session[168095]: Invalid user debian from 123.58.212.100 port 37228
Feb 02 09:52:42 compute-1 sshd-session[168095]: Connection closed by invalid user debian 123.58.212.100 port 37228 [preauth]
Feb 02 09:52:43 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:52:43 : epoch 698072dc : compute-1 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Feb 02 09:52:43 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:52:43 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dcc00c8f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:52:43 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:52:43 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dcc00c8f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:52:43 compute-1 ceph-mon[80115]: pgmap v388: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.2 KiB/s rd, 597 B/s wr, 1 op/s
Feb 02 09:52:43 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:52:43 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4db8002690 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:52:43 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:52:43 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 09:52:43 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:52:43.790 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 09:52:43 compute-1 sshd-session[168098]: Invalid user debian from 123.58.212.100 port 37242
Feb 02 09:52:43 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:52:43 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 09:52:43 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:52:43.915 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 09:52:44 compute-1 sshd-session[168098]: Connection closed by invalid user debian 123.58.212.100 port 37242 [preauth]
Feb 02 09:52:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:52:44.891 143542 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 02 09:52:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:52:44.891 143542 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 02 09:52:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:52:44.891 143542 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 02 09:52:45 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:52:45 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_32] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dcc00c8f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:52:45 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:52:45 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dcc00c8f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:52:45 compute-1 sshd-session[168100]: Invalid user debian from 123.58.212.100 port 37256
Feb 02 09:52:45 compute-1 podman[168103]: 2026-02-02 09:52:45.392893239 +0000 UTC m=+0.068672977 container health_status 9cab238676dda9a572dafc047b624a8b78a8fbec063bf997856559897160605d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_metadata_agent, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'db4758ee7523fe447444c4bd2b867b543b1eee4e3bbcf6676cd1b27bf6147d86-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Feb 02 09:52:45 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:52:45 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_32] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4da8002f60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:52:45 compute-1 ceph-mon[80115]: pgmap v389: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 2.3 KiB/s rd, 1023 B/s wr, 3 op/s
Feb 02 09:52:45 compute-1 sshd-session[168100]: Connection closed by invalid user debian 123.58.212.100 port 37256 [preauth]
Feb 02 09:52:45 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:52:45 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:52:45 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:52:45.793 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:52:45 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:52:45 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 09:52:45 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:52:45.918 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 09:52:46 compute-1 kernel: SELinux:  Converting 2783 SID table entries...
Feb 02 09:52:46 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 09:52:46 compute-1 kernel: SELinux:  policy capability network_peer_controls=1
Feb 02 09:52:46 compute-1 kernel: SELinux:  policy capability open_perms=1
Feb 02 09:52:46 compute-1 kernel: SELinux:  policy capability extended_socket_class=1
Feb 02 09:52:46 compute-1 kernel: SELinux:  policy capability always_check_network=0
Feb 02 09:52:46 compute-1 kernel: SELinux:  policy capability cgroup_seclabel=1
Feb 02 09:52:46 compute-1 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Feb 02 09:52:46 compute-1 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Feb 02 09:52:46 compute-1 ceph-mon[80115]: pgmap v390: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 2.2 KiB/s rd, 1023 B/s wr, 3 op/s
Feb 02 09:52:46 compute-1 sshd-session[168124]: Invalid user debian from 123.58.212.100 port 37262
Feb 02 09:52:47 compute-1 sshd-session[168124]: Connection closed by invalid user debian 123.58.212.100 port 37262 [preauth]
Feb 02 09:52:47 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:52:47 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_27] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4da8002f60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:52:47 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:52:47 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4da8002f60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:52:47 compute-1 groupadd[168138]: group added to /etc/group: name=dnsmasq, GID=993
Feb 02 09:52:47 compute-1 groupadd[168138]: group added to /etc/gshadow: name=dnsmasq
Feb 02 09:52:47 compute-1 groupadd[168138]: new group: name=dnsmasq, GID=993
Feb 02 09:52:47 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:52:47 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4db8002690 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:52:47 compute-1 useradd[168145]: new user: name=dnsmasq, UID=992, GID=993, home=/var/lib/dnsmasq, shell=/usr/sbin/nologin, from=none
Feb 02 09:52:47 compute-1 dbus-broker-launch[775]: Noticed file-system modification, trigger reload.
Feb 02 09:52:47 compute-1 dbus-broker-launch[780]: avc:  op=load_policy lsm=selinux seqno=12 res=1
Feb 02 09:52:47 compute-1 dbus-broker-launch[775]: Noticed file-system modification, trigger reload.
Feb 02 09:52:47 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Feb 02 09:52:47 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:52:47 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:52:47 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:52:47.797 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:52:47 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:52:47 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 09:52:47 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:52:47.921 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 09:52:48 compute-1 sshd-session[168134]: Invalid user debian from 123.58.212.100 port 37270
Feb 02 09:52:48 compute-1 groupadd[168158]: group added to /etc/group: name=clevis, GID=992
Feb 02 09:52:48 compute-1 groupadd[168158]: group added to /etc/gshadow: name=clevis
Feb 02 09:52:48 compute-1 groupadd[168158]: new group: name=clevis, GID=992
Feb 02 09:52:48 compute-1 sshd-session[168134]: Connection closed by invalid user debian 123.58.212.100 port 37270 [preauth]
Feb 02 09:52:48 compute-1 useradd[168165]: new user: name=clevis, UID=991, GID=992, home=/var/cache/clevis, shell=/usr/sbin/nologin, from=none
Feb 02 09:52:48 compute-1 usermod[168175]: add 'clevis' to group 'tss'
Feb 02 09:52:48 compute-1 usermod[168175]: add 'clevis' to shadow group 'tss'
Feb 02 09:52:48 compute-1 ceph-mon[80115]: pgmap v391: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 2.2 KiB/s rd, 1023 B/s wr, 3 op/s
Feb 02 09:52:48 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-haproxy-nfs-cephfs-compute-1-sryqbx[86108]: [WARNING] 032/095248 (4) : Server backend/nfs.cephfs.2 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Feb 02 09:52:49 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:52:49 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_32] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dcc00c8f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:52:49 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:52:49 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_27] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4da8002f60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:52:49 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:52:49 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dd800a8a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:52:49 compute-1 sshd-session[168185]: Invalid user debian from 123.58.212.100 port 37278
Feb 02 09:52:49 compute-1 sshd-session[168185]: Connection closed by invalid user debian 123.58.212.100 port 37278 [preauth]
Feb 02 09:52:49 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:52:49 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:52:49 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:52:49.800 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:52:49 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:52:49 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:52:49 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:52:49.924 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:52:50 compute-1 polkitd[43542]: Reloading rules
Feb 02 09:52:50 compute-1 polkitd[43542]: Collecting garbage unconditionally...
Feb 02 09:52:50 compute-1 polkitd[43542]: Loading rules from directory /etc/polkit-1/rules.d
Feb 02 09:52:50 compute-1 polkitd[43542]: Loading rules from directory /usr/share/polkit-1/rules.d
Feb 02 09:52:50 compute-1 polkitd[43542]: Finished loading, compiling and executing 3 rules
Feb 02 09:52:50 compute-1 polkitd[43542]: Reloading rules
Feb 02 09:52:50 compute-1 polkitd[43542]: Collecting garbage unconditionally...
Feb 02 09:52:50 compute-1 polkitd[43542]: Loading rules from directory /etc/polkit-1/rules.d
Feb 02 09:52:50 compute-1 polkitd[43542]: Loading rules from directory /usr/share/polkit-1/rules.d
Feb 02 09:52:50 compute-1 polkitd[43542]: Finished loading, compiling and executing 3 rules
Feb 02 09:52:50 compute-1 sshd-session[168202]: Invalid user debian from 123.58.212.100 port 37284
Feb 02 09:52:50 compute-1 ceph-mon[80115]: pgmap v392: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 2.2 KiB/s rd, 1023 B/s wr, 3 op/s
Feb 02 09:52:51 compute-1 sshd-session[168202]: Connection closed by invalid user debian 123.58.212.100 port 37284 [preauth]
Feb 02 09:52:51 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:52:51 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4db8002690 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:52:51 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:52:51 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_32] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dcc00c8f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:52:51 compute-1 groupadd[168373]: group added to /etc/group: name=ceph, GID=167
Feb 02 09:52:51 compute-1 groupadd[168373]: group added to /etc/gshadow: name=ceph
Feb 02 09:52:51 compute-1 groupadd[168373]: new group: name=ceph, GID=167
Feb 02 09:52:51 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:52:51 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_27] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4da8002f60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:52:51 compute-1 useradd[168379]: new user: name=ceph, UID=167, GID=167, home=/var/lib/ceph, shell=/sbin/nologin, from=none
Feb 02 09:52:51 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 09:52:51 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:52:51 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:52:51 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:52:51.803 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:52:51 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:52:51 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:52:51 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:52:51.927 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:52:52 compute-1 sshd-session[168370]: Invalid user debian from 123.58.212.100 port 37294
Feb 02 09:52:52 compute-1 sshd-session[168370]: Connection closed by invalid user debian 123.58.212.100 port 37294 [preauth]
Feb 02 09:52:52 compute-1 sudo[168387]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 02 09:52:52 compute-1 sudo[168387]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:52:52 compute-1 sudo[168387]: pam_unix(sudo:session): session closed for user root
Feb 02 09:52:52 compute-1 sudo[168414]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/d241d473-9fcb-5f74-b163-f1ca4454e7f1/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --image quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec --timeout 895 ls
Feb 02 09:52:52 compute-1 sudo[168414]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:52:52 compute-1 ceph-mon[80115]: pgmap v393: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.2 KiB/s rd, 426 B/s wr, 1 op/s
Feb 02 09:52:53 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:52:53 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dd800a8a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:52:53 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:52:53 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4db8002690 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:52:53 compute-1 podman[168624]: 2026-02-02 09:52:53.40935939 +0000 UTC m=+0.072620152 container exec 01cf0f34952fee77036b3444f3bcffc4063a933b24861a159799fb77cf2e6e0e (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-crash-compute-1, ceph=True, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default)
Feb 02 09:52:53 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:52:53 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_32] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dcc00c8f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:52:53 compute-1 podman[168624]: 2026-02-02 09:52:53.553130254 +0000 UTC m=+0.216391006 container exec_died 01cf0f34952fee77036b3444f3bcffc4063a933b24861a159799fb77cf2e6e0e (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-crash-compute-1, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Feb 02 09:52:53 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:52:53 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:52:53 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:52:53.807 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:52:53 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:52:53 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:52:53 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:52:53.929 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:52:53 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "config rm", "who": "osd/host:compute-0", "name": "osd_memory_target"}]: dispatch
Feb 02 09:52:53 compute-1 sshd-session[168410]: Invalid user debian from 123.58.212.100 port 50910
Feb 02 09:52:54 compute-1 podman[169252]: 2026-02-02 09:52:54.142302234 +0000 UTC m=+0.071739789 container exec 8d747add3fe1f1bfe0d198ee6025faae61dcb9cc9c5a16a41a7205162dac9d07 (image=quay.io/prometheus/node-exporter:v1.7.0, name=ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-node-exporter-compute-1, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Feb 02 09:52:54 compute-1 podman[169252]: 2026-02-02 09:52:54.154710334 +0000 UTC m=+0.084147829 container exec_died 8d747add3fe1f1bfe0d198ee6025faae61dcb9cc9c5a16a41a7205162dac9d07 (image=quay.io/prometheus/node-exporter:v1.7.0, name=ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-node-exporter-compute-1, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Feb 02 09:52:54 compute-1 sshd-session[168410]: Connection closed by invalid user debian 123.58.212.100 port 50910 [preauth]
Feb 02 09:52:54 compute-1 systemd[1]: Stopping OpenSSH server daemon...
Feb 02 09:52:54 compute-1 sshd[1010]: Received signal 15; terminating.
Feb 02 09:52:54 compute-1 systemd[1]: sshd.service: Deactivated successfully.
Feb 02 09:52:54 compute-1 systemd[1]: Stopped OpenSSH server daemon.
Feb 02 09:52:54 compute-1 systemd[1]: sshd.service: Consumed 13.475s CPU time, read 32.0K from disk, written 32.0K to disk.
Feb 02 09:52:54 compute-1 systemd[1]: Stopped target sshd-keygen.target.
Feb 02 09:52:54 compute-1 systemd[1]: Stopping sshd-keygen.target...
Feb 02 09:52:54 compute-1 systemd[1]: OpenSSH ecdsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Feb 02 09:52:54 compute-1 systemd[1]: OpenSSH ed25519 Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Feb 02 09:52:54 compute-1 systemd[1]: OpenSSH rsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Feb 02 09:52:54 compute-1 systemd[1]: Reached target sshd-keygen.target.
Feb 02 09:52:54 compute-1 systemd[1]: Starting OpenSSH server daemon...
Feb 02 09:52:54 compute-1 sshd[169352]: Server listening on 0.0.0.0 port 22.
Feb 02 09:52:54 compute-1 sshd[169352]: Server listening on :: port 22.
Feb 02 09:52:54 compute-1 systemd[1]: Started OpenSSH server daemon.
Feb 02 09:52:54 compute-1 podman[169353]: 2026-02-02 09:52:54.427198932 +0000 UTC m=+0.072293424 container exec 7d1344a9b85ae5cf187282ef94fd744fe626fdac05803bf052cfc41639d346b5 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid)
Feb 02 09:52:54 compute-1 podman[169353]: 2026-02-02 09:52:54.45158612 +0000 UTC m=+0.096680612 container exec_died 7d1344a9b85ae5cf187282ef94fd744fe626fdac05803bf052cfc41639d346b5 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Feb 02 09:52:54 compute-1 podman[169446]: 2026-02-02 09:52:54.706535001 +0000 UTC m=+0.067593569 container exec 956874b4175e8a2fade475e5bb43454e3b46ed50db9c41f7163b928c5dfb05c2 (image=quay.io/ceph/haproxy:2.3, name=ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-haproxy-nfs-cephfs-compute-1-sryqbx)
Feb 02 09:52:54 compute-1 podman[169446]: 2026-02-02 09:52:54.73957335 +0000 UTC m=+0.100631848 container exec_died 956874b4175e8a2fade475e5bb43454e3b46ed50db9c41f7163b928c5dfb05c2 (image=quay.io/ceph/haproxy:2.3, name=ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-haproxy-nfs-cephfs-compute-1-sryqbx)
Feb 02 09:52:54 compute-1 ceph-mon[80115]: pgmap v394: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.3 KiB/s rd, 426 B/s wr, 1 op/s
Feb 02 09:52:54 compute-1 podman[169544]: 2026-02-02 09:52:54.965759986 +0000 UTC m=+0.054217863 container exec 2d34301ed9240709c7b956110bf2f7a09db1491812be3ea91463444d19b431a2 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-keepalived-nfs-cephfs-compute-1-whrwoq, io.openshift.tags=Ceph keepalived, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, version=2.2.4, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.buildah.version=1.28.2, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, distribution-scope=public, io.k8s.display-name=Keepalived on RHEL 9, vendor=Red Hat, Inc., io.openshift.expose-services=, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1793, name=keepalived, build-date=2023-02-22T09:23:20, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides keepalived on RHEL 9 for Ceph., com.redhat.component=keepalived-container, description=keepalived for Ceph, vcs-type=git)
Feb 02 09:52:55 compute-1 podman[169544]: 2026-02-02 09:52:55.0226825 +0000 UTC m=+0.111140327 container exec_died 2d34301ed9240709c7b956110bf2f7a09db1491812be3ea91463444d19b431a2 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-keepalived-nfs-cephfs-compute-1-whrwoq, io.k8s.display-name=Keepalived on RHEL 9, vcs-type=git, build-date=2023-02-22T09:23:20, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=keepalived, vendor=Red Hat, Inc., io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, description=keepalived for Ceph, distribution-scope=public, architecture=x86_64, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, summary=Provides keepalived on RHEL 9 for Ceph., com.redhat.component=keepalived-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=2.2.4, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.tags=Ceph keepalived, release=1793, io.buildah.version=1.28.2)
Feb 02 09:52:55 compute-1 sudo[168414]: pam_unix(sudo:session): session closed for user root
Feb 02 09:52:55 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:52:55 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_27] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4da8002f60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:52:55 compute-1 sudo[169612]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 02 09:52:55 compute-1 sudo[169612]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:52:55 compute-1 sudo[169612]: pam_unix(sudo:session): session closed for user root
Feb 02 09:52:55 compute-1 sudo[169645]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/d241d473-9fcb-5f74-b163-f1ca4454e7f1/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Feb 02 09:52:55 compute-1 sudo[169645]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:52:55 compute-1 sudo[169673]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Feb 02 09:52:55 compute-1 sudo[169673]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:52:55 compute-1 sudo[169673]: pam_unix(sudo:session): session closed for user root
Feb 02 09:52:55 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:52:55 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dd800a8a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:52:55 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:52:55 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_32] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dd800a8a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:52:55 compute-1 sudo[169645]: pam_unix(sudo:session): session closed for user root
Feb 02 09:52:55 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:52:55 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:52:55 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:52:55.809 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:52:55 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:52:55 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:52:55 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:52:55.931 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:52:56 compute-1 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Feb 02 09:52:56 compute-1 systemd[1]: Starting man-db-cache-update.service...
Feb 02 09:52:56 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb 02 09:52:56 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb 02 09:52:56 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb 02 09:52:56 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb 02 09:52:56 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "config rm", "who": "osd/host:compute-1", "name": "osd_memory_target"}]: dispatch
Feb 02 09:52:56 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "config rm", "who": "osd/host:compute-2", "name": "osd_memory_target"}]: dispatch
Feb 02 09:52:56 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Feb 02 09:52:56 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Feb 02 09:52:56 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb 02 09:52:56 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb 02 09:52:56 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Feb 02 09:52:56 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Feb 02 09:52:56 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Feb 02 09:52:56 compute-1 systemd[1]: Reloading.
Feb 02 09:52:56 compute-1 systemd-rc-local-generator[169870]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 02 09:52:56 compute-1 systemd-sysv-generator[169875]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 02 09:52:56 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 09:52:56 compute-1 systemd[1]: Queuing reload/restart jobs for marked units…
Feb 02 09:52:56 compute-1 sshd-session[169531]: Invalid user debian from 123.58.212.100 port 50934
Feb 02 09:52:56 compute-1 sshd-session[169531]: Connection closed by invalid user debian 123.58.212.100 port 50934 [preauth]
Feb 02 09:52:57 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:52:57 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dcc00c8f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:52:57 compute-1 ceph-mon[80115]: pgmap v395: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 B/s wr, 0 op/s
Feb 02 09:52:57 compute-1 ceph-mon[80115]: pgmap v396: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 299 B/s rd, 0 B/s wr, 0 op/s
Feb 02 09:52:57 compute-1 ceph-mon[80115]: Health check update: 1 failed cephadm daemon(s) (CEPHADM_FAILED_DAEMON)
Feb 02 09:52:57 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:52:57 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4da80045d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:52:57 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:52:57 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4db8002690 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:52:57 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:52:57 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:52:57 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:52:57.811 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:52:57 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:52:57 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:52:57 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:52:57.932 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:52:58 compute-1 sshd-session[171317]: Invalid user debian from 123.58.212.100 port 50946
Feb 02 09:52:58 compute-1 sshd-session[171317]: Connection closed by invalid user debian 123.58.212.100 port 50946 [preauth]
Feb 02 09:52:58 compute-1 sudo[150559]: pam_unix(sudo:session): session closed for user root
Feb 02 09:52:59 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:52:59 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_32] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dcc00c8f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:52:59 compute-1 ceph-mon[80115]: pgmap v397: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 299 B/s rd, 0 B/s wr, 0 op/s
Feb 02 09:52:59 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:52:59 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_27] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dd800a8a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:52:59 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:52:59 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4da80045d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:52:59 compute-1 sshd-session[173553]: Invalid user debian from 123.58.212.100 port 50958
Feb 02 09:52:59 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:52:59 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:52:59 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:52:59.813 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:52:59 compute-1 sshd-session[173553]: Connection closed by invalid user debian 123.58.212.100 port 50958 [preauth]
Feb 02 09:52:59 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:52:59 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:52:59 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:52:59.934 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:53:01 compute-1 sshd-session[175373]: Invalid user admin from 123.58.212.100 port 50966
Feb 02 09:53:01 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:53:01 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4db80042e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:53:01 compute-1 sudo[176590]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 02 09:53:01 compute-1 sudo[176590]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:53:01 compute-1 sudo[176590]: pam_unix(sudo:session): session closed for user root
Feb 02 09:53:01 compute-1 sshd-session[175373]: Connection closed by invalid user admin 123.58.212.100 port 50966 [preauth]
Feb 02 09:53:01 compute-1 ceph-mon[80115]: pgmap v398: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 299 B/s rd, 0 op/s
Feb 02 09:53:01 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb 02 09:53:01 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb 02 09:53:01 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:53:01 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_32] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dcc00c8f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:53:01 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:53:01 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dcc00c8f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:53:01 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 09:53:01 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:53:01 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:53:01 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:53:01.816 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:53:01 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:53:01 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:53:01 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:53:01.936 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:53:02 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Feb 02 09:53:02 compute-1 sshd-session[176975]: Invalid user admin from 123.58.212.100 port 33322
Feb 02 09:53:02 compute-1 sshd-session[176975]: Connection closed by invalid user admin 123.58.212.100 port 33322 [preauth]
Feb 02 09:53:03 compute-1 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Feb 02 09:53:03 compute-1 systemd[1]: Finished man-db-cache-update.service.
Feb 02 09:53:03 compute-1 systemd[1]: man-db-cache-update.service: Consumed 8.869s CPU time.
Feb 02 09:53:03 compute-1 systemd[1]: run-rb09085d442244dd7bc625d76dab9b4fc.service: Deactivated successfully.
Feb 02 09:53:03 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:53:03 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_32] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4da80045d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:53:03 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:53:03 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dcc00c8f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:53:03 compute-1 ceph-mon[80115]: pgmap v399: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 299 B/s rd, 0 op/s
Feb 02 09:53:03 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:53:03 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_27] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dd800a8a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:53:03 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:53:03 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 09:53:03 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:53:03.819 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 09:53:03 compute-1 sshd-session[178318]: Invalid user admin from 123.58.212.100 port 33326
Feb 02 09:53:03 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:53:03 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:53:03 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:53:03.938 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:53:04 compute-1 sshd-session[178318]: Connection closed by invalid user admin 123.58.212.100 port 33326 [preauth]
Feb 02 09:53:05 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:53:05 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4db80042e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:53:05 compute-1 sshd-session[178321]: Invalid user admin from 123.58.212.100 port 33342
Feb 02 09:53:05 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:53:05 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_32] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4da80045d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:53:05 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:53:05 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dcc00c8f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:53:05 compute-1 sshd-session[178321]: Connection closed by invalid user admin 123.58.212.100 port 33342 [preauth]
Feb 02 09:53:05 compute-1 auditd[706]: Audit daemon rotating log files
Feb 02 09:53:05 compute-1 ceph-mon[80115]: pgmap v400: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 299 B/s rd, 0 op/s
Feb 02 09:53:05 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:53:05 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:53:05 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:53:05.823 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:53:05 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:53:05 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:53:05 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:53:05.940 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:53:06 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 09:53:06 compute-1 ceph-mon[80115]: pgmap v401: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 299 B/s rd, 0 op/s
Feb 02 09:53:06 compute-1 sshd-session[178324]: Invalid user admin from 123.58.212.100 port 33344
Feb 02 09:53:06 compute-1 podman[178326]: 2026-02-02 09:53:06.835590033 +0000 UTC m=+0.151707296 container health_status 1fb2696999ea15e9688144989093738543d3cef725f9986792d6dfd707aefbe2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:099d88ae13fa2b3409da5310cdcba7fa01d2c87a8bc98296299a57054b9a075e, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'db4758ee7523fe447444c4bd2b867b543b1eee4e3bbcf6676cd1b27bf6147d86-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:099d88ae13fa2b3409da5310cdcba7fa01d2c87a8bc98296299a57054b9a075e', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.build-date=20260127, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Feb 02 09:53:06 compute-1 sshd-session[178324]: Connection closed by invalid user admin 123.58.212.100 port 33344 [preauth]
Feb 02 09:53:07 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:53:07 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_27] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dd800a8a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:53:07 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:53:07 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_32] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dd800a8a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:53:07 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:53:07 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4da80045d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:53:07 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:53:07 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:53:07 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:53:07.825 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:53:07 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:53:07 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:53:07 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:53:07.942 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:53:08 compute-1 sudo[178480]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rzyhgyqnbemvhjuxlpyqwztikengydfb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025987.5838099-965-70919650007735/AnsiballZ_systemd.py'
Feb 02 09:53:08 compute-1 sudo[178480]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:53:08 compute-1 python3.9[178482]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Feb 02 09:53:08 compute-1 systemd[1]: Reloading.
Feb 02 09:53:08 compute-1 systemd-sysv-generator[178513]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 02 09:53:08 compute-1 systemd-rc-local-generator[178510]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 02 09:53:08 compute-1 sshd-session[178353]: Invalid user admin from 123.58.212.100 port 33356
Feb 02 09:53:08 compute-1 sudo[178480]: pam_unix(sudo:session): session closed for user root
Feb 02 09:53:09 compute-1 sshd-session[178353]: Connection closed by invalid user admin 123.58.212.100 port 33356 [preauth]
Feb 02 09:53:09 compute-1 ceph-mon[80115]: pgmap v402: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Feb 02 09:53:09 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:53:09 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dcc00c8f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:53:09 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:53:09 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_27] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4db80042e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:53:09 compute-1 sudo[178674]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cjcpbfgyroprndooybqzrprwwbpogtyj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025989.0878599-965-245595165065549/AnsiballZ_systemd.py'
Feb 02 09:53:09 compute-1 sudo[178674]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:53:09 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:53:09 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_32] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dc8002230 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:53:09 compute-1 python3.9[178676]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd-tcp.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Feb 02 09:53:09 compute-1 systemd[1]: Reloading.
Feb 02 09:53:09 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:53:09 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:53:09 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:53:09.830 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:53:09 compute-1 systemd-sysv-generator[178705]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 02 09:53:09 compute-1 systemd-rc-local-generator[178698]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 02 09:53:09 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:53:09 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:53:09 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:53:09.944 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:53:10 compute-1 sudo[178674]: pam_unix(sudo:session): session closed for user root
Feb 02 09:53:10 compute-1 sshd-session[178651]: Invalid user admin from 123.58.212.100 port 33366
Feb 02 09:53:10 compute-1 sshd-session[178651]: Connection closed by invalid user admin 123.58.212.100 port 33366 [preauth]
Feb 02 09:53:10 compute-1 sudo[178864]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yeqdiysuvegspjvhyjmxujgfwmzdjulm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025990.241632-965-10068861072451/AnsiballZ_systemd.py'
Feb 02 09:53:10 compute-1 sudo[178864]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:53:10 compute-1 python3.9[178866]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd-tls.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Feb 02 09:53:10 compute-1 systemd[1]: Reloading.
Feb 02 09:53:10 compute-1 systemd-rc-local-generator[178893]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 02 09:53:10 compute-1 systemd-sysv-generator[178901]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 02 09:53:11 compute-1 sudo[178864]: pam_unix(sudo:session): session closed for user root
Feb 02 09:53:11 compute-1 ceph-mon[80115]: pgmap v403: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 341 B/s rd, 0 op/s
Feb 02 09:53:11 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:53:11 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dc8002230 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:53:11 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:53:11 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_33] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dcc00c8f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:53:11 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:53:11 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_27] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4db0002530 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:53:11 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 09:53:11 compute-1 sudo[179058]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lymrfmmqgmjrpzmelcosbkmiwolxhczb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025991.2318244-965-52681371998849/AnsiballZ_systemd.py'
Feb 02 09:53:11 compute-1 sudo[179058]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:53:11 compute-1 sshd-session[178867]: Invalid user admin from 123.58.212.100 port 33370
Feb 02 09:53:11 compute-1 python3.9[179060]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=virtproxyd-tcp.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Feb 02 09:53:11 compute-1 systemd[1]: Reloading.
Feb 02 09:53:11 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:53:11 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:53:11 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:53:11.832 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:53:11 compute-1 systemd-rc-local-generator[179088]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 02 09:53:11 compute-1 sshd-session[178867]: Connection closed by invalid user admin 123.58.212.100 port 33370 [preauth]
Feb 02 09:53:11 compute-1 systemd-sysv-generator[179092]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 02 09:53:11 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:53:11 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:53:11 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:53:11.946 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:53:12 compute-1 sudo[179058]: pam_unix(sudo:session): session closed for user root
Feb 02 09:53:12 compute-1 sudo[179250]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yiqogfdyyuoblsodrnczckkqnfhyyljf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025992.406033-1051-175458382362315/AnsiballZ_systemd.py'
Feb 02 09:53:12 compute-1 sudo[179250]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:53:13 compute-1 python3.9[179252]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Feb 02 09:53:13 compute-1 sshd-session[179110]: Invalid user admin from 123.58.212.100 port 59306
Feb 02 09:53:13 compute-1 systemd[1]: Reloading.
Feb 02 09:53:13 compute-1 ceph-mon[80115]: pgmap v404: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Feb 02 09:53:13 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:53:13 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_34] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4da80045d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:53:13 compute-1 systemd-sysv-generator[179282]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 02 09:53:13 compute-1 systemd-rc-local-generator[179278]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 02 09:53:13 compute-1 sshd-session[179110]: Connection closed by invalid user admin 123.58.212.100 port 59306 [preauth]
Feb 02 09:53:13 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:53:13 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dc8002230 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:53:13 compute-1 sudo[179250]: pam_unix(sudo:session): session closed for user root
Feb 02 09:53:13 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:53:13 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_33] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dcc00c8f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:53:13 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:53:13 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 09:53:13 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:53:13.835 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 09:53:13 compute-1 sudo[179442]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fkyxoabssyjixqlqjdbqtmerynjbkyfm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025993.5473459-1051-219534420121826/AnsiballZ_systemd.py'
Feb 02 09:53:13 compute-1 sudo[179442]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:53:13 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:53:13 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 09:53:13 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:53:13.948 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 09:53:14 compute-1 python3.9[179444]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Feb 02 09:53:14 compute-1 systemd[1]: Reloading.
Feb 02 09:53:14 compute-1 systemd-sysv-generator[179476]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 02 09:53:14 compute-1 systemd-rc-local-generator[179472]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 02 09:53:14 compute-1 sshd-session[179338]: Invalid user admin from 123.58.212.100 port 59322
Feb 02 09:53:14 compute-1 sudo[179442]: pam_unix(sudo:session): session closed for user root
Feb 02 09:53:14 compute-1 sshd-session[179338]: Connection closed by invalid user admin 123.58.212.100 port 59322 [preauth]
Feb 02 09:53:14 compute-1 sudo[179634]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vgqnyzszgozljbfofmpyuammrmzonpdh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025994.6642287-1051-101844018890808/AnsiballZ_systemd.py'
Feb 02 09:53:14 compute-1 sudo[179634]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:53:15 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:53:15 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_27] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4db0002530 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:53:15 compute-1 ceph-mon[80115]: pgmap v405: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 426 B/s rd, 0 op/s
Feb 02 09:53:15 compute-1 python3.9[179637]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Feb 02 09:53:15 compute-1 systemd[1]: Reloading.
Feb 02 09:53:15 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:53:15 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_34] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4da80045d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:53:15 compute-1 systemd-rc-local-generator[179690]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 02 09:53:15 compute-1 systemd-sysv-generator[179694]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 02 09:53:15 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:53:15 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_33] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4da80045d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:53:15 compute-1 sudo[179642]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Feb 02 09:53:15 compute-1 sudo[179642]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:53:15 compute-1 sudo[179642]: pam_unix(sudo:session): session closed for user root
Feb 02 09:53:15 compute-1 sudo[179634]: pam_unix(sudo:session): session closed for user root
Feb 02 09:53:15 compute-1 podman[179700]: 2026-02-02 09:53:15.70952055 +0000 UTC m=+0.075979991 container health_status 9cab238676dda9a572dafc047b624a8b78a8fbec063bf997856559897160605d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'db4758ee7523fe447444c4bd2b867b543b1eee4e3bbcf6676cd1b27bf6147d86-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Feb 02 09:53:15 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:53:15 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 09:53:15 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:53:15.838 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 09:53:15 compute-1 sshd-session[179633]: Invalid user admin from 123.58.212.100 port 59338
Feb 02 09:53:15 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:53:15 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:53:15 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:53:15.951 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:53:16 compute-1 sudo[179868]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nfqyyabbkcdxbrqwwkuvnpowdqmpgspz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025995.8037326-1051-154417427978235/AnsiballZ_systemd.py'
Feb 02 09:53:16 compute-1 sudo[179868]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:53:16 compute-1 sshd-session[179633]: Connection closed by invalid user admin 123.58.212.100 port 59338 [preauth]
Feb 02 09:53:16 compute-1 python3.9[179870]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Feb 02 09:53:16 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 09:53:16 compute-1 sudo[179868]: pam_unix(sudo:session): session closed for user root
Feb 02 09:53:16 compute-1 sudo[180026]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kazvynshgzypotymdmpowjuuqcyrqase ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025996.6347978-1051-255969078709835/AnsiballZ_systemd.py'
Feb 02 09:53:16 compute-1 sudo[180026]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:53:17 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:53:17 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dcc00c8f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:53:17 compute-1 ceph-mon[80115]: pgmap v406: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Feb 02 09:53:17 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Feb 02 09:53:17 compute-1 python3.9[180028]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Feb 02 09:53:17 compute-1 systemd[1]: Reloading.
Feb 02 09:53:17 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:53:17 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_27] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4db0002530 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:53:17 compute-1 systemd-rc-local-generator[180049]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 02 09:53:17 compute-1 systemd-sysv-generator[180056]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 02 09:53:17 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:53:17 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_33] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4db0002530 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:53:17 compute-1 sshd-session[179871]: Invalid user admin from 123.58.212.100 port 59348
Feb 02 09:53:17 compute-1 sudo[180026]: pam_unix(sudo:session): session closed for user root
Feb 02 09:53:17 compute-1 sshd-session[179871]: Connection closed by invalid user admin 123.58.212.100 port 59348 [preauth]
Feb 02 09:53:17 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:53:17 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:53:17 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:53:17.842 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:53:17 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:53:17 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:53:17 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:53:17.953 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:53:18 compute-1 sudo[180218]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-corylwalevpftzakfueicmzgeirsktiy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770025998.644611-1159-179759418751597/AnsiballZ_systemd.py'
Feb 02 09:53:18 compute-1 sudo[180218]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:53:18 compute-1 sshd-session[180090]: Invalid user admin from 123.58.212.100 port 59354
Feb 02 09:53:19 compute-1 sshd-session[180090]: Connection closed by invalid user admin 123.58.212.100 port 59354 [preauth]
Feb 02 09:53:19 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:53:19 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_33] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4da80045d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:53:19 compute-1 python3.9[180220]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-tls.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Feb 02 09:53:19 compute-1 ceph-mon[80115]: pgmap v407: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Feb 02 09:53:19 compute-1 systemd[1]: Reloading.
Feb 02 09:53:19 compute-1 systemd-rc-local-generator[180247]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 02 09:53:19 compute-1 systemd-sysv-generator[180252]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 02 09:53:19 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:53:19 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_27] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4da80045d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:53:19 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:53:19 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4db0002530 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:53:19 compute-1 systemd[1]: Listening on libvirt proxy daemon socket.
Feb 02 09:53:19 compute-1 systemd[1]: Listening on libvirt proxy daemon TLS IP socket.
Feb 02 09:53:19 compute-1 sudo[180218]: pam_unix(sudo:session): session closed for user root
Feb 02 09:53:19 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:53:19 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:53:19 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:53:19.845 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:53:19 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:53:19 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:53:19 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:53:19.956 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:53:20 compute-1 sshd-session[180259]: Invalid user admin from 123.58.212.100 port 59368
Feb 02 09:53:20 compute-1 sshd-session[180259]: Connection closed by invalid user admin 123.58.212.100 port 59368 [preauth]
Feb 02 09:53:21 compute-1 sudo[180416]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oxxifipsuzwkmjloaxhygoejsvrpaxmq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770026000.8271816-1183-264606013837002/AnsiballZ_systemd.py'
Feb 02 09:53:21 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:53:21 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_33] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4db0002530 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:53:21 compute-1 sudo[180416]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:53:21 compute-1 ceph-mon[80115]: pgmap v408: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 341 B/s rd, 0 op/s
Feb 02 09:53:21 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:53:21 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_33] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dcc00c8f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:53:21 compute-1 python3.9[180418]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Feb 02 09:53:21 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:53:21 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_27] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4da80045d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:53:21 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 09:53:21 compute-1 sudo[180416]: pam_unix(sudo:session): session closed for user root
Feb 02 09:53:21 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:53:21 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 09:53:21 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:53:21.847 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 09:53:21 compute-1 sshd-session[180358]: Invalid user admin from 123.58.212.100 port 59376
Feb 02 09:53:21 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:53:21 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:53:21 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:53:21.958 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:53:22 compute-1 sudo[180571]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rxbemqjoegpetabgwufxonnwzwitcttb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770026001.690447-1183-217686667883593/AnsiballZ_systemd.py'
Feb 02 09:53:22 compute-1 sudo[180571]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:53:22 compute-1 sshd-session[180358]: Connection closed by invalid user admin 123.58.212.100 port 59376 [preauth]
Feb 02 09:53:22 compute-1 python3.9[180573]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Feb 02 09:53:22 compute-1 sudo[180571]: pam_unix(sudo:session): session closed for user root
Feb 02 09:53:22 compute-1 sudo[180729]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xolhllgvqldolumdqczvwddzrgibzpht ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770026002.5312557-1183-250631619770123/AnsiballZ_systemd.py'
Feb 02 09:53:22 compute-1 sudo[180729]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:53:23 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:53:23 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_34] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4da80045d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:53:23 compute-1 python3.9[180731]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Feb 02 09:53:23 compute-1 ceph-mon[80115]: pgmap v409: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Feb 02 09:53:23 compute-1 sudo[180729]: pam_unix(sudo:session): session closed for user root
Feb 02 09:53:23 compute-1 sshd-session[180577]: Invalid user admin from 123.58.212.100 port 60810
Feb 02 09:53:23 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:53:23 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_27] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4da80045d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:53:23 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:53:23 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dbc002690 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:53:23 compute-1 sshd-session[180577]: Connection closed by invalid user admin 123.58.212.100 port 60810 [preauth]
Feb 02 09:53:23 compute-1 sudo[180885]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fhfhlqiojpmglahrpbrgmyycjmlpjqnl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770026003.365385-1183-130458687898424/AnsiballZ_systemd.py'
Feb 02 09:53:23 compute-1 sudo[180885]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:53:23 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:53:23 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 09:53:23 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:53:23.850 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 09:53:23 compute-1 python3.9[180887]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Feb 02 09:53:23 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:53:23 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:53:23 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:53:23.960 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:53:24 compute-1 sudo[180885]: pam_unix(sudo:session): session closed for user root
Feb 02 09:53:24 compute-1 sudo[181042]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-htkzsxgrqsdgjtxrtkljywyxnerrjonm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770026004.1790779-1183-193211853558429/AnsiballZ_systemd.py'
Feb 02 09:53:24 compute-1 sudo[181042]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:53:24 compute-1 python3.9[181044]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Feb 02 09:53:24 compute-1 sshd-session[180888]: Invalid user admin from 123.58.212.100 port 60816
Feb 02 09:53:24 compute-1 sudo[181042]: pam_unix(sudo:session): session closed for user root
Feb 02 09:53:25 compute-1 sshd-session[180888]: Connection closed by invalid user admin 123.58.212.100 port 60816 [preauth]
Feb 02 09:53:25 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:53:25 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_35] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dc8003b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:53:25 compute-1 sudo[181198]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-roevnznsmvnalkkbslnwiljrbfaipteh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770026004.9914832-1183-73540456476954/AnsiballZ_systemd.py'
Feb 02 09:53:25 compute-1 sudo[181198]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:53:25 compute-1 ceph-mon[80115]: pgmap v410: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 426 B/s rd, 0 op/s
Feb 02 09:53:25 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:53:25 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_33] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dcc00c8f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:53:25 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:53:25 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_27] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4da80045d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:53:25 compute-1 python3.9[181200]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Feb 02 09:53:25 compute-1 sudo[181198]: pam_unix(sudo:session): session closed for user root
Feb 02 09:53:25 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:53:25 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:53:25 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:53:25.854 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:53:25 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:53:25 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:53:25 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:53:25.962 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:53:26 compute-1 sudo[181355]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-psnxfkzmhxoeyqmdkplzgkjcyxvtlhtk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770026005.7924838-1183-249280716346905/AnsiballZ_systemd.py'
Feb 02 09:53:26 compute-1 sudo[181355]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:53:26 compute-1 sshd-session[181201]: Invalid user admin from 123.58.212.100 port 60824
Feb 02 09:53:26 compute-1 python3.9[181357]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Feb 02 09:53:26 compute-1 sudo[181355]: pam_unix(sudo:session): session closed for user root
Feb 02 09:53:26 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 09:53:26 compute-1 sshd-session[181201]: Connection closed by invalid user admin 123.58.212.100 port 60824 [preauth]
Feb 02 09:53:26 compute-1 sudo[181513]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uowapmcilqxjnwrpsxqnqhpnstltlbxj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770026006.5962496-1183-234541517582785/AnsiballZ_systemd.py'
Feb 02 09:53:26 compute-1 sudo[181513]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:53:27 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:53:27 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dbc002690 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:53:27 compute-1 python3.9[181515]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Feb 02 09:53:27 compute-1 sudo[181513]: pam_unix(sudo:session): session closed for user root
Feb 02 09:53:27 compute-1 ceph-mon[80115]: pgmap v411: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Feb 02 09:53:27 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:53:27 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_35] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dc8003b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:53:27 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:53:27 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_33] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dcc00c8f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:53:27 compute-1 sshd-session[181461]: Invalid user admin from 123.58.212.100 port 60830
Feb 02 09:53:27 compute-1 sudo[181668]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jbzgbspwqpfuupedfuayqduknoxourgb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770026007.3902044-1183-55184418492257/AnsiballZ_systemd.py'
Feb 02 09:53:27 compute-1 sudo[181668]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:53:27 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:53:27 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:53:27 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:53:27.857 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:53:27 compute-1 sshd-session[181461]: Connection closed by invalid user admin 123.58.212.100 port 60830 [preauth]
Feb 02 09:53:27 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:53:27 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:53:27 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:53:27.964 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:53:27 compute-1 python3.9[181670]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Feb 02 09:53:28 compute-1 sudo[181668]: pam_unix(sudo:session): session closed for user root
Feb 02 09:53:28 compute-1 sudo[181825]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-unedajhcmrppcwmyvlbpbnjdrzrejppo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770026008.2169707-1183-9602974128346/AnsiballZ_systemd.py'
Feb 02 09:53:28 compute-1 sudo[181825]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:53:28 compute-1 python3.9[181827]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Feb 02 09:53:28 compute-1 sudo[181825]: pam_unix(sudo:session): session closed for user root
Feb 02 09:53:29 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:53:29 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_27] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4da80045d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:53:29 compute-1 ceph-mon[80115]: pgmap v412: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Feb 02 09:53:29 compute-1 sudo[181981]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fevhfvwkoscobqaqeileejgfqfyfakfw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770026009.0293853-1183-259828198252098/AnsiballZ_systemd.py'
Feb 02 09:53:29 compute-1 sudo[181981]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:53:29 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:53:29 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_35] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4da80045d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:53:29 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:53:29 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dc8003b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:53:29 compute-1 sshd-session[181695]: Invalid user admin from 123.58.212.100 port 60846
Feb 02 09:53:29 compute-1 python3.9[181983]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Feb 02 09:53:29 compute-1 sudo[181981]: pam_unix(sudo:session): session closed for user root
Feb 02 09:53:29 compute-1 sshd-session[181695]: Connection closed by invalid user admin 123.58.212.100 port 60846 [preauth]
Feb 02 09:53:29 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:53:29 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 09:53:29 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:53:29.861 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 09:53:29 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:53:29 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:53:29 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:53:29.967 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:53:30 compute-1 sudo[182138]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-inbleuddqzrcizmbiunigdrfvjadlmkb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770026009.8259096-1183-87223783400922/AnsiballZ_systemd.py'
Feb 02 09:53:30 compute-1 sudo[182138]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:53:30 compute-1 python3.9[182140]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Feb 02 09:53:30 compute-1 sudo[182138]: pam_unix(sudo:session): session closed for user root
Feb 02 09:53:30 compute-1 sudo[182294]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fomcqqrplkidrkaltguojjhpcuqegoqc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770026010.6057396-1183-85401864867877/AnsiballZ_systemd.py'
Feb 02 09:53:30 compute-1 sudo[182294]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:53:30 compute-1 sshd-session[182109]: Invalid user admin from 123.58.212.100 port 60854
Feb 02 09:53:31 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:53:31 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_33] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dcc00c8f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:53:31 compute-1 python3.9[182296]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Feb 02 09:53:31 compute-1 sshd-session[182109]: Connection closed by invalid user admin 123.58.212.100 port 60854 [preauth]
Feb 02 09:53:31 compute-1 sudo[182294]: pam_unix(sudo:session): session closed for user root
Feb 02 09:53:31 compute-1 ceph-mon[80115]: pgmap v413: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 341 B/s rd, 0 op/s
Feb 02 09:53:31 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:53:31 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_27] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4dbc002690 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:53:31 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[122309]: 02/02/2026 09:53:31 : epoch 698072dc : compute-1 : ganesha.nfsd-2[svc_35] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4da80045d0 fd 38 proxy ignored for local
Feb 02 09:53:31 compute-1 kernel: ganesha.nfsd[151860]: segfault at 50 ip 00007f4e616b332e sp 00007f4dd77fd210 error 4 in libntirpc.so.5.8[7f4e61698000+2c000] likely on CPU 7 (core 0, socket 7)
Feb 02 09:53:31 compute-1 kernel: Code: 47 20 66 41 89 86 f2 00 00 00 41 bf 01 00 00 00 b9 40 00 00 00 e9 af fd ff ff 66 90 48 8b 85 f8 00 00 00 48 8b 40 08 4c 8b 28 <45> 8b 65 50 49 8b 75 68 41 8b be 28 02 00 00 b9 40 00 00 00 e8 29
Feb 02 09:53:31 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 09:53:31 compute-1 systemd[1]: Started Process Core Dump (PID 182377/UID 0).
Feb 02 09:53:31 compute-1 sudo[182453]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-euyryynslmnaxnskpuodqjpmbnzpenju ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770026011.4065988-1183-48550723192455/AnsiballZ_systemd.py'
Feb 02 09:53:31 compute-1 sudo[182453]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:53:31 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:53:31 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000025s ======
Feb 02 09:53:31 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:53:31.864 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Feb 02 09:53:31 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:53:31 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:53:31 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:53:31.970 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:53:31 compute-1 python3.9[182455]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Feb 02 09:53:32 compute-1 sudo[182453]: pam_unix(sudo:session): session closed for user root
Feb 02 09:53:32 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Feb 02 09:53:32 compute-1 sshd-session[182376]: Invalid user admin from 123.58.212.100 port 44078
Feb 02 09:53:32 compute-1 systemd-coredump[182383]: Process 122314 (ganesha.nfsd) of user 0 dumped core.
                                                    
                                                    Stack trace of thread 74:
                                                    #0  0x00007f4e616b332e n/a (/usr/lib64/libntirpc.so.5.8 + 0x2232e)
                                                    ELF object binary architecture: AMD x86-64
Feb 02 09:53:32 compute-1 systemd[1]: systemd-coredump@5-182377-0.service: Deactivated successfully.
Feb 02 09:53:32 compute-1 podman[182487]: 2026-02-02 09:53:32.57527812 +0000 UTC m=+0.042176191 container died 7d1344a9b85ae5cf187282ef94fd744fe626fdac05803bf052cfc41639d346b5 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, ceph=True, org.label-schema.build-date=20250325, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Feb 02 09:53:32 compute-1 systemd[1]: var-lib-containers-storage-overlay-8524dd3387d19fcf089347383faa64c8e2290cf613dceca79426c8e374e209c0-merged.mount: Deactivated successfully.
Feb 02 09:53:32 compute-1 sshd-session[182376]: Connection closed by invalid user admin 123.58.212.100 port 44078 [preauth]
Feb 02 09:53:32 compute-1 podman[182487]: 2026-02-02 09:53:32.620558139 +0000 UTC m=+0.087456220 container remove 7d1344a9b85ae5cf187282ef94fd744fe626fdac05803bf052cfc41639d346b5 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, org.label-schema.build-date=20250325, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2)
Feb 02 09:53:32 compute-1 systemd[1]: ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1@nfs.cephfs.0.0.compute-1.mhzhsx.service: Main process exited, code=exited, status=139/n/a
Feb 02 09:53:32 compute-1 systemd[1]: ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1@nfs.cephfs.0.0.compute-1.mhzhsx.service: Failed with result 'exit-code'.
Feb 02 09:53:32 compute-1 systemd[1]: ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1@nfs.cephfs.0.0.compute-1.mhzhsx.service: Consumed 1.755s CPU time.
Feb 02 09:53:33 compute-1 sudo[182659]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kgfxmkmtpbqfegvrfbbsxmepycmzuwid ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770026012.9219134-1489-45201152039489/AnsiballZ_file.py'
Feb 02 09:53:33 compute-1 sudo[182659]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:53:33 compute-1 ceph-mon[80115]: pgmap v414: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Feb 02 09:53:33 compute-1 python3.9[182661]: ansible-ansible.builtin.file Invoked with group=root owner=root path=/etc/tmpfiles.d/ setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Feb 02 09:53:33 compute-1 sudo[182659]: pam_unix(sudo:session): session closed for user root
Feb 02 09:53:33 compute-1 sshd-session[182532]: Invalid user admin from 123.58.212.100 port 44092
Feb 02 09:53:33 compute-1 sudo[182811]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tgekjamnmyzzgwkypaybbmhgmqymqvom ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770026013.595209-1489-86911829936921/AnsiballZ_file.py'
Feb 02 09:53:33 compute-1 sudo[182811]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:53:33 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:53:33 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000025s ======
Feb 02 09:53:33 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:53:33.867 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Feb 02 09:53:33 compute-1 sshd-session[182532]: Connection closed by invalid user admin 123.58.212.100 port 44092 [preauth]
Feb 02 09:53:33 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:53:33 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:53:33 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:53:33.972 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:53:34 compute-1 python3.9[182813]: ansible-ansible.builtin.file Invoked with group=root owner=root path=/var/lib/edpm-config/firewall setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Feb 02 09:53:34 compute-1 sudo[182811]: pam_unix(sudo:session): session closed for user root
Feb 02 09:53:34 compute-1 sudo[182965]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sfmdnlkwhfyvefuniuvgejxfwekezrfz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770026014.2042093-1489-7514764113140/AnsiballZ_file.py'
Feb 02 09:53:34 compute-1 sudo[182965]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:53:34 compute-1 python3.9[182967]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 02 09:53:34 compute-1 sudo[182965]: pam_unix(sudo:session): session closed for user root
Feb 02 09:53:34 compute-1 sudo[183118]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qmhjrcebfvxbilplkeggrfvdytcieliw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770026014.7665992-1489-120715154820312/AnsiballZ_file.py'
Feb 02 09:53:34 compute-1 sudo[183118]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:53:35 compute-1 sshd-session[182855]: Invalid user admin from 123.58.212.100 port 44102
Feb 02 09:53:35 compute-1 python3.9[183120]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/libvirt/private setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 02 09:53:35 compute-1 sudo[183118]: pam_unix(sudo:session): session closed for user root
Feb 02 09:53:35 compute-1 sshd-session[182855]: Connection closed by invalid user admin 123.58.212.100 port 44102 [preauth]
Feb 02 09:53:35 compute-1 ceph-mon[80115]: pgmap v415: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 341 B/s rd, 0 op/s
Feb 02 09:53:35 compute-1 sudo[183270]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uvwybwrjxzfggpmsmehpkycxyufqyjgb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770026015.2945526-1489-75495875470016/AnsiballZ_file.py'
Feb 02 09:53:35 compute-1 sudo[183270]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:53:35 compute-1 python3.9[183273]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/CA setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 02 09:53:35 compute-1 sudo[183275]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Feb 02 09:53:35 compute-1 sudo[183275]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:53:35 compute-1 sudo[183275]: pam_unix(sudo:session): session closed for user root
Feb 02 09:53:35 compute-1 sudo[183270]: pam_unix(sudo:session): session closed for user root
Feb 02 09:53:35 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:53:35 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000025s ======
Feb 02 09:53:35 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:53:35.869 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Feb 02 09:53:35 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:53:35 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:53:35 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:53:35.973 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:53:36 compute-1 sudo[183449]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kvuryphufvzkzulrxiwkvyunscuetggh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770026015.945957-1489-162423933546938/AnsiballZ_file.py'
Feb 02 09:53:36 compute-1 sudo[183449]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:53:36 compute-1 python3.9[183451]: ansible-ansible.builtin.file Invoked with group=qemu owner=root path=/etc/pki/qemu setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Feb 02 09:53:36 compute-1 sudo[183449]: pam_unix(sudo:session): session closed for user root
Feb 02 09:53:36 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 09:53:36 compute-1 sshd-session[183272]: Invalid user admin from 123.58.212.100 port 44114
Feb 02 09:53:36 compute-1 sshd-session[183272]: Connection closed by invalid user admin 123.58.212.100 port 44114 [preauth]
Feb 02 09:53:37 compute-1 podman[183578]: 2026-02-02 09:53:37.173288406 +0000 UTC m=+0.110173206 container health_status 1fb2696999ea15e9688144989093738543d3cef725f9986792d6dfd707aefbe2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:099d88ae13fa2b3409da5310cdcba7fa01d2c87a8bc98296299a57054b9a075e, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'db4758ee7523fe447444c4bd2b867b543b1eee4e3bbcf6676cd1b27bf6147d86-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:099d88ae13fa2b3409da5310cdcba7fa01d2c87a8bc98296299a57054b9a075e', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Feb 02 09:53:37 compute-1 python3.9[183613]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 02 09:53:37 compute-1 ceph-mon[80115]: pgmap v416: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s
Feb 02 09:53:37 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-haproxy-nfs-cephfs-compute-1-sryqbx[86108]: [WARNING] 032/095337 (4) : Server backend/nfs.cephfs.0 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Feb 02 09:53:37 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:53:37 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:53:37 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:53:37.874 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:53:37 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:53:37 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:53:37 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:53:37.975 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:53:38 compute-1 sudo[183780]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iunaftafnwlznbktrudpealpairmnmkc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770026017.6431115-1642-205544022524164/AnsiballZ_stat.py'
Feb 02 09:53:38 compute-1 sudo[183780]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:53:38 compute-1 sshd-session[183552]: Invalid user admin from 123.58.212.100 port 44116
Feb 02 09:53:38 compute-1 python3.9[183782]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtlogd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 02 09:53:38 compute-1 sudo[183780]: pam_unix(sudo:session): session closed for user root
Feb 02 09:53:38 compute-1 sshd-session[183552]: Connection closed by invalid user admin 123.58.212.100 port 44116 [preauth]
Feb 02 09:53:38 compute-1 sudo[183908]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zeydpgzmjisbmpwchjumioonwmnhoynz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770026017.6431115-1642-205544022524164/AnsiballZ_copy.py'
Feb 02 09:53:38 compute-1 sudo[183908]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:53:39 compute-1 python3.9[183910]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtlogd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1770026017.6431115-1642-205544022524164/.source.conf follow=False _original_basename=virtlogd.conf checksum=d7a72ae92c2c205983b029473e05a6aa4c58ec24 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 02 09:53:39 compute-1 sudo[183908]: pam_unix(sudo:session): session closed for user root
Feb 02 09:53:39 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-haproxy-nfs-cephfs-compute-1-sryqbx[86108]: [WARNING] 032/095339 (4) : Server backend/nfs.cephfs.2 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 1 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Feb 02 09:53:39 compute-1 ceph-mon[80115]: pgmap v417: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s
Feb 02 09:53:39 compute-1 sudo[184060]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kksktmxqgrdqxczepxginxkfaewvhvzw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770026019.2232573-1642-70406926936368/AnsiballZ_stat.py'
Feb 02 09:53:39 compute-1 sudo[184060]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:53:39 compute-1 sshd-session[183856]: Invalid user admin from 123.58.212.100 port 44120
Feb 02 09:53:39 compute-1 python3.9[184062]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtnodedevd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 02 09:53:39 compute-1 sudo[184060]: pam_unix(sudo:session): session closed for user root
Feb 02 09:53:39 compute-1 sshd-session[183856]: Connection closed by invalid user admin 123.58.212.100 port 44120 [preauth]
Feb 02 09:53:39 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:53:39 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:53:39 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:53:39.876 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:53:39 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:53:39 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:53:39 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:53:39.977 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:53:40 compute-1 sudo[184187]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pdyxslqyjidfatisivyxdzkarzhsavie ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770026019.2232573-1642-70406926936368/AnsiballZ_copy.py'
Feb 02 09:53:40 compute-1 sudo[184187]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:53:40 compute-1 python3.9[184189]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtnodedevd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1770026019.2232573-1642-70406926936368/.source.conf follow=False _original_basename=virtnodedevd.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 02 09:53:40 compute-1 sudo[184187]: pam_unix(sudo:session): session closed for user root
Feb 02 09:53:40 compute-1 sudo[184340]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wwpbmsgdkwffilphyqsqjzwmoijieayl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770026020.4817348-1642-249894984223714/AnsiballZ_stat.py'
Feb 02 09:53:40 compute-1 sudo[184340]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:53:40 compute-1 python3.9[184342]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtproxyd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 02 09:53:41 compute-1 sudo[184340]: pam_unix(sudo:session): session closed for user root
Feb 02 09:53:41 compute-1 sshd-session[184174]: Invalid user admin from 123.58.212.100 port 44136
Feb 02 09:53:41 compute-1 sshd-session[184174]: Connection closed by invalid user admin 123.58.212.100 port 44136 [preauth]
Feb 02 09:53:41 compute-1 sudo[184465]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vcwouraqbrrbbtyuxwpkpnfasuesueiz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770026020.4817348-1642-249894984223714/AnsiballZ_copy.py'
Feb 02 09:53:41 compute-1 sudo[184465]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:53:41 compute-1 ceph-mon[80115]: pgmap v418: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s
Feb 02 09:53:41 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 09:53:41 compute-1 python3.9[184467]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtproxyd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1770026020.4817348-1642-249894984223714/.source.conf follow=False _original_basename=virtproxyd.conf checksum=28bc484b7c9988e03de49d4fcc0a088ea975f716 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 02 09:53:41 compute-1 sudo[184465]: pam_unix(sudo:session): session closed for user root
Feb 02 09:53:41 compute-1 sudo[184619]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xqacltgwrjyykbebqfqcdhwovbxbqdpf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770026021.6501741-1642-99336805560282/AnsiballZ_stat.py'
Feb 02 09:53:41 compute-1 sudo[184619]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:53:41 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:53:41 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:53:41 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:53:41.879 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:53:41 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:53:41 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:53:41 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:53:41.979 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:53:42 compute-1 python3.9[184621]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtqemud.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 02 09:53:42 compute-1 sudo[184619]: pam_unix(sudo:session): session closed for user root
Feb 02 09:53:42 compute-1 sshd-session[184468]: Invalid user admin from 123.58.212.100 port 52184
Feb 02 09:53:42 compute-1 sudo[184744]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fxpkravsfwnlgkoynkqvxqwkussgqusg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770026021.6501741-1642-99336805560282/AnsiballZ_copy.py'
Feb 02 09:53:42 compute-1 sudo[184744]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:53:42 compute-1 python3.9[184746]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtqemud.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1770026021.6501741-1642-99336805560282/.source.conf follow=False _original_basename=virtqemud.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 02 09:53:42 compute-1 sudo[184744]: pam_unix(sudo:session): session closed for user root
Feb 02 09:53:42 compute-1 sshd-session[184468]: Connection closed by invalid user admin 123.58.212.100 port 52184 [preauth]
Feb 02 09:53:42 compute-1 systemd[1]: ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1@nfs.cephfs.0.0.compute-1.mhzhsx.service: Scheduled restart job, restart counter is at 6.
Feb 02 09:53:42 compute-1 systemd[1]: Stopped Ceph nfs.cephfs.0.0.compute-1.mhzhsx for d241d473-9fcb-5f74-b163-f1ca4454e7f1.
Feb 02 09:53:42 compute-1 systemd[1]: ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1@nfs.cephfs.0.0.compute-1.mhzhsx.service: Consumed 1.755s CPU time.
Feb 02 09:53:42 compute-1 systemd[1]: Starting Ceph nfs.cephfs.0.0.compute-1.mhzhsx for d241d473-9fcb-5f74-b163-f1ca4454e7f1...
Feb 02 09:53:43 compute-1 podman[184897]: 2026-02-02 09:53:43.059357919 +0000 UTC m=+0.054285489 container create 368812cb123d5562d4341f805c1c6cec04ce7af35417771ccb5a9aba72f1c0d7 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx, org.label-schema.schema-version=1.0, ceph=True, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default)
Feb 02 09:53:43 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b41398c763b3c102a46779e12a2f4cfcf9f278ef37c2c2c2f473f0bbe2f41a2c/merged/etc/ganesha supports timestamps until 2038 (0x7fffffff)
Feb 02 09:53:43 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b41398c763b3c102a46779e12a2f4cfcf9f278ef37c2c2c2f473f0bbe2f41a2c/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Feb 02 09:53:43 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b41398c763b3c102a46779e12a2f4cfcf9f278ef37c2c2c2f473f0bbe2f41a2c/merged/etc/ceph/keyring supports timestamps until 2038 (0x7fffffff)
Feb 02 09:53:43 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b41398c763b3c102a46779e12a2f4cfcf9f278ef37c2c2c2f473f0bbe2f41a2c/merged/var/lib/ceph/radosgw/ceph-nfs.cephfs.0.0.compute-1.mhzhsx-rgw/keyring supports timestamps until 2038 (0x7fffffff)
Feb 02 09:53:43 compute-1 podman[184897]: 2026-02-02 09:53:43.038489549 +0000 UTC m=+0.033417149 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Feb 02 09:53:43 compute-1 podman[184897]: 2026-02-02 09:53:43.134069474 +0000 UTC m=+0.128997084 container init 368812cb123d5562d4341f805c1c6cec04ce7af35417771ccb5a9aba72f1c0d7 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Feb 02 09:53:43 compute-1 podman[184897]: 2026-02-02 09:53:43.14217935 +0000 UTC m=+0.137106930 container start 368812cb123d5562d4341f805c1c6cec04ce7af35417771ccb5a9aba72f1c0d7 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.40.1, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250325)
Feb 02 09:53:43 compute-1 bash[184897]: 368812cb123d5562d4341f805c1c6cec04ce7af35417771ccb5a9aba72f1c0d7
Feb 02 09:53:43 compute-1 systemd[1]: Started Ceph nfs.cephfs.0.0.compute-1.mhzhsx for d241d473-9fcb-5f74-b163-f1ca4454e7f1.
Feb 02 09:53:43 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:53:43 : epoch 69807427 : compute-1 : ganesha.nfsd-2[main] init_logging :LOG :NULL :LOG: Setting log level for all components to NIV_EVENT
Feb 02 09:53:43 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:53:43 : epoch 69807427 : compute-1 : ganesha.nfsd-2[main] main :MAIN :EVENT :ganesha.nfsd Starting: Ganesha Version 5.9
Feb 02 09:53:43 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:53:43 : epoch 69807427 : compute-1 : ganesha.nfsd-2[main] nfs_set_param_from_conf :NFS STARTUP :EVENT :Configuration file successfully parsed
Feb 02 09:53:43 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:53:43 : epoch 69807427 : compute-1 : ganesha.nfsd-2[main] monitoring_init :NFS STARTUP :EVENT :Init monitoring at 0.0.0.0:9587
Feb 02 09:53:43 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:53:43 : epoch 69807427 : compute-1 : ganesha.nfsd-2[main] fsal_init_fds_limit :MDCACHE LRU :EVENT :Setting the system-imposed limit on FDs to 1048576.
Feb 02 09:53:43 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:53:43 : epoch 69807427 : compute-1 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :Initializing ID Mapper.
Feb 02 09:53:43 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:53:43 : epoch 69807427 : compute-1 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :ID Mapper successfully initialized.
Feb 02 09:53:43 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:53:43 : epoch 69807427 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Feb 02 09:53:43 compute-1 sudo[185004]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kggqfcbyihjhjtfvwjrzphfczmynurng ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770026022.7632465-1642-51387862359910/AnsiballZ_stat.py'
Feb 02 09:53:43 compute-1 sudo[185004]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:53:43 compute-1 ceph-mon[80115]: pgmap v419: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 85 B/s rd, 0 op/s
Feb 02 09:53:43 compute-1 python3.9[185007]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/qemu.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 02 09:53:43 compute-1 sudo[185004]: pam_unix(sudo:session): session closed for user root
Feb 02 09:53:43 compute-1 sshd-session[184844]: Invalid user admin from 123.58.212.100 port 52200
Feb 02 09:53:43 compute-1 sudo[185130]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cezpiyxdbanymuizgbsgpvkkmcgafapn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770026022.7632465-1642-51387862359910/AnsiballZ_copy.py'
Feb 02 09:53:43 compute-1 sudo[185130]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:53:43 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:53:43 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000025s ======
Feb 02 09:53:43 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:53:43.882 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Feb 02 09:53:43 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:53:43 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:53:43 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:53:43.981 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:53:44 compute-1 python3.9[185132]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/qemu.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1770026022.7632465-1642-51387862359910/.source.conf follow=False _original_basename=qemu.conf.j2 checksum=c44de21af13c90603565570f09ff60c6a41ed8df backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 02 09:53:44 compute-1 sshd-session[184844]: Connection closed by invalid user admin 123.58.212.100 port 52200 [preauth]
Feb 02 09:53:44 compute-1 sudo[185130]: pam_unix(sudo:session): session closed for user root
Feb 02 09:53:44 compute-1 sudo[185284]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wozpcmjotczadegwjjgosqceejmubekh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770026024.2034993-1642-18020966382232/AnsiballZ_stat.py'
Feb 02 09:53:44 compute-1 sudo[185284]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:53:44 compute-1 python3.9[185286]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtsecretd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 02 09:53:44 compute-1 sudo[185284]: pam_unix(sudo:session): session closed for user root
Feb 02 09:53:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:53:44.891 143542 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 02 09:53:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:53:44.892 143542 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 02 09:53:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:53:44.893 143542 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 02 09:53:45 compute-1 sudo[185410]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kodidwxyyzwaybkiktniurrnoboxesax ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770026024.2034993-1642-18020966382232/AnsiballZ_copy.py'
Feb 02 09:53:45 compute-1 sudo[185410]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:53:45 compute-1 python3.9[185412]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtsecretd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1770026024.2034993-1642-18020966382232/.source.conf follow=False _original_basename=virtsecretd.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 02 09:53:45 compute-1 sudo[185410]: pam_unix(sudo:session): session closed for user root
Feb 02 09:53:45 compute-1 ceph-mon[80115]: pgmap v420: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s
Feb 02 09:53:45 compute-1 sshd-session[185209]: Invalid user admin from 123.58.212.100 port 52206
Feb 02 09:53:45 compute-1 sudo[185562]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bsemubtbqnjdqihbrkamdhtcmpwhoqyl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770026025.3671057-1642-42392861843968/AnsiballZ_stat.py'
Feb 02 09:53:45 compute-1 sudo[185562]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:53:45 compute-1 sshd-session[185209]: Connection closed by invalid user admin 123.58.212.100 port 52206 [preauth]
Feb 02 09:53:45 compute-1 python3.9[185564]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/auth.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 02 09:53:45 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:53:45 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 09:53:45 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:53:45.885 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 09:53:45 compute-1 sudo[185562]: pam_unix(sudo:session): session closed for user root
Feb 02 09:53:45 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:53:45 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:53:45 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:53:45.983 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:53:46 compute-1 podman[185637]: 2026-02-02 09:53:46.387336425 +0000 UTC m=+0.064521028 container health_status 9cab238676dda9a572dafc047b624a8b78a8fbec063bf997856559897160605d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'db4758ee7523fe447444c4bd2b867b543b1eee4e3bbcf6676cd1b27bf6147d86-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS)
Feb 02 09:53:46 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 09:53:46 compute-1 sudo[185706]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vlngbmqxdawotnfsspsguqfxdetbgszp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770026025.3671057-1642-42392861843968/AnsiballZ_copy.py'
Feb 02 09:53:46 compute-1 sudo[185706]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:53:46 compute-1 python3.9[185708]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/auth.conf group=libvirt mode=0600 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1770026025.3671057-1642-42392861843968/.source.conf follow=False _original_basename=auth.conf checksum=a94cd818c374cec2c8425b70d2e0e2f41b743ae4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 02 09:53:46 compute-1 sudo[185706]: pam_unix(sudo:session): session closed for user root
Feb 02 09:53:46 compute-1 sshd-session[185565]: Invalid user admin from 123.58.212.100 port 52218
Feb 02 09:53:47 compute-1 sshd-session[185565]: Connection closed by invalid user admin 123.58.212.100 port 52218 [preauth]
Feb 02 09:53:47 compute-1 sudo[185859]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qgizamnxbqqpnqbblelkmmvraoxkazdu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770026026.9117117-1642-36548457365212/AnsiballZ_stat.py'
Feb 02 09:53:47 compute-1 sudo[185859]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:53:47 compute-1 python3.9[185861]: ansible-ansible.legacy.stat Invoked with path=/etc/sasl2/libvirt.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 02 09:53:47 compute-1 sudo[185859]: pam_unix(sudo:session): session closed for user root
Feb 02 09:53:47 compute-1 ceph-mon[80115]: pgmap v421: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 85 B/s rd, 0 op/s
Feb 02 09:53:47 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Feb 02 09:53:47 compute-1 sudo[185986]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fheccdjwdislrniustrkysvrrfifislm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770026026.9117117-1642-36548457365212/AnsiballZ_copy.py'
Feb 02 09:53:47 compute-1 sudo[185986]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:53:47 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:53:47 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000025s ======
Feb 02 09:53:47 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:53:47.888 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Feb 02 09:53:47 compute-1 python3.9[185988]: ansible-ansible.legacy.copy Invoked with dest=/etc/sasl2/libvirt.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1770026026.9117117-1642-36548457365212/.source.conf follow=False _original_basename=sasl_libvirt.conf checksum=652e4d404bf79253d06956b8e9847c9364979d4a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 02 09:53:47 compute-1 sudo[185986]: pam_unix(sudo:session): session closed for user root
Feb 02 09:53:47 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:53:47 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:53:47 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:53:47.985 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:53:48 compute-1 sshd-session[185862]: Invalid user admin from 123.58.212.100 port 52230
Feb 02 09:53:48 compute-1 sshd-session[185862]: Connection closed by invalid user admin 123.58.212.100 port 52230 [preauth]
Feb 02 09:53:48 compute-1 sudo[186141]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-olmwnuwnfynpnlyywrljooblwiegclkp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770026028.719985-1981-10450664947305/AnsiballZ_command.py'
Feb 02 09:53:48 compute-1 sudo[186141]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:53:49 compute-1 python3.9[186143]: ansible-ansible.legacy.command Invoked with cmd=saslpasswd2 -f /etc/libvirt/passwd.db -p -a libvirt -u openstack migration stdin=12345678 _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None
Feb 02 09:53:49 compute-1 sudo[186141]: pam_unix(sudo:session): session closed for user root
Feb 02 09:53:49 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:53:49 : epoch 69807427 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Feb 02 09:53:49 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:53:49 : epoch 69807427 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Feb 02 09:53:49 compute-1 ceph-mon[80115]: pgmap v422: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 85 B/s rd, 0 op/s
Feb 02 09:53:49 compute-1 sshd-session[186013]: Invalid user admin from 123.58.212.100 port 52232
Feb 02 09:53:49 compute-1 sudo[186294]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tfycolovmopspuogmgucipjwnmjuapnk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770026029.4364285-2008-223658266863120/AnsiballZ_file.py'
Feb 02 09:53:49 compute-1 sudo[186294]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:53:49 compute-1 sshd-session[186013]: Connection closed by invalid user admin 123.58.212.100 port 52232 [preauth]
Feb 02 09:53:49 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:53:49 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 09:53:49 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:53:49.891 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 09:53:49 compute-1 python3.9[186296]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtlogd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 02 09:53:49 compute-1 sudo[186294]: pam_unix(sudo:session): session closed for user root
Feb 02 09:53:49 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:53:49 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:53:49 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:53:49.988 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:53:50 compute-1 sudo[186448]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tnqpkmpxiordsrxjrievvyssawmvmdof ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770026030.029096-2008-95969627489277/AnsiballZ_file.py'
Feb 02 09:53:50 compute-1 sudo[186448]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:53:50 compute-1 python3.9[186450]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtlogd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 02 09:53:50 compute-1 ceph-mon[80115]: pgmap v423: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.3 KiB/s rd, 597 B/s wr, 2 op/s
Feb 02 09:53:50 compute-1 sudo[186448]: pam_unix(sudo:session): session closed for user root
Feb 02 09:53:50 compute-1 sshd-session[186344]: Invalid user admin from 123.58.212.100 port 52244
Feb 02 09:53:51 compute-1 sshd-session[186344]: Connection closed by invalid user admin 123.58.212.100 port 52244 [preauth]
Feb 02 09:53:51 compute-1 sudo[186601]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bypnfbmtryhirnsteuhvavnnidfoxypg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770026030.8700438-2008-266192420803361/AnsiballZ_file.py'
Feb 02 09:53:51 compute-1 sudo[186601]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:53:51 compute-1 python3.9[186603]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 02 09:53:51 compute-1 sudo[186601]: pam_unix(sudo:session): session closed for user root
Feb 02 09:53:51 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 09:53:51 compute-1 sudo[186755]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qyhquiphhwtbhejjbynmppcxzoynpkla ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770026031.4992743-2008-166255484832873/AnsiballZ_file.py'
Feb 02 09:53:51 compute-1 sudo[186755]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:53:51 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:53:51 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000025s ======
Feb 02 09:53:51 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:53:51.894 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Feb 02 09:53:51 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:53:51 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:53:51 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:53:51.990 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:53:51 compute-1 python3.9[186757]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 02 09:53:52 compute-1 sudo[186755]: pam_unix(sudo:session): session closed for user root
Feb 02 09:53:52 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:53:52 : epoch 69807427 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Feb 02 09:53:52 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:53:52 : epoch 69807427 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Feb 02 09:53:52 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:53:52 : epoch 69807427 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Feb 02 09:53:52 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:53:52 : epoch 69807427 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Feb 02 09:53:52 compute-1 sshd-session[186604]: Invalid user admin from 123.58.212.100 port 54160
Feb 02 09:53:52 compute-1 sudo[186907]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iaalspoehxmarzaqhjorhqdcpbfrjyrv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770026032.1897933-2008-175538387154237/AnsiballZ_file.py'
Feb 02 09:53:52 compute-1 sudo[186907]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:53:52 compute-1 python3.9[186909]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 02 09:53:52 compute-1 sudo[186907]: pam_unix(sudo:session): session closed for user root
Feb 02 09:53:52 compute-1 sshd-session[186604]: Connection closed by invalid user admin 123.58.212.100 port 54160 [preauth]
Feb 02 09:53:53 compute-1 sudo[187062]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xbtmuncqbqbqkwbfqftnjotsnpifmwuv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770026032.7601721-2008-107777566445761/AnsiballZ_file.py'
Feb 02 09:53:53 compute-1 sudo[187062]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:53:53 compute-1 ceph-mon[80115]: pgmap v424: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.3 KiB/s rd, 597 B/s wr, 2 op/s
Feb 02 09:53:53 compute-1 python3.9[187064]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 02 09:53:53 compute-1 sudo[187062]: pam_unix(sudo:session): session closed for user root
Feb 02 09:53:53 compute-1 sudo[187214]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yfoghlwaffarqhcguloofsvdtkhexcyb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770026033.4024224-2008-69395011122175/AnsiballZ_file.py'
Feb 02 09:53:53 compute-1 sudo[187214]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:53:53 compute-1 sshd-session[187010]: Invalid user admin from 123.58.212.100 port 54164
Feb 02 09:53:53 compute-1 python3.9[187216]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 02 09:53:53 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:53:53 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:53:53 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:53:53.897 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:53:53 compute-1 sudo[187214]: pam_unix(sudo:session): session closed for user root
Feb 02 09:53:53 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:53:53 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:53:53 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:53:53.992 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:53:54 compute-1 sshd-session[187010]: Connection closed by invalid user admin 123.58.212.100 port 54164 [preauth]
Feb 02 09:53:54 compute-1 sudo[187368]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cjmzxicflnozmssrndcbpzjwzquwlplc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770026034.035616-2008-95147290215960/AnsiballZ_file.py'
Feb 02 09:53:54 compute-1 sudo[187368]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:53:54 compute-1 python3.9[187370]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 02 09:53:54 compute-1 sudo[187368]: pam_unix(sudo:session): session closed for user root
Feb 02 09:53:54 compute-1 sudo[187521]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wjqkadrcfhrofjnoyieguyytmqahjsvq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770026034.6838615-2008-175425089265007/AnsiballZ_file.py'
Feb 02 09:53:54 compute-1 sudo[187521]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:53:55 compute-1 sshd-session[187329]: Invalid user admin from 123.58.212.100 port 54180
Feb 02 09:53:55 compute-1 python3.9[187523]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 02 09:53:55 compute-1 ceph-mon[80115]: pgmap v425: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 2.9 KiB/s rd, 1023 B/s wr, 4 op/s
Feb 02 09:53:55 compute-1 sudo[187521]: pam_unix(sudo:session): session closed for user root
Feb 02 09:53:55 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:53:55 : epoch 69807427 : compute-1 : ganesha.nfsd-2[main] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Feb 02 09:53:55 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:53:55 : epoch 69807427 : compute-1 : ganesha.nfsd-2[main] main :NFS STARTUP :WARN :No export entries found in configuration file !!!
Feb 02 09:53:55 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:53:55 : epoch 69807427 : compute-1 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:24): Unknown block (RADOS_URLS)
Feb 02 09:53:55 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:53:55 : epoch 69807427 : compute-1 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:29): Unknown block (RGW)
Feb 02 09:53:55 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:53:55 : epoch 69807427 : compute-1 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :CAP_SYS_RESOURCE was successfully removed for proper quota management in FSAL
Feb 02 09:53:55 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:53:55 : epoch 69807427 : compute-1 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :currently set capabilities are: cap_chown,cap_dac_override,cap_fowner,cap_fsetid,cap_kill,cap_setgid,cap_setuid,cap_setpcap,cap_net_bind_service,cap_sys_chroot,cap_setfcap=ep
Feb 02 09:53:55 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:53:55 : epoch 69807427 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_pkginit :DBUS :CRIT :dbus_bus_get failed (Failed to connect to socket /run/dbus/system_bus_socket: No such file or directory)
Feb 02 09:53:55 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:53:55 : epoch 69807427 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Feb 02 09:53:55 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:53:55 : epoch 69807427 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Feb 02 09:53:55 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:53:55 : epoch 69807427 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Feb 02 09:53:55 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:53:55 : epoch 69807427 : compute-1 : ganesha.nfsd-2[main] nfs_Init_svc :DISP :CRIT :Cannot acquire credentials for principal nfs
Feb 02 09:53:55 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:53:55 : epoch 69807427 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Feb 02 09:53:55 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:53:55 : epoch 69807427 : compute-1 : ganesha.nfsd-2[main] nfs_Init_admin_thread :NFS CB :EVENT :Admin thread initialized
Feb 02 09:53:55 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:53:55 : epoch 69807427 : compute-1 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :EVENT :Callback creds directory (/var/run/ganesha) already exists
Feb 02 09:53:55 compute-1 sshd-session[187329]: Connection closed by invalid user admin 123.58.212.100 port 54180 [preauth]
Feb 02 09:53:55 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:53:55 : epoch 69807427 : compute-1 : ganesha.nfsd-2[main] find_keytab_entry :NFS CB :WARN :Configuration file does not specify default realm while getting default realm name
Feb 02 09:53:55 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:53:55 : epoch 69807427 : compute-1 : ganesha.nfsd-2[main] gssd_refresh_krb5_machine_credential :NFS CB :CRIT :ERROR: gssd_refresh_krb5_machine_credential: no usable keytab entry found in keytab /etc/krb5.keytab for connection with host localhost
Feb 02 09:53:55 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:53:55 : epoch 69807427 : compute-1 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :WARN :gssd_refresh_krb5_machine_credential failed (-1765328160:0)
Feb 02 09:53:55 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:53:55 : epoch 69807427 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :Starting delayed executor.
Feb 02 09:53:55 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:53:55 : epoch 69807427 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :gsh_dbusthread was started successfully
Feb 02 09:53:55 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:53:55 : epoch 69807427 : compute-1 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :CRIT :DBUS not initialized, service thread exiting
Feb 02 09:53:55 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:53:55 : epoch 69807427 : compute-1 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :EVENT :shutdown
Feb 02 09:53:55 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:53:55 : epoch 69807427 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :admin thread was started successfully
Feb 02 09:53:55 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:53:55 : epoch 69807427 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :reaper thread was started successfully
Feb 02 09:53:55 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:53:55 : epoch 69807427 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :General fridge was started successfully
Feb 02 09:53:55 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:53:55 : epoch 69807427 : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Feb 02 09:53:55 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:53:55 : epoch 69807427 : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :             NFS SERVER INITIALIZED
Feb 02 09:53:55 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:53:55 : epoch 69807427 : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Feb 02 09:53:55 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:53:55 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb620000df0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:53:55 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:53:55 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb6140016c0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:53:55 compute-1 sudo[187690]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vlpvqjpzueyivbgoermefhsmqingkpbv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770026035.5413802-2008-65995237010677/AnsiballZ_file.py'
Feb 02 09:53:55 compute-1 sudo[187690]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:53:55 compute-1 sudo[187692]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Feb 02 09:53:55 compute-1 sudo[187692]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:53:55 compute-1 sudo[187692]: pam_unix(sudo:session): session closed for user root
Feb 02 09:53:55 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:53:55 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:53:55 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:53:55.899 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:53:55 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:53:55 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:53:55 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:53:55.995 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:53:56 compute-1 python3.9[187695]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 02 09:53:56 compute-1 sudo[187690]: pam_unix(sudo:session): session closed for user root
Feb 02 09:53:56 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 09:53:56 compute-1 sudo[187867]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vpsqxbisqtokancedgenebhhjjjxahec ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770026036.200078-2008-32107626900169/AnsiballZ_file.py'
Feb 02 09:53:56 compute-1 sudo[187867]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:53:56 compute-1 sshd-session[187611]: Invalid user admin from 123.58.212.100 port 54188
Feb 02 09:53:56 compute-1 python3.9[187869]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 02 09:53:56 compute-1 sudo[187867]: pam_unix(sudo:session): session closed for user root
Feb 02 09:53:56 compute-1 sshd-session[187611]: Connection closed by invalid user admin 123.58.212.100 port 54188 [preauth]
Feb 02 09:53:57 compute-1 sudo[188021]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-evlwlqllzfedkbvswzztflvidyahmdab ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770026036.8676074-2008-274510360182531/AnsiballZ_file.py'
Feb 02 09:53:57 compute-1 sudo[188021]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:53:57 compute-1 ceph-mon[80115]: pgmap v426: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 2.8 KiB/s rd, 1023 B/s wr, 3 op/s
Feb 02 09:53:57 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:53:57 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb600000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:53:57 compute-1 python3.9[188024]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 02 09:53:57 compute-1 sudo[188021]: pam_unix(sudo:session): session closed for user root
Feb 02 09:53:57 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:53:57 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb604000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:53:57 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-haproxy-nfs-cephfs-compute-1-sryqbx[86108]: [WARNING] 032/095357 (4) : Server backend/nfs.cephfs.0 is UP, reason: Layer4 check passed, check duration: 0ms. 2 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Feb 02 09:53:57 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:53:57 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb608000fa0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:53:57 compute-1 sudo[188174]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aqdlkmqlxdybzvirlykawnyetyaqzlny ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770026037.5201426-2008-257392135578822/AnsiballZ_file.py'
Feb 02 09:53:57 compute-1 sudo[188174]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:53:57 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:53:57 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 09:53:57 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:53:57.902 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 09:53:57 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:53:57 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000025s ======
Feb 02 09:53:57 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:53:57.996 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Feb 02 09:53:58 compute-1 python3.9[188176]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 02 09:53:58 compute-1 sudo[188174]: pam_unix(sudo:session): session closed for user root
Feb 02 09:53:58 compute-1 sshd-session[188018]: Invalid user admin from 123.58.212.100 port 54198
Feb 02 09:53:58 compute-1 sshd-session[188018]: Connection closed by invalid user admin 123.58.212.100 port 54198 [preauth]
Feb 02 09:53:58 compute-1 sudo[188326]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eugfpjhldfrpkehchbguixoklwkqzlqj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770026038.1619828-2008-42944816913453/AnsiballZ_file.py'
Feb 02 09:53:58 compute-1 sudo[188326]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:53:58 compute-1 python3.9[188328]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 02 09:53:58 compute-1 sudo[188326]: pam_unix(sudo:session): session closed for user root
Feb 02 09:53:59 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-haproxy-nfs-cephfs-compute-1-sryqbx[86108]: [WARNING] 032/095359 (4) : Server backend/nfs.cephfs.2 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Feb 02 09:53:59 compute-1 ceph-mon[80115]: pgmap v427: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 2.8 KiB/s rd, 1023 B/s wr, 3 op/s
Feb 02 09:53:59 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:53:59 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb614001fe0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:53:59 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:53:59 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb608000fa0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:53:59 compute-1 sudo[188481]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ubwfzpvmfhasjbpngzyedccuitoggyon ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770026039.1509202-2305-190010660507789/AnsiballZ_stat.py'
Feb 02 09:53:59 compute-1 sudo[188481]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:53:59 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:53:59 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb6000016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:53:59 compute-1 python3.9[188483]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtlogd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 02 09:53:59 compute-1 sudo[188481]: pam_unix(sudo:session): session closed for user root
Feb 02 09:53:59 compute-1 sshd-session[188329]: Invalid user admin from 123.58.212.100 port 54208
Feb 02 09:53:59 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:53:59 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 09:53:59 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:53:59.905 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 09:53:59 compute-1 sudo[188604]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bkcwpkzfkujjesimbkkaqcytvjdmflnq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770026039.1509202-2305-190010660507789/AnsiballZ_copy.py'
Feb 02 09:53:59 compute-1 sudo[188604]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:54:00 compute-1 sshd-session[188329]: Connection closed by invalid user admin 123.58.212.100 port 54208 [preauth]
Feb 02 09:54:00 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:54:00 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:54:00 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:53:59.999 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:54:00 compute-1 python3.9[188606]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtlogd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1770026039.1509202-2305-190010660507789/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 02 09:54:00 compute-1 sudo[188604]: pam_unix(sudo:session): session closed for user root
Feb 02 09:54:00 compute-1 sudo[188758]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oureunbwanlljqsfphoxmququmrdxgwn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770026040.2868788-2305-149784564914833/AnsiballZ_stat.py'
Feb 02 09:54:00 compute-1 sudo[188758]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:54:00 compute-1 python3.9[188760]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtlogd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 02 09:54:00 compute-1 sudo[188758]: pam_unix(sudo:session): session closed for user root
Feb 02 09:54:01 compute-1 sshd-session[188631]: Invalid user admin from 123.58.212.100 port 54224
Feb 02 09:54:01 compute-1 sudo[188882]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dfxzildejhwkxrvkwlmtbucuwnvivacd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770026040.2868788-2305-149784564914833/AnsiballZ_copy.py'
Feb 02 09:54:01 compute-1 sudo[188882]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:54:01 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:54:01 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb6040016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:54:01 compute-1 ceph-mon[80115]: pgmap v428: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 3.2 KiB/s rd, 1.1 KiB/s wr, 4 op/s
Feb 02 09:54:01 compute-1 sshd-session[188631]: Connection closed by invalid user admin 123.58.212.100 port 54224 [preauth]
Feb 02 09:54:01 compute-1 python3.9[188884]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtlogd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1770026040.2868788-2305-149784564914833/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 02 09:54:01 compute-1 sudo[188882]: pam_unix(sudo:session): session closed for user root
Feb 02 09:54:01 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:54:01 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb614001fe0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:54:01 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:54:01 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb608001f60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:54:01 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 09:54:01 compute-1 sudo[188903]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 02 09:54:01 compute-1 sudo[188903]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:54:01 compute-1 sudo[188903]: pam_unix(sudo:session): session closed for user root
Feb 02 09:54:01 compute-1 sudo[188938]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/d241d473-9fcb-5f74-b163-f1ca4454e7f1/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Feb 02 09:54:01 compute-1 sudo[188938]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:54:01 compute-1 sudo[189094]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cquysvdsgfqrvdobhkmuzjlvbkjdixbs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770026041.5640469-2305-110359189699812/AnsiballZ_stat.py'
Feb 02 09:54:01 compute-1 sudo[189094]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:54:01 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:54:01 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000025s ======
Feb 02 09:54:01 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:54:01.908 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Feb 02 09:54:02 compute-1 python3.9[189102]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 02 09:54:02 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:54:02 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:54:02 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:54:02.001 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:54:02 compute-1 sudo[188938]: pam_unix(sudo:session): session closed for user root
Feb 02 09:54:02 compute-1 sudo[189094]: pam_unix(sudo:session): session closed for user root
Feb 02 09:54:02 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Feb 02 09:54:02 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Feb 02 09:54:02 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb 02 09:54:02 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb 02 09:54:02 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Feb 02 09:54:02 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Feb 02 09:54:02 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Feb 02 09:54:02 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Feb 02 09:54:02 compute-1 sudo[189242]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nycymvdohxqevkppxcuvzuogpwjvgxdt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770026041.5640469-2305-110359189699812/AnsiballZ_copy.py'
Feb 02 09:54:02 compute-1 sudo[189242]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:54:02 compute-1 sshd-session[188986]: Invalid user admin from 123.58.212.100 port 52878
Feb 02 09:54:02 compute-1 python3.9[189244]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1770026041.5640469-2305-110359189699812/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 02 09:54:02 compute-1 sudo[189242]: pam_unix(sudo:session): session closed for user root
Feb 02 09:54:02 compute-1 sshd-session[188986]: Connection closed by invalid user admin 123.58.212.100 port 52878 [preauth]
Feb 02 09:54:02 compute-1 sudo[189396]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iisikbsugarrcqiqbdmbbeqstrdogdal ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770026042.6770968-2305-174616988064089/AnsiballZ_stat.py'
Feb 02 09:54:02 compute-1 sudo[189396]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:54:03 compute-1 python3.9[189399]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 02 09:54:03 compute-1 sudo[189396]: pam_unix(sudo:session): session closed for user root
Feb 02 09:54:03 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:54:03 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb6000016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:54:03 compute-1 ceph-mon[80115]: pgmap v429: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.9 KiB/s rd, 511 B/s wr, 2 op/s
Feb 02 09:54:03 compute-1 ceph-mon[80115]: pgmap v430: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 2.3 KiB/s rd, 609 B/s wr, 3 op/s
Feb 02 09:54:03 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:54:03 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb604001fc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:54:03 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:54:03 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb614001fe0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:54:03 compute-1 sudo[189520]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mvcqjwvjingsgtfgtdssuvyykyndpttk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770026042.6770968-2305-174616988064089/AnsiballZ_copy.py'
Feb 02 09:54:03 compute-1 sudo[189520]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:54:03 compute-1 python3.9[189522]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1770026042.6770968-2305-174616988064089/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 02 09:54:03 compute-1 sudo[189520]: pam_unix(sudo:session): session closed for user root
Feb 02 09:54:03 compute-1 sshd-session[189395]: Invalid user admin from 123.58.212.100 port 52884
Feb 02 09:54:03 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:54:03 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 09:54:03 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:54:03.911 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 09:54:04 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:54:04 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:54:04 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:54:04.003 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:54:04 compute-1 sshd-session[189395]: Connection closed by invalid user admin 123.58.212.100 port 52884 [preauth]
Feb 02 09:54:04 compute-1 sudo[189672]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zkxviwrsyahoczfnwcocpjkelcdtpfnl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770026043.8491817-2305-235956656393227/AnsiballZ_stat.py'
Feb 02 09:54:04 compute-1 sudo[189672]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:54:04 compute-1 python3.9[189674]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 02 09:54:04 compute-1 sudo[189672]: pam_unix(sudo:session): session closed for user root
Feb 02 09:54:04 compute-1 sudo[189795]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tgiptroosxazuoxeolyjttjmeovfhhdh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770026043.8491817-2305-235956656393227/AnsiballZ_copy.py'
Feb 02 09:54:04 compute-1 sudo[189795]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:54:04 compute-1 python3.9[189797]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1770026043.8491817-2305-235956656393227/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 02 09:54:04 compute-1 sudo[189795]: pam_unix(sudo:session): session closed for user root
Feb 02 09:54:05 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:54:05 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb608001f60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:54:05 compute-1 ceph-mon[80115]: pgmap v431: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 406 B/s rd, 101 B/s wr, 0 op/s
Feb 02 09:54:05 compute-1 sudo[189948]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pamqxlvfjkxhttheinnkzaicqiictpal ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770026044.985122-2305-238934063252152/AnsiballZ_stat.py'
Feb 02 09:54:05 compute-1 sudo[189948]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:54:05 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:54:05 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb6000016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:54:05 compute-1 python3.9[189950]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 02 09:54:05 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:54:05 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb604001fc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:54:05 compute-1 sudo[189948]: pam_unix(sudo:session): session closed for user root
Feb 02 09:54:05 compute-1 sudo[190073]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rfodzdkmkarnfhxifrwshvztipppfkhi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770026044.985122-2305-238934063252152/AnsiballZ_copy.py'
Feb 02 09:54:05 compute-1 sudo[190073]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:54:05 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:54:05 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 09:54:05 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:54:05.915 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 09:54:06 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:54:06 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:54:06 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:54:06.006 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:54:06 compute-1 python3.9[190075]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1770026044.985122-2305-238934063252152/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 02 09:54:06 compute-1 sudo[190073]: pam_unix(sudo:session): session closed for user root
Feb 02 09:54:06 compute-1 sshd-session[189951]: Invalid user admin from 123.58.212.100 port 52896
Feb 02 09:54:06 compute-1 sshd-session[189951]: Connection closed by invalid user admin 123.58.212.100 port 52896 [preauth]
Feb 02 09:54:06 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 09:54:06 compute-1 sudo[190225]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qrfiavemofemgjomugfrlvrabjwirqia ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770026046.244873-2305-185985929790888/AnsiballZ_stat.py'
Feb 02 09:54:06 compute-1 sudo[190225]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:54:06 compute-1 python3.9[190227]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 02 09:54:06 compute-1 sudo[190225]: pam_unix(sudo:session): session closed for user root
Feb 02 09:54:07 compute-1 sudo[190301]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 02 09:54:07 compute-1 sudo[190301]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:54:07 compute-1 sudo[190301]: pam_unix(sudo:session): session closed for user root
Feb 02 09:54:07 compute-1 sudo[190376]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-utyirjavythhsldondlruncvkiyvgvae ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770026046.244873-2305-185985929790888/AnsiballZ_copy.py'
Feb 02 09:54:07 compute-1 sudo[190376]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:54:07 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:54:07 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb614001fe0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:54:07 compute-1 ceph-mon[80115]: pgmap v432: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 406 B/s rd, 101 B/s wr, 0 op/s
Feb 02 09:54:07 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb 02 09:54:07 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb 02 09:54:07 compute-1 python3.9[190378]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1770026046.244873-2305-185985929790888/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 02 09:54:07 compute-1 sudo[190376]: pam_unix(sudo:session): session closed for user root
Feb 02 09:54:07 compute-1 podman[190379]: 2026-02-02 09:54:07.435763389 +0000 UTC m=+0.099723480 container health_status 1fb2696999ea15e9688144989093738543d3cef725f9986792d6dfd707aefbe2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:099d88ae13fa2b3409da5310cdcba7fa01d2c87a8bc98296299a57054b9a075e, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'db4758ee7523fe447444c4bd2b867b543b1eee4e3bbcf6676cd1b27bf6147d86-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:099d88ae13fa2b3409da5310cdcba7fa01d2c87a8bc98296299a57054b9a075e', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_id=ovn_controller, managed_by=edpm_ansible)
Feb 02 09:54:07 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:54:07 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb608001f60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:54:07 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:54:07 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb600002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:54:07 compute-1 sudo[190556]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pdsqzgdgfxkuobcumeuqijvastzgyhzt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770026047.4871-2305-277857923965015/AnsiballZ_stat.py'
Feb 02 09:54:07 compute-1 sudo[190556]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:54:07 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:54:07 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:54:07 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:54:07.919 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:54:07 compute-1 python3.9[190558]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 02 09:54:07 compute-1 sudo[190556]: pam_unix(sudo:session): session closed for user root
Feb 02 09:54:08 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:54:08 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:54:08 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:54:08.008 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:54:08 compute-1 sudo[190679]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uwlbyqtsfbmpkygjybyqxnyrdmopbyzd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770026047.4871-2305-277857923965015/AnsiballZ_copy.py'
Feb 02 09:54:08 compute-1 sudo[190679]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:54:08 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-haproxy-nfs-cephfs-compute-1-sryqbx[86108]: [WARNING] 032/095408 (4) : Server backend/nfs.cephfs.1 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Feb 02 09:54:08 compute-1 python3.9[190681]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1770026047.4871-2305-277857923965015/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 02 09:54:08 compute-1 sudo[190679]: pam_unix(sudo:session): session closed for user root
Feb 02 09:54:08 compute-1 sshd-session[190228]: Invalid user admin from 123.58.212.100 port 52906
Feb 02 09:54:08 compute-1 sudo[190832]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wkbatuwsgzixorthslgniwdzdlwanney ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770026048.6744466-2305-242883492080263/AnsiballZ_stat.py'
Feb 02 09:54:08 compute-1 sudo[190832]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:54:09 compute-1 python3.9[190834]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 02 09:54:09 compute-1 sudo[190832]: pam_unix(sudo:session): session closed for user root
Feb 02 09:54:09 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:54:09 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb604001fc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:54:09 compute-1 sshd-session[190228]: Connection closed by invalid user admin 123.58.212.100 port 52906 [preauth]
Feb 02 09:54:09 compute-1 ceph-mon[80115]: pgmap v433: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 406 B/s rd, 101 B/s wr, 0 op/s
Feb 02 09:54:09 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:54:09 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb614001fe0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:54:09 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:54:09 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb608003340 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:54:09 compute-1 sudo[190957]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-leiimydtxdawjnrglxmdzcyupkhwbbpi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770026048.6744466-2305-242883492080263/AnsiballZ_copy.py'
Feb 02 09:54:09 compute-1 sudo[190957]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:54:09 compute-1 python3.9[190959]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1770026048.6744466-2305-242883492080263/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 02 09:54:09 compute-1 sudo[190957]: pam_unix(sudo:session): session closed for user root
Feb 02 09:54:09 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:54:09 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000025s ======
Feb 02 09:54:09 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:54:09.922 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Feb 02 09:54:10 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:54:10 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 09:54:10 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:54:10.012 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 09:54:10 compute-1 sudo[191109]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hvpwdnvrqhngjzhxvcursebdlzpehvvl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770026049.8597515-2305-77005213568255/AnsiballZ_stat.py'
Feb 02 09:54:10 compute-1 sudo[191109]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:54:10 compute-1 sshd-session[190922]: Invalid user admin from 123.58.212.100 port 52912
Feb 02 09:54:10 compute-1 python3.9[191111]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 02 09:54:10 compute-1 sudo[191109]: pam_unix(sudo:session): session closed for user root
Feb 02 09:54:10 compute-1 sshd-session[190922]: Connection closed by invalid user admin 123.58.212.100 port 52912 [preauth]
Feb 02 09:54:10 compute-1 sudo[191233]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kzhgbstzchtcggmxnteduqtwnrdwvbzb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770026049.8597515-2305-77005213568255/AnsiballZ_copy.py'
Feb 02 09:54:10 compute-1 sudo[191233]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:54:10 compute-1 python3.9[191235]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1770026049.8597515-2305-77005213568255/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 02 09:54:10 compute-1 sudo[191233]: pam_unix(sudo:session): session closed for user root
Feb 02 09:54:11 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-haproxy-nfs-cephfs-compute-1-sryqbx[86108]: [WARNING] 032/095411 (4) : Server backend/nfs.cephfs.2 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 1 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Feb 02 09:54:11 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:54:11 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb600002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:54:11 compute-1 sudo[191387]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mmnxmbtawmbswldmnuwdgrshxzcijtot ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770026051.1267045-2305-252374458956224/AnsiballZ_stat.py'
Feb 02 09:54:11 compute-1 sudo[191387]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:54:11 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:54:11 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb6040032f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:54:11 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:54:11 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb614001fe0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:54:11 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 09:54:11 compute-1 python3.9[191389]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 02 09:54:11 compute-1 sudo[191387]: pam_unix(sudo:session): session closed for user root
Feb 02 09:54:11 compute-1 ceph-mon[80115]: pgmap v434: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 203 B/s rd, 0 op/s
Feb 02 09:54:11 compute-1 sshd-session[191236]: Invalid user admin from 123.58.212.100 port 52926
Feb 02 09:54:11 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:54:11 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:54:11 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:54:11.925 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:54:11 compute-1 sshd-session[191236]: Connection closed by invalid user admin 123.58.212.100 port 52926 [preauth]
Feb 02 09:54:12 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:54:12 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:54:12 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:54:12.015 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:54:12 compute-1 sudo[191512]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dnvuuxvecsblfraztwtacrdlloqcrwhx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770026051.1267045-2305-252374458956224/AnsiballZ_copy.py'
Feb 02 09:54:12 compute-1 sudo[191512]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:54:12 compute-1 ceph-mon[80115]: pgmap v435: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 203 B/s rd, 0 op/s
Feb 02 09:54:12 compute-1 python3.9[191514]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1770026051.1267045-2305-252374458956224/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 02 09:54:12 compute-1 sudo[191512]: pam_unix(sudo:session): session closed for user root
Feb 02 09:54:13 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:54:13 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb608003340 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:54:13 compute-1 sudo[191665]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-avftrrksjcjxotjzzejmfdarbuyrxexy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770026052.9545631-2305-277997031214432/AnsiballZ_stat.py'
Feb 02 09:54:13 compute-1 sudo[191665]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:54:13 compute-1 sshd-session[191480]: Invalid user admin from 123.58.212.100 port 34178
Feb 02 09:54:13 compute-1 python3.9[191667]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 02 09:54:13 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:54:13 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb600002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:54:13 compute-1 sudo[191665]: pam_unix(sudo:session): session closed for user root
Feb 02 09:54:13 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:54:13 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb6040032f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:54:13 compute-1 sshd-session[191480]: Connection closed by invalid user admin 123.58.212.100 port 34178 [preauth]
Feb 02 09:54:13 compute-1 sudo[191788]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zetecxplbbthdxmiymraienzbkntxvqv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770026052.9545631-2305-277997031214432/AnsiballZ_copy.py'
Feb 02 09:54:13 compute-1 sudo[191788]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:54:13 compute-1 python3.9[191790]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1770026052.9545631-2305-277997031214432/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 02 09:54:13 compute-1 sudo[191788]: pam_unix(sudo:session): session closed for user root
Feb 02 09:54:13 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:54:13 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:54:13 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:54:13.928 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:54:14 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:54:14 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:54:14 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:54:14.017 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:54:14 compute-1 sudo[191942]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dsvbhjuqjjpsnjjgjsywjzrofdgjzbii ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770026054.1463447-2305-195197278954890/AnsiballZ_stat.py'
Feb 02 09:54:14 compute-1 sudo[191942]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:54:14 compute-1 python3.9[191944]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 02 09:54:14 compute-1 sudo[191942]: pam_unix(sudo:session): session closed for user root
Feb 02 09:54:14 compute-1 sshd-session[191791]: Invalid user admin from 123.58.212.100 port 34182
Feb 02 09:54:14 compute-1 sudo[192066]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-weqkxnfljxumquxbjmcpqlpqgfflqpgw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770026054.1463447-2305-195197278954890/AnsiballZ_copy.py'
Feb 02 09:54:14 compute-1 sudo[192066]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:54:14 compute-1 sshd-session[191791]: Connection closed by invalid user admin 123.58.212.100 port 34182 [preauth]
Feb 02 09:54:15 compute-1 python3.9[192068]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1770026054.1463447-2305-195197278954890/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 02 09:54:15 compute-1 sudo[192066]: pam_unix(sudo:session): session closed for user root
Feb 02 09:54:15 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:54:15 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb614001fe0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:54:15 compute-1 ceph-mon[80115]: pgmap v436: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s
Feb 02 09:54:15 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:54:15 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb608004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:54:15 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:54:15 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb600003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:54:15 compute-1 sudo[192220]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pgqbvzqmwffzcwccfnbfyfnfnqyecapa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770026055.3469117-2305-211702492162504/AnsiballZ_stat.py'
Feb 02 09:54:15 compute-1 sudo[192220]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:54:15 compute-1 python3.9[192222]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 02 09:54:15 compute-1 sudo[192220]: pam_unix(sudo:session): session closed for user root
Feb 02 09:54:15 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:54:15 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000025s ======
Feb 02 09:54:15 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:54:15.930 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Feb 02 09:54:15 compute-1 sudo[192223]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Feb 02 09:54:15 compute-1 sudo[192223]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:54:15 compute-1 sudo[192223]: pam_unix(sudo:session): session closed for user root
Feb 02 09:54:16 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:54:16 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 09:54:16 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:54:16.018 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 09:54:16 compute-1 sshd-session[192069]: Invalid user admin from 123.58.212.100 port 34192
Feb 02 09:54:16 compute-1 sudo[192368]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eucypbexslsglzbjdldifmqnyadnniba ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770026055.3469117-2305-211702492162504/AnsiballZ_copy.py'
Feb 02 09:54:16 compute-1 sudo[192368]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:54:16 compute-1 sshd-session[192069]: Connection closed by invalid user admin 123.58.212.100 port 34192 [preauth]
Feb 02 09:54:16 compute-1 python3.9[192370]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1770026055.3469117-2305-211702492162504/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 02 09:54:16 compute-1 sudo[192368]: pam_unix(sudo:session): session closed for user root
Feb 02 09:54:16 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 09:54:17 compute-1 podman[192497]: 2026-02-02 09:54:17.054751432 +0000 UTC m=+0.076840630 container health_status 9cab238676dda9a572dafc047b624a8b78a8fbec063bf997856559897160605d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, managed_by=edpm_ansible, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'db4758ee7523fe447444c4bd2b867b543b1eee4e3bbcf6676cd1b27bf6147d86-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2)
Feb 02 09:54:17 compute-1 python3.9[192534]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail
                                             ls -lRZ /run/libvirt | grep -E ':container_\S+_t'
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 02 09:54:17 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:54:17 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb604004000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:54:17 compute-1 ceph-mon[80115]: pgmap v437: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s
Feb 02 09:54:17 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Feb 02 09:54:17 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:54:17 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb614001fe0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:54:17 compute-1 sshd-session[192395]: Invalid user admin from 123.58.212.100 port 34198
Feb 02 09:54:17 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:54:17 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb608004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:54:17 compute-1 sshd-session[192395]: Connection closed by invalid user admin 123.58.212.100 port 34198 [preauth]
Feb 02 09:54:17 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:54:17 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000025s ======
Feb 02 09:54:17 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:54:17.933 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Feb 02 09:54:18 compute-1 sudo[192695]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mqebviiofujodwgaemgjoiqxgkgqacha ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770026057.5287356-2923-43084710421495/AnsiballZ_seboolean.py'
Feb 02 09:54:18 compute-1 sudo[192695]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:54:18 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:54:18 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000025s ======
Feb 02 09:54:18 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:54:18.020 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Feb 02 09:54:18 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:54:18 : epoch 69807427 : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Feb 02 09:54:18 compute-1 python3.9[192697]: ansible-ansible.posix.seboolean Invoked with name=os_enable_vtpm persistent=True state=True ignore_selinux_state=False
Feb 02 09:54:19 compute-1 sshd-session[192655]: Invalid user admin from 123.58.212.100 port 34212
Feb 02 09:54:19 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:54:19 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb600003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:54:19 compute-1 sshd-session[192655]: Connection closed by invalid user admin 123.58.212.100 port 34212 [preauth]
Feb 02 09:54:19 compute-1 ceph-mon[80115]: pgmap v438: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s
Feb 02 09:54:19 compute-1 sudo[192695]: pam_unix(sudo:session): session closed for user root
Feb 02 09:54:19 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:54:19 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb604004000 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:54:19 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:54:19 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb614001fe0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:54:19 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:54:19 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000025s ======
Feb 02 09:54:19 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:54:19.936 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Feb 02 09:54:20 compute-1 sudo[192854]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-acjbqeuffitftuzofshtywowmtcvldmq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770026059.654323-2947-195437358441095/AnsiballZ_copy.py'
Feb 02 09:54:20 compute-1 dbus-broker-launch[780]: avc:  op=load_policy lsm=selinux seqno=13 res=1
Feb 02 09:54:20 compute-1 sudo[192854]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:54:20 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:54:20 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000025s ======
Feb 02 09:54:20 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:54:20.022 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Feb 02 09:54:20 compute-1 python3.9[192856]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/servercert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 02 09:54:20 compute-1 sudo[192854]: pam_unix(sudo:session): session closed for user root
Feb 02 09:54:20 compute-1 sshd-session[192727]: Invalid user admin from 123.58.212.100 port 34222
Feb 02 09:54:20 compute-1 sshd-session[192727]: Connection closed by invalid user admin 123.58.212.100 port 34222 [preauth]
Feb 02 09:54:20 compute-1 sudo[193006]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-whhwkrkahubmsusbdjtdodrfwzfrkmao ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770026060.394409-2947-277228511950697/AnsiballZ_copy.py'
Feb 02 09:54:20 compute-1 sudo[193006]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:54:20 compute-1 python3.9[193009]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/private/serverkey.pem group=root mode=0600 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 02 09:54:20 compute-1 sudo[193006]: pam_unix(sudo:session): session closed for user root
Feb 02 09:54:21 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:54:21 : epoch 69807427 : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Feb 02 09:54:21 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:54:21 : epoch 69807427 : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Feb 02 09:54:21 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:54:21 : epoch 69807427 : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Feb 02 09:54:21 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:54:21 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb608004050 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:54:21 compute-1 ceph-mon[80115]: pgmap v439: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.2 KiB/s rd, 511 B/s wr, 1 op/s
Feb 02 09:54:21 compute-1 sudo[193161]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hxayfemrrzkkhfcsxsjiwzttzudoziwa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770026061.1299083-2947-58998863493806/AnsiballZ_copy.py'
Feb 02 09:54:21 compute-1 sudo[193161]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:54:21 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:54:21 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb600003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:54:21 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:54:21 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb604004000 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:54:21 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 09:54:21 compute-1 python3.9[193163]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/clientcert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 02 09:54:21 compute-1 sudo[193161]: pam_unix(sudo:session): session closed for user root
Feb 02 09:54:21 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:54:21 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 09:54:21 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:54:21.939 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 09:54:22 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:54:22 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:54:22 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:54:22.025 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:54:22 compute-1 sudo[193315]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nzltcfjwelojosszswuasebcaxllakuk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770026061.8120635-2947-191190691833541/AnsiballZ_copy.py'
Feb 02 09:54:22 compute-1 sudo[193315]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:54:22 compute-1 python3.9[193317]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/private/clientkey.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 02 09:54:22 compute-1 sudo[193315]: pam_unix(sudo:session): session closed for user root
Feb 02 09:54:22 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:54:22 : epoch 69807427 : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Feb 02 09:54:22 compute-1 sshd-session[193231]: Invalid user solv from 80.94.92.184 port 41132
Feb 02 09:54:22 compute-1 sshd-session[193231]: Connection closed by invalid user solv 80.94.92.184 port 41132 [preauth]
Feb 02 09:54:22 compute-1 sshd-session[193010]: Invalid user admin from 123.58.212.100 port 34230
Feb 02 09:54:22 compute-1 sudo[193468]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-efuquahxunbqxyjjnrpzjrgrgpzkpdci ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770026062.5454772-2947-96976259542984/AnsiballZ_copy.py'
Feb 02 09:54:22 compute-1 sudo[193468]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:54:22 compute-1 sshd-session[193010]: Connection closed by invalid user admin 123.58.212.100 port 34230 [preauth]
Feb 02 09:54:23 compute-1 python3.9[193470]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/CA/cacert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/ca.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 02 09:54:23 compute-1 sudo[193468]: pam_unix(sudo:session): session closed for user root
Feb 02 09:54:23 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:54:23 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb614001fe0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:54:23 compute-1 ceph-mon[80115]: pgmap v440: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.1 KiB/s rd, 511 B/s wr, 1 op/s
Feb 02 09:54:23 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:54:23 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb614001fe0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:54:23 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:54:23 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb600003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:54:23 compute-1 sudo[193622]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xorhfgboxclgjidarzvsbmmxxwpmskdx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770026063.6064954-3055-109232494399538/AnsiballZ_copy.py'
Feb 02 09:54:23 compute-1 sudo[193622]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:54:23 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:54:23 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000025s ======
Feb 02 09:54:23 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:54:23.942 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Feb 02 09:54:24 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:54:24 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000025s ======
Feb 02 09:54:24 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:54:24.028 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Feb 02 09:54:24 compute-1 sshd-session[193478]: Invalid user admin from 123.58.212.100 port 46456
Feb 02 09:54:24 compute-1 python3.9[193624]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/server-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 02 09:54:24 compute-1 sudo[193622]: pam_unix(sudo:session): session closed for user root
Feb 02 09:54:24 compute-1 sshd-session[193478]: Connection closed by invalid user admin 123.58.212.100 port 46456 [preauth]
Feb 02 09:54:24 compute-1 sudo[193776]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-keuizfxwqudnwkxvgwtuqddkupexnygd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770026064.3134656-3055-224989363961899/AnsiballZ_copy.py'
Feb 02 09:54:24 compute-1 sudo[193776]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:54:24 compute-1 python3.9[193778]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/server-key.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 02 09:54:24 compute-1 sudo[193776]: pam_unix(sudo:session): session closed for user root
Feb 02 09:54:25 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:54:25 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb604004000 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:54:25 compute-1 sudo[193929]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pfezpvwsgzfbghjirhwvcrpklfnckrmm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770026064.998654-3055-244113536599754/AnsiballZ_copy.py'
Feb 02 09:54:25 compute-1 sudo[193929]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:54:25 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:54:25 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb614001fe0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:54:25 compute-1 python3.9[193931]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/client-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 02 09:54:25 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:54:25 : epoch 69807427 : compute-1 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Feb 02 09:54:25 compute-1 sudo[193929]: pam_unix(sudo:session): session closed for user root
Feb 02 09:54:25 compute-1 ceph-mon[80115]: pgmap v441: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 2.4 KiB/s rd, 938 B/s wr, 3 op/s
Feb 02 09:54:25 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:54:25 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb614001fe0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:54:25 compute-1 sshd-session[193748]: Invalid user admin from 123.58.212.100 port 46470
Feb 02 09:54:25 compute-1 sshd-session[193748]: Connection closed by invalid user admin 123.58.212.100 port 46470 [preauth]
Feb 02 09:54:25 compute-1 sudo[194081]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wjdurpuquziwlhkebleqajtvzptcufmq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770026065.5889254-3055-254903062590006/AnsiballZ_copy.py'
Feb 02 09:54:25 compute-1 sudo[194081]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:54:25 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:54:25 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 09:54:25 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:54:25.945 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 09:54:26 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:54:26 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000025s ======
Feb 02 09:54:26 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:54:26.030 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Feb 02 09:54:26 compute-1 python3.9[194083]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/client-key.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 02 09:54:26 compute-1 sudo[194081]: pam_unix(sudo:session): session closed for user root
Feb 02 09:54:26 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 09:54:26 compute-1 sudo[194235]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ynscaogipcpppejoxpwynyrhyvgxlpfh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770026066.283177-3055-197765167586694/AnsiballZ_copy.py'
Feb 02 09:54:26 compute-1 sudo[194235]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:54:26 compute-1 ceph-mon[80115]: pgmap v442: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 2.4 KiB/s rd, 938 B/s wr, 3 op/s
Feb 02 09:54:26 compute-1 python3.9[194237]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/ca-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/ca.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 02 09:54:26 compute-1 sudo[194235]: pam_unix(sudo:session): session closed for user root
Feb 02 09:54:26 compute-1 sshd-session[194084]: Invalid user admin from 123.58.212.100 port 46476
Feb 02 09:54:27 compute-1 sshd-session[194084]: Connection closed by invalid user admin 123.58.212.100 port 46476 [preauth]
Feb 02 09:54:27 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:54:27 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb600003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:54:27 compute-1 sudo[194390]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gkvvtgjulfwwlbyhuxkwatvbnxnenply ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770026067.0511298-3163-22307589230900/AnsiballZ_systemd.py'
Feb 02 09:54:27 compute-1 sudo[194390]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:54:27 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:54:27 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb604004000 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:54:27 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:54:27 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb604004000 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:54:27 compute-1 python3.9[194392]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtlogd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Feb 02 09:54:27 compute-1 systemd[1]: Reloading.
Feb 02 09:54:27 compute-1 systemd-sysv-generator[194421]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 02 09:54:27 compute-1 systemd-rc-local-generator[194417]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 02 09:54:27 compute-1 systemd[1]: Starting libvirt logging daemon socket...
Feb 02 09:54:27 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:54:27 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:54:27 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:54:27.949 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:54:27 compute-1 systemd[1]: Listening on libvirt logging daemon socket.
Feb 02 09:54:27 compute-1 systemd[1]: Starting libvirt logging daemon admin socket...
Feb 02 09:54:27 compute-1 systemd[1]: Listening on libvirt logging daemon admin socket.
Feb 02 09:54:27 compute-1 systemd[1]: Starting libvirt logging daemon...
Feb 02 09:54:28 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:54:28 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:54:28 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:54:28.034 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:54:28 compute-1 systemd[1]: Started libvirt logging daemon.
Feb 02 09:54:28 compute-1 sudo[194390]: pam_unix(sudo:session): session closed for user root
Feb 02 09:54:28 compute-1 sshd-session[194393]: Invalid user admin from 123.58.212.100 port 46480
Feb 02 09:54:28 compute-1 sshd-session[194393]: Connection closed by invalid user admin 123.58.212.100 port 46480 [preauth]
Feb 02 09:54:28 compute-1 sudo[194585]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ilytfwxajsjwyqtxyanemkpkeeafqwzb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770026068.2422357-3163-133831691590782/AnsiballZ_systemd.py'
Feb 02 09:54:28 compute-1 sudo[194585]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:54:28 compute-1 python3.9[194587]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtnodedevd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Feb 02 09:54:28 compute-1 systemd[1]: Reloading.
Feb 02 09:54:29 compute-1 systemd-sysv-generator[194616]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 02 09:54:29 compute-1 systemd-rc-local-generator[194613]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 02 09:54:29 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:54:29 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb614001fe0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:54:29 compute-1 systemd[1]: Starting SETroubleshoot daemon for processing new SELinux denial logs...
Feb 02 09:54:29 compute-1 ceph-mon[80115]: pgmap v443: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 2.4 KiB/s rd, 938 B/s wr, 3 op/s
Feb 02 09:54:29 compute-1 systemd[1]: Starting libvirt nodedev daemon socket...
Feb 02 09:54:29 compute-1 systemd[1]: Listening on libvirt nodedev daemon socket.
Feb 02 09:54:29 compute-1 systemd[1]: Starting libvirt nodedev daemon admin socket...
Feb 02 09:54:29 compute-1 systemd[1]: Starting libvirt nodedev daemon read-only socket...
Feb 02 09:54:29 compute-1 systemd[1]: Listening on libvirt nodedev daemon read-only socket.
Feb 02 09:54:29 compute-1 systemd[1]: Listening on libvirt nodedev daemon admin socket.
Feb 02 09:54:29 compute-1 systemd[1]: Starting libvirt nodedev daemon...
Feb 02 09:54:29 compute-1 systemd[1]: Started libvirt nodedev daemon.
Feb 02 09:54:29 compute-1 sudo[194585]: pam_unix(sudo:session): session closed for user root
Feb 02 09:54:29 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:54:29 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb5f8000d00 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:54:29 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:54:29 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb5f0000d00 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:54:29 compute-1 systemd[1]: Started SETroubleshoot daemon for processing new SELinux denial logs.
Feb 02 09:54:29 compute-1 sshd-session[194589]: Invalid user admin from 123.58.212.100 port 46486
Feb 02 09:54:29 compute-1 systemd[1]: Created slice Slice /system/dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged.
Feb 02 09:54:29 compute-1 systemd[1]: Started dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@0.service.
Feb 02 09:54:29 compute-1 sudo[194812]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sfyeuanfgzbcvcaefdsutylpwjmwadhk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770026069.5329502-3163-17075976205170/AnsiballZ_systemd.py'
Feb 02 09:54:29 compute-1 sudo[194812]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:54:29 compute-1 sshd-session[194589]: Connection closed by invalid user admin 123.58.212.100 port 46486 [preauth]
Feb 02 09:54:29 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:54:29 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000025s ======
Feb 02 09:54:29 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:54:29.951 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Feb 02 09:54:30 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:54:30 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:54:30 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:54:30.036 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:54:30 compute-1 python3.9[194814]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtproxyd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Feb 02 09:54:30 compute-1 systemd[1]: Reloading.
Feb 02 09:54:30 compute-1 systemd-rc-local-generator[194841]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 02 09:54:30 compute-1 systemd-sysv-generator[194848]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 02 09:54:30 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-haproxy-nfs-cephfs-compute-1-sryqbx[86108]: [WARNING] 032/095430 (4) : Server backend/nfs.cephfs.1 is UP, reason: Layer4 check passed, check duration: 0ms. 2 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Feb 02 09:54:30 compute-1 systemd[1]: Starting libvirt proxy daemon admin socket...
Feb 02 09:54:30 compute-1 systemd[1]: Starting libvirt proxy daemon read-only socket...
Feb 02 09:54:30 compute-1 systemd[1]: Listening on libvirt proxy daemon admin socket.
Feb 02 09:54:30 compute-1 systemd[1]: Listening on libvirt proxy daemon read-only socket.
Feb 02 09:54:30 compute-1 systemd[1]: Starting libvirt proxy daemon...
Feb 02 09:54:30 compute-1 systemd[1]: Started libvirt proxy daemon.
Feb 02 09:54:30 compute-1 sudo[194812]: pam_unix(sudo:session): session closed for user root
Feb 02 09:54:30 compute-1 setroubleshoot[194627]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability. For complete SELinux messages run: sealert -l 6c0cf167-02db-439b-80c9-cd599f4d6f27
Feb 02 09:54:30 compute-1 setroubleshoot[194627]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability.
                                                  
                                                  *****  Plugin dac_override (91.4 confidence) suggests   **********************
                                                  
                                                  If you want to help identify if domain needs this access or you have a file with the wrong permissions on your system
                                                  Then turn on full auditing to get path information about the offending file and generate the error again.
                                                  Do
                                                  
                                                  Turn on full auditing
                                                  # auditctl -w /etc/shadow -p w
                                                  Try to recreate AVC. Then execute
                                                  # ausearch -m avc -ts recent
                                                  If you see PATH record check ownership/permissions on file, and fix it,
                                                  otherwise report as a bugzilla.
                                                  
                                                  *****  Plugin catchall (9.59 confidence) suggests   **************************
                                                  
                                                  If you believe that virtlogd should have the dac_read_search capability by default.
                                                  Then you should report this as a bug.
                                                  You can generate a local policy module to allow this access.
                                                  Do
                                                  allow this access for now by executing:
                                                  # ausearch -c 'virtlogd' --raw | audit2allow -M my-virtlogd
                                                  # semodule -X 300 -i my-virtlogd.pp
                                                  
Feb 02 09:54:30 compute-1 setroubleshoot[194627]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability. For complete SELinux messages run: sealert -l 6c0cf167-02db-439b-80c9-cd599f4d6f27
Feb 02 09:54:30 compute-1 setroubleshoot[194627]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability.
                                                  
                                                  *****  Plugin dac_override (91.4 confidence) suggests   **********************
                                                  
                                                  If you want to help identify if domain needs this access or you have a file with the wrong permissions on your system
                                                  Then turn on full auditing to get path information about the offending file and generate the error again.
                                                  Do
                                                  
                                                  Turn on full auditing
                                                  # auditctl -w /etc/shadow -p w
                                                  Try to recreate AVC. Then execute
                                                  # ausearch -m avc -ts recent
                                                  If you see PATH record check ownership/permissions on file, and fix it,
                                                  otherwise report as a bugzilla.
                                                  
                                                  *****  Plugin catchall (9.59 confidence) suggests   **************************
                                                  
                                                  If you believe that virtlogd should have the dac_read_search capability by default.
                                                  Then you should report this as a bug.
                                                  You can generate a local policy module to allow this access.
                                                  Do
                                                  allow this access for now by executing:
                                                  # ausearch -c 'virtlogd' --raw | audit2allow -M my-virtlogd
                                                  # semodule -X 300 -i my-virtlogd.pp
                                                  
Feb 02 09:54:31 compute-1 sshd-session[194817]: Invalid user admin from 123.58.212.100 port 46494
Feb 02 09:54:31 compute-1 sudo[195028]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ocvvuizxertiedxwwcsghpjruscrsjht ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770026070.7614088-3163-225672878783492/AnsiballZ_systemd.py'
Feb 02 09:54:31 compute-1 sudo[195028]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:54:31 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-haproxy-nfs-cephfs-compute-1-sryqbx[86108]: [WARNING] 032/095431 (4) : Server backend/nfs.cephfs.2 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Feb 02 09:54:31 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:54:31 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb604004000 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:54:31 compute-1 sshd-session[194817]: Connection closed by invalid user admin 123.58.212.100 port 46494 [preauth]
Feb 02 09:54:31 compute-1 ceph-mon[80115]: pgmap v444: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 3.1 KiB/s rd, 1.1 KiB/s wr, 4 op/s
Feb 02 09:54:31 compute-1 python3.9[195030]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtqemud.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Feb 02 09:54:31 compute-1 systemd[1]: Reloading.
Feb 02 09:54:31 compute-1 systemd-rc-local-generator[195058]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 02 09:54:31 compute-1 systemd-sysv-generator[195061]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 02 09:54:31 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:54:31 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb614001fe0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:54:31 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:54:31 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb5f8001820 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:54:31 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 09:54:31 compute-1 systemd[1]: Listening on libvirt locking daemon socket.
Feb 02 09:54:31 compute-1 systemd[1]: Starting libvirt QEMU daemon socket...
Feb 02 09:54:31 compute-1 systemd[1]: Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw).
Feb 02 09:54:31 compute-1 systemd[1]: Starting Virtual Machine and Container Registration Service...
Feb 02 09:54:31 compute-1 systemd[1]: Listening on libvirt QEMU daemon socket.
Feb 02 09:54:31 compute-1 systemd[1]: Starting libvirt QEMU daemon admin socket...
Feb 02 09:54:31 compute-1 systemd[1]: Starting libvirt QEMU daemon read-only socket...
Feb 02 09:54:31 compute-1 systemd[1]: Listening on libvirt QEMU daemon admin socket.
Feb 02 09:54:31 compute-1 systemd[1]: Listening on libvirt QEMU daemon read-only socket.
Feb 02 09:54:31 compute-1 systemd[1]: Started Virtual Machine and Container Registration Service.
Feb 02 09:54:31 compute-1 systemd[1]: Starting libvirt QEMU daemon...
Feb 02 09:54:31 compute-1 systemd[1]: Started libvirt QEMU daemon.
Feb 02 09:54:31 compute-1 sudo[195028]: pam_unix(sudo:session): session closed for user root
Feb 02 09:54:31 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:54:31 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000025s ======
Feb 02 09:54:31 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:54:31.954 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Feb 02 09:54:32 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:54:32 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:54:32 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:54:32.039 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:54:32 compute-1 sudo[195245]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-owvjleelgqbfbawgfqkuillxdfpwcazy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770026071.8868866-3163-145247923083594/AnsiballZ_systemd.py'
Feb 02 09:54:32 compute-1 sudo[195245]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:54:32 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Feb 02 09:54:32 compute-1 sshd-session[195066]: Invalid user admin from 123.58.212.100 port 53144
Feb 02 09:54:32 compute-1 python3.9[195247]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtsecretd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Feb 02 09:54:32 compute-1 systemd[1]: Reloading.
Feb 02 09:54:32 compute-1 systemd-rc-local-generator[195274]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 02 09:54:32 compute-1 systemd-sysv-generator[195278]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 02 09:54:32 compute-1 sshd-session[195066]: Connection closed by invalid user admin 123.58.212.100 port 53144 [preauth]
Feb 02 09:54:32 compute-1 systemd[1]: Starting libvirt secret daemon socket...
Feb 02 09:54:32 compute-1 systemd[1]: Listening on libvirt secret daemon socket.
Feb 02 09:54:32 compute-1 systemd[1]: Starting libvirt secret daemon admin socket...
Feb 02 09:54:32 compute-1 systemd[1]: Starting libvirt secret daemon read-only socket...
Feb 02 09:54:32 compute-1 systemd[1]: Listening on libvirt secret daemon admin socket.
Feb 02 09:54:32 compute-1 systemd[1]: Listening on libvirt secret daemon read-only socket.
Feb 02 09:54:32 compute-1 systemd[1]: Starting libvirt secret daemon...
Feb 02 09:54:32 compute-1 systemd[1]: Started libvirt secret daemon.
Feb 02 09:54:32 compute-1 sudo[195245]: pam_unix(sudo:session): session closed for user root
Feb 02 09:54:33 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:54:33 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb5f0001840 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:54:33 compute-1 ceph-mon[80115]: pgmap v445: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 2.0 KiB/s rd, 597 B/s wr, 2 op/s
Feb 02 09:54:33 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:54:33 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb604004000 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:54:33 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:54:33 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb614001fe0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:54:33 compute-1 sudo[195460]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-obkrserxakhzflkqajzslfvfqballqtz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770026073.2519925-3274-72186947640265/AnsiballZ_file.py'
Feb 02 09:54:33 compute-1 sudo[195460]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:54:33 compute-1 python3.9[195462]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/openstack/config/ceph state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 02 09:54:33 compute-1 sshd-session[195291]: Invalid user admin from 123.58.212.100 port 53146
Feb 02 09:54:33 compute-1 sudo[195460]: pam_unix(sudo:session): session closed for user root
Feb 02 09:54:33 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:54:33 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:54:33 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:54:33.958 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:54:33 compute-1 sshd-session[195291]: Connection closed by invalid user admin 123.58.212.100 port 53146 [preauth]
Feb 02 09:54:34 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:54:34 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:54:34 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:54:34.042 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:54:34 compute-1 sudo[195614]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nzrfnspwsqdndpthhgwkhmfdqmwwjhis ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770026073.9884043-3298-189839414698227/AnsiballZ_find.py'
Feb 02 09:54:34 compute-1 sudo[195614]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:54:34 compute-1 python3.9[195616]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/config/ceph'] patterns=['*.conf'] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Feb 02 09:54:34 compute-1 sudo[195614]: pam_unix(sudo:session): session closed for user root
Feb 02 09:54:35 compute-1 sudo[195767]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lknezffqloxabhflxqnydcysogrrddjp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770026074.7830114-3322-94292849745601/AnsiballZ_command.py'
Feb 02 09:54:35 compute-1 sudo[195767]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:54:35 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:54:35 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb5f8001820 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:54:35 compute-1 python3.9[195769]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail;
                                             echo ceph
                                             awk -F '=' '/fsid/ {print $2}' /var/lib/openstack/config/ceph/ceph.conf | xargs
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 02 09:54:35 compute-1 sudo[195767]: pam_unix(sudo:session): session closed for user root
Feb 02 09:54:35 compute-1 ceph-mon[80115]: pgmap v446: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 2.1 KiB/s rd, 597 B/s wr, 2 op/s
Feb 02 09:54:35 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:54:35 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb5f0001840 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:54:35 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:54:35 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb604004000 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:54:35 compute-1 sshd-session[195586]: Invalid user admin from 123.58.212.100 port 53154
Feb 02 09:54:35 compute-1 python3.9[195923]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/config/ceph'] patterns=['*.keyring'] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Feb 02 09:54:35 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:54:35 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:54:35 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:54:35.961 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:54:36 compute-1 sshd-session[195586]: Connection closed by invalid user admin 123.58.212.100 port 53154 [preauth]
Feb 02 09:54:36 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:54:36 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 09:54:36 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:54:36.044 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 09:54:36 compute-1 sudo[195947]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Feb 02 09:54:36 compute-1 sudo[195947]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:54:36 compute-1 sudo[195947]: pam_unix(sudo:session): session closed for user root
Feb 02 09:54:36 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 09:54:36 compute-1 python3.9[196102]: ansible-ansible.legacy.stat Invoked with path=/tmp/secret.xml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 02 09:54:37 compute-1 sshd-session[195974]: Invalid user admin from 123.58.212.100 port 53156
Feb 02 09:54:37 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:54:37 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb614001fe0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:54:37 compute-1 ceph-mon[80115]: pgmap v447: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 170 B/s wr, 1 op/s
Feb 02 09:54:37 compute-1 sshd-session[195974]: Connection closed by invalid user admin 123.58.212.100 port 53156 [preauth]
Feb 02 09:54:37 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:54:37 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb614001fe0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:54:37 compute-1 python3.9[196223]: ansible-ansible.legacy.copy Invoked with dest=/tmp/secret.xml mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1770026076.4173043-3379-269828669694851/.source.xml follow=False _original_basename=secret.xml.j2 checksum=19e72152fe151d80bf9ff9b6a78f27bac75d38a6 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 02 09:54:37 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:54:37 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb5f8001820 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:54:37 compute-1 podman[196224]: 2026-02-02 09:54:37.65324014 +0000 UTC m=+0.130560797 container health_status 1fb2696999ea15e9688144989093738543d3cef725f9986792d6dfd707aefbe2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:099d88ae13fa2b3409da5310cdcba7fa01d2c87a8bc98296299a57054b9a075e, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'db4758ee7523fe447444c4bd2b867b543b1eee4e3bbcf6676cd1b27bf6147d86-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:099d88ae13fa2b3409da5310cdcba7fa01d2c87a8bc98296299a57054b9a075e', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Feb 02 09:54:37 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:54:37 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:54:37 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:54:37.963 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:54:38 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:54:38 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:54:38 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:54:38.048 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:54:38 compute-1 sudo[196401]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iccopwfjerwguiwjgcyjrdivpjwexwbi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770026077.7604532-3424-136279970768441/AnsiballZ_command.py'
Feb 02 09:54:38 compute-1 sudo[196401]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:54:38 compute-1 python3.9[196403]: ansible-ansible.legacy.command Invoked with _raw_params=virsh secret-undefine d241d473-9fcb-5f74-b163-f1ca4454e7f1
                                             virsh secret-define --file /tmp/secret.xml
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 02 09:54:38 compute-1 polkitd[43542]: Registered Authentication Agent for unix-process:196405:324064 (system bus name :1.1852 [pkttyagent --process 196405 --notify-fd 4 --fallback], object path /org/freedesktop/PolicyKit1/AuthenticationAgent, locale en_US.UTF-8)
Feb 02 09:54:38 compute-1 polkitd[43542]: Unregistered Authentication Agent for unix-process:196405:324064 (system bus name :1.1852, object path /org/freedesktop/PolicyKit1/AuthenticationAgent, locale en_US.UTF-8) (disconnected from bus)
Feb 02 09:54:38 compute-1 polkitd[43542]: Registered Authentication Agent for unix-process:196404:324063 (system bus name :1.1853 [pkttyagent --process 196404 --notify-fd 4 --fallback], object path /org/freedesktop/PolicyKit1/AuthenticationAgent, locale en_US.UTF-8)
Feb 02 09:54:38 compute-1 polkitd[43542]: Unregistered Authentication Agent for unix-process:196404:324063 (system bus name :1.1853, object path /org/freedesktop/PolicyKit1/AuthenticationAgent, locale en_US.UTF-8) (disconnected from bus)
Feb 02 09:54:38 compute-1 sudo[196401]: pam_unix(sudo:session): session closed for user root
Feb 02 09:54:38 compute-1 sshd-session[196274]: Invalid user admin from 123.58.212.100 port 53158
Feb 02 09:54:38 compute-1 sshd-session[196274]: Connection closed by invalid user admin 123.58.212.100 port 53158 [preauth]
Feb 02 09:54:39 compute-1 python3.9[196566]: ansible-ansible.builtin.file Invoked with path=/tmp/secret.xml state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 02 09:54:39 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:54:39 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb604004000 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:54:39 compute-1 ceph-mon[80115]: pgmap v448: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 170 B/s wr, 1 op/s
Feb 02 09:54:39 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:54:39 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb604004000 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:54:39 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:54:39 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb620002010 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:54:39 compute-1 sudo[196719]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mvqqpalxwdwfjaxueferstlnprernkyr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770026079.3957846-3472-88389476068733/AnsiballZ_command.py'
Feb 02 09:54:39 compute-1 sudo[196719]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:54:39 compute-1 sudo[196719]: pam_unix(sudo:session): session closed for user root
Feb 02 09:54:39 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:54:39 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 09:54:39 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:54:39.966 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 09:54:40 compute-1 sshd-session[196591]: Invalid user admin from 123.58.212.100 port 53160
Feb 02 09:54:40 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:54:40 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:54:40 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:54:40.051 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:54:40 compute-1 sshd-session[196591]: Connection closed by invalid user admin 123.58.212.100 port 53160 [preauth]
Feb 02 09:54:40 compute-1 sudo[196874]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gtocwbcynqwpmjyswlduahwdssibmayp ; FSID=d241d473-9fcb-5f74-b163-f1ca4454e7f1 KEY=AQBGcIBpAAAAABAA2I2uJAQ9+FTGDrMvmIgfmg== /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770026080.2132668-3496-273700968405321/AnsiballZ_command.py'
Feb 02 09:54:40 compute-1 sudo[196874]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:54:40 compute-1 systemd[1]: dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@0.service: Deactivated successfully.
Feb 02 09:54:40 compute-1 systemd[1]: setroubleshootd.service: Deactivated successfully.
Feb 02 09:54:40 compute-1 polkitd[43542]: Registered Authentication Agent for unix-process:196878:324313 (system bus name :1.1856 [pkttyagent --process 196878 --notify-fd 4 --fallback], object path /org/freedesktop/PolicyKit1/AuthenticationAgent, locale en_US.UTF-8)
Feb 02 09:54:40 compute-1 polkitd[43542]: Unregistered Authentication Agent for unix-process:196878:324313 (system bus name :1.1856, object path /org/freedesktop/PolicyKit1/AuthenticationAgent, locale en_US.UTF-8) (disconnected from bus)
Feb 02 09:54:40 compute-1 sudo[196874]: pam_unix(sudo:session): session closed for user root
Feb 02 09:54:41 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:54:41 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb5f8002cb0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:54:41 compute-1 sudo[197033]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dgxteckltffdvibpazmyenlxydpzzmbv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770026081.1011236-3520-18341903951725/AnsiballZ_copy.py'
Feb 02 09:54:41 compute-1 sudo[197033]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:54:41 compute-1 sshd-session[196822]: Invalid user admin from 123.58.212.100 port 53168
Feb 02 09:54:41 compute-1 ceph-mon[80115]: pgmap v449: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 938 B/s rd, 170 B/s wr, 1 op/s
Feb 02 09:54:41 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:54:41 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb614001fe0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:54:41 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:54:41 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb604004000 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:54:41 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 09:54:41 compute-1 python3.9[197035]: ansible-ansible.legacy.copy Invoked with dest=/etc/ceph/ceph.conf group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/config/ceph/ceph.conf backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 02 09:54:41 compute-1 sudo[197033]: pam_unix(sudo:session): session closed for user root
Feb 02 09:54:41 compute-1 sshd-session[196822]: Connection closed by invalid user admin 123.58.212.100 port 53168 [preauth]
Feb 02 09:54:41 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:54:41 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 09:54:41 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:54:41.969 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 09:54:42 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:54:42 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 09:54:42 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:54:42.053 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 09:54:42 compute-1 sudo[197187]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-icncwxadtwmwungauhhyufpfllddzeev ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770026081.8428671-3544-96012965375489/AnsiballZ_stat.py'
Feb 02 09:54:42 compute-1 sudo[197187]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:54:42 compute-1 python3.9[197189]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/libvirt.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 02 09:54:42 compute-1 sudo[197187]: pam_unix(sudo:session): session closed for user root
Feb 02 09:54:42 compute-1 sudo[197310]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xkymwtfccfdxqjdenerydxwyzchfwrdw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770026081.8428671-3544-96012965375489/AnsiballZ_copy.py'
Feb 02 09:54:42 compute-1 sudo[197310]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:54:42 compute-1 sshd-session[197083]: Invalid user admin from 123.58.212.100 port 34118
Feb 02 09:54:42 compute-1 python3.9[197312]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/libvirt.yaml mode=0640 src=/home/zuul/.ansible/tmp/ansible-tmp-1770026081.8428671-3544-96012965375489/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=5ca83b1310a74c5e48c4c3d4640e1cb8fdac1061 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 02 09:54:42 compute-1 sudo[197310]: pam_unix(sudo:session): session closed for user root
Feb 02 09:54:42 compute-1 sshd-session[197083]: Connection closed by invalid user admin 123.58.212.100 port 34118 [preauth]
Feb 02 09:54:43 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:54:43 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb620002010 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:54:43 compute-1 ceph-mon[80115]: pgmap v450: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 B/s wr, 0 op/s
Feb 02 09:54:43 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:54:43 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb5f8002cb0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:54:43 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:54:43 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb614001fe0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:54:43 compute-1 sudo[197465]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bowdkyfohxvdheorfmvsvnddgpscrrsq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770026083.3442652-3592-127177532536129/AnsiballZ_file.py'
Feb 02 09:54:43 compute-1 sudo[197465]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:54:43 compute-1 python3.9[197467]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 02 09:54:43 compute-1 sudo[197465]: pam_unix(sudo:session): session closed for user root
Feb 02 09:54:43 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:54:43 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:54:43 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:54:43.972 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:54:44 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:54:44 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:54:44 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:54:44.055 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:54:44 compute-1 sshd-session[197338]: Invalid user admin from 123.58.212.100 port 34122
Feb 02 09:54:44 compute-1 sshd-session[197338]: Connection closed by invalid user admin 123.58.212.100 port 34122 [preauth]
Feb 02 09:54:44 compute-1 sudo[197619]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-srfrtbbxtnrjafogpnqiwocpxqsoxtvk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770026084.0787084-3616-183980066427427/AnsiballZ_stat.py'
Feb 02 09:54:44 compute-1 sudo[197619]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:54:44 compute-1 python3.9[197621]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 02 09:54:44 compute-1 sudo[197619]: pam_unix(sudo:session): session closed for user root
Feb 02 09:54:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:54:44.893 143542 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 02 09:54:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:54:44.893 143542 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 02 09:54:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:54:44.893 143542 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 02 09:54:45 compute-1 sudo[197698]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dxtvhxjrfcyodcbmijxlxpgldthiliie ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770026084.0787084-3616-183980066427427/AnsiballZ_file.py'
Feb 02 09:54:45 compute-1 sudo[197698]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:54:45 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:54:45 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb620002010 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:54:45 compute-1 python3.9[197700]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 02 09:54:45 compute-1 sudo[197698]: pam_unix(sudo:session): session closed for user root
Feb 02 09:54:45 compute-1 ceph-mon[80115]: pgmap v451: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 341 B/s rd, 0 B/s wr, 0 op/s
Feb 02 09:54:45 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:54:45 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb604004000 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:54:45 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:54:45 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb5f80039c0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:54:45 compute-1 sshd-session[197588]: Invalid user admin from 123.58.212.100 port 34130
Feb 02 09:54:45 compute-1 sudo[197850]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ujkxtaczfsxwnltwehrwlmjoxzliavlz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770026085.5327902-3652-279958994367135/AnsiballZ_stat.py'
Feb 02 09:54:45 compute-1 sudo[197850]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:54:45 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:54:45 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:54:45 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:54:45.975 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:54:46 compute-1 python3.9[197852]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 02 09:54:46 compute-1 sudo[197850]: pam_unix(sudo:session): session closed for user root
Feb 02 09:54:46 compute-1 sshd-session[197588]: Connection closed by invalid user admin 123.58.212.100 port 34130 [preauth]
Feb 02 09:54:46 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:54:46 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 09:54:46 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:54:46.067 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 09:54:46 compute-1 sudo[197930]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dftnjgqtygzilojsnjfzgxwosyymxlmb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770026085.5327902-3652-279958994367135/AnsiballZ_file.py'
Feb 02 09:54:46 compute-1 sudo[197930]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:54:46 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 09:54:46 compute-1 python3.9[197932]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.7q_xu196 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 02 09:54:46 compute-1 sudo[197930]: pam_unix(sudo:session): session closed for user root
Feb 02 09:54:47 compute-1 sudo[198096]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zfqidjunmjuhyeexpuatutjpnysteiof ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770026086.795394-3688-263964639593521/AnsiballZ_stat.py'
Feb 02 09:54:47 compute-1 sudo[198096]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:54:47 compute-1 podman[198057]: 2026-02-02 09:54:47.173233697 +0000 UTC m=+0.076882422 container health_status 9cab238676dda9a572dafc047b624a8b78a8fbec063bf997856559897160605d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'db4758ee7523fe447444c4bd2b867b543b1eee4e3bbcf6676cd1b27bf6147d86-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=ovn_metadata_agent)
Feb 02 09:54:47 compute-1 sshd-session[197902]: Invalid user admin from 123.58.212.100 port 34142
Feb 02 09:54:47 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:54:47 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb614001fe0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:54:47 compute-1 python3.9[198103]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 02 09:54:47 compute-1 sudo[198096]: pam_unix(sudo:session): session closed for user root
Feb 02 09:54:47 compute-1 sshd-session[197902]: Connection closed by invalid user admin 123.58.212.100 port 34142 [preauth]
Feb 02 09:54:47 compute-1 ceph-mon[80115]: pgmap v452: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Feb 02 09:54:47 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Feb 02 09:54:47 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:54:47 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb6200089d0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:54:47 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:54:47 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb604004000 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:54:47 compute-1 sudo[198182]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fldjiddmcaggfsbuqagcvjsciwefngim ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770026086.795394-3688-263964639593521/AnsiballZ_file.py'
Feb 02 09:54:47 compute-1 sudo[198182]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:54:47 compute-1 python3.9[198184]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 02 09:54:47 compute-1 sudo[198182]: pam_unix(sudo:session): session closed for user root
Feb 02 09:54:47 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:54:47 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 09:54:47 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:54:47.978 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 09:54:48 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:54:48 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:54:48 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:54:48.071 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:54:48 compute-1 sshd-session[198177]: Invalid user admin from 123.58.212.100 port 34146
Feb 02 09:54:48 compute-1 sudo[198334]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fhlpnyomsrvhkhoebmugiknbbleetdov ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770026088.1366878-3727-77680741725593/AnsiballZ_command.py'
Feb 02 09:54:48 compute-1 sudo[198334]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:54:48 compute-1 sshd-session[198177]: Connection closed by invalid user admin 123.58.212.100 port 34146 [preauth]
Feb 02 09:54:48 compute-1 python3.9[198336]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 02 09:54:48 compute-1 sudo[198334]: pam_unix(sudo:session): session closed for user root
Feb 02 09:54:49 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:54:49 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb5f80039c0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:54:49 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:54:49 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb614001fe0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:54:49 compute-1 ceph-mon[80115]: pgmap v453: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Feb 02 09:54:49 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:54:49 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb6200089d0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:54:49 compute-1 sudo[198490]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wzvjgztgawsrcldtepqixukkmowiynip ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1770026089.0468326-3751-24510146046566/AnsiballZ_edpm_nftables_from_files.py'
Feb 02 09:54:49 compute-1 sudo[198490]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:54:49 compute-1 python3[198492]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Feb 02 09:54:49 compute-1 sudo[198490]: pam_unix(sudo:session): session closed for user root
Feb 02 09:54:49 compute-1 sshd-session[198363]: Invalid user admin from 123.58.212.100 port 34154
Feb 02 09:54:49 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:54:49 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 09:54:49 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:54:49.981 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 09:54:50 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:54:50 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 09:54:50 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:54:50.073 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 09:54:50 compute-1 sshd-session[198363]: Connection closed by invalid user admin 123.58.212.100 port 34154 [preauth]
Feb 02 09:54:50 compute-1 sudo[198642]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ajpgorjvtohtfbnxlrxqdbkufaodshja ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770026090.0017726-3775-239252999122918/AnsiballZ_stat.py'
Feb 02 09:54:50 compute-1 sudo[198642]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:54:50 compute-1 python3.9[198644]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 02 09:54:50 compute-1 sudo[198642]: pam_unix(sudo:session): session closed for user root
Feb 02 09:54:50 compute-1 sudo[198723]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iqufddykhuesyaweqpgkczrdedbtebvi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770026090.0017726-3775-239252999122918/AnsiballZ_file.py'
Feb 02 09:54:50 compute-1 sudo[198723]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:54:50 compute-1 python3.9[198725]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 02 09:54:51 compute-1 sudo[198723]: pam_unix(sudo:session): session closed for user root
Feb 02 09:54:51 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:54:51 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb604004000 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:54:51 compute-1 sshd-session[198645]: Invalid user admin from 123.58.212.100 port 34156
Feb 02 09:54:51 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:54:51 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb5f80039c0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:54:51 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:54:51 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb5f80039c0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:54:51 compute-1 ceph-mon[80115]: pgmap v454: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 426 B/s rd, 0 op/s
Feb 02 09:54:51 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 09:54:51 compute-1 sshd-session[198645]: Connection closed by invalid user admin 123.58.212.100 port 34156 [preauth]
Feb 02 09:54:51 compute-1 sudo[198875]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uxobzrutobfslxyarlttnsjbqyzxwmrt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770026091.3288875-3811-263282415164707/AnsiballZ_stat.py'
Feb 02 09:54:51 compute-1 sudo[198875]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:54:51 compute-1 python3.9[198877]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 02 09:54:51 compute-1 sudo[198875]: pam_unix(sudo:session): session closed for user root
Feb 02 09:54:51 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:54:51 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 09:54:51 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:54:51.984 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 09:54:52 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:54:52 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:54:52 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:54:52.076 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:54:52 compute-1 sudo[199002]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ogzksinmzhtngbkqpgshxaecihyuaolk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770026091.3288875-3811-263282415164707/AnsiballZ_copy.py'
Feb 02 09:54:52 compute-1 sudo[199002]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:54:52 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-haproxy-nfs-cephfs-compute-1-sryqbx[86108]: [WARNING] 032/095452 (4) : Server backend/nfs.cephfs.1 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Feb 02 09:54:52 compute-1 ceph-mon[80115]: pgmap v455: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Feb 02 09:54:52 compute-1 python3.9[199004]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-update-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1770026091.3288875-3811-263282415164707/.source.nft follow=False _original_basename=jump-chain.j2 checksum=3ce353c89bce3b135a0ed688d4e338b2efb15185 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 02 09:54:52 compute-1 sudo[199002]: pam_unix(sudo:session): session closed for user root
Feb 02 09:54:53 compute-1 sshd-session[198880]: Invalid user admin from 123.58.212.100 port 37418
Feb 02 09:54:53 compute-1 sudo[199155]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qbyfxbnznnsnjwhnaaudchpeovofukca ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770026092.8802278-3856-199121156629041/AnsiballZ_stat.py'
Feb 02 09:54:53 compute-1 sudo[199155]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:54:53 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:54:53 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb6200096e0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:54:53 compute-1 sshd-session[198880]: Connection closed by invalid user admin 123.58.212.100 port 37418 [preauth]
Feb 02 09:54:53 compute-1 python3.9[199157]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 02 09:54:53 compute-1 sudo[199155]: pam_unix(sudo:session): session closed for user root
Feb 02 09:54:53 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:54:53 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb604004000 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:54:53 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:54:53 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb5f80039c0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:54:53 compute-1 sudo[199235]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-afbnkgfhxurjrymtznmncswsnhejxuld ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770026092.8802278-3856-199121156629041/AnsiballZ_file.py'
Feb 02 09:54:53 compute-1 sudo[199235]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:54:53 compute-1 python3.9[199237]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-flushes.nft _original_basename=flush-chain.j2 recurse=False state=file path=/etc/nftables/edpm-flushes.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 02 09:54:53 compute-1 sudo[199235]: pam_unix(sudo:session): session closed for user root
Feb 02 09:54:53 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:54:53 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 09:54:53 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:54:53.986 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 09:54:54 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:54:54 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:54:54 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:54:54.078 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:54:54 compute-1 sshd-session[199177]: Invalid user admin from 123.58.212.100 port 37424
Feb 02 09:54:54 compute-1 sudo[199387]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-clfdjjijxadgmdawstrgfqqjfztfiinn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770026094.1291409-3892-218470614342155/AnsiballZ_stat.py'
Feb 02 09:54:54 compute-1 sudo[199387]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:54:54 compute-1 python3.9[199389]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 02 09:54:54 compute-1 sshd-session[199177]: Connection closed by invalid user admin 123.58.212.100 port 37424 [preauth]
Feb 02 09:54:54 compute-1 sudo[199387]: pam_unix(sudo:session): session closed for user root
Feb 02 09:54:54 compute-1 sudo[199466]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bxlveqdnxameczicoysvdcazarokxdsm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770026094.1291409-3892-218470614342155/AnsiballZ_file.py'
Feb 02 09:54:54 compute-1 sudo[199466]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:54:55 compute-1 python3.9[199468]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-chains.nft _original_basename=chains.j2 recurse=False state=file path=/etc/nftables/edpm-chains.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 02 09:54:55 compute-1 sudo[199466]: pam_unix(sudo:session): session closed for user root
Feb 02 09:54:55 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:54:55 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb614001fe0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:54:55 compute-1 ceph-mon[80115]: pgmap v456: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 341 B/s rd, 0 op/s
Feb 02 09:54:55 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:54:55 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb6200096e0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:54:55 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:54:55 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb604004000 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:54:55 compute-1 sudo[199618]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hmotunjcpwbdmpasmuobxxdqtrtdrcjt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770026095.3225706-3928-83691514853329/AnsiballZ_stat.py'
Feb 02 09:54:55 compute-1 sudo[199618]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:54:55 compute-1 python3.9[199620]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 02 09:54:55 compute-1 sudo[199618]: pam_unix(sudo:session): session closed for user root
Feb 02 09:54:55 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:54:55 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 09:54:55 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:54:55.989 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 09:54:56 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:54:56 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:54:56 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:54:56.080 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:54:56 compute-1 sudo[199695]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Feb 02 09:54:56 compute-1 sudo[199695]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:54:56 compute-1 sudo[199695]: pam_unix(sudo:session): session closed for user root
Feb 02 09:54:56 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 09:54:56 compute-1 sudo[199770]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eqvxsnumszfyprqkdcymorxbqsgkfzbs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770026095.3225706-3928-83691514853329/AnsiballZ_copy.py'
Feb 02 09:54:56 compute-1 sudo[199770]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:54:56 compute-1 python3.9[199772]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1770026095.3225706-3928-83691514853329/.source.nft follow=False _original_basename=ruleset.j2 checksum=ac3ce8ce2d33fa5fe0a79b0c811c97734ce43fa5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 02 09:54:56 compute-1 sudo[199770]: pam_unix(sudo:session): session closed for user root
Feb 02 09:54:56 compute-1 sshd-session[199621]: Invalid user admin from 123.58.212.100 port 37434
Feb 02 09:54:57 compute-1 sshd-session[199621]: Connection closed by invalid user admin 123.58.212.100 port 37434 [preauth]
Feb 02 09:54:57 compute-1 sudo[199923]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uavezdyhocqdlvvxrhqlcvielzirhrac ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770026096.9166455-3973-243115260092709/AnsiballZ_file.py'
Feb 02 09:54:57 compute-1 sudo[199923]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:54:57 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:54:57 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb604004000 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:54:57 compute-1 ceph-mon[80115]: pgmap v457: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Feb 02 09:54:57 compute-1 python3.9[199925]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 02 09:54:57 compute-1 sudo[199923]: pam_unix(sudo:session): session closed for user root
Feb 02 09:54:57 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:54:57 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb604004000 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:54:57 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:54:57 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb604004000 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:54:57 compute-1 sudo[200077]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lseaffatkkkadvhvxhttyyaodgyvzeik ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770026097.6103628-3997-126032684167939/AnsiballZ_command.py'
Feb 02 09:54:57 compute-1 sudo[200077]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:54:57 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:54:57 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:54:57 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:54:57.993 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:54:58 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:54:58 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 09:54:58 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:54:58.082 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 09:54:58 compute-1 python3.9[200079]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 02 09:54:58 compute-1 sudo[200077]: pam_unix(sudo:session): session closed for user root
Feb 02 09:54:58 compute-1 sshd-session[199926]: Invalid user admin from 123.58.212.100 port 37442
Feb 02 09:54:58 compute-1 sudo[200233]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ymrqcntpihueezqknwkcrhcmwvvdfnfd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770026098.3255959-4021-194789640088369/AnsiballZ_blockinfile.py'
Feb 02 09:54:58 compute-1 sudo[200233]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:54:58 compute-1 sshd-session[199926]: Connection closed by invalid user admin 123.58.212.100 port 37442 [preauth]
Feb 02 09:54:58 compute-1 python3.9[200235]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"
                                             include "/etc/nftables/edpm-chains.nft"
                                             include "/etc/nftables/edpm-rules.nft"
                                             include "/etc/nftables/edpm-jumps.nft"
                                              path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 02 09:54:58 compute-1 sudo[200233]: pam_unix(sudo:session): session closed for user root
Feb 02 09:54:59 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:54:59 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb604004000 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:54:59 compute-1 ceph-mon[80115]: pgmap v458: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Feb 02 09:54:59 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:54:59 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb604004000 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:54:59 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:54:59 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb604004000 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:54:59 compute-1 sudo[200387]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hcksiglyctabqripydswxnscihzmzbhr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770026099.3779953-4048-217278091975711/AnsiballZ_command.py'
Feb 02 09:54:59 compute-1 sudo[200387]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:54:59 compute-1 python3.9[200389]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 02 09:54:59 compute-1 sudo[200387]: pam_unix(sudo:session): session closed for user root
Feb 02 09:54:59 compute-1 sshd-session[200239]: Invalid user admin from 123.58.212.100 port 37450
Feb 02 09:54:59 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:54:59 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 09:54:59 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:54:59.995 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 09:55:00 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:55:00 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:55:00 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:55:00.086 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:55:00 compute-1 sshd-session[200239]: Connection closed by invalid user admin 123.58.212.100 port 37450 [preauth]
Feb 02 09:55:00 compute-1 sudo[200542]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dadiorszodbzyofomuhegltrjssrnwhn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770026100.1365967-4072-38613525041407/AnsiballZ_stat.py'
Feb 02 09:55:00 compute-1 sudo[200542]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:55:00 compute-1 python3.9[200544]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 02 09:55:00 compute-1 sudo[200542]: pam_unix(sudo:session): session closed for user root
Feb 02 09:55:01 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:55:01 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb604004000 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:55:01 compute-1 sshd-session[200490]: Invalid user admin from 123.58.212.100 port 37464
Feb 02 09:55:01 compute-1 ceph-mon[80115]: pgmap v459: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 341 B/s rd, 0 op/s
Feb 02 09:55:01 compute-1 sudo[200699]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aydolmadgtmhnuvxobiipdudokffvlie ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770026101.1200545-4096-117252627780938/AnsiballZ_command.py'
Feb 02 09:55:01 compute-1 sudo[200699]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:55:01 compute-1 sshd-session[200490]: Connection closed by invalid user admin 123.58.212.100 port 37464 [preauth]
Feb 02 09:55:01 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:55:01 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb604004000 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:55:01 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:55:01 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb608002370 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:55:01 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 09:55:01 compute-1 python3.9[200701]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 02 09:55:01 compute-1 sudo[200699]: pam_unix(sudo:session): session closed for user root
Feb 02 09:55:02 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:55:02 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 09:55:02 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:55:01.998 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 09:55:02 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:55:02 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 09:55:02 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:55:02.088 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 09:55:02 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:55:02 : epoch 69807427 : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Feb 02 09:55:02 compute-1 sudo[200856]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-umbkjrygbstmqubdgiphejmdxqklgkya ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770026101.93465-4120-45698433341106/AnsiballZ_file.py'
Feb 02 09:55:02 compute-1 sudo[200856]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:55:02 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Feb 02 09:55:02 compute-1 python3.9[200858]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 02 09:55:02 compute-1 sudo[200856]: pam_unix(sudo:session): session closed for user root
Feb 02 09:55:02 compute-1 sshd-session[200712]: Invalid user admin from 123.58.212.100 port 56786
Feb 02 09:55:02 compute-1 sshd-session[200712]: Connection closed by invalid user admin 123.58.212.100 port 56786 [preauth]
Feb 02 09:55:02 compute-1 sudo[201009]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yielalmrslzlrgsqgykkihmvfascqnds ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770026102.6988423-4144-276850308272281/AnsiballZ_stat.py'
Feb 02 09:55:02 compute-1 sudo[201009]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:55:03 compute-1 python3.9[201011]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm_libvirt.target follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 02 09:55:03 compute-1 sudo[201009]: pam_unix(sudo:session): session closed for user root
Feb 02 09:55:03 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:55:03 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb5f00028c0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:55:03 compute-1 ceph-mon[80115]: pgmap v460: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s
Feb 02 09:55:03 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:55:03 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb614003cb0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:55:03 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:55:03 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb604004000 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:55:03 compute-1 sudo[201134]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rioayrunevcadinkogdmzjlcmsplkzcc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770026102.6988423-4144-276850308272281/AnsiballZ_copy.py'
Feb 02 09:55:03 compute-1 sudo[201134]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:55:03 compute-1 python3.9[201136]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm_libvirt.target mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1770026102.6988423-4144-276850308272281/.source.target follow=False _original_basename=edpm_libvirt.target checksum=13035a1aa0f414c677b14be9a5a363b6623d393c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 02 09:55:03 compute-1 sudo[201134]: pam_unix(sudo:session): session closed for user root
Feb 02 09:55:04 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:55:04 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:55:04 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:55:04.002 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:55:04 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:55:04 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:55:04 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:55:04.091 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:55:04 compute-1 sshd-session[201012]: Invalid user pi from 123.58.212.100 port 56790
Feb 02 09:55:04 compute-1 sshd-session[201012]: Connection closed by invalid user pi 123.58.212.100 port 56790 [preauth]
Feb 02 09:55:04 compute-1 sudo[201286]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-avngjzydtuamxosumxoqlvwamttsalqh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770026104.0811508-4189-75525924766860/AnsiballZ_stat.py'
Feb 02 09:55:04 compute-1 sudo[201286]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:55:04 compute-1 python3.9[201288]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm_libvirt_guests.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 02 09:55:04 compute-1 sudo[201286]: pam_unix(sudo:session): session closed for user root
Feb 02 09:55:05 compute-1 sudo[201412]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bmnlgqqtwwoeddnstgbmaxcwnbmzytob ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770026104.0811508-4189-75525924766860/AnsiballZ_copy.py'
Feb 02 09:55:05 compute-1 sudo[201412]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:55:05 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:55:05 : epoch 69807427 : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Feb 02 09:55:05 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:55:05 : epoch 69807427 : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Feb 02 09:55:05 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:55:05 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb608002370 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:55:05 compute-1 python3.9[201414]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm_libvirt_guests.service mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1770026104.0811508-4189-75525924766860/.source.service follow=False _original_basename=edpm_libvirt_guests.service checksum=db83430a42fc2ccfd6ed8b56ebf04f3dff9cd0cf backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 02 09:55:05 compute-1 sudo[201412]: pam_unix(sudo:session): session closed for user root
Feb 02 09:55:05 compute-1 ceph-mon[80115]: pgmap v461: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.5 KiB/s rd, 597 B/s wr, 2 op/s
Feb 02 09:55:05 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:55:05 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb5f00031e0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:55:05 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:55:05 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb614003cb0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:55:05 compute-1 sshd-session[201289]: Connection closed by authenticating user ftp 123.58.212.100 port 56804 [preauth]
Feb 02 09:55:05 compute-1 sudo[201564]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-llijvzzkqwwzdcwstgemyudrqywpgtam ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770026105.5721936-4234-221588767828483/AnsiballZ_stat.py'
Feb 02 09:55:05 compute-1 sudo[201564]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:55:06 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:55:06 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:55:06 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:55:06.005 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:55:06 compute-1 python3.9[201566]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virt-guest-shutdown.target follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 02 09:55:06 compute-1 sudo[201564]: pam_unix(sudo:session): session closed for user root
Feb 02 09:55:06 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:55:06 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 09:55:06 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:55:06.093 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 09:55:06 compute-1 sudo[201687]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rxpeuxskhgeyrlncynwmxqydndvjrthh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770026105.5721936-4234-221588767828483/AnsiballZ_copy.py'
Feb 02 09:55:06 compute-1 sudo[201687]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:55:06 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 09:55:06 compute-1 python3.9[201689]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virt-guest-shutdown.target mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1770026105.5721936-4234-221588767828483/.source.target follow=False _original_basename=virt-guest-shutdown.target checksum=49ca149619c596cbba877418629d2cf8f7b0f5cf backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 02 09:55:06 compute-1 sudo[201687]: pam_unix(sudo:session): session closed for user root
Feb 02 09:55:07 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:55:07 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb604004000 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:55:07 compute-1 sudo[201813]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 02 09:55:07 compute-1 sudo[201813]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:55:07 compute-1 sudo[201813]: pam_unix(sudo:session): session closed for user root
Feb 02 09:55:07 compute-1 sudo[201865]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qwqqqkdbchptdvlljhhgilxrhcskggxk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770026106.94487-4279-270210030843855/AnsiballZ_systemd.py'
Feb 02 09:55:07 compute-1 sudo[201865]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:55:07 compute-1 sudo[201867]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/d241d473-9fcb-5f74-b163-f1ca4454e7f1/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Feb 02 09:55:07 compute-1 sudo[201867]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:55:07 compute-1 ceph-mon[80115]: pgmap v462: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.4 KiB/s rd, 596 B/s wr, 2 op/s
Feb 02 09:55:07 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:55:07 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb608002370 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:55:07 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:55:07 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb5f00031e0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:55:07 compute-1 python3.9[201868]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm_libvirt.target state=restarted daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 02 09:55:07 compute-1 systemd[1]: Reloading.
Feb 02 09:55:07 compute-1 systemd-sysv-generator[201938]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 02 09:55:07 compute-1 systemd-rc-local-generator[201932]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 02 09:55:07 compute-1 sudo[201867]: pam_unix(sudo:session): session closed for user root
Feb 02 09:55:07 compute-1 systemd[1]: Reached target edpm_libvirt.target.
Feb 02 09:55:07 compute-1 sudo[201865]: pam_unix(sudo:session): session closed for user root
Feb 02 09:55:07 compute-1 podman[201959]: 2026-02-02 09:55:07.976391031 +0000 UTC m=+0.079100741 container health_status 1fb2696999ea15e9688144989093738543d3cef725f9986792d6dfd707aefbe2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:099d88ae13fa2b3409da5310cdcba7fa01d2c87a8bc98296299a57054b9a075e, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'db4758ee7523fe447444c4bd2b867b543b1eee4e3bbcf6676cd1b27bf6147d86-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:099d88ae13fa2b3409da5310cdcba7fa01d2c87a8bc98296299a57054b9a075e', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127)
Feb 02 09:55:08 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:55:08 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:55:08 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:55:08.009 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:55:08 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:55:08 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:55:08 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:55:08.095 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:55:08 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:55:08 : epoch 69807427 : compute-1 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Feb 02 09:55:08 compute-1 sudo[202138]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pnqcamichjfwlrfkiifaawwnhzjsndnx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770026108.286525-4303-5906744649862/AnsiballZ_systemd.py'
Feb 02 09:55:08 compute-1 sudo[202138]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:55:08 compute-1 python3.9[202140]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm_libvirt_guests daemon_reexec=False scope=system no_block=False state=None force=None masked=None
Feb 02 09:55:08 compute-1 systemd[1]: Reloading.
Feb 02 09:55:09 compute-1 systemd-sysv-generator[202168]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 02 09:55:09 compute-1 systemd-rc-local-generator[202165]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 02 09:55:09 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:55:09 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb614003cb0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:55:09 compute-1 systemd[1]: Reloading.
Feb 02 09:55:09 compute-1 ceph-mon[80115]: pgmap v463: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.4 KiB/s rd, 597 B/s wr, 2 op/s
Feb 02 09:55:09 compute-1 systemd-sysv-generator[202209]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 02 09:55:09 compute-1 systemd-rc-local-generator[202205]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 02 09:55:09 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:55:09 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb604004000 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:55:09 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:55:09 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb6080018c0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:55:09 compute-1 sudo[202138]: pam_unix(sudo:session): session closed for user root
Feb 02 09:55:10 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:55:10 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:55:10 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:55:10.012 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:55:10 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:55:10 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:55:10 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:55:10.099 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:55:10 compute-1 sshd-session[144138]: Connection closed by 192.168.122.30 port 53654
Feb 02 09:55:10 compute-1 sshd-session[144135]: pam_unix(sshd:session): session closed for user zuul
Feb 02 09:55:10 compute-1 systemd[1]: session-52.scope: Deactivated successfully.
Feb 02 09:55:10 compute-1 systemd[1]: session-52.scope: Consumed 3min 11.353s CPU time.
Feb 02 09:55:10 compute-1 systemd-logind[805]: Session 52 logged out. Waiting for processes to exit.
Feb 02 09:55:10 compute-1 systemd-logind[805]: Removed session 52.
Feb 02 09:55:10 compute-1 rsyslogd[1009]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Feb 02 09:55:10 compute-1 rsyslogd[1009]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Feb 02 09:55:11 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:55:11 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb5f00031e0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:55:11 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:55:11 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb614003cb0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:55:11 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:55:11 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb604004000 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:55:11 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 09:55:11 compute-1 ceph-mon[80115]: pgmap v464: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 2.5 KiB/s rd, 1023 B/s wr, 3 op/s
Feb 02 09:55:12 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:55:12 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:55:12 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:55:12.015 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:55:12 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:55:12 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:55:12 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:55:12.102 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:55:12 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb 02 09:55:12 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb 02 09:55:12 compute-1 ceph-mon[80115]: pgmap v465: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 2.4 KiB/s rd, 1023 B/s wr, 3 op/s
Feb 02 09:55:12 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Feb 02 09:55:12 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Feb 02 09:55:12 compute-1 ceph-mon[80115]: pgmap v466: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 2.9 KiB/s rd, 1.2 KiB/s wr, 4 op/s
Feb 02 09:55:12 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb 02 09:55:12 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb 02 09:55:12 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Feb 02 09:55:12 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Feb 02 09:55:12 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Feb 02 09:55:13 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:55:13 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb6080018c0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:55:13 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:55:13 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb5f00031e0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:55:13 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:55:13 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb614003cb0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:55:13 compute-1 ceph-mon[80115]: Health check cleared: CEPHADM_FAILED_DAEMON (was: 1 failed cephadm daemon(s))
Feb 02 09:55:13 compute-1 ceph-mon[80115]: Cluster is now healthy
Feb 02 09:55:14 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:55:14 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:55:14 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:55:14.019 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:55:14 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:55:14 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:55:14 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:55:14.105 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:55:14 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-haproxy-nfs-cephfs-compute-1-sryqbx[86108]: [WARNING] 032/095514 (4) : Server backend/nfs.cephfs.1 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Feb 02 09:55:14 compute-1 ceph-mon[80115]: pgmap v467: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.4 KiB/s rd, 506 B/s wr, 2 op/s
Feb 02 09:55:15 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:55:15 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb604004000 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:55:15 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:55:15 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb6080018c0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:55:15 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:55:15 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb5f00031e0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:55:15 compute-1 sshd-session[202242]: Accepted publickey for zuul from 192.168.122.30 port 57414 ssh2: ECDSA SHA256:RIWOugHsRom13QN8+H2eekzMj6VNcm6gUxie+zDStiQ
Feb 02 09:55:15 compute-1 systemd-logind[805]: New session 53 of user zuul.
Feb 02 09:55:15 compute-1 systemd[1]: Started Session 53 of User zuul.
Feb 02 09:55:15 compute-1 sshd-session[202242]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Feb 02 09:55:16 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:55:16 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:55:16 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:55:16.021 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:55:16 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:55:16 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:55:16 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:55:16.108 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:55:16 compute-1 sudo[202298]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Feb 02 09:55:16 compute-1 sudo[202298]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:55:16 compute-1 sudo[202298]: pam_unix(sudo:session): session closed for user root
Feb 02 09:55:16 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 09:55:16 compute-1 python3.9[202420]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 02 09:55:17 compute-1 ceph-mon[80115]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #28. Immutable memtables: 0.
Feb 02 09:55:17 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-09:55:17.198976) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Feb 02 09:55:17 compute-1 ceph-mon[80115]: rocksdb: [db/flush_job.cc:856] [default] [JOB 13] Flushing memtable with next log file: 28
Feb 02 09:55:17 compute-1 ceph-mon[80115]: rocksdb: EVENT_LOG_v1 {"time_micros": 1770026117199082, "job": 13, "event": "flush_started", "num_memtables": 1, "num_entries": 4666, "num_deletes": 502, "total_data_size": 12826185, "memory_usage": 12996592, "flush_reason": "Manual Compaction"}
Feb 02 09:55:17 compute-1 ceph-mon[80115]: rocksdb: [db/flush_job.cc:885] [default] [JOB 13] Level-0 flush table #29: started
Feb 02 09:55:17 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:55:17 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb614003cb0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:55:17 compute-1 ceph-mon[80115]: rocksdb: EVENT_LOG_v1 {"time_micros": 1770026117252415, "cf_name": "default", "job": 13, "event": "table_file_creation", "file_number": 29, "file_size": 8261921, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 13324, "largest_seqno": 17985, "table_properties": {"data_size": 8244315, "index_size": 11860, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 4677, "raw_key_size": 36899, "raw_average_key_size": 19, "raw_value_size": 8207635, "raw_average_value_size": 4393, "num_data_blocks": 518, "num_entries": 1868, "num_filter_entries": 1868, "num_deletions": 502, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1770025678, "oldest_key_time": 1770025678, "file_creation_time": 1770026117, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "fac1d709-8a2a-487d-b05b-57255ec289c7", "db_session_id": "DE871D21TSCUFP8UED8E", "orig_file_number": 29, "seqno_to_time_mapping": "N/A"}}
Feb 02 09:55:17 compute-1 ceph-mon[80115]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 13] Flush lasted 53581 microseconds, and 18215 cpu microseconds.
Feb 02 09:55:17 compute-1 ceph-mon[80115]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Feb 02 09:55:17 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-09:55:17.252564) [db/flush_job.cc:967] [default] [JOB 13] Level-0 flush table #29: 8261921 bytes OK
Feb 02 09:55:17 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-09:55:17.252622) [db/memtable_list.cc:519] [default] Level-0 commit table #29 started
Feb 02 09:55:17 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-09:55:17.259331) [db/memtable_list.cc:722] [default] Level-0 commit table #29: memtable #1 done
Feb 02 09:55:17 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-09:55:17.259353) EVENT_LOG_v1 {"time_micros": 1770026117259346, "job": 13, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Feb 02 09:55:17 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-09:55:17.259372) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Feb 02 09:55:17 compute-1 ceph-mon[80115]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 13] Try to delete WAL files size 12805597, prev total WAL file size 12842136, number of live WAL files 2.
Feb 02 09:55:17 compute-1 ceph-mon[80115]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000025.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb 02 09:55:17 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-09:55:17.261741) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730031303034' seq:72057594037927935, type:22 .. '7061786F730031323536' seq:0, type:0; will stop at (end)
Feb 02 09:55:17 compute-1 ceph-mon[80115]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 14] Compacting 1@0 + 1@6 files to L6, score -1.00
Feb 02 09:55:17 compute-1 ceph-mon[80115]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 13 Base level 0, inputs: [29(8068KB)], [27(11MB)]
Feb 02 09:55:17 compute-1 ceph-mon[80115]: rocksdb: EVENT_LOG_v1 {"time_micros": 1770026117261831, "job": 14, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [29], "files_L6": [27], "score": -1, "input_data_size": 20464118, "oldest_snapshot_seqno": -1}
Feb 02 09:55:17 compute-1 ceph-mon[80115]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 14] Generated table #30: 5083 keys, 15279652 bytes, temperature: kUnknown
Feb 02 09:55:17 compute-1 ceph-mon[80115]: rocksdb: EVENT_LOG_v1 {"time_micros": 1770026117389434, "cf_name": "default", "job": 14, "event": "table_file_creation", "file_number": 30, "file_size": 15279652, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 15241110, "index_size": 24736, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 12741, "raw_key_size": 127177, "raw_average_key_size": 25, "raw_value_size": 15144227, "raw_average_value_size": 2979, "num_data_blocks": 1039, "num_entries": 5083, "num_filter_entries": 5083, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1770025175, "oldest_key_time": 0, "file_creation_time": 1770026117, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "fac1d709-8a2a-487d-b05b-57255ec289c7", "db_session_id": "DE871D21TSCUFP8UED8E", "orig_file_number": 30, "seqno_to_time_mapping": "N/A"}}
Feb 02 09:55:17 compute-1 ceph-mon[80115]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Feb 02 09:55:17 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-09:55:17.389803) [db/compaction/compaction_job.cc:1663] [default] [JOB 14] Compacted 1@0 + 1@6 files to L6 => 15279652 bytes
Feb 02 09:55:17 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-09:55:17.391999) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 160.2 rd, 119.6 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(7.9, 11.6 +0.0 blob) out(14.6 +0.0 blob), read-write-amplify(4.3) write-amplify(1.8) OK, records in: 6106, records dropped: 1023 output_compression: NoCompression
Feb 02 09:55:17 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-09:55:17.392037) EVENT_LOG_v1 {"time_micros": 1770026117392020, "job": 14, "event": "compaction_finished", "compaction_time_micros": 127722, "compaction_time_cpu_micros": 37846, "output_level": 6, "num_output_files": 1, "total_output_size": 15279652, "num_input_records": 6106, "num_output_records": 5083, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Feb 02 09:55:17 compute-1 ceph-mon[80115]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000029.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb 02 09:55:17 compute-1 ceph-mon[80115]: rocksdb: EVENT_LOG_v1 {"time_micros": 1770026117394107, "job": 14, "event": "table_file_deletion", "file_number": 29}
Feb 02 09:55:17 compute-1 ceph-mon[80115]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000027.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb 02 09:55:17 compute-1 ceph-mon[80115]: rocksdb: EVENT_LOG_v1 {"time_micros": 1770026117396371, "job": 14, "event": "table_file_deletion", "file_number": 27}
Feb 02 09:55:17 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-09:55:17.261615) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 02 09:55:17 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-09:55:17.396433) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 02 09:55:17 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-09:55:17.396441) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 02 09:55:17 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-09:55:17.396445) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 02 09:55:17 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-09:55:17.396449) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 02 09:55:17 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-09:55:17.396453) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 02 09:55:17 compute-1 ceph-mon[80115]: pgmap v468: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.4 KiB/s rd, 506 B/s wr, 2 op/s
Feb 02 09:55:17 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb 02 09:55:17 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Feb 02 09:55:17 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb 02 09:55:17 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb 02 09:55:17 compute-1 podman[202450]: 2026-02-02 09:55:17.410304372 +0000 UTC m=+0.081715001 container health_status 9cab238676dda9a572dafc047b624a8b78a8fbec063bf997856559897160605d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'db4758ee7523fe447444c4bd2b867b543b1eee4e3bbcf6676cd1b27bf6147d86-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_managed=true)
Feb 02 09:55:17 compute-1 sudo[202462]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 02 09:55:17 compute-1 sudo[202462]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:55:17 compute-1 sudo[202462]: pam_unix(sudo:session): session closed for user root
Feb 02 09:55:17 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:55:17 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb604004020 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:55:17 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:55:17 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb6080018c0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:55:18 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:55:18 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:55:18 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:55:18.024 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:55:18 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:55:18 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:55:18 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:55:18.111 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:55:18 compute-1 python3.9[202619]: ansible-ansible.builtin.service_facts Invoked
Feb 02 09:55:18 compute-1 network[202636]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Feb 02 09:55:18 compute-1 network[202637]: 'network-scripts' will be removed from distribution in near future.
Feb 02 09:55:18 compute-1 network[202638]: It is advised to switch to 'NetworkManager' instead for network management.
Feb 02 09:55:19 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:55:19 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb614003cb0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:55:19 compute-1 ceph-mon[80115]: pgmap v469: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.6 KiB/s rd, 506 B/s wr, 2 op/s
Feb 02 09:55:19 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:55:19 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb5f00031e0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:55:19 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:55:19 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb604004040 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:55:20 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:55:20 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 09:55:20 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:55:20.027 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 09:55:20 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:55:20 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:55:20 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:55:20.114 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:55:21 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:55:21 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb6080018c0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:55:21 compute-1 ceph-mon[80115]: pgmap v470: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 304 B/s rd, 0 B/s wr, 0 op/s
Feb 02 09:55:21 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:55:21 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb614003cb0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:55:21 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:55:21 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb5f00031e0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:55:21 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 09:55:22 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:55:22 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:55:22 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:55:22.030 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:55:22 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:55:22 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:55:22 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:55:22.117 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:55:22 compute-1 sudo[202911]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ejopxaozwjodwpsjzzlzlfojnlirojej ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770026122.5600471-97-99278689447122/AnsiballZ_setup.py'
Feb 02 09:55:22 compute-1 sudo[202911]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:55:23 compute-1 python3.9[202913]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Feb 02 09:55:23 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:55:23 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb604004060 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:55:23 compute-1 sudo[202911]: pam_unix(sudo:session): session closed for user root
Feb 02 09:55:23 compute-1 ceph-mon[80115]: pgmap v471: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 304 B/s rd, 0 B/s wr, 0 op/s
Feb 02 09:55:23 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:55:23 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb6080018c0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:55:23 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:55:23 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb614003cb0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:55:23 compute-1 sudo[202995]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rqcuarbgzqrdlbqyxewohgtjjeckghsd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770026122.5600471-97-99278689447122/AnsiballZ_dnf.py'
Feb 02 09:55:23 compute-1 sudo[202995]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:55:24 compute-1 python3.9[202997]: ansible-ansible.legacy.dnf Invoked with name=['iscsi-initiator-utils'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Feb 02 09:55:24 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:55:24 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:55:24 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:55:24.033 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:55:24 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:55:24 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:55:24 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:55:24.119 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:55:25 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:55:25 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb5f00031e0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:55:25 compute-1 ceph-mon[80115]: pgmap v472: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 341 B/s rd, 0 B/s wr, 0 op/s
Feb 02 09:55:25 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:55:25 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb604004080 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:55:25 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:55:25 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb6080018c0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:55:26 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:55:26 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.003000079s ======
Feb 02 09:55:26 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:55:26.037 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.003000079s
Feb 02 09:55:26 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:55:26 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:55:26 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:55:26.124 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:55:26 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 09:55:27 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:55:27 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb614003cb0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:55:27 compute-1 ceph-mon[80115]: pgmap v473: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Feb 02 09:55:27 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:55:27 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb5f00031e0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:55:27 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:55:27 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb6040040a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:55:28 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:55:28 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 09:55:28 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:55:28.043 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 09:55:28 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:55:28 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:55:28 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:55:28.127 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:55:28 compute-1 sudo[202995]: pam_unix(sudo:session): session closed for user root
Feb 02 09:55:29 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:55:29 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb6080018c0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:55:29 compute-1 ceph-mon[80115]: pgmap v474: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 426 B/s rd, 0 op/s
Feb 02 09:55:29 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:55:29 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb614003cb0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:55:29 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:55:29 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb5f00031e0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:55:30 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:55:30 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:55:30 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:55:30.046 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:55:30 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:55:30 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 09:55:30 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:55:30.129 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 09:55:30 compute-1 sudo[203151]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-swzkeufihfxoekmgbptvhdhitgtatcnc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770026129.781438-133-117920102473005/AnsiballZ_stat.py'
Feb 02 09:55:30 compute-1 sudo[203151]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:55:30 compute-1 python3.9[203153]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated/iscsid/etc/iscsi follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 02 09:55:30 compute-1 sudo[203151]: pam_unix(sudo:session): session closed for user root
Feb 02 09:55:31 compute-1 sudo[203304]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-semooedxzpaszbhsbkxtrkvbmbucwxle ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770026130.805329-163-23578839629982/AnsiballZ_command.py'
Feb 02 09:55:31 compute-1 sudo[203304]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:55:31 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:55:31 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb6040040c0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:55:31 compute-1 python3.9[203306]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/restorecon -nvr /etc/iscsi /var/lib/iscsi _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 02 09:55:31 compute-1 sudo[203304]: pam_unix(sudo:session): session closed for user root
Feb 02 09:55:31 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:55:31 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb6080018c0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:55:31 compute-1 ceph-mon[80115]: pgmap v475: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Feb 02 09:55:31 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:55:31 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb614003cb0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:55:31 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 09:55:32 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:55:32 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.002000053s ======
Feb 02 09:55:32 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:55:32.047 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000053s
Feb 02 09:55:32 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:55:32 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 09:55:32 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:55:32.130 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 09:55:32 compute-1 sudo[203457]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-puczinfgtjeoyggplrldtlcxxrsdqfrq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770026132.0332859-194-27172594147955/AnsiballZ_stat.py'
Feb 02 09:55:32 compute-1 sudo[203457]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:55:32 compute-1 python3.9[203459]: ansible-ansible.builtin.stat Invoked with path=/etc/iscsi/.initiator_reset follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 02 09:55:32 compute-1 sudo[203457]: pam_unix(sudo:session): session closed for user root
Feb 02 09:55:32 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Feb 02 09:55:33 compute-1 sudo[203610]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-apqgrpvepzmddrsakbkkzvykhcwtahzu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770026132.7612333-217-219310294313830/AnsiballZ_command.py'
Feb 02 09:55:33 compute-1 sudo[203610]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:55:33 compute-1 python3.9[203612]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/iscsi-iname _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 02 09:55:33 compute-1 sudo[203610]: pam_unix(sudo:session): session closed for user root
Feb 02 09:55:33 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:55:33 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb5f00031e0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:55:33 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:55:33 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb6040040e0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:55:33 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:55:33 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb5f8002830 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:55:33 compute-1 ceph-mon[80115]: pgmap v476: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Feb 02 09:55:33 compute-1 sudo[203765]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-clvsndvohvdmalbtxhuifdycvaemfrlp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770026133.535011-241-58220896284172/AnsiballZ_stat.py'
Feb 02 09:55:33 compute-1 sudo[203765]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:55:33 compute-1 python3.9[203767]: ansible-ansible.legacy.stat Invoked with path=/etc/iscsi/initiatorname.iscsi follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 02 09:55:33 compute-1 sudo[203765]: pam_unix(sudo:session): session closed for user root
Feb 02 09:55:34 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:55:34 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:55:34 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:55:34.050 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:55:34 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:55:34 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:55:34 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:55:34.133 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:55:34 compute-1 sudo[203888]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rdjlznsdaayfyhtlekfjapibdwkxwzjv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770026133.535011-241-58220896284172/AnsiballZ_copy.py'
Feb 02 09:55:34 compute-1 sudo[203888]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:55:34 compute-1 ceph-mon[80115]: pgmap v477: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 341 B/s rd, 0 op/s
Feb 02 09:55:34 compute-1 python3.9[203890]: ansible-ansible.legacy.copy Invoked with dest=/etc/iscsi/initiatorname.iscsi mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1770026133.535011-241-58220896284172/.source.iscsi _original_basename=.l_7mqm52 follow=False checksum=7a06202972a249b967f1a984bcbd8e2cf3a33f8a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 02 09:55:34 compute-1 sudo[203888]: pam_unix(sudo:session): session closed for user root
Feb 02 09:55:35 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:55:35 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb614003cb0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:55:35 compute-1 sudo[204041]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dlsodvjzflihjlayumqxdxxtwdhpolni ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770026134.8936048-286-37416607631042/AnsiballZ_file.py'
Feb 02 09:55:35 compute-1 sudo[204041]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:55:35 compute-1 python3.9[204043]: ansible-ansible.builtin.file Invoked with mode=0600 path=/etc/iscsi/.initiator_reset state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 02 09:55:35 compute-1 sudo[204041]: pam_unix(sudo:session): session closed for user root
Feb 02 09:55:35 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:55:35 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb5f00031e0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:55:35 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:55:35 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb6040040e0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:55:36 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:55:36 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:55:36 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:55:36.052 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:55:36 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:55:36 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:55:36 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:55:36.136 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:55:36 compute-1 sudo[204193]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tichldkkfygvrnavmcnzldxqizfctany ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770026135.775882-310-128491302786013/AnsiballZ_lineinfile.py'
Feb 02 09:55:36 compute-1 sudo[204193]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:55:36 compute-1 sudo[204196]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Feb 02 09:55:36 compute-1 sudo[204196]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:55:36 compute-1 sudo[204196]: pam_unix(sudo:session): session closed for user root
Feb 02 09:55:36 compute-1 python3.9[204195]: ansible-ansible.builtin.lineinfile Invoked with insertafter=^#node.session.auth.chap.algs line=node.session.auth.chap_algs = SHA3-256,SHA256,SHA1,MD5 path=/etc/iscsi/iscsid.conf regexp=^node.session.auth.chap_algs state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 02 09:55:36 compute-1 sudo[204193]: pam_unix(sudo:session): session closed for user root
Feb 02 09:55:36 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 09:55:37 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:55:37 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb5f8002830 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:55:37 compute-1 ceph-mon[80115]: pgmap v478: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Feb 02 09:55:37 compute-1 sudo[204371]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pncjgfsexalokdjcwrhjgvautabjkvzy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770026136.8242698-337-272559909457922/AnsiballZ_systemd_service.py'
Feb 02 09:55:37 compute-1 sudo[204371]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:55:37 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:55:37 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb614003cb0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:55:37 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:55:37 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb5f00042e0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:55:37 compute-1 python3.9[204373]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=iscsid.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 02 09:55:37 compute-1 systemd[1]: Listening on Open-iSCSI iscsid Socket.
Feb 02 09:55:37 compute-1 sudo[204371]: pam_unix(sudo:session): session closed for user root
Feb 02 09:55:38 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:55:38 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 09:55:38 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:55:38.055 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 09:55:38 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:55:38 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:55:38 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:55:38.139 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:55:38 compute-1 podman[204477]: 2026-02-02 09:55:38.396941418 +0000 UTC m=+0.074332415 container health_status 1fb2696999ea15e9688144989093738543d3cef725f9986792d6dfd707aefbe2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:099d88ae13fa2b3409da5310cdcba7fa01d2c87a8bc98296299a57054b9a075e, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'db4758ee7523fe447444c4bd2b867b543b1eee4e3bbcf6676cd1b27bf6147d86-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:099d88ae13fa2b3409da5310cdcba7fa01d2c87a8bc98296299a57054b9a075e', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true)
Feb 02 09:55:38 compute-1 sudo[204554]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ljbyfqluteiqmyoqfzsqomcpenjjcudv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770026138.163675-361-85419538420470/AnsiballZ_systemd_service.py'
Feb 02 09:55:38 compute-1 sudo[204554]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:55:38 compute-1 python3.9[204556]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=iscsid state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 02 09:55:38 compute-1 systemd[1]: Reloading.
Feb 02 09:55:38 compute-1 systemd-sysv-generator[204582]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 02 09:55:38 compute-1 systemd-rc-local-generator[204578]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 02 09:55:39 compute-1 systemd[1]: One time configuration for iscsi.service was skipped because of an unmet condition check (ConditionPathExists=!/etc/iscsi/initiatorname.iscsi).
Feb 02 09:55:39 compute-1 systemd[1]: Starting Open-iSCSI...
Feb 02 09:55:39 compute-1 kernel: Loading iSCSI transport class v2.0-870.
Feb 02 09:55:39 compute-1 systemd[1]: Started Open-iSCSI.
Feb 02 09:55:39 compute-1 systemd[1]: Starting Logout off all iSCSI sessions on shutdown...
Feb 02 09:55:39 compute-1 systemd[1]: Finished Logout off all iSCSI sessions on shutdown.
Feb 02 09:55:39 compute-1 sudo[204554]: pam_unix(sudo:session): session closed for user root
Feb 02 09:55:39 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:55:39 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb6040040e0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:55:39 compute-1 ceph-mon[80115]: pgmap v479: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 426 B/s rd, 0 op/s
Feb 02 09:55:39 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:55:39 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb5f8002830 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:55:39 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:55:39 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb614003cb0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:55:40 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:55:40 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:55:40 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:55:40.057 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:55:40 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:55:40 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 09:55:40 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:55:40.141 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 09:55:40 compute-1 python3.9[204755]: ansible-ansible.builtin.service_facts Invoked
Feb 02 09:55:40 compute-1 network[204772]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Feb 02 09:55:40 compute-1 network[204773]: 'network-scripts' will be removed from distribution in near future.
Feb 02 09:55:40 compute-1 network[204774]: It is advised to switch to 'NetworkManager' instead for network management.
Feb 02 09:55:41 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:55:41 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb5f00042e0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:55:41 compute-1 ceph-mon[80115]: pgmap v480: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Feb 02 09:55:41 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:55:41 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb6040040e0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:55:41 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:55:41 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb5f8002830 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:55:41 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 09:55:42 compute-1 ceph-mon[80115]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #31. Immutable memtables: 0.
Feb 02 09:55:42 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-09:55:42.021858) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Feb 02 09:55:42 compute-1 ceph-mon[80115]: rocksdb: [db/flush_job.cc:856] [default] [JOB 15] Flushing memtable with next log file: 31
Feb 02 09:55:42 compute-1 ceph-mon[80115]: rocksdb: EVENT_LOG_v1 {"time_micros": 1770026142021895, "job": 15, "event": "flush_started", "num_memtables": 1, "num_entries": 478, "num_deletes": 250, "total_data_size": 753496, "memory_usage": 763376, "flush_reason": "Manual Compaction"}
Feb 02 09:55:42 compute-1 ceph-mon[80115]: rocksdb: [db/flush_job.cc:885] [default] [JOB 15] Level-0 flush table #32: started
Feb 02 09:55:42 compute-1 ceph-mon[80115]: rocksdb: EVENT_LOG_v1 {"time_micros": 1770026142026720, "cf_name": "default", "job": 15, "event": "table_file_creation", "file_number": 32, "file_size": 400569, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 17990, "largest_seqno": 18463, "table_properties": {"data_size": 398101, "index_size": 568, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 837, "raw_key_size": 6412, "raw_average_key_size": 19, "raw_value_size": 393094, "raw_average_value_size": 1202, "num_data_blocks": 26, "num_entries": 327, "num_filter_entries": 327, "num_deletions": 250, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1770026117, "oldest_key_time": 1770026117, "file_creation_time": 1770026142, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "fac1d709-8a2a-487d-b05b-57255ec289c7", "db_session_id": "DE871D21TSCUFP8UED8E", "orig_file_number": 32, "seqno_to_time_mapping": "N/A"}}
Feb 02 09:55:42 compute-1 ceph-mon[80115]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 15] Flush lasted 4908 microseconds, and 1932 cpu microseconds.
Feb 02 09:55:42 compute-1 ceph-mon[80115]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Feb 02 09:55:42 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-09:55:42.026764) [db/flush_job.cc:967] [default] [JOB 15] Level-0 flush table #32: 400569 bytes OK
Feb 02 09:55:42 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-09:55:42.026781) [db/memtable_list.cc:519] [default] Level-0 commit table #32 started
Feb 02 09:55:42 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-09:55:42.028686) [db/memtable_list.cc:722] [default] Level-0 commit table #32: memtable #1 done
Feb 02 09:55:42 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-09:55:42.028701) EVENT_LOG_v1 {"time_micros": 1770026142028697, "job": 15, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Feb 02 09:55:42 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-09:55:42.028716) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Feb 02 09:55:42 compute-1 ceph-mon[80115]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 15] Try to delete WAL files size 750596, prev total WAL file size 750596, number of live WAL files 2.
Feb 02 09:55:42 compute-1 ceph-mon[80115]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000028.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb 02 09:55:42 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-09:55:42.029122) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6D67727374617400323530' seq:72057594037927935, type:22 .. '6D67727374617400353031' seq:0, type:0; will stop at (end)
Feb 02 09:55:42 compute-1 ceph-mon[80115]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 16] Compacting 1@0 + 1@6 files to L6, score -1.00
Feb 02 09:55:42 compute-1 ceph-mon[80115]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 15 Base level 0, inputs: [32(391KB)], [30(14MB)]
Feb 02 09:55:42 compute-1 ceph-mon[80115]: rocksdb: EVENT_LOG_v1 {"time_micros": 1770026142029228, "job": 16, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [32], "files_L6": [30], "score": -1, "input_data_size": 15680221, "oldest_snapshot_seqno": -1}
Feb 02 09:55:42 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:55:42 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:55:42 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:55:42.060 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:55:42 compute-1 ceph-mon[80115]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 16] Generated table #33: 4906 keys, 11699367 bytes, temperature: kUnknown
Feb 02 09:55:42 compute-1 ceph-mon[80115]: rocksdb: EVENT_LOG_v1 {"time_micros": 1770026142112273, "cf_name": "default", "job": 16, "event": "table_file_creation", "file_number": 33, "file_size": 11699367, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 11666307, "index_size": 19702, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 12293, "raw_key_size": 123909, "raw_average_key_size": 25, "raw_value_size": 11576842, "raw_average_value_size": 2359, "num_data_blocks": 819, "num_entries": 4906, "num_filter_entries": 4906, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1770025175, "oldest_key_time": 0, "file_creation_time": 1770026142, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "fac1d709-8a2a-487d-b05b-57255ec289c7", "db_session_id": "DE871D21TSCUFP8UED8E", "orig_file_number": 33, "seqno_to_time_mapping": "N/A"}}
Feb 02 09:55:42 compute-1 ceph-mon[80115]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Feb 02 09:55:42 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-09:55:42.112665) [db/compaction/compaction_job.cc:1663] [default] [JOB 16] Compacted 1@0 + 1@6 files to L6 => 11699367 bytes
Feb 02 09:55:42 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-09:55:42.117815) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 188.5 rd, 140.6 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.4, 14.6 +0.0 blob) out(11.2 +0.0 blob), read-write-amplify(68.4) write-amplify(29.2) OK, records in: 5410, records dropped: 504 output_compression: NoCompression
Feb 02 09:55:42 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-09:55:42.117838) EVENT_LOG_v1 {"time_micros": 1770026142117829, "job": 16, "event": "compaction_finished", "compaction_time_micros": 83181, "compaction_time_cpu_micros": 32724, "output_level": 6, "num_output_files": 1, "total_output_size": 11699367, "num_input_records": 5410, "num_output_records": 4906, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Feb 02 09:55:42 compute-1 ceph-mon[80115]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000032.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb 02 09:55:42 compute-1 ceph-mon[80115]: rocksdb: EVENT_LOG_v1 {"time_micros": 1770026142117987, "job": 16, "event": "table_file_deletion", "file_number": 32}
Feb 02 09:55:42 compute-1 ceph-mon[80115]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000030.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb 02 09:55:42 compute-1 ceph-mon[80115]: rocksdb: EVENT_LOG_v1 {"time_micros": 1770026142119649, "job": 16, "event": "table_file_deletion", "file_number": 30}
Feb 02 09:55:42 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-09:55:42.029017) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 02 09:55:42 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-09:55:42.119734) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 02 09:55:42 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-09:55:42.119739) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 02 09:55:42 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-09:55:42.119742) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 02 09:55:42 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-09:55:42.119745) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 02 09:55:42 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-09:55:42.119747) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 02 09:55:42 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:55:42 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 09:55:42 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:55:42.143 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 09:55:43 compute-1 ceph-mon[80115]: pgmap v481: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Feb 02 09:55:43 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:55:43 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb614003cb0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:55:43 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:55:43 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb5f00042e0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:55:43 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:55:43 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb6040040e0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:55:44 compute-1 sudo[205046]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ltsfpwekkrahunoahujeayxeoglypsem ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770026143.752057-430-107053388395446/AnsiballZ_dnf.py'
Feb 02 09:55:44 compute-1 sudo[205046]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:55:44 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:55:44 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 09:55:44 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:55:44.062 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 09:55:44 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:55:44 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:55:44 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:55:44.147 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:55:44 compute-1 python3.9[205048]: ansible-ansible.legacy.dnf Invoked with name=['device-mapper-multipath'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Feb 02 09:55:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:55:44.894 143542 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 02 09:55:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:55:44.895 143542 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 02 09:55:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:55:44.895 143542 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 02 09:55:45 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:55:45 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb5f8002830 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:55:45 compute-1 ceph-mon[80115]: pgmap v482: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 341 B/s rd, 0 op/s
Feb 02 09:55:45 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:55:45 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb614003cb0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:55:45 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:55:45 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb5f00042e0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:55:46 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:55:46 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 09:55:46 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:55:46.066 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 09:55:46 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:55:46 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 09:55:46 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:55:46.152 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 09:55:46 compute-1 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Feb 02 09:55:46 compute-1 systemd[1]: Starting man-db-cache-update.service...
Feb 02 09:55:46 compute-1 systemd[1]: Reloading.
Feb 02 09:55:46 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 09:55:46 compute-1 systemd-sysv-generator[205095]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 02 09:55:46 compute-1 systemd-rc-local-generator[205092]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 02 09:55:46 compute-1 systemd[1]: Queuing reload/restart jobs for marked units…
Feb 02 09:55:47 compute-1 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Feb 02 09:55:47 compute-1 systemd[1]: Finished man-db-cache-update.service.
Feb 02 09:55:47 compute-1 systemd[1]: run-r2b981ab0999c4ed39aa5df20ad3a9b03.service: Deactivated successfully.
Feb 02 09:55:47 compute-1 sudo[205046]: pam_unix(sudo:session): session closed for user root
Feb 02 09:55:47 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:55:47 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb6040040e0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:55:47 compute-1 ceph-mon[80115]: pgmap v483: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Feb 02 09:55:47 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Feb 02 09:55:47 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:55:47 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb5f8002830 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:55:47 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:55:47 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb614003cb0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:55:47 compute-1 sudo[205376]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fvgqjmndzexqlikldjokqqeyewlkvmoi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770026147.585579-457-209979941903521/AnsiballZ_file.py'
Feb 02 09:55:47 compute-1 sudo[205376]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:55:47 compute-1 podman[205339]: 2026-02-02 09:55:47.985952676 +0000 UTC m=+0.107899086 container health_status 9cab238676dda9a572dafc047b624a8b78a8fbec063bf997856559897160605d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'db4758ee7523fe447444c4bd2b867b543b1eee4e3bbcf6676cd1b27bf6147d86-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Feb 02 09:55:48 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:55:48 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 09:55:48 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:55:48.068 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 09:55:48 compute-1 python3.9[205382]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/modules-load.d selevel=s0 setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Feb 02 09:55:48 compute-1 sudo[205376]: pam_unix(sudo:session): session closed for user root
Feb 02 09:55:48 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:55:48 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:55:48 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:55:48.157 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:55:48 compute-1 sudo[205537]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eyaunjdxcpudsnqgkgmcevakiujudyot ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770026148.3376508-481-40282139172404/AnsiballZ_modprobe.py'
Feb 02 09:55:48 compute-1 sudo[205537]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:55:48 compute-1 python3.9[205539]: ansible-community.general.modprobe Invoked with name=dm-multipath state=present params= persistent=disabled
Feb 02 09:55:48 compute-1 sudo[205537]: pam_unix(sudo:session): session closed for user root
Feb 02 09:55:49 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:55:49 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb5f00042e0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:55:49 compute-1 ceph-mon[80115]: pgmap v484: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 426 B/s rd, 0 op/s
Feb 02 09:55:49 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:55:49 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb5f8002830 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:55:49 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:55:49 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb6040040e0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:55:49 compute-1 sudo[205694]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mfxqanmsejmhezyustpbuexacqqtbvqx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770026149.2161922-505-160903731970247/AnsiballZ_stat.py'
Feb 02 09:55:49 compute-1 sudo[205694]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:55:49 compute-1 python3.9[205696]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/dm-multipath.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 02 09:55:49 compute-1 sudo[205694]: pam_unix(sudo:session): session closed for user root
Feb 02 09:55:50 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:55:50 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:55:50 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:55:50.071 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:55:50 compute-1 sudo[205817]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rvtgsrwbkwhcmditytafxoqioztihlnw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770026149.2161922-505-160903731970247/AnsiballZ_copy.py'
Feb 02 09:55:50 compute-1 sudo[205817]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:55:50 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:55:50 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:55:50 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:55:50.158 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:55:50 compute-1 python3.9[205819]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/dm-multipath.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1770026149.2161922-505-160903731970247/.source.conf follow=False _original_basename=module-load.conf.j2 checksum=065061c60917e4f67cecc70d12ce55e42f9d0b3f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 02 09:55:50 compute-1 sudo[205817]: pam_unix(sudo:session): session closed for user root
Feb 02 09:55:50 compute-1 sudo[205970]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-noqdsgrjirtexofaxaeejkntgcwnedvf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770026150.6926649-553-32944591026958/AnsiballZ_lineinfile.py'
Feb 02 09:55:50 compute-1 sudo[205970]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:55:51 compute-1 python3.9[205972]: ansible-ansible.builtin.lineinfile Invoked with create=True dest=/etc/modules line=dm-multipath  mode=0644 state=present path=/etc/modules encoding=utf-8 backrefs=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 02 09:55:51 compute-1 sudo[205970]: pam_unix(sudo:session): session closed for user root
Feb 02 09:55:51 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:55:51 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb614003cb0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:55:51 compute-1 ceph-mon[80115]: pgmap v485: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Feb 02 09:55:51 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:55:51 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb5f00042e0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:55:51 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:55:51 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb5f8002830 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:55:51 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 09:55:52 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:55:52 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:55:52 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:55:52.074 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:55:52 compute-1 sudo[206122]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jxplpelxhqgqhpmabolblarcayaymfnq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770026151.3907952-578-276211349901064/AnsiballZ_systemd.py'
Feb 02 09:55:52 compute-1 sudo[206122]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:55:52 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:55:52 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:55:52 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:55:52.162 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:55:52 compute-1 python3.9[206124]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Feb 02 09:55:52 compute-1 systemd[1]: systemd-modules-load.service: Deactivated successfully.
Feb 02 09:55:52 compute-1 systemd[1]: Stopped Load Kernel Modules.
Feb 02 09:55:52 compute-1 systemd[1]: Stopping Load Kernel Modules...
Feb 02 09:55:52 compute-1 systemd[1]: Starting Load Kernel Modules...
Feb 02 09:55:52 compute-1 systemd[1]: Finished Load Kernel Modules.
Feb 02 09:55:52 compute-1 sudo[206122]: pam_unix(sudo:session): session closed for user root
Feb 02 09:55:53 compute-1 sudo[206279]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nbxnbkxmeioafqixykakfavoywgrgrhj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770026152.7351813-601-98510379545698/AnsiballZ_command.py'
Feb 02 09:55:53 compute-1 sudo[206279]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:55:53 compute-1 python3.9[206281]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/restorecon -nvr /etc/multipath _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 02 09:55:53 compute-1 sudo[206279]: pam_unix(sudo:session): session closed for user root
Feb 02 09:55:53 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:55:53 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb6040040e0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:55:53 compute-1 ceph-mon[80115]: pgmap v486: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Feb 02 09:55:53 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:55:53 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb614003cb0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:55:53 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:55:53 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb5f00042e0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:55:53 compute-1 sudo[206432]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-youkqhfpostmswwyizekiefwtyffeggw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770026153.6472328-631-122855934185246/AnsiballZ_stat.py'
Feb 02 09:55:53 compute-1 sudo[206432]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:55:54 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:55:54 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:55:54 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:55:54.077 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:55:54 compute-1 python3.9[206434]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 02 09:55:54 compute-1 sudo[206432]: pam_unix(sudo:session): session closed for user root
Feb 02 09:55:54 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:55:54 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 09:55:54 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:55:54.164 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 09:55:54 compute-1 sudo[206585]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-myzpcpmkxhbmquflijrjzldcwxoljwid ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770026154.4315484-658-57031365423968/AnsiballZ_stat.py'
Feb 02 09:55:54 compute-1 sudo[206585]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:55:54 compute-1 python3.9[206587]: ansible-ansible.legacy.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 02 09:55:54 compute-1 sudo[206585]: pam_unix(sudo:session): session closed for user root
Feb 02 09:55:55 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:55:55 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb5f8003fb0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:55:55 compute-1 sudo[206708]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wlajqlzjpitkxpqdptqezqptddldjyxe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770026154.4315484-658-57031365423968/AnsiballZ_copy.py'
Feb 02 09:55:55 compute-1 sudo[206708]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:55:55 compute-1 ceph-mon[80115]: pgmap v487: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 341 B/s rd, 0 op/s
Feb 02 09:55:55 compute-1 python3.9[206710]: ansible-ansible.legacy.copy Invoked with dest=/etc/multipath.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1770026154.4315484-658-57031365423968/.source.conf _original_basename=multipath.conf follow=False checksum=bf02ab264d3d648048a81f3bacec8bc58db93162 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 02 09:55:55 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:55:55 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb6040040e0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:55:55 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:55:55 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb614003cb0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:55:55 compute-1 sudo[206708]: pam_unix(sudo:session): session closed for user root
Feb 02 09:55:56 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:55:56 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 09:55:56 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:55:56.079 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 09:55:56 compute-1 sudo[206860]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zwvaiesrehpucdwgdwbulfoswygbtynm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770026155.8462088-703-92014500204403/AnsiballZ_command.py'
Feb 02 09:55:56 compute-1 sudo[206860]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:55:56 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:55:56 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 09:55:56 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:55:56.166 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 09:55:56 compute-1 python3.9[206862]: ansible-ansible.legacy.command Invoked with _raw_params=grep -q '^blacklist\s*{' /etc/multipath.conf _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 02 09:55:56 compute-1 sudo[206860]: pam_unix(sudo:session): session closed for user root
Feb 02 09:55:56 compute-1 sudo[206865]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Feb 02 09:55:56 compute-1 sudo[206865]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:55:56 compute-1 sudo[206865]: pam_unix(sudo:session): session closed for user root
Feb 02 09:55:56 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 09:55:56 compute-1 sudo[207039]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kpvwfqvgzizrlmcngtsbhhgvvulqxmth ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770026156.5968037-727-215021246420692/AnsiballZ_lineinfile.py'
Feb 02 09:55:56 compute-1 sudo[207039]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:55:57 compute-1 python3.9[207041]: ansible-ansible.builtin.lineinfile Invoked with line=blacklist { path=/etc/multipath.conf state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 02 09:55:57 compute-1 sudo[207039]: pam_unix(sudo:session): session closed for user root
Feb 02 09:55:57 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:55:57 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb5f00042e0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:55:57 compute-1 ceph-mon[80115]: pgmap v488: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Feb 02 09:55:57 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:55:57 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb5f8003fb0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:55:57 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:55:57 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb604004100 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:55:57 compute-1 sudo[207191]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uwvowieryzrnoirnqgrapvbhhyseygjl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770026157.318122-751-204528402216729/AnsiballZ_replace.py'
Feb 02 09:55:57 compute-1 sudo[207191]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:55:58 compute-1 python3.9[207193]: ansible-ansible.builtin.replace Invoked with path=/etc/multipath.conf regexp=^(blacklist {) replace=\1\n} backup=False encoding=utf-8 unsafe_writes=False after=None before=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 02 09:55:58 compute-1 sudo[207191]: pam_unix(sudo:session): session closed for user root
Feb 02 09:55:58 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:55:58 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 09:55:58 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:55:58.082 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 09:55:58 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:55:58 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 09:55:58 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:55:58.170 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 09:55:58 compute-1 sudo[207343]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jfentehwhvobqkggpqyauwkwctxspgsa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770026158.3452735-775-137760129558525/AnsiballZ_replace.py'
Feb 02 09:55:58 compute-1 sudo[207343]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:55:58 compute-1 python3.9[207345]: ansible-ansible.builtin.replace Invoked with path=/etc/multipath.conf regexp=^blacklist\s*{\n[\s]+devnode \"\.\*\" replace=blacklist { backup=False encoding=utf-8 unsafe_writes=False after=None before=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 02 09:55:58 compute-1 sudo[207343]: pam_unix(sudo:session): session closed for user root
Feb 02 09:55:59 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:55:59 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb614003cb0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:55:59 compute-1 ceph-mon[80115]: pgmap v489: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 341 B/s rd, 0 op/s
Feb 02 09:55:59 compute-1 sudo[207496]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sqzvwmyeceayhwbysitawrvfkefhordz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770026159.18485-802-149032895931698/AnsiballZ_lineinfile.py'
Feb 02 09:55:59 compute-1 sudo[207496]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:55:59 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:55:59 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb5f00042e0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:55:59 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:55:59 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb5f8003fb0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:55:59 compute-1 python3.9[207498]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        find_multipaths yes path=/etc/multipath.conf regexp=^\s+find_multipaths state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 02 09:55:59 compute-1 sudo[207496]: pam_unix(sudo:session): session closed for user root
Feb 02 09:56:00 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:56:00 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:56:00 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:56:00.085 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:56:00 compute-1 sudo[207648]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-epnlvpfzpsgoiqdweyhntyshcihpdyjl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770026159.8586557-802-129853938977727/AnsiballZ_lineinfile.py'
Feb 02 09:56:00 compute-1 sudo[207648]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:56:00 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:56:00 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:56:00 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:56:00.173 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:56:00 compute-1 python3.9[207650]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        recheck_wwid yes path=/etc/multipath.conf regexp=^\s+recheck_wwid state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 02 09:56:00 compute-1 sudo[207648]: pam_unix(sudo:session): session closed for user root
Feb 02 09:56:00 compute-1 sudo[207800]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qoglbaetctobqrqpmcuxbkmtzcsaxmmn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770026160.4285827-802-67748002419774/AnsiballZ_lineinfile.py'
Feb 02 09:56:00 compute-1 sudo[207800]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:56:00 compute-1 python3.9[207802]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        skip_kpartx yes path=/etc/multipath.conf regexp=^\s+skip_kpartx state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 02 09:56:00 compute-1 sudo[207800]: pam_unix(sudo:session): session closed for user root
Feb 02 09:56:01 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:56:01 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb604004120 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:56:01 compute-1 sudo[207953]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dijisbbxyeubpobjlztqetenkibsbhzi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770026161.0441895-802-179910907045120/AnsiballZ_lineinfile.py'
Feb 02 09:56:01 compute-1 sudo[207953]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:56:01 compute-1 ceph-mon[80115]: pgmap v490: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s
Feb 02 09:56:01 compute-1 python3.9[207955]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        user_friendly_names no path=/etc/multipath.conf regexp=^\s+user_friendly_names state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 02 09:56:01 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:56:01 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb614003cb0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:56:01 compute-1 sudo[207953]: pam_unix(sudo:session): session closed for user root
Feb 02 09:56:01 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:56:01 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb5f00042e0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:56:01 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 09:56:02 compute-1 sudo[208105]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wnwjbigkkrlpsujbmwcrcdbysseebrxk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770026161.7931366-889-139908006696523/AnsiballZ_stat.py'
Feb 02 09:56:02 compute-1 sudo[208105]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:56:02 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:56:02 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:56:02 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:56:02.088 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:56:02 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:56:02 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:56:02 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:56:02.176 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:56:02 compute-1 python3.9[208107]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 02 09:56:02 compute-1 sudo[208105]: pam_unix(sudo:session): session closed for user root
Feb 02 09:56:02 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Feb 02 09:56:02 compute-1 sudo[208260]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nwmpvhkwcnozkaubmdfifqfwzdeltrva ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770026162.5528727-913-8595723261026/AnsiballZ_command.py'
Feb 02 09:56:02 compute-1 sudo[208260]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:56:03 compute-1 python3.9[208262]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/bin/true _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 02 09:56:03 compute-1 sudo[208260]: pam_unix(sudo:session): session closed for user root
Feb 02 09:56:03 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-haproxy-nfs-cephfs-compute-1-sryqbx[86108]: [WARNING] 032/095603 (4) : Server backend/nfs.cephfs.2 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Feb 02 09:56:03 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:56:03 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb5f8003fb0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:56:03 compute-1 ceph-mon[80115]: pgmap v491: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s
Feb 02 09:56:03 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:56:03 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb604004140 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:56:03 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:56:03 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb608002ad0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:56:03 compute-1 sudo[208414]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vawwlrjamqrjvmlqiydhgnwylnmglozt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770026163.4242282-940-179390694036892/AnsiballZ_systemd_service.py'
Feb 02 09:56:03 compute-1 sudo[208414]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:56:03 compute-1 python3.9[208416]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=multipathd.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 02 09:56:04 compute-1 systemd[1]: Listening on multipathd control socket.
Feb 02 09:56:04 compute-1 sudo[208414]: pam_unix(sudo:session): session closed for user root
Feb 02 09:56:04 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:56:04 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:56:04 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:56:04.092 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:56:04 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:56:04 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:56:04 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:56:04.180 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:56:04 compute-1 ceph-mon[80115]: pgmap v492: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Feb 02 09:56:04 compute-1 sudo[208571]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kzxixdfcnycxsbmfxpcgbyidlonfrjkb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770026164.4503121-964-202496552103192/AnsiballZ_systemd_service.py'
Feb 02 09:56:04 compute-1 sudo[208571]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:56:05 compute-1 python3.9[208573]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=multipathd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 02 09:56:05 compute-1 systemd[1]: Starting Wait for udev To Complete Device Initialization...
Feb 02 09:56:05 compute-1 udevadm[208578]: systemd-udev-settle.service is deprecated. Please fix multipathd.service not to pull it in.
Feb 02 09:56:05 compute-1 systemd[1]: Finished Wait for udev To Complete Device Initialization.
Feb 02 09:56:05 compute-1 systemd[1]: Starting Device-Mapper Multipath Device Controller...
Feb 02 09:56:05 compute-1 multipathd[208582]: --------start up--------
Feb 02 09:56:05 compute-1 multipathd[208582]: read /etc/multipath.conf
Feb 02 09:56:05 compute-1 multipathd[208582]: path checkers start up
Feb 02 09:56:05 compute-1 systemd[1]: Started Device-Mapper Multipath Device Controller.
Feb 02 09:56:05 compute-1 sudo[208571]: pam_unix(sudo:session): session closed for user root
Feb 02 09:56:05 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:56:05 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb5f00042e0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:56:05 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:56:05 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb5f8003fb0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:56:05 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:56:05 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb600002690 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:56:06 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:56:06 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 09:56:06 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:56:06.093 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 09:56:06 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:56:06 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 09:56:06 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:56:06.182 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 09:56:06 compute-1 sudo[208740]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lelpmjtbfndokjzubaivlitgtbqvmahy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770026165.9489694-1000-135152228430034/AnsiballZ_file.py'
Feb 02 09:56:06 compute-1 sudo[208740]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:56:06 compute-1 python3.9[208742]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/modules-load.d selevel=s0 setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Feb 02 09:56:06 compute-1 sudo[208740]: pam_unix(sudo:session): session closed for user root
Feb 02 09:56:06 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 09:56:06 compute-1 sudo[208893]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ghgsvmvtijjmxpmkopvimkswmzyaupne ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770026166.6684873-1024-234358039934195/AnsiballZ_modprobe.py'
Feb 02 09:56:06 compute-1 sudo[208893]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:56:07 compute-1 python3.9[208895]: ansible-community.general.modprobe Invoked with name=nvme-fabrics state=present params= persistent=disabled
Feb 02 09:56:07 compute-1 kernel: Key type psk registered
Feb 02 09:56:07 compute-1 sudo[208893]: pam_unix(sudo:session): session closed for user root
Feb 02 09:56:07 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:56:07 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb608002ad0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:56:07 compute-1 ceph-mon[80115]: pgmap v493: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s
Feb 02 09:56:07 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:56:07 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb5f00042e0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:56:07 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:56:07 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb5f8003fb0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:56:07 compute-1 sudo[209055]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nqzxumgyaxnpixjpdclfbjddmjnwgugu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770026167.4130406-1048-108689643082974/AnsiballZ_stat.py'
Feb 02 09:56:07 compute-1 sudo[209055]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:56:07 compute-1 python3.9[209057]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/nvme-fabrics.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 02 09:56:07 compute-1 sudo[209055]: pam_unix(sudo:session): session closed for user root
Feb 02 09:56:08 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:56:08 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 09:56:08 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:56:08.095 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 09:56:08 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:56:08 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:56:08 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:56:08.185 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:56:08 compute-1 sudo[209178]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oenyciwjqmpgmfyoymzkzlbyfnlytpmw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770026167.4130406-1048-108689643082974/AnsiballZ_copy.py'
Feb 02 09:56:08 compute-1 sudo[209178]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:56:08 compute-1 python3.9[209180]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/nvme-fabrics.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1770026167.4130406-1048-108689643082974/.source.conf follow=False _original_basename=module-load.conf.j2 checksum=783c778f0c68cc414f35486f234cbb1cf3f9bbff backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 02 09:56:08 compute-1 sudo[209178]: pam_unix(sudo:session): session closed for user root
Feb 02 09:56:09 compute-1 sudo[209342]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kzwiulvqdrezxkxeqyuatiaqjtohitjk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770026168.8121977-1096-100672775594366/AnsiballZ_lineinfile.py'
Feb 02 09:56:09 compute-1 sudo[209342]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:56:09 compute-1 podman[209305]: 2026-02-02 09:56:09.156201369 +0000 UTC m=+0.090244217 container health_status 1fb2696999ea15e9688144989093738543d3cef725f9986792d6dfd707aefbe2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:099d88ae13fa2b3409da5310cdcba7fa01d2c87a8bc98296299a57054b9a075e, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'db4758ee7523fe447444c4bd2b867b543b1eee4e3bbcf6676cd1b27bf6147d86-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:099d88ae13fa2b3409da5310cdcba7fa01d2c87a8bc98296299a57054b9a075e', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_controller)
Feb 02 09:56:09 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:56:09 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb600002690 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:56:09 compute-1 python3.9[209352]: ansible-ansible.builtin.lineinfile Invoked with create=True dest=/etc/modules line=nvme-fabrics  mode=0644 state=present path=/etc/modules encoding=utf-8 backrefs=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 02 09:56:09 compute-1 sudo[209342]: pam_unix(sudo:session): session closed for user root
Feb 02 09:56:09 compute-1 ceph-mon[80115]: pgmap v494: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Feb 02 09:56:09 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:56:09 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb6080018c0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:56:09 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:56:09 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb5f00042e0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:56:09 compute-1 sudo[209510]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-othmlfpmplgoajcknswahimjotlirjsb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770026169.5507767-1120-4756300351984/AnsiballZ_systemd.py'
Feb 02 09:56:09 compute-1 sudo[209510]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:56:10 compute-1 python3.9[209512]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Feb 02 09:56:10 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:56:10 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:56:10 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:56:10.097 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:56:10 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:56:10 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 09:56:10 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:56:10.187 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 09:56:11 compute-1 systemd[1]: systemd-modules-load.service: Deactivated successfully.
Feb 02 09:56:11 compute-1 systemd[1]: Stopped Load Kernel Modules.
Feb 02 09:56:11 compute-1 systemd[1]: Stopping Load Kernel Modules...
Feb 02 09:56:11 compute-1 systemd[1]: Starting Load Kernel Modules...
Feb 02 09:56:11 compute-1 systemd[1]: Finished Load Kernel Modules.
Feb 02 09:56:11 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:56:11 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb5f8003fb0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:56:11 compute-1 sudo[209510]: pam_unix(sudo:session): session closed for user root
Feb 02 09:56:11 compute-1 ceph-mon[80115]: pgmap v495: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s
Feb 02 09:56:11 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:56:11 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb600002690 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:56:11 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:56:11 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb6080018c0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:56:11 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 09:56:11 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:56:11 : epoch 69807427 : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Feb 02 09:56:11 compute-1 sudo[209667]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-twliwbphgqxouvsszydlflplxoplvzhy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770026171.661172-1144-243690406821696/AnsiballZ_dnf.py'
Feb 02 09:56:11 compute-1 sudo[209667]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:56:12 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:56:12 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 09:56:12 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:56:12.100 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 09:56:12 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:56:12 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:56:12 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:56:12.210 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:56:12 compute-1 python3.9[209669]: ansible-ansible.legacy.dnf Invoked with name=['nvme-cli'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Feb 02 09:56:13 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:56:13 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb5f00042e0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:56:13 compute-1 ceph-mon[80115]: pgmap v496: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s
Feb 02 09:56:13 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:56:13 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb5f00042e0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:56:13 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:56:13 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb600002690 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:56:14 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:56:14 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:56:14 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:56:14.103 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:56:14 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:56:14 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 09:56:14 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:56:14.212 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 09:56:14 compute-1 systemd[1]: Reloading.
Feb 02 09:56:14 compute-1 systemd-rc-local-generator[209700]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 02 09:56:14 compute-1 systemd-sysv-generator[209705]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 02 09:56:14 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:56:14 : epoch 69807427 : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Feb 02 09:56:14 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:56:14 : epoch 69807427 : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Feb 02 09:56:14 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:56:14 : epoch 69807427 : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Feb 02 09:56:14 compute-1 systemd[1]: Reloading.
Feb 02 09:56:14 compute-1 systemd-rc-local-generator[209732]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 02 09:56:14 compute-1 systemd-sysv-generator[209735]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 02 09:56:15 compute-1 systemd-logind[805]: Watching system buttons on /dev/input/event0 (Power Button)
Feb 02 09:56:15 compute-1 systemd-logind[805]: Watching system buttons on /dev/input/event1 (AT Translated Set 2 keyboard)
Feb 02 09:56:15 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:56:15 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb6080018c0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:56:15 compute-1 lvm[209784]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Feb 02 09:56:15 compute-1 lvm[209784]: VG ceph_vg0 finished
Feb 02 09:56:15 compute-1 ceph-mon[80115]: pgmap v497: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.2 KiB/s rd, 597 B/s wr, 1 op/s
Feb 02 09:56:15 compute-1 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Feb 02 09:56:15 compute-1 systemd[1]: Starting man-db-cache-update.service...
Feb 02 09:56:15 compute-1 systemd[1]: Reloading.
Feb 02 09:56:15 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:56:15 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb5f8003fb0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:56:15 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:56:15 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb6080018c0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:56:15 compute-1 systemd-rc-local-generator[209827]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 02 09:56:15 compute-1 systemd-sysv-generator[209834]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 02 09:56:15 compute-1 systemd[1]: Queuing reload/restart jobs for marked units…
Feb 02 09:56:16 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:56:16 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 09:56:16 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:56:16.105 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 09:56:16 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:56:16 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:56:16 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:56:16.214 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:56:16 compute-1 sudo[209667]: pam_unix(sudo:session): session closed for user root
Feb 02 09:56:16 compute-1 sudo[210862]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Feb 02 09:56:16 compute-1 sudo[210862]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:56:16 compute-1 sudo[210862]: pam_unix(sudo:session): session closed for user root
Feb 02 09:56:16 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 09:56:16 compute-1 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Feb 02 09:56:16 compute-1 systemd[1]: Finished man-db-cache-update.service.
Feb 02 09:56:16 compute-1 systemd[1]: man-db-cache-update.service: Consumed 1.482s CPU time.
Feb 02 09:56:16 compute-1 systemd[1]: run-rc9e90e4836ab4b33914d432eecb8e24a.service: Deactivated successfully.
Feb 02 09:56:17 compute-1 sudo[211162]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jdzsvyudzkblhbzvwvafsxjlzllvosuo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770026176.6807067-1168-157272649691419/AnsiballZ_systemd_service.py'
Feb 02 09:56:17 compute-1 sudo[211162]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:56:17 compute-1 python3.9[211164]: ansible-ansible.builtin.systemd_service Invoked with name=iscsid state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Feb 02 09:56:17 compute-1 systemd[1]: Stopping Open-iSCSI...
Feb 02 09:56:17 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:56:17 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb5f00042e0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:56:17 compute-1 iscsid[204596]: iscsid shutting down.
Feb 02 09:56:17 compute-1 systemd[1]: iscsid.service: Deactivated successfully.
Feb 02 09:56:17 compute-1 systemd[1]: Stopped Open-iSCSI.
Feb 02 09:56:17 compute-1 systemd[1]: One time configuration for iscsi.service was skipped because of an unmet condition check (ConditionPathExists=!/etc/iscsi/initiatorname.iscsi).
Feb 02 09:56:17 compute-1 systemd[1]: Starting Open-iSCSI...
Feb 02 09:56:17 compute-1 systemd[1]: Started Open-iSCSI.
Feb 02 09:56:17 compute-1 sudo[211162]: pam_unix(sudo:session): session closed for user root
Feb 02 09:56:17 compute-1 ceph-mon[80115]: pgmap v498: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.2 KiB/s rd, 597 B/s wr, 1 op/s
Feb 02 09:56:17 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Feb 02 09:56:17 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:56:17 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb600002690 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:56:17 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:56:17 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb5f8003fb0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:56:17 compute-1 sudo[211217]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 02 09:56:17 compute-1 sudo[211217]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:56:17 compute-1 sudo[211217]: pam_unix(sudo:session): session closed for user root
Feb 02 09:56:17 compute-1 sudo[211265]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/d241d473-9fcb-5f74-b163-f1ca4454e7f1/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Feb 02 09:56:17 compute-1 sudo[211265]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:56:17 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:56:17 : epoch 69807427 : compute-1 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Feb 02 09:56:17 compute-1 sudo[211371]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xhueymtvqhtmkcxfgipnngsoruievdya ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770026177.629551-1192-32603920834622/AnsiballZ_systemd_service.py'
Feb 02 09:56:17 compute-1 sudo[211371]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:56:18 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:56:18 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:56:18 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:56:18.108 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:56:18 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:56:18 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 09:56:18 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:56:18.217 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 09:56:18 compute-1 python3.9[211378]: ansible-ansible.builtin.systemd_service Invoked with name=multipathd state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Feb 02 09:56:18 compute-1 sudo[211265]: pam_unix(sudo:session): session closed for user root
Feb 02 09:56:18 compute-1 systemd[1]: Stopping Device-Mapper Multipath Device Controller...
Feb 02 09:56:18 compute-1 multipathd[208582]: exit (signal)
Feb 02 09:56:18 compute-1 multipathd[208582]: --------shut down-------
Feb 02 09:56:18 compute-1 systemd[1]: multipathd.service: Deactivated successfully.
Feb 02 09:56:18 compute-1 systemd[1]: Stopped Device-Mapper Multipath Device Controller.
Feb 02 09:56:18 compute-1 systemd[1]: Starting Device-Mapper Multipath Device Controller...
Feb 02 09:56:18 compute-1 multipathd[211428]: --------start up--------
Feb 02 09:56:18 compute-1 multipathd[211428]: read /etc/multipath.conf
Feb 02 09:56:18 compute-1 multipathd[211428]: path checkers start up
Feb 02 09:56:18 compute-1 systemd[1]: Started Device-Mapper Multipath Device Controller.
Feb 02 09:56:18 compute-1 podman[211405]: 2026-02-02 09:56:18.36799089 +0000 UTC m=+0.094911781 container health_status 9cab238676dda9a572dafc047b624a8b78a8fbec063bf997856559897160605d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'db4758ee7523fe447444c4bd2b867b543b1eee4e3bbcf6676cd1b27bf6147d86-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true)
Feb 02 09:56:18 compute-1 sudo[211371]: pam_unix(sudo:session): session closed for user root
Feb 02 09:56:18 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Feb 02 09:56:18 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Feb 02 09:56:18 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb 02 09:56:18 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb 02 09:56:18 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Feb 02 09:56:18 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Feb 02 09:56:18 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Feb 02 09:56:19 compute-1 python3.9[211586]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 02 09:56:19 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:56:19 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb6080018c0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:56:19 compute-1 ceph-mon[80115]: pgmap v499: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 2.7 KiB/s rd, 1023 B/s wr, 3 op/s
Feb 02 09:56:19 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:56:19 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb5f00042e0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:56:19 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:56:19 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb600003490 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:56:20 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:56:20 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 09:56:20 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:56:20.111 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 09:56:20 compute-1 sudo[211740]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-usxxqgbhsezlhuwtphiolvqinkfegvwt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770026179.8699737-1244-171122042027060/AnsiballZ_file.py'
Feb 02 09:56:20 compute-1 sudo[211740]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:56:20 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:56:20 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:56:20 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:56:20.221 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:56:20 compute-1 python3.9[211742]: ansible-ansible.builtin.file Invoked with mode=0644 path=/etc/ssh/ssh_known_hosts state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 02 09:56:20 compute-1 sudo[211740]: pam_unix(sudo:session): session closed for user root
Feb 02 09:56:21 compute-1 sudo[211893]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-utokriiwbggjssunxpdbmywvmyszxqlh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770026180.9416244-1278-22825078240980/AnsiballZ_systemd_service.py'
Feb 02 09:56:21 compute-1 sudo[211893]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:56:21 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:56:21 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb600003490 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:56:21 compute-1 python3.9[211895]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Feb 02 09:56:21 compute-1 systemd[1]: Reloading.
Feb 02 09:56:21 compute-1 ceph-mon[80115]: pgmap v500: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 2.7 KiB/s rd, 1023 B/s wr, 3 op/s
Feb 02 09:56:21 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:56:21 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb608001a60 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:56:21 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:56:21 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb5f00042e0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:56:21 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 09:56:21 compute-1 systemd-sysv-generator[211918]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 02 09:56:21 compute-1 systemd-rc-local-generator[211915]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 02 09:56:21 compute-1 sudo[211893]: pam_unix(sudo:session): session closed for user root
Feb 02 09:56:22 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:56:22 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 09:56:22 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:56:22.114 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 09:56:22 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:56:22 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 09:56:22 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:56:22.223 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 09:56:22 compute-1 python3.9[212080]: ansible-ansible.builtin.service_facts Invoked
Feb 02 09:56:22 compute-1 ceph-mon[80115]: pgmap v501: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 2.7 KiB/s rd, 1023 B/s wr, 3 op/s
Feb 02 09:56:22 compute-1 network[212097]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Feb 02 09:56:22 compute-1 network[212098]: 'network-scripts' will be removed from distribution in near future.
Feb 02 09:56:22 compute-1 network[212099]: It is advised to switch to 'NetworkManager' instead for network management.
Feb 02 09:56:23 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:56:23 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb600003490 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:56:23 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-haproxy-nfs-cephfs-compute-1-sryqbx[86108]: [WARNING] 032/095623 (4) : Server backend/nfs.cephfs.2 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Feb 02 09:56:23 compute-1 sudo[212118]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 02 09:56:23 compute-1 sudo[212118]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:56:23 compute-1 sudo[212118]: pam_unix(sudo:session): session closed for user root
Feb 02 09:56:23 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:56:23 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb608001a60 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:56:23 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:56:23 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb5f8003fb0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:56:24 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:56:24 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:56:24 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:56:24.117 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:56:24 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:56:24 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:56:24 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:56:24.226 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:56:24 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb 02 09:56:24 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb 02 09:56:25 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:56:25 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb5f00042e0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:56:25 compute-1 ceph-mon[80115]: pgmap v502: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 2.7 KiB/s rd, 1023 B/s wr, 4 op/s
Feb 02 09:56:25 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:56:25 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb600003490 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:56:25 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:56:25 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb608001a60 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:56:26 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:56:26 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 09:56:26 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:56:26.120 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 09:56:26 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:56:26 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:56:26 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:56:26.229 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:56:26 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 09:56:27 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:56:27 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb5f8003fb0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:56:27 compute-1 ceph-mon[80115]: pgmap v503: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.7 KiB/s rd, 426 B/s wr, 2 op/s
Feb 02 09:56:27 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:56:27 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb5f8003fb0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:56:27 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:56:27 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb600003490 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:56:28 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:56:28 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 09:56:28 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:56:28.122 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 09:56:28 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:56:28 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:56:28 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:56:28.231 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:56:28 compute-1 sudo[212398]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fyxuqpiagsoukqulpmnrmbbexsywgtle ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770026188.0366573-1334-60422933781097/AnsiballZ_systemd_service.py'
Feb 02 09:56:28 compute-1 sudo[212398]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:56:28 compute-1 python3.9[212400]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_compute.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 02 09:56:28 compute-1 sudo[212398]: pam_unix(sudo:session): session closed for user root
Feb 02 09:56:29 compute-1 sudo[212552]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wnvnybludugkzrilvfijuazmgcdnnkze ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770026188.746356-1334-110761884802945/AnsiballZ_systemd_service.py'
Feb 02 09:56:29 compute-1 sudo[212552]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:56:29 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:56:29 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb5f8003fb0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:56:29 compute-1 python3.9[212554]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_migration_target.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 02 09:56:29 compute-1 systemd[1]: virtnodedevd.service: Deactivated successfully.
Feb 02 09:56:29 compute-1 sudo[212552]: pam_unix(sudo:session): session closed for user root
Feb 02 09:56:29 compute-1 ceph-mon[80115]: pgmap v504: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.8 KiB/s rd, 426 B/s wr, 2 op/s
Feb 02 09:56:29 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:56:29 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb5f00042e0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:56:29 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:56:29 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb600003490 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:56:29 compute-1 sudo[212706]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wgdvpgqbgdldvpmiuwxmqhihfsddnplw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770026189.5344799-1334-224498769852282/AnsiballZ_systemd_service.py'
Feb 02 09:56:29 compute-1 sudo[212706]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:56:30 compute-1 python3.9[212708]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_api_cron.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 02 09:56:30 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:56:30 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:56:30 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:56:30.125 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:56:30 compute-1 sudo[212706]: pam_unix(sudo:session): session closed for user root
Feb 02 09:56:30 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:56:30 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:56:30 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:56:30.234 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:56:30 compute-1 systemd[1]: virtproxyd.service: Deactivated successfully.
Feb 02 09:56:30 compute-1 sudo[212860]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-coihcjjphgnqptvqokpzexrpxsewncyn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770026190.2927744-1334-258592332382108/AnsiballZ_systemd_service.py'
Feb 02 09:56:30 compute-1 sudo[212860]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:56:30 compute-1 python3.9[212862]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_api.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 02 09:56:30 compute-1 sudo[212860]: pam_unix(sudo:session): session closed for user root
Feb 02 09:56:31 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:56:31 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb608001a60 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:56:31 compute-1 sudo[213014]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iomhozfucuqzmxtagaifsixjnjzvyocd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770026191.121076-1334-130719659413865/AnsiballZ_systemd_service.py'
Feb 02 09:56:31 compute-1 sudo[213014]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:56:31 compute-1 ceph-mon[80115]: pgmap v505: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 B/s wr, 0 op/s
Feb 02 09:56:31 compute-1 kernel: ganesha.nfsd[208288]: segfault at 50 ip 00007fb6a9d2f32e sp 00007fb611ffa210 error 4 in libntirpc.so.5.8[7fb6a9d14000+2c000] likely on CPU 5 (core 0, socket 5)
Feb 02 09:56:31 compute-1 kernel: Code: 47 20 66 41 89 86 f2 00 00 00 41 bf 01 00 00 00 b9 40 00 00 00 e9 af fd ff ff 66 90 48 8b 85 f8 00 00 00 48 8b 40 08 4c 8b 28 <45> 8b 65 50 49 8b 75 68 41 8b be 28 02 00 00 b9 40 00 00 00 e8 29
Feb 02 09:56:31 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[184913]: 02/02/2026 09:56:31 : epoch 69807427 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb608001a60 fd 48 proxy ignored for local
Feb 02 09:56:31 compute-1 systemd[1]: Started Process Core Dump (PID 213017/UID 0).
Feb 02 09:56:31 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 09:56:31 compute-1 python3.9[213016]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_conductor.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 02 09:56:31 compute-1 sudo[213014]: pam_unix(sudo:session): session closed for user root
Feb 02 09:56:32 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:56:32 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:56:32 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:56:32.127 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:56:32 compute-1 sudo[213169]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sqsydxcgeamorbrjnpikehaezeekstbs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770026191.8848574-1334-224917134681608/AnsiballZ_systemd_service.py'
Feb 02 09:56:32 compute-1 sudo[213169]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:56:32 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:56:32 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:56:32 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:56:32.237 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:56:32 compute-1 systemd-coredump[213018]: Process 184917 (ganesha.nfsd) of user 0 dumped core.
                                                    
                                                    Stack trace of thread 62:
                                                    #0  0x00007fb6a9d2f32e n/a (/usr/lib64/libntirpc.so.5.8 + 0x2232e)
                                                    ELF object binary architecture: AMD x86-64
Feb 02 09:56:32 compute-1 systemd[1]: systemd-coredump@6-213017-0.service: Deactivated successfully.
Feb 02 09:56:32 compute-1 python3.9[213171]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_metadata.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 02 09:56:32 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Feb 02 09:56:32 compute-1 podman[213176]: 2026-02-02 09:56:32.491089344 +0000 UTC m=+0.039077289 container died 368812cb123d5562d4341f805c1c6cec04ce7af35417771ccb5a9aba72f1c0d7 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.build-date=20250325, OSD_FLAVOR=default, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, CEPH_REF=squid, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS)
Feb 02 09:56:32 compute-1 sudo[213169]: pam_unix(sudo:session): session closed for user root
Feb 02 09:56:32 compute-1 systemd[1]: var-lib-containers-storage-overlay-b41398c763b3c102a46779e12a2f4cfcf9f278ef37c2c2c2f473f0bbe2f41a2c-merged.mount: Deactivated successfully.
Feb 02 09:56:32 compute-1 podman[213176]: 2026-02-02 09:56:32.52594949 +0000 UTC m=+0.073937445 container remove 368812cb123d5562d4341f805c1c6cec04ce7af35417771ccb5a9aba72f1c0d7 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Feb 02 09:56:32 compute-1 systemd[1]: ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1@nfs.cephfs.0.0.compute-1.mhzhsx.service: Main process exited, code=exited, status=139/n/a
Feb 02 09:56:32 compute-1 systemd[1]: ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1@nfs.cephfs.0.0.compute-1.mhzhsx.service: Failed with result 'exit-code'.
Feb 02 09:56:32 compute-1 systemd[1]: ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1@nfs.cephfs.0.0.compute-1.mhzhsx.service: Consumed 1.287s CPU time.
Feb 02 09:56:33 compute-1 sudo[213370]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fswtwdgmugmgtnrjsmcyccplaikbjews ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770026192.6311235-1334-273280110367702/AnsiballZ_systemd_service.py'
Feb 02 09:56:33 compute-1 sudo[213370]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:56:33 compute-1 python3.9[213372]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_scheduler.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 02 09:56:33 compute-1 sudo[213370]: pam_unix(sudo:session): session closed for user root
Feb 02 09:56:33 compute-1 ceph-mon[80115]: pgmap v506: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 B/s wr, 0 op/s
Feb 02 09:56:33 compute-1 sudo[213523]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rnpwpeyclrxwbxxqgxgufvxmmppraxcr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770026193.5303576-1334-103471418525214/AnsiballZ_systemd_service.py'
Feb 02 09:56:33 compute-1 sudo[213523]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:56:34 compute-1 python3.9[213525]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_vnc_proxy.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 02 09:56:34 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:56:34 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:56:34 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:56:34.130 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:56:34 compute-1 sudo[213523]: pam_unix(sudo:session): session closed for user root
Feb 02 09:56:34 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:56:34 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 09:56:34 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:56:34.239 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 09:56:35 compute-1 sudo[213677]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gbkepdngomwjppclbskgnarcjeyjxmqt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770026194.89101-1511-109792275538097/AnsiballZ_file.py'
Feb 02 09:56:35 compute-1 sudo[213677]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:56:35 compute-1 python3.9[213679]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 02 09:56:35 compute-1 sudo[213677]: pam_unix(sudo:session): session closed for user root
Feb 02 09:56:35 compute-1 ceph-mon[80115]: pgmap v507: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 341 B/s rd, 0 B/s wr, 0 op/s
Feb 02 09:56:35 compute-1 sudo[213829]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jcsryalzrqicguhmyjxecrcrpdbwufxa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770026195.5011356-1511-56810731110238/AnsiballZ_file.py'
Feb 02 09:56:35 compute-1 sudo[213829]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:56:35 compute-1 python3.9[213831]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 02 09:56:35 compute-1 sudo[213829]: pam_unix(sudo:session): session closed for user root
Feb 02 09:56:36 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:56:36 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 09:56:36 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:56:36.132 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 09:56:36 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:56:36 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:56:36 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:56:36.243 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:56:36 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 09:56:36 compute-1 sudo[213955]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Feb 02 09:56:36 compute-1 sudo[213955]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:56:36 compute-1 sudo[213955]: pam_unix(sudo:session): session closed for user root
Feb 02 09:56:36 compute-1 sudo[214006]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-suyqjgidqgosxjieqivdeoacmoozlobo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770026196.1882753-1511-280223293924635/AnsiballZ_file.py'
Feb 02 09:56:36 compute-1 sudo[214006]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:56:36 compute-1 python3.9[214008]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_api_cron.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 02 09:56:36 compute-1 sudo[214006]: pam_unix(sudo:session): session closed for user root
Feb 02 09:56:37 compute-1 sudo[214159]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zufachkwiheazrilqsvgondidgmagiin ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770026197.0086162-1511-63767114219269/AnsiballZ_file.py'
Feb 02 09:56:37 compute-1 sudo[214159]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:56:37 compute-1 python3.9[214161]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_api.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 02 09:56:37 compute-1 sudo[214159]: pam_unix(sudo:session): session closed for user root
Feb 02 09:56:37 compute-1 ceph-mon[80115]: pgmap v508: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Feb 02 09:56:37 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-haproxy-nfs-cephfs-compute-1-sryqbx[86108]: [WARNING] 032/095637 (4) : Server backend/nfs.cephfs.0 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Feb 02 09:56:37 compute-1 sudo[214311]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-npscqnbxumdxuvgptkjnnuhofyoligxc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770026197.659483-1511-260994067059113/AnsiballZ_file.py'
Feb 02 09:56:37 compute-1 sudo[214311]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:56:38 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:56:38 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:56:38 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:56:38.135 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:56:38 compute-1 python3.9[214313]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_conductor.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 02 09:56:38 compute-1 sudo[214311]: pam_unix(sudo:session): session closed for user root
Feb 02 09:56:38 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:56:38 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:56:38 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:56:38.246 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:56:38 compute-1 sudo[214463]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vaycceyynxhpzlzytqmajwnelhrifcix ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770026198.2908335-1511-277436154816559/AnsiballZ_file.py'
Feb 02 09:56:38 compute-1 sudo[214463]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:56:38 compute-1 python3.9[214465]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_metadata.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 02 09:56:38 compute-1 sudo[214463]: pam_unix(sudo:session): session closed for user root
Feb 02 09:56:39 compute-1 sudo[214616]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mrujvscjjnhgzulqysddfovfeunrcsfi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770026198.911971-1511-261690885273581/AnsiballZ_file.py'
Feb 02 09:56:39 compute-1 sudo[214616]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:56:39 compute-1 podman[214618]: 2026-02-02 09:56:39.303244744 +0000 UTC m=+0.089203269 container health_status 1fb2696999ea15e9688144989093738543d3cef725f9986792d6dfd707aefbe2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:099d88ae13fa2b3409da5310cdcba7fa01d2c87a8bc98296299a57054b9a075e, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_controller, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_controller, managed_by=edpm_ansible, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'db4758ee7523fe447444c4bd2b867b543b1eee4e3bbcf6676cd1b27bf6147d86-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:099d88ae13fa2b3409da5310cdcba7fa01d2c87a8bc98296299a57054b9a075e', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20260127)
Feb 02 09:56:39 compute-1 python3.9[214619]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_scheduler.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 02 09:56:39 compute-1 sudo[214616]: pam_unix(sudo:session): session closed for user root
Feb 02 09:56:39 compute-1 ceph-mon[80115]: pgmap v509: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 341 B/s rd, 0 op/s
Feb 02 09:56:39 compute-1 sudo[214794]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ryddblreylzhbgzorcpvvkngoqcwifmi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770026199.5973706-1511-132779138028367/AnsiballZ_file.py'
Feb 02 09:56:39 compute-1 sudo[214794]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:56:40 compute-1 python3.9[214796]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_vnc_proxy.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 02 09:56:40 compute-1 sudo[214794]: pam_unix(sudo:session): session closed for user root
Feb 02 09:56:40 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:56:40 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:56:40 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:56:40.136 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:56:40 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:56:40 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:56:40 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:56:40.249 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:56:40 compute-1 systemd[1]: virtsecretd.service: Deactivated successfully.
Feb 02 09:56:40 compute-1 systemd[1]: virtqemud.service: Deactivated successfully.
Feb 02 09:56:40 compute-1 sudo[214949]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pusvfmqtdkvifippsiefmekjvqlamfcz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770026200.680557-1683-175397881602218/AnsiballZ_file.py'
Feb 02 09:56:40 compute-1 sudo[214949]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:56:41 compute-1 python3.9[214951]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 02 09:56:41 compute-1 sudo[214949]: pam_unix(sudo:session): session closed for user root
Feb 02 09:56:41 compute-1 ceph-mon[80115]: pgmap v510: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s
Feb 02 09:56:41 compute-1 sudo[215101]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fbshggwhrtjzgjrqhgfzfzimrctleqob ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770026201.3135912-1683-43228498337193/AnsiballZ_file.py'
Feb 02 09:56:41 compute-1 sudo[215101]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:56:41 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 09:56:41 compute-1 python3.9[215103]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 02 09:56:41 compute-1 sudo[215101]: pam_unix(sudo:session): session closed for user root
Feb 02 09:56:42 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:56:42 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 09:56:42 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:56:42.139 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 09:56:42 compute-1 sudo[215253]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lkilzqfvardaibrscrnwgcmaqshhzdnj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770026201.9012005-1683-184901317479387/AnsiballZ_file.py'
Feb 02 09:56:42 compute-1 sudo[215253]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:56:42 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:56:42 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:56:42 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:56:42.253 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:56:42 compute-1 python3.9[215255]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_api_cron.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 02 09:56:42 compute-1 sudo[215253]: pam_unix(sudo:session): session closed for user root
Feb 02 09:56:42 compute-1 ceph-mon[80115]: pgmap v511: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s
Feb 02 09:56:42 compute-1 systemd[1]: ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1@nfs.cephfs.0.0.compute-1.mhzhsx.service: Scheduled restart job, restart counter is at 7.
Feb 02 09:56:42 compute-1 systemd[1]: Stopped Ceph nfs.cephfs.0.0.compute-1.mhzhsx for d241d473-9fcb-5f74-b163-f1ca4454e7f1.
Feb 02 09:56:42 compute-1 systemd[1]: ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1@nfs.cephfs.0.0.compute-1.mhzhsx.service: Consumed 1.287s CPU time.
Feb 02 09:56:42 compute-1 systemd[1]: Starting Ceph nfs.cephfs.0.0.compute-1.mhzhsx for d241d473-9fcb-5f74-b163-f1ca4454e7f1...
Feb 02 09:56:42 compute-1 sudo[215418]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qkjlcdpjuskfhrtnhonmondgtexgppbs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770026202.5642836-1683-235232861119183/AnsiballZ_file.py'
Feb 02 09:56:42 compute-1 sudo[215418]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:56:43 compute-1 podman[215455]: 2026-02-02 09:56:43.05170777 +0000 UTC m=+0.047347511 container create a3b92c046843d2bbfa703febce951685c5306a1f96da141839946a768e80c6de (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx, io.buildah.version=1.40.1, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, ceph=True)
Feb 02 09:56:43 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8b577d442f85adf05c581d868a55d038674fea5811e9f6b8663a812196de41a7/merged/etc/ganesha supports timestamps until 2038 (0x7fffffff)
Feb 02 09:56:43 compute-1 python3.9[215425]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_api.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 02 09:56:43 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8b577d442f85adf05c581d868a55d038674fea5811e9f6b8663a812196de41a7/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Feb 02 09:56:43 compute-1 sudo[215418]: pam_unix(sudo:session): session closed for user root
Feb 02 09:56:43 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8b577d442f85adf05c581d868a55d038674fea5811e9f6b8663a812196de41a7/merged/etc/ceph/keyring supports timestamps until 2038 (0x7fffffff)
Feb 02 09:56:43 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8b577d442f85adf05c581d868a55d038674fea5811e9f6b8663a812196de41a7/merged/var/lib/ceph/radosgw/ceph-nfs.cephfs.0.0.compute-1.mhzhsx-rgw/keyring supports timestamps until 2038 (0x7fffffff)
Feb 02 09:56:43 compute-1 podman[215455]: 2026-02-02 09:56:43.025924671 +0000 UTC m=+0.021564472 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Feb 02 09:56:43 compute-1 podman[215455]: 2026-02-02 09:56:43.140711453 +0000 UTC m=+0.136351214 container init a3b92c046843d2bbfa703febce951685c5306a1f96da141839946a768e80c6de (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx, OSD_FLAVOR=default, CEPH_REF=squid, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1)
Feb 02 09:56:43 compute-1 podman[215455]: 2026-02-02 09:56:43.144328385 +0000 UTC m=+0.139968126 container start a3b92c046843d2bbfa703febce951685c5306a1f96da141839946a768e80c6de (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx, CEPH_REF=squid, org.label-schema.build-date=20250325, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9)
Feb 02 09:56:43 compute-1 bash[215455]: a3b92c046843d2bbfa703febce951685c5306a1f96da141839946a768e80c6de
Feb 02 09:56:43 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[215470]: 02/02/2026 09:56:43 : epoch 698074db : compute-1 : ganesha.nfsd-2[main] init_logging :LOG :NULL :LOG: Setting log level for all components to NIV_EVENT
Feb 02 09:56:43 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[215470]: 02/02/2026 09:56:43 : epoch 698074db : compute-1 : ganesha.nfsd-2[main] main :MAIN :EVENT :ganesha.nfsd Starting: Ganesha Version 5.9
Feb 02 09:56:43 compute-1 systemd[1]: Started Ceph nfs.cephfs.0.0.compute-1.mhzhsx for d241d473-9fcb-5f74-b163-f1ca4454e7f1.
Feb 02 09:56:43 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[215470]: 02/02/2026 09:56:43 : epoch 698074db : compute-1 : ganesha.nfsd-2[main] nfs_set_param_from_conf :NFS STARTUP :EVENT :Configuration file successfully parsed
Feb 02 09:56:43 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[215470]: 02/02/2026 09:56:43 : epoch 698074db : compute-1 : ganesha.nfsd-2[main] monitoring_init :NFS STARTUP :EVENT :Init monitoring at 0.0.0.0:9587
Feb 02 09:56:43 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[215470]: 02/02/2026 09:56:43 : epoch 698074db : compute-1 : ganesha.nfsd-2[main] fsal_init_fds_limit :MDCACHE LRU :EVENT :Setting the system-imposed limit on FDs to 1048576.
Feb 02 09:56:43 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[215470]: 02/02/2026 09:56:43 : epoch 698074db : compute-1 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :Initializing ID Mapper.
Feb 02 09:56:43 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[215470]: 02/02/2026 09:56:43 : epoch 698074db : compute-1 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :ID Mapper successfully initialized.
Feb 02 09:56:43 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[215470]: 02/02/2026 09:56:43 : epoch 698074db : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Feb 02 09:56:43 compute-1 sudo[215661]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kippxeanctqcydyyvsocornftvgbjvww ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770026203.2934098-1683-233062129432896/AnsiballZ_file.py'
Feb 02 09:56:43 compute-1 sudo[215661]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:56:44 compute-1 python3.9[215663]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_conductor.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 02 09:56:44 compute-1 sudo[215661]: pam_unix(sudo:session): session closed for user root
Feb 02 09:56:44 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:56:44 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:56:44 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:56:44.141 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:56:44 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:56:44 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:56:44 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:56:44.256 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:56:44 compute-1 sudo[215813]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-orpupwtalhacoevhpwyarlakhvxdlncs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770026204.1845186-1683-243942302554907/AnsiballZ_file.py'
Feb 02 09:56:44 compute-1 sudo[215813]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:56:44 compute-1 python3.9[215815]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_metadata.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 02 09:56:44 compute-1 sudo[215813]: pam_unix(sudo:session): session closed for user root
Feb 02 09:56:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:56:44.896 143542 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 02 09:56:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:56:44.896 143542 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 02 09:56:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:56:44.897 143542 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 02 09:56:45 compute-1 sudo[215966]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mflwuedmecvkspnufzhkqvwbuooqaxgj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770026204.772146-1683-189587464028834/AnsiballZ_file.py'
Feb 02 09:56:45 compute-1 sudo[215966]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:56:45 compute-1 python3.9[215968]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_scheduler.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 02 09:56:45 compute-1 sudo[215966]: pam_unix(sudo:session): session closed for user root
Feb 02 09:56:45 compute-1 ceph-mon[80115]: pgmap v512: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Feb 02 09:56:45 compute-1 sudo[216118]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dvuqczkqyklwsvcwtldnrpfesemkdqvc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770026205.4119122-1683-35598617571368/AnsiballZ_file.py'
Feb 02 09:56:45 compute-1 sudo[216118]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:56:45 compute-1 python3.9[216120]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_vnc_proxy.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 02 09:56:45 compute-1 sudo[216118]: pam_unix(sudo:session): session closed for user root
Feb 02 09:56:46 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:56:46 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:56:46 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:56:46.144 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:56:46 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:56:46 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:56:46 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:56:46.258 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:56:46 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 09:56:46 compute-1 sudo[216270]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tyfyjagynuflkxsxunujgvytsnqejgfp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770026206.4101532-1856-44352420518791/AnsiballZ_command.py'
Feb 02 09:56:46 compute-1 sudo[216270]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:56:46 compute-1 python3.9[216272]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then
                                               systemctl disable --now certmonger.service
                                               test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service
                                             fi
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 02 09:56:46 compute-1 sudo[216270]: pam_unix(sudo:session): session closed for user root
Feb 02 09:56:47 compute-1 ceph-mon[80115]: pgmap v513: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s
Feb 02 09:56:47 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Feb 02 09:56:47 compute-1 python3.9[216425]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Feb 02 09:56:48 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:56:48 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 09:56:48 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:56:48.145 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 09:56:48 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:56:48 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:56:48 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:56:48.261 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:56:48 compute-1 sudo[216591]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-znmafgqxnohsvalyctgcmbsdmcytichv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770026208.4366887-1910-166144605257275/AnsiballZ_systemd_service.py'
Feb 02 09:56:48 compute-1 sudo[216591]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:56:48 compute-1 podman[216549]: 2026-02-02 09:56:48.770497757 +0000 UTC m=+0.067876034 container health_status 9cab238676dda9a572dafc047b624a8b78a8fbec063bf997856559897160605d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'db4758ee7523fe447444c4bd2b867b543b1eee4e3bbcf6676cd1b27bf6147d86-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Feb 02 09:56:49 compute-1 python3.9[216597]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Feb 02 09:56:49 compute-1 systemd[1]: Reloading.
Feb 02 09:56:49 compute-1 systemd-rc-local-generator[216623]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 02 09:56:49 compute-1 systemd-sysv-generator[216627]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 02 09:56:49 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[215470]: 02/02/2026 09:56:49 : epoch 698074db : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Feb 02 09:56:49 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[215470]: 02/02/2026 09:56:49 : epoch 698074db : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Feb 02 09:56:49 compute-1 sudo[216591]: pam_unix(sudo:session): session closed for user root
Feb 02 09:56:49 compute-1 ceph-mon[80115]: pgmap v514: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.2 KiB/s rd, 597 B/s wr, 1 op/s
Feb 02 09:56:49 compute-1 sudo[216783]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rzdosfskiakvronsrwkpvahpsqfhtchk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770026209.6649268-1934-96555533545713/AnsiballZ_command.py'
Feb 02 09:56:49 compute-1 sudo[216783]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:56:50 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:56:50 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:56:50 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:56:50.148 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:56:50 compute-1 python3.9[216785]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_compute.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 02 09:56:50 compute-1 sudo[216783]: pam_unix(sudo:session): session closed for user root
Feb 02 09:56:50 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:56:50 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:56:50 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:56:50.264 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:56:50 compute-1 sudo[216936]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fpfxbsxewxdzhvntbgtwwcugrydvouzv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770026210.3593876-1934-268291311830885/AnsiballZ_command.py'
Feb 02 09:56:50 compute-1 sudo[216936]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:56:50 compute-1 python3.9[216938]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_migration_target.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 02 09:56:50 compute-1 sudo[216936]: pam_unix(sudo:session): session closed for user root
Feb 02 09:56:51 compute-1 sudo[217090]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-txneahjtntgaroyupaublecofegbyfhb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770026211.061364-1934-267738974481231/AnsiballZ_command.py'
Feb 02 09:56:51 compute-1 sudo[217090]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:56:51 compute-1 ceph-mon[80115]: pgmap v515: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.2 KiB/s rd, 596 B/s wr, 1 op/s
Feb 02 09:56:51 compute-1 python3.9[217092]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_api_cron.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 02 09:56:51 compute-1 sudo[217090]: pam_unix(sudo:session): session closed for user root
Feb 02 09:56:51 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 09:56:52 compute-1 sudo[217243]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gujqwvldiyerxiljdyssirawfmtnqcob ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770026211.7488356-1934-82641087867918/AnsiballZ_command.py'
Feb 02 09:56:52 compute-1 sudo[217243]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:56:52 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:56:52 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:56:52 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:56:52.151 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:56:52 compute-1 python3.9[217245]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_api.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 02 09:56:52 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:56:52 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:56:52 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:56:52.266 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:56:52 compute-1 sudo[217243]: pam_unix(sudo:session): session closed for user root
Feb 02 09:56:52 compute-1 sudo[217396]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sdknnltwyxwycqrxnzmmzxlrduxtudir ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770026212.4267893-1934-231021233434954/AnsiballZ_command.py'
Feb 02 09:56:52 compute-1 sudo[217396]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:56:52 compute-1 python3.9[217398]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_conductor.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 02 09:56:52 compute-1 sudo[217396]: pam_unix(sudo:session): session closed for user root
Feb 02 09:56:53 compute-1 sudo[217550]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jkpmeoxdqnjpaihhdmpvnnztcnbpawru ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770026212.9882452-1934-101255246759878/AnsiballZ_command.py'
Feb 02 09:56:53 compute-1 sudo[217550]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:56:53 compute-1 python3.9[217552]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_metadata.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 02 09:56:53 compute-1 ceph-mon[80115]: pgmap v516: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.2 KiB/s rd, 597 B/s wr, 1 op/s
Feb 02 09:56:53 compute-1 sudo[217550]: pam_unix(sudo:session): session closed for user root
Feb 02 09:56:53 compute-1 sudo[217703]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zvbpiutjraiyvxvcvhrczqqntgiywznb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770026213.6112888-1934-173622963230239/AnsiballZ_command.py'
Feb 02 09:56:53 compute-1 sudo[217703]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:56:54 compute-1 python3.9[217705]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_scheduler.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 02 09:56:54 compute-1 sudo[217703]: pam_unix(sudo:session): session closed for user root
Feb 02 09:56:54 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:56:54 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:56:54 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:56:54.153 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:56:54 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:56:54 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:56:54 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:56:54.269 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:56:54 compute-1 sudo[217856]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fzuqswattjvjkeqdvopxcczhwembdveg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770026214.2227018-1934-134066129984152/AnsiballZ_command.py'
Feb 02 09:56:54 compute-1 sudo[217856]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:56:54 compute-1 python3.9[217858]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_vnc_proxy.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 02 09:56:54 compute-1 sudo[217856]: pam_unix(sudo:session): session closed for user root
Feb 02 09:56:55 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[215470]: 02/02/2026 09:56:55 : epoch 698074db : compute-1 : ganesha.nfsd-2[main] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Feb 02 09:56:55 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[215470]: 02/02/2026 09:56:55 : epoch 698074db : compute-1 : ganesha.nfsd-2[main] main :NFS STARTUP :WARN :No export entries found in configuration file !!!
Feb 02 09:56:55 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[215470]: 02/02/2026 09:56:55 : epoch 698074db : compute-1 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:24): Unknown block (RADOS_URLS)
Feb 02 09:56:55 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[215470]: 02/02/2026 09:56:55 : epoch 698074db : compute-1 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:29): Unknown block (RGW)
Feb 02 09:56:55 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[215470]: 02/02/2026 09:56:55 : epoch 698074db : compute-1 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :CAP_SYS_RESOURCE was successfully removed for proper quota management in FSAL
Feb 02 09:56:55 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[215470]: 02/02/2026 09:56:55 : epoch 698074db : compute-1 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :currently set capabilities are: cap_chown,cap_dac_override,cap_fowner,cap_fsetid,cap_kill,cap_setgid,cap_setuid,cap_setpcap,cap_net_bind_service,cap_sys_chroot,cap_setfcap=ep
Feb 02 09:56:55 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[215470]: 02/02/2026 09:56:55 : epoch 698074db : compute-1 : ganesha.nfsd-2[main] gsh_dbus_pkginit :DBUS :CRIT :dbus_bus_get failed (Failed to connect to socket /run/dbus/system_bus_socket: No such file or directory)
Feb 02 09:56:55 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[215470]: 02/02/2026 09:56:55 : epoch 698074db : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Feb 02 09:56:55 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[215470]: 02/02/2026 09:56:55 : epoch 698074db : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Feb 02 09:56:55 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[215470]: 02/02/2026 09:56:55 : epoch 698074db : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Feb 02 09:56:55 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[215470]: 02/02/2026 09:56:55 : epoch 698074db : compute-1 : ganesha.nfsd-2[main] nfs_Init_svc :DISP :CRIT :Cannot acquire credentials for principal nfs
Feb 02 09:56:55 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[215470]: 02/02/2026 09:56:55 : epoch 698074db : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Feb 02 09:56:55 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[215470]: 02/02/2026 09:56:55 : epoch 698074db : compute-1 : ganesha.nfsd-2[main] nfs_Init_admin_thread :NFS CB :EVENT :Admin thread initialized
Feb 02 09:56:55 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[215470]: 02/02/2026 09:56:55 : epoch 698074db : compute-1 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :EVENT :Callback creds directory (/var/run/ganesha) already exists
Feb 02 09:56:55 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[215470]: 02/02/2026 09:56:55 : epoch 698074db : compute-1 : ganesha.nfsd-2[main] find_keytab_entry :NFS CB :WARN :Configuration file does not specify default realm while getting default realm name
Feb 02 09:56:55 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[215470]: 02/02/2026 09:56:55 : epoch 698074db : compute-1 : ganesha.nfsd-2[main] gssd_refresh_krb5_machine_credential :NFS CB :CRIT :ERROR: gssd_refresh_krb5_machine_credential: no usable keytab entry found in keytab /etc/krb5.keytab for connection with host localhost
Feb 02 09:56:55 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[215470]: 02/02/2026 09:56:55 : epoch 698074db : compute-1 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :WARN :gssd_refresh_krb5_machine_credential failed (-1765328160:0)
Feb 02 09:56:55 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[215470]: 02/02/2026 09:56:55 : epoch 698074db : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :Starting delayed executor.
Feb 02 09:56:55 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[215470]: 02/02/2026 09:56:55 : epoch 698074db : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :gsh_dbusthread was started successfully
Feb 02 09:56:55 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[215470]: 02/02/2026 09:56:55 : epoch 698074db : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :admin thread was started successfully
Feb 02 09:56:55 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[215470]: 02/02/2026 09:56:55 : epoch 698074db : compute-1 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :CRIT :DBUS not initialized, service thread exiting
Feb 02 09:56:55 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[215470]: 02/02/2026 09:56:55 : epoch 698074db : compute-1 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :EVENT :shutdown
Feb 02 09:56:55 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[215470]: 02/02/2026 09:56:55 : epoch 698074db : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :reaper thread was started successfully
Feb 02 09:56:55 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[215470]: 02/02/2026 09:56:55 : epoch 698074db : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :General fridge was started successfully
Feb 02 09:56:55 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[215470]: 02/02/2026 09:56:55 : epoch 698074db : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Feb 02 09:56:55 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[215470]: 02/02/2026 09:56:55 : epoch 698074db : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :             NFS SERVER INITIALIZED
Feb 02 09:56:55 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[215470]: 02/02/2026 09:56:55 : epoch 698074db : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Feb 02 09:56:55 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[215470]: 02/02/2026 09:56:55 : epoch 698074db : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe370000df0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:56:55 compute-1 ceph-mon[80115]: pgmap v517: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 2.2 KiB/s rd, 1023 B/s wr, 3 op/s
Feb 02 09:56:55 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[215470]: 02/02/2026 09:56:55 : epoch 698074db : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe368001c00 fd 37 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:56:55 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[215470]: 02/02/2026 09:56:55 : epoch 698074db : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe354000b60 fd 37 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:56:56 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:56:56 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 09:56:56 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:56:56.156 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 09:56:56 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:56:56 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000025s ======
Feb 02 09:56:56 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:56:56.272 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Feb 02 09:56:56 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 09:56:56 compute-1 sudo[217901]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Feb 02 09:56:56 compute-1 sudo[217901]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:56:56 compute-1 sudo[217901]: pam_unix(sudo:session): session closed for user root
Feb 02 09:56:57 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[215470]: 02/02/2026 09:56:57 : epoch 698074db : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe348000b60 fd 37 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:56:57 compute-1 ceph-mon[80115]: pgmap v518: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 2.2 KiB/s rd, 1023 B/s wr, 3 op/s
Feb 02 09:56:57 compute-1 kernel: ganesha.nfsd[217899]: segfault at 50 ip 00007fe3f9da232e sp 00007fe3627fb210 error 4 in libntirpc.so.5.8[7fe3f9d87000+2c000] likely on CPU 0 (core 0, socket 0)
Feb 02 09:56:57 compute-1 kernel: Code: 47 20 66 41 89 86 f2 00 00 00 41 bf 01 00 00 00 b9 40 00 00 00 e9 af fd ff ff 66 90 48 8b 85 f8 00 00 00 48 8b 40 08 4c 8b 28 <45> 8b 65 50 49 8b 75 68 41 8b be 28 02 00 00 b9 40 00 00 00 e8 29
Feb 02 09:56:57 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[215470]: 02/02/2026 09:56:57 : epoch 698074db : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe348000b60 fd 37 proxy ignored for local
Feb 02 09:56:57 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-haproxy-nfs-cephfs-compute-1-sryqbx[86108]: [WARNING] 032/095657 (4) : Server backend/nfs.cephfs.0 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Feb 02 09:56:57 compute-1 systemd[1]: Started Process Core Dump (PID 218020/UID 0).
Feb 02 09:56:57 compute-1 sudo[218054]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ttfzitdquohwsivrydyqyjbzhzuglkza ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770026217.3881593-2141-131852521899749/AnsiballZ_file.py'
Feb 02 09:56:57 compute-1 sudo[218054]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:56:57 compute-1 python3.9[218056]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 02 09:56:57 compute-1 sudo[218054]: pam_unix(sudo:session): session closed for user root
Feb 02 09:56:58 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:56:58 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:56:58 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:56:58.159 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:56:58 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:56:58 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:56:58 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:56:58.275 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:56:58 compute-1 sudo[218206]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lkwheqayzqubuqvrkmgrowckjbznbdri ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770026218.0072062-2141-165364619811481/AnsiballZ_file.py'
Feb 02 09:56:58 compute-1 sudo[218206]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:56:58 compute-1 systemd-coredump[218027]: Process 215474 (ganesha.nfsd) of user 0 dumped core.
                                                    
                                                    Stack trace of thread 54:
                                                    #0  0x00007fe3f9da232e n/a (/usr/lib64/libntirpc.so.5.8 + 0x2232e)
                                                    ELF object binary architecture: AMD x86-64
Feb 02 09:56:58 compute-1 systemd[1]: systemd-coredump@7-218020-0.service: Deactivated successfully.
Feb 02 09:56:58 compute-1 podman[218213]: 2026-02-02 09:56:58.461652777 +0000 UTC m=+0.025389749 container died a3b92c046843d2bbfa703febce951685c5306a1f96da141839946a768e80c6de (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9)
Feb 02 09:56:58 compute-1 systemd[1]: var-lib-containers-storage-overlay-8b577d442f85adf05c581d868a55d038674fea5811e9f6b8663a812196de41a7-merged.mount: Deactivated successfully.
Feb 02 09:56:58 compute-1 podman[218213]: 2026-02-02 09:56:58.49582355 +0000 UTC m=+0.059560502 container remove a3b92c046843d2bbfa703febce951685c5306a1f96da141839946a768e80c6de (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx, org.label-schema.build-date=20250325, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0)
Feb 02 09:56:58 compute-1 systemd[1]: ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1@nfs.cephfs.0.0.compute-1.mhzhsx.service: Main process exited, code=exited, status=139/n/a
Feb 02 09:56:58 compute-1 python3.9[218208]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/containers setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 02 09:56:58 compute-1 sudo[218206]: pam_unix(sudo:session): session closed for user root
Feb 02 09:56:58 compute-1 systemd[1]: ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1@nfs.cephfs.0.0.compute-1.mhzhsx.service: Failed with result 'exit-code'.
Feb 02 09:56:58 compute-1 systemd[1]: ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1@nfs.cephfs.0.0.compute-1.mhzhsx.service: Consumed 1.053s CPU time.
Feb 02 09:56:59 compute-1 sudo[218405]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vbjprjpftkkmppspjeghtizdefdwshxs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770026218.7475944-2141-280225275209621/AnsiballZ_file.py'
Feb 02 09:56:59 compute-1 sudo[218405]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:56:59 compute-1 python3.9[218407]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/nova_nvme_cleaner setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 02 09:56:59 compute-1 sudo[218405]: pam_unix(sudo:session): session closed for user root
Feb 02 09:56:59 compute-1 ceph-mon[80115]: pgmap v519: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 2.2 KiB/s rd, 1023 B/s wr, 3 op/s
Feb 02 09:56:59 compute-1 sudo[218557]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pnzauqlywbdypzqlbbnyhjioaartdhbh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770026219.6007366-2207-230891464586293/AnsiballZ_file.py'
Feb 02 09:56:59 compute-1 sudo[218557]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:57:00 compute-1 python3.9[218559]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 02 09:57:00 compute-1 sudo[218557]: pam_unix(sudo:session): session closed for user root
Feb 02 09:57:00 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:57:00 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:57:00 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:57:00.161 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:57:00 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:57:00 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:57:00 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:57:00.278 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:57:00 compute-1 sudo[218709]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hourlnxwdbzodxozhplgclrsnjnqarat ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770026220.251648-2207-205616278975263/AnsiballZ_file.py'
Feb 02 09:57:00 compute-1 sudo[218709]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:57:00 compute-1 python3.9[218711]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/_nova_secontext setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 02 09:57:00 compute-1 sudo[218709]: pam_unix(sudo:session): session closed for user root
Feb 02 09:57:01 compute-1 sudo[218862]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-upvkcokadmipxwearsckkpaodbwdpitd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770026221.1467316-2207-204170634312925/AnsiballZ_file.py'
Feb 02 09:57:01 compute-1 sudo[218862]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:57:01 compute-1 ceph-mon[80115]: pgmap v520: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.2 KiB/s rd, 426 B/s wr, 1 op/s
Feb 02 09:57:01 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 09:57:01 compute-1 python3.9[218864]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/nova/instances setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 02 09:57:01 compute-1 sudo[218862]: pam_unix(sudo:session): session closed for user root
Feb 02 09:57:02 compute-1 sudo[219014]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jnlewvzuscstvjsowwlmnshdioechyyc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770026221.8366423-2207-265192271310527/AnsiballZ_file.py'
Feb 02 09:57:02 compute-1 sudo[219014]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:57:02 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:57:02 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:57:02 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:57:02.164 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:57:02 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:57:02 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:57:02 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:57:02.281 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:57:02 compute-1 python3.9[219016]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/etc/ceph setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 02 09:57:02 compute-1 sudo[219014]: pam_unix(sudo:session): session closed for user root
Feb 02 09:57:02 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Feb 02 09:57:02 compute-1 sudo[219167]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wijxdnwswzpnnnrhllxerelhbuqznpax ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770026222.499809-2207-65650020156028/AnsiballZ_file.py'
Feb 02 09:57:02 compute-1 sudo[219167]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:57:02 compute-1 python3.9[219169]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/etc/multipath setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Feb 02 09:57:02 compute-1 sudo[219167]: pam_unix(sudo:session): session closed for user root
Feb 02 09:57:03 compute-1 sudo[219319]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-scrvwuogahzrcaxuefabpzmdclbzgrja ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770026223.126562-2207-37033052878153/AnsiballZ_file.py'
Feb 02 09:57:03 compute-1 sudo[219319]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:57:03 compute-1 ceph-mon[80115]: pgmap v521: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.2 KiB/s rd, 426 B/s wr, 1 op/s
Feb 02 09:57:03 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-haproxy-nfs-cephfs-compute-1-sryqbx[86108]: [WARNING] 032/095703 (4) : Server backend/nfs.cephfs.0 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Feb 02 09:57:03 compute-1 python3.9[219321]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/etc/nvme setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Feb 02 09:57:03 compute-1 sudo[219319]: pam_unix(sudo:session): session closed for user root
Feb 02 09:57:04 compute-1 sudo[219471]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xlazyffsfnsqksuwbetvacxceheutusc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770026223.8414612-2207-71127955422531/AnsiballZ_file.py'
Feb 02 09:57:04 compute-1 sudo[219471]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:57:04 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:57:04 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:57:04 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:57:04.167 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:57:04 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:57:04 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:57:04 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:57:04.283 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:57:04 compute-1 python3.9[219473]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/run/openvswitch setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Feb 02 09:57:04 compute-1 sudo[219471]: pam_unix(sudo:session): session closed for user root
Feb 02 09:57:04 compute-1 ceph-mon[80115]: pgmap v522: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.3 KiB/s rd, 426 B/s wr, 1 op/s
Feb 02 09:57:06 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:57:06 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.002000051s ======
Feb 02 09:57:06 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:57:06.170 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000051s
Feb 02 09:57:06 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:57:06 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:57:06 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:57:06.287 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:57:06 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 09:57:07 compute-1 ceph-mon[80115]: pgmap v523: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 B/s wr, 0 op/s
Feb 02 09:57:08 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:57:08 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000025s ======
Feb 02 09:57:08 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:57:08.173 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Feb 02 09:57:08 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:57:08 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:57:08 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:57:08.290 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:57:08 compute-1 systemd[1]: ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1@nfs.cephfs.0.0.compute-1.mhzhsx.service: Scheduled restart job, restart counter is at 8.
Feb 02 09:57:08 compute-1 systemd[1]: Stopped Ceph nfs.cephfs.0.0.compute-1.mhzhsx for d241d473-9fcb-5f74-b163-f1ca4454e7f1.
Feb 02 09:57:08 compute-1 systemd[1]: ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1@nfs.cephfs.0.0.compute-1.mhzhsx.service: Consumed 1.053s CPU time.
Feb 02 09:57:08 compute-1 systemd[1]: Starting Ceph nfs.cephfs.0.0.compute-1.mhzhsx for d241d473-9fcb-5f74-b163-f1ca4454e7f1...
Feb 02 09:57:09 compute-1 podman[219547]: 2026-02-02 09:57:09.076616632 +0000 UTC m=+0.064123979 container create 86fb8b86fe4fd313b488fe64e1e5eb8bb5d56efdf256a6fe24227489fcde9b6e (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_REF=squid, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9)
Feb 02 09:57:09 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/73dc38675cb95bd2ce9e007b58ce4c3a2f4231b213b0e6b3b48bce8e41439171/merged/etc/ganesha supports timestamps until 2038 (0x7fffffff)
Feb 02 09:57:09 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/73dc38675cb95bd2ce9e007b58ce4c3a2f4231b213b0e6b3b48bce8e41439171/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Feb 02 09:57:09 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/73dc38675cb95bd2ce9e007b58ce4c3a2f4231b213b0e6b3b48bce8e41439171/merged/etc/ceph/keyring supports timestamps until 2038 (0x7fffffff)
Feb 02 09:57:09 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/73dc38675cb95bd2ce9e007b58ce4c3a2f4231b213b0e6b3b48bce8e41439171/merged/var/lib/ceph/radosgw/ceph-nfs.cephfs.0.0.compute-1.mhzhsx-rgw/keyring supports timestamps until 2038 (0x7fffffff)
Feb 02 09:57:09 compute-1 podman[219547]: 2026-02-02 09:57:09.132010906 +0000 UTC m=+0.119518253 container init 86fb8b86fe4fd313b488fe64e1e5eb8bb5d56efdf256a6fe24227489fcde9b6e (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.label-schema.schema-version=1.0, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, io.buildah.version=1.40.1, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS)
Feb 02 09:57:09 compute-1 podman[219547]: 2026-02-02 09:57:09.044389148 +0000 UTC m=+0.031896545 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Feb 02 09:57:09 compute-1 podman[219547]: 2026-02-02 09:57:09.140665997 +0000 UTC m=+0.128173334 container start 86fb8b86fe4fd313b488fe64e1e5eb8bb5d56efdf256a6fe24227489fcde9b6e (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 02 09:57:09 compute-1 bash[219547]: 86fb8b86fe4fd313b488fe64e1e5eb8bb5d56efdf256a6fe24227489fcde9b6e
Feb 02 09:57:09 compute-1 systemd[1]: Started Ceph nfs.cephfs.0.0.compute-1.mhzhsx for d241d473-9fcb-5f74-b163-f1ca4454e7f1.
Feb 02 09:57:09 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[219563]: 02/02/2026 09:57:09 : epoch 698074f5 : compute-1 : ganesha.nfsd-2[main] init_logging :LOG :NULL :LOG: Setting log level for all components to NIV_EVENT
Feb 02 09:57:09 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[219563]: 02/02/2026 09:57:09 : epoch 698074f5 : compute-1 : ganesha.nfsd-2[main] main :MAIN :EVENT :ganesha.nfsd Starting: Ganesha Version 5.9
Feb 02 09:57:09 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[219563]: 02/02/2026 09:57:09 : epoch 698074f5 : compute-1 : ganesha.nfsd-2[main] nfs_set_param_from_conf :NFS STARTUP :EVENT :Configuration file successfully parsed
Feb 02 09:57:09 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[219563]: 02/02/2026 09:57:09 : epoch 698074f5 : compute-1 : ganesha.nfsd-2[main] monitoring_init :NFS STARTUP :EVENT :Init monitoring at 0.0.0.0:9587
Feb 02 09:57:09 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[219563]: 02/02/2026 09:57:09 : epoch 698074f5 : compute-1 : ganesha.nfsd-2[main] fsal_init_fds_limit :MDCACHE LRU :EVENT :Setting the system-imposed limit on FDs to 1048576.
Feb 02 09:57:09 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[219563]: 02/02/2026 09:57:09 : epoch 698074f5 : compute-1 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :Initializing ID Mapper.
Feb 02 09:57:09 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[219563]: 02/02/2026 09:57:09 : epoch 698074f5 : compute-1 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :ID Mapper successfully initialized.
Feb 02 09:57:09 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[219563]: 02/02/2026 09:57:09 : epoch 698074f5 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Feb 02 09:57:09 compute-1 ceph-mon[80115]: pgmap v524: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 B/s wr, 0 op/s
Feb 02 09:57:10 compute-1 sudo[219740]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hdggduzcmurjrwwkvkwmhxhrrxpptbeh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770026229.419381-2532-112197995870845/AnsiballZ_getent.py'
Feb 02 09:57:10 compute-1 sudo[219740]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:57:10 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:57:10 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:57:10 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:57:10.176 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:57:10 compute-1 podman[219704]: 2026-02-02 09:57:10.181819059 +0000 UTC m=+0.126082701 container health_status 1fb2696999ea15e9688144989093738543d3cef725f9986792d6dfd707aefbe2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:099d88ae13fa2b3409da5310cdcba7fa01d2c87a8bc98296299a57054b9a075e, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_controller, container_name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'db4758ee7523fe447444c4bd2b867b543b1eee4e3bbcf6676cd1b27bf6147d86-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:099d88ae13fa2b3409da5310cdcba7fa01d2c87a8bc98296299a57054b9a075e', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true)
Feb 02 09:57:10 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:57:10 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:57:10 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:57:10.293 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:57:10 compute-1 python3.9[219746]: ansible-ansible.builtin.getent Invoked with database=passwd key=nova fail_key=True service=None split=None
Feb 02 09:57:10 compute-1 sudo[219740]: pam_unix(sudo:session): session closed for user root
Feb 02 09:57:11 compute-1 sudo[219907]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mclojujpqeeqsradbqaxmsnoumudgptv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770026230.6213233-2556-19282609141383/AnsiballZ_group.py'
Feb 02 09:57:11 compute-1 sudo[219907]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:57:11 compute-1 python3.9[219909]: ansible-ansible.builtin.group Invoked with gid=42436 name=nova state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Feb 02 09:57:11 compute-1 groupadd[219910]: group added to /etc/group: name=nova, GID=42436
Feb 02 09:57:11 compute-1 groupadd[219910]: group added to /etc/gshadow: name=nova
Feb 02 09:57:11 compute-1 groupadd[219910]: new group: name=nova, GID=42436
Feb 02 09:57:11 compute-1 sudo[219907]: pam_unix(sudo:session): session closed for user root
Feb 02 09:57:11 compute-1 ceph-mon[80115]: pgmap v525: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s
Feb 02 09:57:11 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 09:57:11 compute-1 sudo[220065]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-isfkrfvcrxcqhusfqrcyjdfqjqqfucgv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770026231.5239775-2580-149856400621730/AnsiballZ_user.py'
Feb 02 09:57:11 compute-1 sudo[220065]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:57:12 compute-1 python3.9[220067]: ansible-ansible.builtin.user Invoked with comment=nova user group=nova groups=['libvirt'] name=nova shell=/bin/sh state=present uid=42436 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-1 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Feb 02 09:57:12 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:57:12 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:57:12 compute-1 rsyslogd[1009]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Feb 02 09:57:12 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:57:12.179 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:57:12 compute-1 rsyslogd[1009]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Feb 02 09:57:12 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:57:12 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:57:12 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:57:12.295 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:57:12 compute-1 useradd[220069]: new user: name=nova, UID=42436, GID=42436, home=/home/nova, shell=/bin/sh, from=/dev/pts/0
Feb 02 09:57:12 compute-1 ceph-mon[80115]: pgmap v526: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s
Feb 02 09:57:12 compute-1 useradd[220069]: add 'nova' to group 'libvirt'
Feb 02 09:57:12 compute-1 useradd[220069]: add 'nova' to shadow group 'libvirt'
Feb 02 09:57:12 compute-1 sudo[220065]: pam_unix(sudo:session): session closed for user root
Feb 02 09:57:14 compute-1 sshd-session[220102]: Accepted publickey for zuul from 192.168.122.30 port 49830 ssh2: ECDSA SHA256:RIWOugHsRom13QN8+H2eekzMj6VNcm6gUxie+zDStiQ
Feb 02 09:57:14 compute-1 systemd-logind[805]: New session 54 of user zuul.
Feb 02 09:57:14 compute-1 systemd[1]: Started Session 54 of User zuul.
Feb 02 09:57:14 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:57:14 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000025s ======
Feb 02 09:57:14 compute-1 sshd-session[220102]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Feb 02 09:57:14 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:57:14.182 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Feb 02 09:57:14 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:57:14 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:57:14 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:57:14.299 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:57:14 compute-1 sshd-session[220105]: Received disconnect from 192.168.122.30 port 49830:11: disconnected by user
Feb 02 09:57:14 compute-1 sshd-session[220105]: Disconnected from user zuul 192.168.122.30 port 49830
Feb 02 09:57:14 compute-1 sshd-session[220102]: pam_unix(sshd:session): session closed for user zuul
Feb 02 09:57:14 compute-1 systemd[1]: session-54.scope: Deactivated successfully.
Feb 02 09:57:14 compute-1 systemd-logind[805]: Session 54 logged out. Waiting for processes to exit.
Feb 02 09:57:14 compute-1 systemd-logind[805]: Removed session 54.
Feb 02 09:57:15 compute-1 python3.9[220256]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/config.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 02 09:57:15 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[219563]: 02/02/2026 09:57:15 : epoch 698074f5 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Feb 02 09:57:15 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[219563]: 02/02/2026 09:57:15 : epoch 698074f5 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Feb 02 09:57:15 compute-1 ceph-mon[80115]: pgmap v527: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.2 KiB/s rd, 597 B/s wr, 1 op/s
Feb 02 09:57:15 compute-1 python3.9[220377]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/config.json mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1770026234.7935214-2655-231399672093957/.source.json follow=False _original_basename=config.json.j2 checksum=b51012bfb0ca26296dcf3793a2f284446fb1395e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 02 09:57:16 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:57:16 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:57:16 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:57:16.185 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:57:16 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:57:16 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:57:16 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:57:16.301 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:57:16 compute-1 python3.9[220527]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/nova-blank.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 02 09:57:16 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 09:57:16 compute-1 sudo[220605]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Feb 02 09:57:16 compute-1 sudo[220605]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:57:16 compute-1 sudo[220605]: pam_unix(sudo:session): session closed for user root
Feb 02 09:57:16 compute-1 python3.9[220603]: ansible-ansible.legacy.file Invoked with mode=0644 setype=container_file_t dest=/var/lib/openstack/config/nova/nova-blank.conf _original_basename=nova-blank.conf recurse=False state=file path=/var/lib/openstack/config/nova/nova-blank.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 02 09:57:17 compute-1 python3.9[220779]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/ssh-config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 02 09:57:17 compute-1 ceph-mon[80115]: pgmap v528: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.1 KiB/s rd, 597 B/s wr, 1 op/s
Feb 02 09:57:17 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Feb 02 09:57:17 compute-1 python3.9[220900]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/ssh-config mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1770026236.9955747-2655-255954910177455/.source follow=False _original_basename=ssh-config checksum=4297f735c41bdc1ff52d72e6f623a02242f37958 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 02 09:57:18 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:57:18 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:57:18 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:57:18.188 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:57:18 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:57:18 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 09:57:18 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:57:18.303 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 09:57:18 compute-1 python3.9[221050]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/02-nova-host-specific.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 02 09:57:19 compute-1 podman[221146]: 2026-02-02 09:57:19.171080561 +0000 UTC m=+0.066793577 container health_status 9cab238676dda9a572dafc047b624a8b78a8fbec063bf997856559897160605d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'db4758ee7523fe447444c4bd2b867b543b1eee4e3bbcf6676cd1b27bf6147d86-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent)
Feb 02 09:57:19 compute-1 python3.9[221183]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/02-nova-host-specific.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1770026238.0810962-2655-16037526371783/.source.conf follow=False _original_basename=02-nova-host-specific.conf.j2 checksum=bc7f3bb7d4094c596a18178a888511b54e157ba4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 02 09:57:19 compute-1 ceph-mon[80115]: pgmap v529: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 2.0 KiB/s rd, 938 B/s wr, 2 op/s
Feb 02 09:57:19 compute-1 python3.9[221341]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/nova_statedir_ownership.py follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 02 09:57:20 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:57:20 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:57:20 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:57:20.192 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:57:20 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:57:20 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:57:20 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:57:20.306 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:57:20 compute-1 python3.9[221462]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/nova_statedir_ownership.py mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1770026239.4295225-2655-85468870633035/.source.py follow=False _original_basename=nova_statedir_ownership.py checksum=c6c8a3cfefa5efd60ceb1408c4e977becedb71e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 02 09:57:21 compute-1 python3.9[221613]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/run-on-host follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 02 09:57:21 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[219563]: 02/02/2026 09:57:21 : epoch 698074f5 : compute-1 : ganesha.nfsd-2[main] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Feb 02 09:57:21 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[219563]: 02/02/2026 09:57:21 : epoch 698074f5 : compute-1 : ganesha.nfsd-2[main] main :NFS STARTUP :WARN :No export entries found in configuration file !!!
Feb 02 09:57:21 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[219563]: 02/02/2026 09:57:21 : epoch 698074f5 : compute-1 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:24): Unknown block (RADOS_URLS)
Feb 02 09:57:21 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[219563]: 02/02/2026 09:57:21 : epoch 698074f5 : compute-1 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:29): Unknown block (RGW)
Feb 02 09:57:21 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[219563]: 02/02/2026 09:57:21 : epoch 698074f5 : compute-1 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :CAP_SYS_RESOURCE was successfully removed for proper quota management in FSAL
Feb 02 09:57:21 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[219563]: 02/02/2026 09:57:21 : epoch 698074f5 : compute-1 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :currently set capabilities are: cap_chown,cap_dac_override,cap_fowner,cap_fsetid,cap_kill,cap_setgid,cap_setuid,cap_setpcap,cap_net_bind_service,cap_sys_chroot,cap_setfcap=ep
Feb 02 09:57:21 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[219563]: 02/02/2026 09:57:21 : epoch 698074f5 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_pkginit :DBUS :CRIT :dbus_bus_get failed (Failed to connect to socket /run/dbus/system_bus_socket: No such file or directory)
Feb 02 09:57:21 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[219563]: 02/02/2026 09:57:21 : epoch 698074f5 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Feb 02 09:57:21 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[219563]: 02/02/2026 09:57:21 : epoch 698074f5 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Feb 02 09:57:21 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[219563]: 02/02/2026 09:57:21 : epoch 698074f5 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Feb 02 09:57:21 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[219563]: 02/02/2026 09:57:21 : epoch 698074f5 : compute-1 : ganesha.nfsd-2[main] nfs_Init_svc :DISP :CRIT :Cannot acquire credentials for principal nfs
Feb 02 09:57:21 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[219563]: 02/02/2026 09:57:21 : epoch 698074f5 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Feb 02 09:57:21 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[219563]: 02/02/2026 09:57:21 : epoch 698074f5 : compute-1 : ganesha.nfsd-2[main] nfs_Init_admin_thread :NFS CB :EVENT :Admin thread initialized
Feb 02 09:57:21 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[219563]: 02/02/2026 09:57:21 : epoch 698074f5 : compute-1 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :EVENT :Callback creds directory (/var/run/ganesha) already exists
Feb 02 09:57:21 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[219563]: 02/02/2026 09:57:21 : epoch 698074f5 : compute-1 : ganesha.nfsd-2[main] find_keytab_entry :NFS CB :WARN :Configuration file does not specify default realm while getting default realm name
Feb 02 09:57:21 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[219563]: 02/02/2026 09:57:21 : epoch 698074f5 : compute-1 : ganesha.nfsd-2[main] gssd_refresh_krb5_machine_credential :NFS CB :CRIT :ERROR: gssd_refresh_krb5_machine_credential: no usable keytab entry found in keytab /etc/krb5.keytab for connection with host localhost
Feb 02 09:57:21 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[219563]: 02/02/2026 09:57:21 : epoch 698074f5 : compute-1 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :WARN :gssd_refresh_krb5_machine_credential failed (-1765328160:0)
Feb 02 09:57:21 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[219563]: 02/02/2026 09:57:21 : epoch 698074f5 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :Starting delayed executor.
Feb 02 09:57:21 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[219563]: 02/02/2026 09:57:21 : epoch 698074f5 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :gsh_dbusthread was started successfully
Feb 02 09:57:21 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[219563]: 02/02/2026 09:57:21 : epoch 698074f5 : compute-1 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :CRIT :DBUS not initialized, service thread exiting
Feb 02 09:57:21 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[219563]: 02/02/2026 09:57:21 : epoch 698074f5 : compute-1 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :EVENT :shutdown
Feb 02 09:57:21 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[219563]: 02/02/2026 09:57:21 : epoch 698074f5 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :admin thread was started successfully
Feb 02 09:57:21 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[219563]: 02/02/2026 09:57:21 : epoch 698074f5 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :reaper thread was started successfully
Feb 02 09:57:21 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[219563]: 02/02/2026 09:57:21 : epoch 698074f5 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :General fridge was started successfully
Feb 02 09:57:21 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[219563]: 02/02/2026 09:57:21 : epoch 698074f5 : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Feb 02 09:57:21 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[219563]: 02/02/2026 09:57:21 : epoch 698074f5 : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :             NFS SERVER INITIALIZED
Feb 02 09:57:21 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[219563]: 02/02/2026 09:57:21 : epoch 698074f5 : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Feb 02 09:57:21 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[219563]: 02/02/2026 09:57:21 : epoch 698074f5 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc740000df0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:57:21 compute-1 ceph-mon[80115]: pgmap v530: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 2.0 KiB/s rd, 938 B/s wr, 2 op/s
Feb 02 09:57:21 compute-1 python3.9[221750]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/run-on-host mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1770026240.5676095-2655-253180740455012/.source follow=False _original_basename=run-on-host checksum=93aba8edc83d5878604a66d37fea2f12b60bdea2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 02 09:57:21 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[219563]: 02/02/2026 09:57:21 : epoch 698074f5 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc724000b60 fd 37 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:57:21 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[219563]: 02/02/2026 09:57:21 : epoch 698074f5 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc728000b60 fd 37 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:57:21 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 09:57:22 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:57:22 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:57:22 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:57:22.194 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:57:22 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:57:22 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:57:22 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:57:22.310 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:57:22 compute-1 sudo[221900]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-towpcfdbyiywqkejouuwhznlfnrreaig ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770026242.1273174-2904-194531946058508/AnsiballZ_file.py'
Feb 02 09:57:22 compute-1 sudo[221900]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:57:22 compute-1 python3.9[221902]: ansible-ansible.builtin.file Invoked with group=nova mode=0700 owner=nova path=/home/nova/.ssh state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 02 09:57:22 compute-1 sudo[221900]: pam_unix(sudo:session): session closed for user root
Feb 02 09:57:23 compute-1 sudo[222053]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nimpokkrzbjebccldwjjryzztcjkchsw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770026242.8999655-2928-111157697708875/AnsiballZ_copy.py'
Feb 02 09:57:23 compute-1 sudo[222053]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:57:23 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[219563]: 02/02/2026 09:57:23 : epoch 698074f5 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc71c000b60 fd 37 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:57:23 compute-1 python3.9[222055]: ansible-ansible.legacy.copy Invoked with dest=/home/nova/.ssh/authorized_keys group=nova mode=0600 owner=nova remote_src=True src=/var/lib/openstack/config/nova/ssh-publickey backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 02 09:57:23 compute-1 sudo[222053]: pam_unix(sudo:session): session closed for user root
Feb 02 09:57:23 compute-1 ceph-mon[80115]: pgmap v531: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 2.0 KiB/s rd, 938 B/s wr, 2 op/s
Feb 02 09:57:23 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-haproxy-nfs-cephfs-compute-1-sryqbx[86108]: [WARNING] 032/095723 (4) : Server backend/nfs.cephfs.0 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Feb 02 09:57:23 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[219563]: 02/02/2026 09:57:23 : epoch 698074f5 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc71c000b60 fd 37 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:57:23 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[219563]: 02/02/2026 09:57:23 : epoch 698074f5 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc7240016a0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:57:23 compute-1 sudo[222103]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 02 09:57:23 compute-1 sudo[222103]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:57:23 compute-1 sudo[222103]: pam_unix(sudo:session): session closed for user root
Feb 02 09:57:23 compute-1 sudo[222157]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/d241d473-9fcb-5f74-b163-f1ca4454e7f1/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Feb 02 09:57:23 compute-1 sudo[222157]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:57:24 compute-1 sudo[222255]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sadgjbygyexsmkqnmihgtxxhdzarzgcy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770026243.7158935-2952-196976884877774/AnsiballZ_stat.py'
Feb 02 09:57:24 compute-1 sudo[222255]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:57:24 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:57:24 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:57:24 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:57:24.197 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:57:24 compute-1 python3.9[222259]: ansible-ansible.builtin.stat Invoked with path=/var/lib/nova/compute_id follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 02 09:57:24 compute-1 sudo[222255]: pam_unix(sudo:session): session closed for user root
Feb 02 09:57:24 compute-1 sudo[222157]: pam_unix(sudo:session): session closed for user root
Feb 02 09:57:24 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:57:24 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 09:57:24 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:57:24.313 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 09:57:24 compute-1 sudo[222440]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xsktxtcnuwqagwqpmamdwwqxrpeqgmmx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770026244.4763923-2976-137110893292462/AnsiballZ_stat.py'
Feb 02 09:57:24 compute-1 sudo[222440]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:57:24 compute-1 python3.9[222442]: ansible-ansible.legacy.stat Invoked with path=/var/lib/nova/compute_id follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 02 09:57:24 compute-1 sudo[222440]: pam_unix(sudo:session): session closed for user root
Feb 02 09:57:25 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[219563]: 02/02/2026 09:57:25 : epoch 698074f5 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc7280016a0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:57:25 compute-1 sudo[222563]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-drufensbrhehhdvldkbkilokdpqwznlc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770026244.4763923-2976-137110893292462/AnsiballZ_copy.py'
Feb 02 09:57:25 compute-1 sudo[222563]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:57:25 compute-1 ceph-mon[80115]: pgmap v532: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 2.2 KiB/s rd, 1023 B/s wr, 3 op/s
Feb 02 09:57:25 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Feb 02 09:57:25 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Feb 02 09:57:25 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb 02 09:57:25 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb 02 09:57:25 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Feb 02 09:57:25 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Feb 02 09:57:25 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Feb 02 09:57:25 compute-1 python3.9[222565]: ansible-ansible.legacy.copy Invoked with attributes=+i dest=/var/lib/nova/compute_id group=nova mode=0400 owner=nova src=/home/zuul/.ansible/tmp/ansible-tmp-1770026244.4763923-2976-137110893292462/.source _original_basename=.3j3jpy7v follow=False checksum=2ad71deba695918c6972b05c522c472e308b3cb4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None
Feb 02 09:57:25 compute-1 sudo[222563]: pam_unix(sudo:session): session closed for user root
Feb 02 09:57:25 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[219563]: 02/02/2026 09:57:25 : epoch 698074f5 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc71c000b60 fd 37 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:57:25 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[219563]: 02/02/2026 09:57:25 : epoch 698074f5 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc730001570 fd 37 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:57:26 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:57:26 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:57:26 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:57:26.199 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:57:26 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:57:26 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000025s ======
Feb 02 09:57:26 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:57:26.315 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Feb 02 09:57:26 compute-1 python3.9[222717]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 02 09:57:26 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 09:57:27 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[219563]: 02/02/2026 09:57:27 : epoch 698074f5 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc7240016a0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:57:27 compute-1 python3.9[222870]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/containers/nova_compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 02 09:57:27 compute-1 ceph-mon[80115]: pgmap v533: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.2 KiB/s rd, 426 B/s wr, 1 op/s
Feb 02 09:57:27 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[219563]: 02/02/2026 09:57:27 : epoch 698074f5 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc7280016a0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:57:27 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[219563]: 02/02/2026 09:57:27 : epoch 698074f5 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc71c001fc0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:57:27 compute-1 python3.9[222991]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/containers/nova_compute.json mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1770026246.858164-3054-69020391126883/.source.json follow=False _original_basename=nova_compute.json.j2 checksum=aafdeb4849f80b4aa3d95767e2f1397576892cd0 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 02 09:57:28 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:57:28 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000025s ======
Feb 02 09:57:28 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:57:28.201 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Feb 02 09:57:28 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:57:28 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 09:57:28 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:57:28.318 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 09:57:28 compute-1 python3.9[223141]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/containers/nova_compute_init.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 02 09:57:29 compute-1 python3.9[223263]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/containers/nova_compute_init.json mode=0700 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1770026248.133443-3099-9849263248571/.source.json follow=False _original_basename=nova_compute_init.json.j2 checksum=a1f1b826d995a314b6b973b7452c5ae4777408c1 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 02 09:57:29 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[219563]: 02/02/2026 09:57:29 : epoch 698074f5 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc730002090 fd 37 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:57:29 compute-1 sudo[223288]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 02 09:57:29 compute-1 sudo[223288]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:57:29 compute-1 sudo[223288]: pam_unix(sudo:session): session closed for user root
Feb 02 09:57:29 compute-1 ceph-mon[80115]: pgmap v534: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.2 KiB/s rd, 426 B/s wr, 1 op/s
Feb 02 09:57:29 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb 02 09:57:29 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb 02 09:57:29 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[219563]: 02/02/2026 09:57:29 : epoch 698074f5 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc7240016a0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:57:29 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[219563]: 02/02/2026 09:57:29 : epoch 698074f5 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc7240016a0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:57:30 compute-1 sudo[223438]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dmugxkrtanxesooochcxgcbrarqkpxfn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770026249.6121979-3150-271949008312878/AnsiballZ_container_config_data.py'
Feb 02 09:57:30 compute-1 sudo[223438]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:57:30 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:57:30 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:57:30 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:57:30.204 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:57:30 compute-1 python3.9[223440]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/containers config_pattern=nova_compute_init.json debug=False
Feb 02 09:57:30 compute-1 sudo[223438]: pam_unix(sudo:session): session closed for user root
Feb 02 09:57:30 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:57:30 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:57:30 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:57:30.321 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:57:31 compute-1 sudo[223591]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ptvqtddvvudshhcjcxcqpmetixvgwhmv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770026250.679216-3183-239239193726759/AnsiballZ_container_config_hash.py'
Feb 02 09:57:31 compute-1 sudo[223591]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:57:31 compute-1 python3.9[223593]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack
Feb 02 09:57:31 compute-1 sudo[223591]: pam_unix(sudo:session): session closed for user root
Feb 02 09:57:31 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[219563]: 02/02/2026 09:57:31 : epoch 698074f5 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc71c001fc0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:57:31 compute-1 ceph-mon[80115]: pgmap v535: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 85 B/s wr, 0 op/s
Feb 02 09:57:31 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[219563]: 02/02/2026 09:57:31 : epoch 698074f5 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc730002090 fd 37 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:57:31 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[219563]: 02/02/2026 09:57:31 : epoch 698074f5 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc728002720 fd 37 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:57:31 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 09:57:32 compute-1 ceph-mon[80115]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #34. Immutable memtables: 0.
Feb 02 09:57:32 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-09:57:32.063641) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Feb 02 09:57:32 compute-1 ceph-mon[80115]: rocksdb: [db/flush_job.cc:856] [default] [JOB 17] Flushing memtable with next log file: 34
Feb 02 09:57:32 compute-1 ceph-mon[80115]: rocksdb: EVENT_LOG_v1 {"time_micros": 1770026252063668, "job": 17, "event": "flush_started", "num_memtables": 1, "num_entries": 1277, "num_deletes": 254, "total_data_size": 3111561, "memory_usage": 3137440, "flush_reason": "Manual Compaction"}
Feb 02 09:57:32 compute-1 ceph-mon[80115]: rocksdb: [db/flush_job.cc:885] [default] [JOB 17] Level-0 flush table #35: started
Feb 02 09:57:32 compute-1 ceph-mon[80115]: rocksdb: EVENT_LOG_v1 {"time_micros": 1770026252077974, "cf_name": "default", "job": 17, "event": "table_file_creation", "file_number": 35, "file_size": 2037435, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 18468, "largest_seqno": 19740, "table_properties": {"data_size": 2031933, "index_size": 2897, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1541, "raw_key_size": 11090, "raw_average_key_size": 18, "raw_value_size": 2021027, "raw_average_value_size": 3390, "num_data_blocks": 130, "num_entries": 596, "num_filter_entries": 596, "num_deletions": 254, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1770026143, "oldest_key_time": 1770026143, "file_creation_time": 1770026252, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "fac1d709-8a2a-487d-b05b-57255ec289c7", "db_session_id": "DE871D21TSCUFP8UED8E", "orig_file_number": 35, "seqno_to_time_mapping": "N/A"}}
Feb 02 09:57:32 compute-1 ceph-mon[80115]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 17] Flush lasted 14401 microseconds, and 4096 cpu microseconds.
Feb 02 09:57:32 compute-1 ceph-mon[80115]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Feb 02 09:57:32 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-09:57:32.078035) [db/flush_job.cc:967] [default] [JOB 17] Level-0 flush table #35: 2037435 bytes OK
Feb 02 09:57:32 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-09:57:32.078058) [db/memtable_list.cc:519] [default] Level-0 commit table #35 started
Feb 02 09:57:32 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-09:57:32.081886) [db/memtable_list.cc:722] [default] Level-0 commit table #35: memtable #1 done
Feb 02 09:57:32 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-09:57:32.081909) EVENT_LOG_v1 {"time_micros": 1770026252081902, "job": 17, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Feb 02 09:57:32 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-09:57:32.081927) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Feb 02 09:57:32 compute-1 ceph-mon[80115]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 17] Try to delete WAL files size 3105476, prev total WAL file size 3105476, number of live WAL files 2.
Feb 02 09:57:32 compute-1 ceph-mon[80115]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000031.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb 02 09:57:32 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-09:57:32.082742) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6C6F676D0030' seq:72057594037927935, type:22 .. '6C6F676D00323531' seq:0, type:0; will stop at (end)
Feb 02 09:57:32 compute-1 ceph-mon[80115]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 18] Compacting 1@0 + 1@6 files to L6, score -1.00
Feb 02 09:57:32 compute-1 ceph-mon[80115]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 17 Base level 0, inputs: [35(1989KB)], [33(11MB)]
Feb 02 09:57:32 compute-1 ceph-mon[80115]: rocksdb: EVENT_LOG_v1 {"time_micros": 1770026252082798, "job": 18, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [35], "files_L6": [33], "score": -1, "input_data_size": 13736802, "oldest_snapshot_seqno": -1}
Feb 02 09:57:32 compute-1 ceph-mon[80115]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 18] Generated table #36: 4980 keys, 13234698 bytes, temperature: kUnknown
Feb 02 09:57:32 compute-1 ceph-mon[80115]: rocksdb: EVENT_LOG_v1 {"time_micros": 1770026252172105, "cf_name": "default", "job": 18, "event": "table_file_creation", "file_number": 36, "file_size": 13234698, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 13200221, "index_size": 20927, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 12485, "raw_key_size": 126574, "raw_average_key_size": 25, "raw_value_size": 13108586, "raw_average_value_size": 2632, "num_data_blocks": 858, "num_entries": 4980, "num_filter_entries": 4980, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1770025175, "oldest_key_time": 0, "file_creation_time": 1770026252, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "fac1d709-8a2a-487d-b05b-57255ec289c7", "db_session_id": "DE871D21TSCUFP8UED8E", "orig_file_number": 36, "seqno_to_time_mapping": "N/A"}}
Feb 02 09:57:32 compute-1 ceph-mon[80115]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Feb 02 09:57:32 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-09:57:32.172319) [db/compaction/compaction_job.cc:1663] [default] [JOB 18] Compacted 1@0 + 1@6 files to L6 => 13234698 bytes
Feb 02 09:57:32 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-09:57:32.179977) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 153.7 rd, 148.1 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.9, 11.2 +0.0 blob) out(12.6 +0.0 blob), read-write-amplify(13.2) write-amplify(6.5) OK, records in: 5502, records dropped: 522 output_compression: NoCompression
Feb 02 09:57:32 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-09:57:32.180012) EVENT_LOG_v1 {"time_micros": 1770026252179998, "job": 18, "event": "compaction_finished", "compaction_time_micros": 89384, "compaction_time_cpu_micros": 36350, "output_level": 6, "num_output_files": 1, "total_output_size": 13234698, "num_input_records": 5502, "num_output_records": 4980, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Feb 02 09:57:32 compute-1 ceph-mon[80115]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000035.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb 02 09:57:32 compute-1 ceph-mon[80115]: rocksdb: EVENT_LOG_v1 {"time_micros": 1770026252180306, "job": 18, "event": "table_file_deletion", "file_number": 35}
Feb 02 09:57:32 compute-1 ceph-mon[80115]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000033.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb 02 09:57:32 compute-1 ceph-mon[80115]: rocksdb: EVENT_LOG_v1 {"time_micros": 1770026252181062, "job": 18, "event": "table_file_deletion", "file_number": 33}
Feb 02 09:57:32 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-09:57:32.082626) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 02 09:57:32 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-09:57:32.181098) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 02 09:57:32 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-09:57:32.181103) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 02 09:57:32 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-09:57:32.181105) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 02 09:57:32 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-09:57:32.181106) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 02 09:57:32 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-09:57:32.181108) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 02 09:57:32 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:57:32 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000025s ======
Feb 02 09:57:32 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:57:32.207 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Feb 02 09:57:32 compute-1 sudo[223743]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-muxhbmeimjhjsywuuiqlsytvmcprxlvp ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1770026251.7393787-3213-126891123067894/AnsiballZ_edpm_container_manage.py'
Feb 02 09:57:32 compute-1 sudo[223743]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:57:32 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:57:32 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:57:32 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:57:32.324 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:57:32 compute-1 python3[223745]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/containers config_id=edpm config_overrides={} config_patterns=nova_compute_init.json containers=[] log_base_path=/var/log/containers/stdouts debug=False
Feb 02 09:57:33 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Feb 02 09:57:33 compute-1 ceph-mon[80115]: pgmap v536: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 85 B/s wr, 0 op/s
Feb 02 09:57:33 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[219563]: 02/02/2026 09:57:33 : epoch 698074f5 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc724002f00 fd 37 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:57:33 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[219563]: 02/02/2026 09:57:33 : epoch 698074f5 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc724002f00 fd 37 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:57:33 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[219563]: 02/02/2026 09:57:33 : epoch 698074f5 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc730002090 fd 37 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:57:34 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:57:34 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:57:34 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:57:34.210 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:57:34 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:57:34 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:57:34 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:57:34.328 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:57:35 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[219563]: 02/02/2026 09:57:35 : epoch 698074f5 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc728002720 fd 37 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:57:35 compute-1 ceph-mon[80115]: pgmap v537: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 85 B/s wr, 0 op/s
Feb 02 09:57:35 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[219563]: 02/02/2026 09:57:35 : epoch 698074f5 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc71c002f00 fd 37 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:57:35 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[219563]: 02/02/2026 09:57:35 : epoch 698074f5 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc724002f00 fd 37 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:57:36 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:57:36 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:57:36 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:57:36.212 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:57:36 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:57:36 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000025s ======
Feb 02 09:57:36 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:57:36.330 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Feb 02 09:57:36 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 09:57:36 compute-1 sudo[223805]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Feb 02 09:57:36 compute-1 sudo[223805]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:57:36 compute-1 sudo[223805]: pam_unix(sudo:session): session closed for user root
Feb 02 09:57:37 compute-1 ceph-mon[80115]: pgmap v538: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Feb 02 09:57:37 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[219563]: 02/02/2026 09:57:37 : epoch 698074f5 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc730003520 fd 37 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:57:37 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[219563]: 02/02/2026 09:57:37 : epoch 698074f5 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc728002720 fd 37 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:57:37 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[219563]: 02/02/2026 09:57:37 : epoch 698074f5 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc71c002f00 fd 37 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:57:38 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:57:38 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:57:38 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:57:38.215 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:57:38 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:57:38 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000025s ======
Feb 02 09:57:38 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:57:38.332 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Feb 02 09:57:39 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[219563]: 02/02/2026 09:57:39 : epoch 698074f5 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc724003c10 fd 37 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:57:39 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[219563]: 02/02/2026 09:57:39 : epoch 698074f5 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc730003520 fd 37 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:57:39 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[219563]: 02/02/2026 09:57:39 : epoch 698074f5 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc728003820 fd 37 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:57:39 compute-1 ceph-mon[80115]: pgmap v539: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Feb 02 09:57:40 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:57:40 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:57:40 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:57:40.217 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:57:40 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:57:40 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000025s ======
Feb 02 09:57:40 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:57:40.334 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Feb 02 09:57:41 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[219563]: 02/02/2026 09:57:41 : epoch 698074f5 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc71c003c10 fd 37 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:57:41 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[219563]: 02/02/2026 09:57:41 : epoch 698074f5 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc724003c10 fd 37 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:57:41 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[219563]: 02/02/2026 09:57:41 : epoch 698074f5 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc730003520 fd 37 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:57:42 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:57:42 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 09:57:42 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:57:42.219 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 09:57:42 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 09:57:42 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:57:42 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:57:42 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:57:42.337 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:57:42 compute-1 ceph-mon[80115]: pgmap v540: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Feb 02 09:57:43 compute-1 podman[223847]: 2026-02-02 09:57:43.01526774 +0000 UTC m=+2.685890834 container health_status 1fb2696999ea15e9688144989093738543d3cef725f9986792d6dfd707aefbe2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:099d88ae13fa2b3409da5310cdcba7fa01d2c87a8bc98296299a57054b9a075e, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'db4758ee7523fe447444c4bd2b867b543b1eee4e3bbcf6676cd1b27bf6147d86-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:099d88ae13fa2b3409da5310cdcba7fa01d2c87a8bc98296299a57054b9a075e', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Feb 02 09:57:43 compute-1 podman[223760]: 2026-02-02 09:57:43.046254751 +0000 UTC m=+10.472975788 image pull f4e0688689eb3c524117ae65df199eeb4e620e591d26898b5cb25b819a2d79fd quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:f96bd21c79ae0d7e8e17010c5e2573637d6c0f47f03e63134c477edd8ad73d83
Feb 02 09:57:43 compute-1 podman[223898]: 2026-02-02 09:57:43.245603793 +0000 UTC m=+0.079862041 container create f928f7489bec4355ce1c96cdb3df24e598d845581ddb3fb4d77610b087f1888a (image=quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:f96bd21c79ae0d7e8e17010c5e2573637d6c0f47f03e63134c477edd8ad73d83, name=nova_compute_init, maintainer=OpenStack Kubernetes Operator team, config_id=edpm, tcib_managed=true, container_name=nova_compute_init, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, managed_by=edpm_ansible, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:f96bd21c79ae0d7e8e17010c5e2573637d6c0f47f03e63134c477edd8ad73d83', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']})
Feb 02 09:57:43 compute-1 podman[223898]: 2026-02-02 09:57:43.202520323 +0000 UTC m=+0.036778581 image pull f4e0688689eb3c524117ae65df199eeb4e620e591d26898b5cb25b819a2d79fd quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:f96bd21c79ae0d7e8e17010c5e2573637d6c0f47f03e63134c477edd8ad73d83
Feb 02 09:57:43 compute-1 python3[223745]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name nova_compute_init --conmon-pidfile /run/nova_compute_init.pid --env NOVA_STATEDIR_OWNERSHIP_SKIP=/var/lib/nova/compute_id --env __OS_DEBUG=False --label config_id=edpm --label container_name=nova_compute_init --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:f96bd21c79ae0d7e8e17010c5e2573637d6c0f47f03e63134c477edd8ad73d83', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']} --log-driver journald --log-level info --network none --privileged=False --security-opt label=disable --user root --volume /dev/log:/dev/log --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z --volume /var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:f96bd21c79ae0d7e8e17010c5e2573637d6c0f47f03e63134c477edd8ad73d83 bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init
Feb 02 09:57:43 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[219563]: 02/02/2026 09:57:43 : epoch 698074f5 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc728003820 fd 37 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:57:43 compute-1 sudo[223743]: pam_unix(sudo:session): session closed for user root
Feb 02 09:57:43 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[219563]: 02/02/2026 09:57:43 : epoch 698074f5 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc71c003c10 fd 37 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:57:43 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[219563]: 02/02/2026 09:57:43 : epoch 698074f5 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc724003c10 fd 37 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:57:43 compute-1 ceph-mon[80115]: pgmap v541: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Feb 02 09:57:43 compute-1 sudo[224086]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xzfvoiiubkbmferstlpowhsguozwyeia ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770026263.6870234-3237-11680919410126/AnsiballZ_stat.py'
Feb 02 09:57:43 compute-1 sudo[224086]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:57:44 compute-1 python3.9[224088]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 02 09:57:44 compute-1 sudo[224086]: pam_unix(sudo:session): session closed for user root
Feb 02 09:57:44 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:57:44 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:57:44 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:57:44.221 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:57:44 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:57:44 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:57:44 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:57:44.340 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:57:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:57:44.897 143542 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 02 09:57:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:57:44.898 143542 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 02 09:57:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:57:44.898 143542 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 02 09:57:45 compute-1 ceph-mon[80115]: pgmap v542: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Feb 02 09:57:45 compute-1 sudo[224241]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ourlopvwcvagtozhbahygvtwmlunbwmu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770026264.818616-3273-142454019449588/AnsiballZ_container_config_data.py'
Feb 02 09:57:45 compute-1 sudo[224241]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:57:45 compute-1 python3.9[224243]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/containers config_pattern=nova_compute.json debug=False
Feb 02 09:57:45 compute-1 sudo[224241]: pam_unix(sudo:session): session closed for user root
Feb 02 09:57:45 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[219563]: 02/02/2026 09:57:45 : epoch 698074f5 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc730004620 fd 37 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:57:45 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[219563]: 02/02/2026 09:57:45 : epoch 698074f5 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc728003820 fd 37 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:57:45 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[219563]: 02/02/2026 09:57:45 : epoch 698074f5 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc71c003c10 fd 37 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:57:46 compute-1 sudo[224393]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jqauucbxbplwqruetlsadfiriiymhpsi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770026265.8071783-3306-188732613295550/AnsiballZ_container_config_hash.py'
Feb 02 09:57:46 compute-1 sudo[224393]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:57:46 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:57:46 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:57:46 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:57:46.224 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:57:46 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:57:46 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:57:46 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:57:46.343 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:57:46 compute-1 python3.9[224395]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack
Feb 02 09:57:46 compute-1 sudo[224393]: pam_unix(sudo:session): session closed for user root
Feb 02 09:57:47 compute-1 sudo[224546]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qpigxctpphdjoipazgmdxqrheqwiidkn ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1770026266.8147929-3336-193800510890335/AnsiballZ_edpm_container_manage.py'
Feb 02 09:57:47 compute-1 sudo[224546]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:57:47 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 09:57:47 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[219563]: 02/02/2026 09:57:47 : epoch 698074f5 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc724003c10 fd 37 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:57:47 compute-1 python3[224548]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/containers config_id=edpm config_overrides={} config_patterns=nova_compute.json containers=[] log_base_path=/var/log/containers/stdouts debug=False
Feb 02 09:57:47 compute-1 ceph-mon[80115]: pgmap v543: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Feb 02 09:57:47 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Feb 02 09:57:47 compute-1 podman[224584]: 2026-02-02 09:57:47.591649918 +0000 UTC m=+0.056420992 container create 44a8f98e42b54b85c089d415e06284450a966543946fe740f844231d5b8c260e (image=quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:f96bd21c79ae0d7e8e17010c5e2573637d6c0f47f03e63134c477edd8ad73d83, name=nova_compute, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, managed_by=edpm_ansible, org.label-schema.build-date=20260127, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, container_name=nova_compute, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:f96bd21c79ae0d7e8e17010c5e2573637d6c0f47f03e63134c477edd8ad73d83', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, config_id=edpm, org.label-schema.license=GPLv2, tcib_managed=true)
Feb 02 09:57:47 compute-1 podman[224584]: 2026-02-02 09:57:47.565284344 +0000 UTC m=+0.030055418 image pull f4e0688689eb3c524117ae65df199eeb4e620e591d26898b5cb25b819a2d79fd quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:f96bd21c79ae0d7e8e17010c5e2573637d6c0f47f03e63134c477edd8ad73d83
Feb 02 09:57:47 compute-1 python3[224548]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name nova_compute --conmon-pidfile /run/nova_compute.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --label config_id=edpm --label container_name=nova_compute --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:f96bd21c79ae0d7e8e17010c5e2573637d6c0f47f03e63134c477edd8ad73d83', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']} --log-driver journald --log-level info --network host --pid host --privileged=True --user nova --volume /var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro --volume /var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /etc/localtime:/etc/localtime:ro --volume /lib/modules:/lib/modules:ro --volume /dev:/dev --volume /var/lib/libvirt:/var/lib/libvirt --volume /run/libvirt:/run/libvirt:shared --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/iscsi:/var/lib/iscsi --volume /etc/multipath:/etc/multipath --volume /etc/multipath.conf:/etc/multipath.conf:ro,Z --volume /etc/iscsi:/etc/iscsi:ro --volume /etc/nvme:/etc/nvme --volume /var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro --volume /etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:f96bd21c79ae0d7e8e17010c5e2573637d6c0f47f03e63134c477edd8ad73d83 kolla_start
Feb 02 09:57:47 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[219563]: 02/02/2026 09:57:47 : epoch 698074f5 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc730004620 fd 37 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:57:47 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[219563]: 02/02/2026 09:57:47 : epoch 698074f5 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc728003820 fd 37 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:57:47 compute-1 sudo[224546]: pam_unix(sudo:session): session closed for user root
Feb 02 09:57:48 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:57:48 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:57:48 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:57:48.227 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:57:48 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:57:48 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:57:48 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:57:48.346 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:57:48 compute-1 sudo[224771]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pgkpmczolfhpuifxlfqgvhswiontniat ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770026268.05655-3360-177010899021442/AnsiballZ_stat.py'
Feb 02 09:57:48 compute-1 sudo[224771]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:57:48 compute-1 python3.9[224773]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 02 09:57:48 compute-1 sudo[224771]: pam_unix(sudo:session): session closed for user root
Feb 02 09:57:49 compute-1 sudo[224926]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pvqwcrgddacjwnavrvabtanioavxvspg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770026268.8942637-3387-137951808235280/AnsiballZ_file.py'
Feb 02 09:57:49 compute-1 sudo[224926]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:57:49 compute-1 podman[224928]: 2026-02-02 09:57:49.318032032 +0000 UTC m=+0.070994895 container health_status 9cab238676dda9a572dafc047b624a8b78a8fbec063bf997856559897160605d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'db4758ee7523fe447444c4bd2b867b543b1eee4e3bbcf6676cd1b27bf6147d86-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Feb 02 09:57:49 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[219563]: 02/02/2026 09:57:49 : epoch 698074f5 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc71c003c10 fd 37 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:57:49 compute-1 python3.9[224929]: ansible-file Invoked with path=/etc/systemd/system/edpm_nova_compute.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 02 09:57:49 compute-1 ceph-mon[80115]: pgmap v544: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Feb 02 09:57:49 compute-1 sudo[224926]: pam_unix(sudo:session): session closed for user root
Feb 02 09:57:49 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[219563]: 02/02/2026 09:57:49 : epoch 698074f5 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc724003c10 fd 37 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:57:49 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[219563]: 02/02/2026 09:57:49 : epoch 698074f5 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc730004620 fd 37 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:57:49 compute-1 sudo[225096]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wlmagzrhrrezxzorjmegzbbpxagcfaip ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770026269.540563-3387-101090893348406/AnsiballZ_copy.py'
Feb 02 09:57:49 compute-1 sudo[225096]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:57:50 compute-1 python3.9[225098]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1770026269.540563-3387-101090893348406/source dest=/etc/systemd/system/edpm_nova_compute.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 02 09:57:50 compute-1 sudo[225096]: pam_unix(sudo:session): session closed for user root
Feb 02 09:57:50 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:57:50 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:57:50 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:57:50.229 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:57:50 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:57:50 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:57:50 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:57:50.349 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:57:50 compute-1 sudo[225172]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dknbfnelyxbsmscnledrpgydzvidyjok ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770026269.540563-3387-101090893348406/AnsiballZ_systemd.py'
Feb 02 09:57:50 compute-1 sudo[225172]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:57:50 compute-1 python3.9[225174]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Feb 02 09:57:50 compute-1 systemd[1]: Reloading.
Feb 02 09:57:50 compute-1 systemd-rc-local-generator[225201]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 02 09:57:50 compute-1 systemd-sysv-generator[225206]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 02 09:57:50 compute-1 sudo[225172]: pam_unix(sudo:session): session closed for user root
Feb 02 09:57:51 compute-1 sudo[225284]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lgtgoleuyrcewqhbuwvvlthynehgzxur ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770026269.540563-3387-101090893348406/AnsiballZ_systemd.py'
Feb 02 09:57:51 compute-1 sudo[225284]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:57:51 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[219563]: 02/02/2026 09:57:51 : epoch 698074f5 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc728003820 fd 37 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:57:51 compute-1 ceph-mon[80115]: pgmap v545: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Feb 02 09:57:51 compute-1 python3.9[225286]: ansible-systemd Invoked with state=restarted name=edpm_nova_compute.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 02 09:57:51 compute-1 systemd[1]: Reloading.
Feb 02 09:57:51 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[219563]: 02/02/2026 09:57:51 : epoch 698074f5 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc71c003c10 fd 37 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:57:51 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[219563]: 02/02/2026 09:57:51 : epoch 698074f5 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc724003c10 fd 37 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:57:51 compute-1 systemd-rc-local-generator[225311]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 02 09:57:51 compute-1 systemd-sysv-generator[225316]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 02 09:57:51 compute-1 systemd[1]: Starting nova_compute container...
Feb 02 09:57:51 compute-1 systemd[1]: Started libcrun container.
Feb 02 09:57:51 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ff854ca9e4e8bf877a980a70dc449dc8b76c85e2d48fcddfd47deaf64bc04154/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Feb 02 09:57:51 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ff854ca9e4e8bf877a980a70dc449dc8b76c85e2d48fcddfd47deaf64bc04154/merged/etc/nvme supports timestamps until 2038 (0x7fffffff)
Feb 02 09:57:51 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ff854ca9e4e8bf877a980a70dc449dc8b76c85e2d48fcddfd47deaf64bc04154/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Feb 02 09:57:51 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ff854ca9e4e8bf877a980a70dc449dc8b76c85e2d48fcddfd47deaf64bc04154/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Feb 02 09:57:51 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ff854ca9e4e8bf877a980a70dc449dc8b76c85e2d48fcddfd47deaf64bc04154/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Feb 02 09:57:51 compute-1 podman[225326]: 2026-02-02 09:57:51.993451697 +0000 UTC m=+0.123134736 container init 44a8f98e42b54b85c089d415e06284450a966543946fe740f844231d5b8c260e (image=quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:f96bd21c79ae0d7e8e17010c5e2573637d6c0f47f03e63134c477edd8ad73d83, name=nova_compute, tcib_managed=true, config_id=edpm, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:f96bd21c79ae0d7e8e17010c5e2573637d6c0f47f03e63134c477edd8ad73d83', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, container_name=nova_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Feb 02 09:57:52 compute-1 podman[225326]: 2026-02-02 09:57:52.000974309 +0000 UTC m=+0.130657348 container start 44a8f98e42b54b85c089d415e06284450a966543946fe740f844231d5b8c260e (image=quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:f96bd21c79ae0d7e8e17010c5e2573637d6c0f47f03e63134c477edd8ad73d83, name=nova_compute, org.label-schema.vendor=CentOS, config_id=edpm, container_name=nova_compute, managed_by=edpm_ansible, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:f96bd21c79ae0d7e8e17010c5e2573637d6c0f47f03e63134c477edd8ad73d83', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Feb 02 09:57:52 compute-1 podman[225326]: nova_compute
Feb 02 09:57:52 compute-1 nova_compute[225341]: + sudo -E kolla_set_configs
Feb 02 09:57:52 compute-1 systemd[1]: Started nova_compute container.
Feb 02 09:57:52 compute-1 sudo[225284]: pam_unix(sudo:session): session closed for user root
Feb 02 09:57:52 compute-1 nova_compute[225341]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Feb 02 09:57:52 compute-1 nova_compute[225341]: INFO:__main__:Validating config file
Feb 02 09:57:52 compute-1 nova_compute[225341]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Feb 02 09:57:52 compute-1 nova_compute[225341]: INFO:__main__:Copying service configuration files
Feb 02 09:57:52 compute-1 nova_compute[225341]: INFO:__main__:Deleting /etc/nova/nova.conf
Feb 02 09:57:52 compute-1 nova_compute[225341]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf
Feb 02 09:57:52 compute-1 nova_compute[225341]: INFO:__main__:Setting permission for /etc/nova/nova.conf
Feb 02 09:57:52 compute-1 nova_compute[225341]: INFO:__main__:Copying /var/lib/kolla/config_files/01-nova.conf to /etc/nova/nova.conf.d/01-nova.conf
Feb 02 09:57:52 compute-1 nova_compute[225341]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/01-nova.conf
Feb 02 09:57:52 compute-1 nova_compute[225341]: INFO:__main__:Copying /var/lib/kolla/config_files/03-ceph-nova.conf to /etc/nova/nova.conf.d/03-ceph-nova.conf
Feb 02 09:57:52 compute-1 nova_compute[225341]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/03-ceph-nova.conf
Feb 02 09:57:52 compute-1 nova_compute[225341]: INFO:__main__:Copying /var/lib/kolla/config_files/25-nova-extra.conf to /etc/nova/nova.conf.d/25-nova-extra.conf
Feb 02 09:57:52 compute-1 nova_compute[225341]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/25-nova-extra.conf
Feb 02 09:57:52 compute-1 nova_compute[225341]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf.d/nova-blank.conf
Feb 02 09:57:52 compute-1 nova_compute[225341]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/nova-blank.conf
Feb 02 09:57:52 compute-1 nova_compute[225341]: INFO:__main__:Copying /var/lib/kolla/config_files/02-nova-host-specific.conf to /etc/nova/nova.conf.d/02-nova-host-specific.conf
Feb 02 09:57:52 compute-1 nova_compute[225341]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/02-nova-host-specific.conf
Feb 02 09:57:52 compute-1 nova_compute[225341]: INFO:__main__:Deleting /etc/ceph
Feb 02 09:57:52 compute-1 nova_compute[225341]: INFO:__main__:Creating directory /etc/ceph
Feb 02 09:57:52 compute-1 nova_compute[225341]: INFO:__main__:Setting permission for /etc/ceph
Feb 02 09:57:52 compute-1 nova_compute[225341]: INFO:__main__:Copying /var/lib/kolla/config_files/ceph/ceph.conf to /etc/ceph/ceph.conf
Feb 02 09:57:52 compute-1 nova_compute[225341]: INFO:__main__:Setting permission for /etc/ceph/ceph.conf
Feb 02 09:57:52 compute-1 nova_compute[225341]: INFO:__main__:Copying /var/lib/kolla/config_files/ceph/ceph.client.openstack.keyring to /etc/ceph/ceph.client.openstack.keyring
Feb 02 09:57:52 compute-1 nova_compute[225341]: INFO:__main__:Setting permission for /etc/ceph/ceph.client.openstack.keyring
Feb 02 09:57:52 compute-1 nova_compute[225341]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-privatekey to /var/lib/nova/.ssh/ssh-privatekey
Feb 02 09:57:52 compute-1 nova_compute[225341]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Feb 02 09:57:52 compute-1 nova_compute[225341]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-config to /var/lib/nova/.ssh/config
Feb 02 09:57:52 compute-1 nova_compute[225341]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Feb 02 09:57:52 compute-1 nova_compute[225341]: INFO:__main__:Deleting /usr/sbin/iscsiadm
Feb 02 09:57:52 compute-1 nova_compute[225341]: INFO:__main__:Copying /var/lib/kolla/config_files/run-on-host to /usr/sbin/iscsiadm
Feb 02 09:57:52 compute-1 nova_compute[225341]: INFO:__main__:Setting permission for /usr/sbin/iscsiadm
Feb 02 09:57:52 compute-1 nova_compute[225341]: INFO:__main__:Writing out command to execute
Feb 02 09:57:52 compute-1 nova_compute[225341]: INFO:__main__:Setting permission for /etc/ceph/ceph.conf
Feb 02 09:57:52 compute-1 nova_compute[225341]: INFO:__main__:Setting permission for /etc/ceph/ceph.client.openstack.keyring
Feb 02 09:57:52 compute-1 nova_compute[225341]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/
Feb 02 09:57:52 compute-1 nova_compute[225341]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Feb 02 09:57:52 compute-1 nova_compute[225341]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Feb 02 09:57:52 compute-1 nova_compute[225341]: ++ cat /run_command
Feb 02 09:57:52 compute-1 nova_compute[225341]: + CMD=nova-compute
Feb 02 09:57:52 compute-1 nova_compute[225341]: + ARGS=
Feb 02 09:57:52 compute-1 nova_compute[225341]: + sudo kolla_copy_cacerts
Feb 02 09:57:52 compute-1 nova_compute[225341]: + [[ ! -n '' ]]
Feb 02 09:57:52 compute-1 nova_compute[225341]: + . kolla_extend_start
Feb 02 09:57:52 compute-1 nova_compute[225341]: Running command: 'nova-compute'
Feb 02 09:57:52 compute-1 nova_compute[225341]: + echo 'Running command: '\''nova-compute'\'''
Feb 02 09:57:52 compute-1 nova_compute[225341]: + umask 0022
Feb 02 09:57:52 compute-1 nova_compute[225341]: + exec nova-compute
Feb 02 09:57:52 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:57:52 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:57:52 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:57:52.232 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:57:52 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 09:57:52 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:57:52 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000025s ======
Feb 02 09:57:52 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:57:52.351 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Feb 02 09:57:53 compute-1 python3.9[225503]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner_healthcheck.service follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 02 09:57:53 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[219563]: 02/02/2026 09:57:53 : epoch 698074f5 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc724003c10 fd 37 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:57:53 compute-1 ceph-mon[80115]: pgmap v546: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Feb 02 09:57:53 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[219563]: 02/02/2026 09:57:53 : epoch 698074f5 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc724003c10 fd 37 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:57:53 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[219563]: 02/02/2026 09:57:53 : epoch 698074f5 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc720000b60 fd 37 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:57:54 compute-1 python3.9[225657]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner.service follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 02 09:57:54 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:57:54 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 09:57:54 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:57:54.233 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 09:57:54 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:57:54 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:57:54 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:57:54.353 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:57:54 compute-1 sshd-session[225658]: Invalid user solv from 80.94.92.184 port 44160
Feb 02 09:57:54 compute-1 nova_compute[225341]: 2026-02-02 09:57:54.644 225345 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_linux_bridge.linux_bridge.LinuxBridgePlugin'>' with name 'linux_bridge' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Feb 02 09:57:54 compute-1 nova_compute[225341]: 2026-02-02 09:57:54.645 225345 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_noop.noop.NoOpPlugin'>' with name 'noop' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Feb 02 09:57:54 compute-1 nova_compute[225341]: 2026-02-02 09:57:54.645 225345 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_ovs.ovs.OvsPlugin'>' with name 'ovs' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Feb 02 09:57:54 compute-1 nova_compute[225341]: 2026-02-02 09:57:54.645 225345 INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs
Feb 02 09:57:54 compute-1 sshd-session[225658]: Connection closed by invalid user solv 80.94.92.184 port 44160 [preauth]
Feb 02 09:57:54 compute-1 nova_compute[225341]: 2026-02-02 09:57:54.791 225345 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 02 09:57:54 compute-1 nova_compute[225341]: 2026-02-02 09:57:54.815 225345 DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 1 in 0.024s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 02 09:57:54 compute-1 nova_compute[225341]: 2026-02-02 09:57:54.815 225345 DEBUG oslo_concurrency.processutils [-] 'grep -F node.session.scan /sbin/iscsiadm' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473
Feb 02 09:57:54 compute-1 python3.9[225811]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner.service.requires follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 02 09:57:55 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[219563]: 02/02/2026 09:57:55 : epoch 698074f5 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc728003820 fd 37 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.461 225345 INFO nova.virt.driver [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] Loading compute driver 'libvirt.LibvirtDriver'
Feb 02 09:57:55 compute-1 ceph-mon[80115]: pgmap v547: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.610 225345 INFO nova.compute.provider_config [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access.
Feb 02 09:57:55 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[219563]: 02/02/2026 09:57:55 : epoch 698074f5 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc740000df0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:57:55 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[219563]: 02/02/2026 09:57:55 : epoch 698074f5 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc720000b60 fd 37 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.666 225345 DEBUG oslo_concurrency.lockutils [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.667 225345 DEBUG oslo_concurrency.lockutils [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.668 225345 DEBUG oslo_concurrency.lockutils [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.669 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] Full set of CONF: _wait_for_exit_or_signal /usr/lib/python3.9/site-packages/oslo_service/service.py:362
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.669 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.670 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.670 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.671 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] config files: ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.671 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.671 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] allow_resize_to_same_host      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.672 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] arq_binding_timeout            = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.672 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] backdoor_port                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.673 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] backdoor_socket                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.673 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] block_device_allocate_retries  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.674 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] block_device_allocate_retries_interval = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.674 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] cert                           = self.pem log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.674 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] compute_driver                 = libvirt.LibvirtDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.675 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] compute_monitors               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.675 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] config_dir                     = ['/etc/nova/nova.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.676 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] config_drive_format            = iso9660 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.676 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] config_file                    = ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.677 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.677 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] console_host                   = compute-1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.678 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] control_exchange               = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.678 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] cpu_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.679 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] daemon                         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.679 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.680 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] default_access_ip_network_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.680 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] default_availability_zone      = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.680 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] default_ephemeral_format       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.681 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.681 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] default_schedule_zone          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.682 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] disk_allocation_ratio          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.682 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] enable_new_services            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.683 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] enabled_apis                   = ['osapi_compute', 'metadata'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.684 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] enabled_ssl_apis               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.684 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] flat_injected                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.685 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] force_config_drive             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.685 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] force_raw_images               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.686 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.686 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] heal_instance_info_cache_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.687 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] host                           = compute-1.ctlplane.example.com log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.688 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] initial_cpu_allocation_ratio   = 4.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.688 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] initial_disk_allocation_ratio  = 0.9 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.689 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] initial_ram_allocation_ratio   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.690 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] injected_network_template      = /usr/lib/python3.9/site-packages/nova/virt/interfaces.template log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.690 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] instance_build_timeout         = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.691 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] instance_delete_interval       = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.691 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.691 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] instance_name_template         = instance-%08x log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.692 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] instance_usage_audit           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.692 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] instance_usage_audit_period    = month log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.693 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.693 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] instances_path                 = /var/lib/nova/instances log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.693 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] internal_service_availability_zone = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.694 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] key                            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.694 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] live_migration_retry_count     = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.694 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.695 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.695 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.696 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.696 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.696 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.697 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.697 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] log_rotation_type              = size log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.697 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.698 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.698 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.698 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.699 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.699 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] long_rpc_timeout               = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.699 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] max_concurrent_builds          = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.699 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] max_concurrent_live_migrations = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.700 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] max_concurrent_snapshots       = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.700 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] max_local_block_devices        = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.700 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] max_logfile_count              = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.701 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] max_logfile_size_mb            = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.701 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] maximum_instance_delete_attempts = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.701 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] metadata_listen                = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.701 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] metadata_listen_port           = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.702 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] metadata_workers               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.702 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] migrate_max_retries            = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.702 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] mkisofs_cmd                    = /usr/bin/mkisofs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.703 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] my_block_storage_ip            = 192.168.122.101 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.703 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] my_ip                          = 192.168.122.101 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.703 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] network_allocate_retries       = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.704 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.704 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] osapi_compute_listen           = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.704 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] osapi_compute_listen_port      = 8774 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:57:55 compute-1 sudo[225964]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vnxzbogfhosgupemsewbivefcwzzzrtb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770026275.2185092-3567-4035639946730/AnsiballZ_podman_container.py'
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.705 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] osapi_compute_unique_server_name_scope =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.705 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] osapi_compute_workers          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.705 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] password_length                = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.706 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] periodic_enable                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.706 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] periodic_fuzzy_delay           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.707 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] pointer_model                  = usbtablet log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.707 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] preallocate_images             = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.707 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.708 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] pybasedir                      = /usr/lib/python3.9/site-packages log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.708 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] ram_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.709 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:57:55 compute-1 sudo[225964]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.709 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.709 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.710 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] reboot_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.710 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] reclaim_instance_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.710 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] record                         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.711 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] reimage_timeout_per_gb         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.711 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] report_interval                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.711 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] rescue_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.711 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] reserved_host_cpus             = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.712 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] reserved_host_disk_mb          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.712 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] reserved_host_memory_mb        = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.712 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] reserved_huge_pages            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.712 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] resize_confirm_window          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.713 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] resize_fs_using_block_device   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.713 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] resume_guests_state_on_host_boot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.713 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] rootwrap_config                = /etc/nova/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.713 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] rpc_response_timeout           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.714 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] run_external_periodic_tasks    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.714 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] running_deleted_instance_action = reap log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.714 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] running_deleted_instance_poll_interval = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.715 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] running_deleted_instance_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.715 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] scheduler_instance_sync_interval = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.715 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] service_down_time              = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.715 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] servicegroup_driver            = db log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.716 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] shelved_offload_time           = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.716 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] shelved_poll_interval          = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.716 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] shutdown_timeout               = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.716 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] source_is_ipv6                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.717 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] ssl_only                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.717 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] state_path                     = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.717 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] sync_power_state_interval      = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.717 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] sync_power_state_pool_size     = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.718 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.718 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] tempdir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.718 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] timeout_nbd                    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.718 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.719 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] update_resources_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.719 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] use_cow_images                 = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.719 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.719 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.720 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.720 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] use_rootwrap_daemon            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.720 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.720 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.721 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] vcpu_pin_set                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.721 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] vif_plugging_is_fatal          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.721 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] vif_plugging_timeout           = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.721 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] virt_mkfs                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.722 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] volume_usage_poll_interval     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.722 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.722 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] web                            = /usr/share/spice-html5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.723 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.723 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] oslo_concurrency.lock_path     = /var/lib/nova/tmp log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.723 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.723 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.724 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.724 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.724 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.724 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] api.auth_strategy              = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.725 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] api.compute_link_prefix        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.725 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.725 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] api.dhcp_domain                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.725 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] api.enable_instance_password   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.726 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] api.glance_link_prefix         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.726 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] api.instance_list_cells_batch_fixed_size = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.726 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] api.instance_list_cells_batch_strategy = distributed log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.727 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] api.instance_list_per_project_cells = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.727 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] api.list_records_by_skipping_down_cells = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.727 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] api.local_metadata_per_cell    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.727 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] api.max_limit                  = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.728 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] api.metadata_cache_expiration  = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.728 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] api.neutron_default_tenant_id  = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.728 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] api.use_forwarded_for          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.728 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] api.use_neutron_default_nets   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.729 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] api.vendordata_dynamic_connect_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.729 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] api.vendordata_dynamic_failure_fatal = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.729 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] api.vendordata_dynamic_read_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.729 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] api.vendordata_dynamic_ssl_certfile =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.730 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] api.vendordata_dynamic_targets = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.730 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] api.vendordata_jsonfile_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.730 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] api.vendordata_providers       = ['StaticJSON'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.730 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] cache.backend                  = oslo_cache.dict log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.730 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] cache.backend_argument         = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.731 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] cache.config_prefix            = cache.oslo log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.731 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] cache.dead_timeout             = 60.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.731 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] cache.debug_cache_backend      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.731 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] cache.enable_retry_client      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.731 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] cache.enable_socket_keepalive  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.732 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] cache.enabled                  = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.732 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] cache.expiration_time          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.732 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] cache.hashclient_retry_attempts = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.732 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] cache.hashclient_retry_delay   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.732 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] cache.memcache_dead_retry      = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.732 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] cache.memcache_password        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.733 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] cache.memcache_pool_connection_get_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.733 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] cache.memcache_pool_flush_on_reconnect = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.733 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] cache.memcache_pool_maxsize    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.733 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] cache.memcache_pool_unused_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.733 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] cache.memcache_sasl_enabled    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.733 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] cache.memcache_servers         = ['localhost:11211'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.734 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] cache.memcache_socket_timeout  = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.734 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] cache.memcache_username        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.734 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] cache.proxies                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.734 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] cache.retry_attempts           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.734 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] cache.retry_delay              = 0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.735 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] cache.socket_keepalive_count   = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.735 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] cache.socket_keepalive_idle    = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.735 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] cache.socket_keepalive_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.735 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] cache.tls_allowed_ciphers      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.735 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] cache.tls_cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.735 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] cache.tls_certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.736 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] cache.tls_enabled              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.736 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] cache.tls_keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.736 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] cinder.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.736 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] cinder.auth_type               = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.736 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] cinder.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.737 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] cinder.catalog_info            = volumev3:cinderv3:internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.737 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] cinder.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.737 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] cinder.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.737 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] cinder.cross_az_attach         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.737 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] cinder.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.737 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] cinder.endpoint_template       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.738 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] cinder.http_retries            = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.738 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] cinder.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.738 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] cinder.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.738 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] cinder.os_region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.738 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] cinder.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.738 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] cinder.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.739 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] compute.consecutive_build_service_disable_threshold = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.739 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] compute.cpu_dedicated_set      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.739 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] compute.cpu_shared_set         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.739 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] compute.image_type_exclude_list = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.739 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] compute.live_migration_wait_for_vif_plug = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.740 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] compute.max_concurrent_disk_ops = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.740 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] compute.max_disk_devices_to_attach = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.740 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] compute.packing_host_numa_cells_allocation_strategy = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.740 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] compute.provider_config_location = /etc/nova/provider_config/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.740 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] compute.resource_provider_association_refresh = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.740 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] compute.shutdown_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.741 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] compute.vmdk_allowed_types     = ['streamOptimized', 'monolithicSparse'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.741 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] conductor.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.741 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] console.allowed_origins        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.741 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] console.ssl_ciphers            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.741 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] console.ssl_minimum_version    = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.741 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] consoleauth.token_ttl          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.742 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] cyborg.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.742 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] cyborg.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.742 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] cyborg.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.742 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] cyborg.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.742 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] cyborg.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.742 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] cyborg.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.743 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] cyborg.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.743 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] cyborg.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.743 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] cyborg.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.743 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] cyborg.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.743 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] cyborg.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.744 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] cyborg.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.744 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] cyborg.service_type            = accelerator log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.744 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] cyborg.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.744 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] cyborg.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.744 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] cyborg.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.744 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] cyborg.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.745 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] cyborg.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.745 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] cyborg.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.745 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] database.backend               = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.745 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] database.connection            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.745 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] database.connection_debug      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.745 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.746 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.746 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] database.connection_trace      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.747 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.747 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] database.db_max_retries        = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.747 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.747 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] database.db_retry_interval     = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.747 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] database.max_overflow          = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.747 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] database.max_pool_size         = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.748 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] database.max_retries           = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.748 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] database.mysql_enable_ndb      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.748 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] database.mysql_sql_mode        = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.748 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.748 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] database.pool_timeout          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.748 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] database.retry_interval        = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.749 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] database.slave_connection      = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.749 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] database.sqlite_synchronous    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.749 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] api_database.backend           = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.749 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] api_database.connection        = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.749 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] api_database.connection_debug  = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.749 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] api_database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.750 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] api_database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.750 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] api_database.connection_trace  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.750 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] api_database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.750 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] api_database.db_max_retries    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.750 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] api_database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.750 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] api_database.db_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.751 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] api_database.max_overflow      = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.751 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] api_database.max_pool_size     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.751 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] api_database.max_retries       = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.751 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] api_database.mysql_enable_ndb  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.751 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] api_database.mysql_sql_mode    = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.751 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] api_database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.752 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] api_database.pool_timeout      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.752 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] api_database.retry_interval    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.752 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] api_database.slave_connection  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.752 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] api_database.sqlite_synchronous = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.752 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] devices.enabled_mdev_types     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.752 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] ephemeral_storage_encryption.cipher = aes-xts-plain64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.753 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] ephemeral_storage_encryption.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.753 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] ephemeral_storage_encryption.key_size = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.753 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] glance.api_servers             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.753 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] glance.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.753 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] glance.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.753 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] glance.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.754 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] glance.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.754 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] glance.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.754 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] glance.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.754 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] glance.default_trusted_certificate_ids = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.754 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] glance.enable_certificate_validation = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.754 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] glance.enable_rbd_download     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.755 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] glance.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.755 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] glance.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.755 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] glance.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.755 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] glance.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.755 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] glance.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.755 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] glance.num_retries             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.755 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] glance.rbd_ceph_conf           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.756 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] glance.rbd_connect_timeout     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.756 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] glance.rbd_pool                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.756 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] glance.rbd_user                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.756 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] glance.region_name             = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.756 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] glance.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.756 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] glance.service_type            = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.757 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] glance.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.757 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] glance.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.757 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] glance.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.757 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] glance.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.757 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] glance.valid_interfaces        = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.757 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] glance.verify_glance_signatures = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.758 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] glance.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.758 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] guestfs.debug                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.758 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] hyperv.config_drive_cdrom      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.758 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] hyperv.config_drive_inject_password = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.758 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] hyperv.dynamic_memory_ratio    = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.758 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] hyperv.enable_instance_metrics_collection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.759 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] hyperv.enable_remotefx         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.759 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] hyperv.instances_path_share    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.759 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] hyperv.iscsi_initiator_list    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.759 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] hyperv.limit_cpu_features      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.759 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] hyperv.mounted_disk_query_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.759 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] hyperv.mounted_disk_query_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.760 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] hyperv.power_state_check_timeframe = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.760 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] hyperv.power_state_event_polling_interval = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.760 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] hyperv.qemu_img_cmd            = qemu-img.exe log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.760 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] hyperv.use_multipath_io        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.760 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] hyperv.volume_attach_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.761 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] hyperv.volume_attach_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.761 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] hyperv.vswitch_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.761 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] hyperv.wait_soft_reboot_seconds = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.761 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] mks.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.761 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] mks.mksproxy_base_url          = http://127.0.0.1:6090/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.762 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] image_cache.manager_interval   = 2400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.762 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] image_cache.precache_concurrency = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.762 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] image_cache.remove_unused_base_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.762 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] image_cache.remove_unused_original_minimum_age_seconds = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.762 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] image_cache.remove_unused_resized_minimum_age_seconds = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.763 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] image_cache.subdirectory_name  = _base log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.763 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] ironic.api_max_retries         = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.763 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] ironic.api_retry_interval      = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.763 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.763 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.764 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.764 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.764 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.764 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.764 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.764 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.764 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.765 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.765 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.765 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.765 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] ironic.partition_key           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.765 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] ironic.peer_list               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.765 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.766 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] ironic.serial_console_state_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.766 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.766 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] ironic.service_type            = baremetal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.766 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.766 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.766 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.767 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.767 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] ironic.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.767 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.767 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] key_manager.backend            = barbican log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.767 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] key_manager.fixed_key          = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.768 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] barbican.auth_endpoint         = http://localhost/identity/v3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.768 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] barbican.barbican_api_version  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.768 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] barbican.barbican_endpoint     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.768 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] barbican.barbican_endpoint_type = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.768 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] barbican.barbican_region_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.768 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] barbican.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.769 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] barbican.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.769 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] barbican.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.769 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] barbican.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.769 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] barbican.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.769 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] barbican.number_of_retries     = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.769 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] barbican.retry_delay           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.770 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] barbican.send_service_user_token = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.770 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] barbican.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.770 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] barbican.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.770 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] barbican.verify_ssl            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.770 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] barbican.verify_ssl_path       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.770 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] barbican_service_user.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.770 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] barbican_service_user.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.771 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] barbican_service_user.cafile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.771 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] barbican_service_user.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.771 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] barbican_service_user.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.771 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] barbican_service_user.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.771 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] barbican_service_user.keyfile  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.771 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] barbican_service_user.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.772 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] barbican_service_user.timeout  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.772 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] vault.approle_role_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.772 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] vault.approle_secret_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.772 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] vault.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.772 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] vault.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.772 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] vault.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.773 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] vault.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.773 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] vault.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.773 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] vault.kv_mountpoint            = secret log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.773 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] vault.kv_version               = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.773 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] vault.namespace                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.773 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] vault.root_token_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.774 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] vault.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.774 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] vault.ssl_ca_crt_file          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.774 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] vault.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.774 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] vault.use_ssl                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.774 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] vault.vault_url                = http://127.0.0.1:8200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.774 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] keystone.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.775 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] keystone.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.775 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] keystone.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.775 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] keystone.connect_retries       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.775 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] keystone.connect_retry_delay   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.775 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] keystone.endpoint_override     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.775 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] keystone.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.776 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] keystone.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.776 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] keystone.max_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.776 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] keystone.min_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.776 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] keystone.region_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.776 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] keystone.service_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.777 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] keystone.service_type          = identity log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.777 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] keystone.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.777 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] keystone.status_code_retries   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.777 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] keystone.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.777 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] keystone.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.777 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] keystone.valid_interfaces      = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.778 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] keystone.version               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.778 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] libvirt.connection_uri         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.778 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] libvirt.cpu_mode               = host-model log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.778 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] libvirt.cpu_model_extra_flags  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.778 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] libvirt.cpu_models             = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.778 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] libvirt.cpu_power_governor_high = performance log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.779 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] libvirt.cpu_power_governor_low = powersave log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.779 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] libvirt.cpu_power_management   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.779 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] libvirt.cpu_power_management_strategy = cpu_state log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.779 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] libvirt.device_detach_attempts = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.779 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] libvirt.device_detach_timeout  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.779 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] libvirt.disk_cachemodes        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.780 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] libvirt.disk_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.780 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] libvirt.enabled_perf_events    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.780 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] libvirt.file_backed_memory     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.780 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] libvirt.gid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.780 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] libvirt.hw_disk_discard        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.780 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] libvirt.hw_machine_type        = ['x86_64=q35'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.781 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] libvirt.images_rbd_ceph_conf   = /etc/ceph/ceph.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.781 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] libvirt.images_rbd_glance_copy_poll_interval = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.781 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] libvirt.images_rbd_glance_copy_timeout = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.781 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] libvirt.images_rbd_glance_store_name = default_backend log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.781 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] libvirt.images_rbd_pool        = vms log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.781 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] libvirt.images_type            = rbd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.782 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] libvirt.images_volume_group    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.782 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] libvirt.inject_key             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.782 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] libvirt.inject_partition       = -2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.782 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] libvirt.inject_password        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.782 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] libvirt.iscsi_iface            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.782 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] libvirt.iser_use_multipath     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.783 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] libvirt.live_migration_bandwidth = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.783 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] libvirt.live_migration_completion_timeout = 800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.783 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] libvirt.live_migration_downtime = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.783 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] libvirt.live_migration_downtime_delay = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.783 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] libvirt.live_migration_downtime_steps = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.783 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] libvirt.live_migration_inbound_addr = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.784 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] libvirt.live_migration_permit_auto_converge = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.784 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] libvirt.live_migration_permit_post_copy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.784 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] libvirt.live_migration_scheme  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.784 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] libvirt.live_migration_timeout_action = force_complete log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.784 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] libvirt.live_migration_tunnelled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.785 225345 WARNING oslo_config.cfg [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] Deprecated: Option "live_migration_uri" from group "libvirt" is deprecated for removal (
Feb 02 09:57:55 compute-1 nova_compute[225341]: live_migration_uri is deprecated for removal in favor of two other options that
Feb 02 09:57:55 compute-1 nova_compute[225341]: allow to change live migration scheme and target URI: ``live_migration_scheme``
Feb 02 09:57:55 compute-1 nova_compute[225341]: and ``live_migration_inbound_addr`` respectively.
Feb 02 09:57:55 compute-1 nova_compute[225341]: ).  Its value may be silently ignored in the future.
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.785 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] libvirt.live_migration_uri     = qemu+tls://%s/system log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.785 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] libvirt.live_migration_with_native_tls = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.785 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] libvirt.max_queues             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.785 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] libvirt.mem_stats_period_seconds = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.785 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] libvirt.nfs_mount_options      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.786 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] libvirt.nfs_mount_point_base   = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.786 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] libvirt.num_aoe_discover_tries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.786 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] libvirt.num_iser_scan_tries    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.786 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] libvirt.num_memory_encrypted_guests = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.786 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] libvirt.num_nvme_discover_tries = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.787 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] libvirt.num_pcie_ports         = 24 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.787 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] libvirt.num_volume_scan_tries  = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.787 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] libvirt.pmem_namespaces        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.787 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] libvirt.quobyte_client_cfg     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.787 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] libvirt.quobyte_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.787 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] libvirt.rbd_connect_timeout    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.788 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] libvirt.rbd_destroy_volume_retries = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.788 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] libvirt.rbd_destroy_volume_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.788 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] libvirt.rbd_secret_uuid        = d241d473-9fcb-5f74-b163-f1ca4454e7f1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.788 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] libvirt.rbd_user               = openstack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.788 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] libvirt.realtime_scheduler_priority = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.788 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] libvirt.remote_filesystem_transport = ssh log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.789 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] libvirt.rescue_image_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.789 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] libvirt.rescue_kernel_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.789 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] libvirt.rescue_ramdisk_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.789 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] libvirt.rng_dev_path           = /dev/urandom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.789 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] libvirt.rx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.789 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] libvirt.smbfs_mount_options    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.790 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] libvirt.smbfs_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.790 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] libvirt.snapshot_compression   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.790 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] libvirt.snapshot_image_format  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.790 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] libvirt.snapshots_directory    = /var/lib/nova/instances/snapshots log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.790 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] libvirt.sparse_logical_volumes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.791 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] libvirt.swtpm_enabled          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.791 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] libvirt.swtpm_group            = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.791 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] libvirt.swtpm_user             = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.791 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] libvirt.sysinfo_serial         = unique log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.791 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] libvirt.tx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.791 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] libvirt.uid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.792 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] libvirt.use_virtio_for_bridges = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.792 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] libvirt.virt_type              = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.792 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] libvirt.volume_clear           = zero log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.792 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] libvirt.volume_clear_size      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.792 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] libvirt.volume_use_multipath   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.792 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] libvirt.vzstorage_cache_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.793 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] libvirt.vzstorage_log_path     = /var/log/vstorage/%(cluster_name)s/nova.log.gz log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.793 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] libvirt.vzstorage_mount_group  = qemu log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.793 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] libvirt.vzstorage_mount_opts   = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.793 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] libvirt.vzstorage_mount_perms  = 0770 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.793 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] libvirt.vzstorage_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.793 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] libvirt.vzstorage_mount_user   = stack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.794 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] libvirt.wait_soft_reboot_seconds = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.794 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] neutron.auth_section           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.794 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] neutron.auth_type              = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.794 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] neutron.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.794 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] neutron.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.794 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] neutron.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.795 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] neutron.connect_retries        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.795 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] neutron.connect_retry_delay    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.795 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] neutron.default_floating_pool  = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.795 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] neutron.endpoint_override      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.795 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] neutron.extension_sync_interval = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.795 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] neutron.http_retries           = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.796 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] neutron.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.796 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] neutron.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.796 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] neutron.max_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.796 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] neutron.metadata_proxy_shared_secret = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.796 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] neutron.min_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.796 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] neutron.ovs_bridge             = br-int log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.797 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] neutron.physnets               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.797 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] neutron.region_name            = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.797 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] neutron.service_metadata_proxy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.797 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] neutron.service_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.797 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] neutron.service_type           = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.797 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] neutron.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.798 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] neutron.status_code_retries    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.798 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] neutron.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.798 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] neutron.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.798 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] neutron.valid_interfaces       = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.798 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] neutron.version                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.798 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] notifications.bdms_in_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.799 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] notifications.default_level    = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.799 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] notifications.notification_format = unversioned log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.799 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] notifications.notify_on_state_change = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.799 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] notifications.versioned_notifications_topics = ['versioned_notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.799 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] pci.alias                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.799 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] pci.device_spec                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.800 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] pci.report_in_placement        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.800 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.800 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] placement.auth_type            = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.800 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] placement.auth_url             = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.800 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.800 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.801 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.801 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] placement.connect_retries      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.801 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] placement.connect_retry_delay  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.801 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] placement.default_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.801 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] placement.default_domain_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.801 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] placement.domain_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.801 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] placement.domain_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.802 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] placement.endpoint_override    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.802 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.802 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.802 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] placement.max_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.802 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] placement.min_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.802 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] placement.password             = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.803 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] placement.project_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.803 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] placement.project_domain_name  = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.803 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] placement.project_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.803 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] placement.project_name         = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.803 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] placement.region_name          = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.803 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] placement.service_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.804 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] placement.service_type         = placement log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.804 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.804 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] placement.status_code_retries  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.804 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] placement.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.804 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] placement.system_scope         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.804 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.805 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] placement.trust_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.805 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] placement.user_domain_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.805 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] placement.user_domain_name     = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.805 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] placement.user_id              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.805 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] placement.username             = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.805 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] placement.valid_interfaces     = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.806 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] placement.version              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.806 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] quota.cores                    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.806 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] quota.count_usage_from_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.806 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] quota.driver                   = nova.quota.DbQuotaDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.806 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] quota.injected_file_content_bytes = 10240 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.806 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] quota.injected_file_path_length = 255 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.807 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] quota.injected_files           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.807 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] quota.instances                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.807 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] quota.key_pairs                = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.807 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] quota.metadata_items           = 128 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.807 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] quota.ram                      = 51200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.807 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] quota.recheck_quota            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.808 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] quota.server_group_members     = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.808 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] quota.server_groups            = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.808 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] rdp.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.808 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] rdp.html5_proxy_base_url       = http://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.808 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] scheduler.discover_hosts_in_cells_interval = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.809 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] scheduler.enable_isolated_aggregate_filtering = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.809 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] scheduler.image_metadata_prefilter = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.809 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] scheduler.limit_tenants_to_placement_aggregate = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.809 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] scheduler.max_attempts         = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.809 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] scheduler.max_placement_results = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.809 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] scheduler.placement_aggregate_required_for_tenants = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.810 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] scheduler.query_placement_for_availability_zone = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.810 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] scheduler.query_placement_for_image_type_support = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.810 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] scheduler.query_placement_for_routed_network_aggregates = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.810 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] scheduler.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.810 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_namespace = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.810 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_separator = . log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.811 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.811 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] filter_scheduler.build_failure_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.811 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] filter_scheduler.cpu_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.811 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.811 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] filter_scheduler.disk_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.811 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.812 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] filter_scheduler.host_subset_size = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.812 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] filter_scheduler.image_properties_default_architecture = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.812 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] filter_scheduler.io_ops_weight_multiplier = -1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.812 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] filter_scheduler.isolated_hosts = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.812 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] filter_scheduler.isolated_images = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.813 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] filter_scheduler.max_instances_per_host = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.813 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] filter_scheduler.max_io_ops_per_host = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.813 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] filter_scheduler.pci_in_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.813 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] filter_scheduler.pci_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.813 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] filter_scheduler.ram_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.813 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.814 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] filter_scheduler.shuffle_best_same_weighed_hosts = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.814 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] filter_scheduler.soft_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.814 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.814 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] filter_scheduler.track_instance_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.814 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.814 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] metrics.required               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.815 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] metrics.weight_multiplier      = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.815 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] metrics.weight_of_unavailable  = -10000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.815 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] metrics.weight_setting         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.815 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] serial_console.base_url        = ws://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.815 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] serial_console.enabled         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.815 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] serial_console.port_range      = 10000:20000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.816 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] serial_console.proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.816 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] serial_console.serialproxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.816 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] serial_console.serialproxy_port = 6083 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.816 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] service_user.auth_section      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.816 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] service_user.auth_type         = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.816 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] service_user.cafile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.817 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] service_user.certfile          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.817 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] service_user.collect_timing    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.817 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] service_user.insecure          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.817 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] service_user.keyfile           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.817 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] service_user.send_service_user_token = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.817 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] service_user.split_loggers     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.818 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] service_user.timeout           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.818 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] spice.agent_enabled            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.818 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] spice.enabled                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.818 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] spice.html5proxy_base_url      = http://127.0.0.1:6082/spice_auto.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.818 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] spice.html5proxy_host          = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.819 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] spice.html5proxy_port          = 6082 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.819 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] spice.image_compression        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.819 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] spice.jpeg_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.819 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] spice.playback_compression     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.819 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] spice.server_listen            = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.819 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] spice.server_proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.820 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] spice.streaming_mode           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.820 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] spice.zlib_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.820 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] upgrade_levels.baseapi         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.820 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] upgrade_levels.cert            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.820 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] upgrade_levels.compute         = auto log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.820 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] upgrade_levels.conductor       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.821 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] upgrade_levels.scheduler       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.821 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] vendordata_dynamic_auth.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.821 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] vendordata_dynamic_auth.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.821 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] vendordata_dynamic_auth.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.821 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] vendordata_dynamic_auth.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.821 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] vendordata_dynamic_auth.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.822 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] vendordata_dynamic_auth.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.822 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] vendordata_dynamic_auth.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.822 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] vendordata_dynamic_auth.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.822 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] vendordata_dynamic_auth.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.822 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.822 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.823 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] vmware.cache_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.823 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] vmware.cluster_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.823 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] vmware.connection_pool_size    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.823 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] vmware.console_delay_seconds   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.823 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] vmware.datastore_regex         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.823 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] vmware.host_ip                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.824 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.824 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.824 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] vmware.host_username           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.824 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.824 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] vmware.integration_bridge      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.824 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] vmware.maximum_objects         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.825 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] vmware.pbm_default_policy      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.825 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] vmware.pbm_enabled             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.825 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] vmware.pbm_wsdl_location       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.825 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] vmware.serial_log_dir          = /opt/vmware/vspc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.825 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] vmware.serial_port_proxy_uri   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.825 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] vmware.serial_port_service_uri = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.826 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.826 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] vmware.use_linked_clone        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.826 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] vmware.vnc_keymap              = en-us log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.826 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] vmware.vnc_port                = 5900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.826 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] vmware.vnc_port_total          = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.826 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] vnc.auth_schemes               = ['none'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.827 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] vnc.enabled                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.827 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] vnc.novncproxy_base_url        = https://nova-novncproxy-cell1-public-openstack.apps-crc.testing/vnc_lite.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.827 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] vnc.novncproxy_host            = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.827 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] vnc.novncproxy_port            = 6080 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.827 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] vnc.server_listen              = ::0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.828 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] vnc.server_proxyclient_address = 192.168.122.101 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.828 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] vnc.vencrypt_ca_certs          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.828 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] vnc.vencrypt_client_cert       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.828 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] vnc.vencrypt_client_key        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.828 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] workarounds.disable_compute_service_check_for_ffu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.828 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] workarounds.disable_deep_image_inspection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.829 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] workarounds.disable_fallback_pcpu_query = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.829 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] workarounds.disable_group_policy_check_upcall = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.829 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] workarounds.disable_libvirt_livesnapshot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.829 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] workarounds.disable_rootwrap   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.829 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] workarounds.enable_numa_live_migration = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.829 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] workarounds.enable_qemu_monitor_announce_self = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.830 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.830 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] workarounds.handle_virt_lifecycle_events = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.830 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] workarounds.libvirt_disable_apic = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.830 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] workarounds.never_download_image_if_on_rbd = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.830 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] workarounds.qemu_monitor_announce_self_count = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.830 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] workarounds.qemu_monitor_announce_self_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.831 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] workarounds.reserve_disk_resource_for_image_cache = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.831 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] workarounds.skip_cpu_compare_at_startup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.831 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] workarounds.skip_cpu_compare_on_dest = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.831 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] workarounds.skip_hypervisor_version_check_on_lm = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.831 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] workarounds.skip_reserve_in_use_ironic_nodes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.831 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] workarounds.unified_limits_count_pcpu_as_vcpu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.832 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.832 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] wsgi.api_paste_config          = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.832 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] wsgi.client_socket_timeout     = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.832 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] wsgi.default_pool_size         = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.832 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] wsgi.keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.832 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] wsgi.max_header_line           = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.833 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] wsgi.secure_proxy_ssl_header   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.833 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] wsgi.ssl_ca_file               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.833 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] wsgi.ssl_cert_file             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.833 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] wsgi.ssl_key_file              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.833 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] wsgi.tcp_keepidle              = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.833 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] wsgi.wsgi_log_format           = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.834 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] zvm.ca_file                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.834 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] zvm.cloud_connector_url        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.834 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] zvm.image_tmp_path             = /var/lib/nova/images log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.834 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] zvm.reachable_timeout          = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.834 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] oslo_policy.enforce_new_defaults = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.834 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] oslo_policy.enforce_scope      = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.835 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.835 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.835 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.835 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.835 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.835 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.836 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.836 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.836 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.836 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.836 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] remote_debug.host              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.836 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] remote_debug.port              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.837 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.837 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] oslo_messaging_rabbit.amqp_durable_queues = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.837 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.837 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.837 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.838 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.838 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.838 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.838 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.838 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.838 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.839 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.839 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.839 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.839 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.839 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.839 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.840 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.840 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.840 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.840 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_queue = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.840 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.840 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.840 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.841 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.841 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.841 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.841 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.841 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.842 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.842 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.842 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] oslo_messaging_notifications.driver = ['noop'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.842 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.842 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.842 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.843 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] oslo_limit.auth_section        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.843 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] oslo_limit.auth_type           = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.843 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] oslo_limit.auth_url            = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.843 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] oslo_limit.cafile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.843 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] oslo_limit.certfile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.843 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] oslo_limit.collect_timing      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.844 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] oslo_limit.connect_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.844 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] oslo_limit.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.844 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] oslo_limit.default_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.844 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] oslo_limit.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.844 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] oslo_limit.domain_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.844 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] oslo_limit.domain_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.844 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] oslo_limit.endpoint_id         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.845 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] oslo_limit.endpoint_override   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.845 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] oslo_limit.insecure            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.845 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] oslo_limit.keyfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.845 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] oslo_limit.max_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.845 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] oslo_limit.min_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.846 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] oslo_limit.password            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.846 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] oslo_limit.project_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.846 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] oslo_limit.project_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.846 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] oslo_limit.project_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.846 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] oslo_limit.project_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.846 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] oslo_limit.region_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.847 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] oslo_limit.service_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.847 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] oslo_limit.service_type        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.847 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] oslo_limit.split_loggers       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.847 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] oslo_limit.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.847 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] oslo_limit.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.847 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] oslo_limit.system_scope        = all log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.848 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] oslo_limit.timeout             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.848 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] oslo_limit.trust_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.848 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] oslo_limit.user_domain_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.848 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] oslo_limit.user_domain_name    = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.848 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] oslo_limit.user_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.848 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] oslo_limit.username            = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.848 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] oslo_limit.valid_interfaces    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.849 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] oslo_limit.version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.849 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] oslo_reports.file_event_handler = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.849 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] oslo_reports.file_event_handler_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.849 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] oslo_reports.log_dir           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.849 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] vif_plug_linux_bridge_privileged.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.850 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] vif_plug_linux_bridge_privileged.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.850 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] vif_plug_linux_bridge_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.850 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.850 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] vif_plug_linux_bridge_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.850 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] vif_plug_linux_bridge_privileged.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.850 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] vif_plug_ovs_privileged.capabilities = [12, 1] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.851 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] vif_plug_ovs_privileged.group  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.851 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] vif_plug_ovs_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.851 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.851 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] vif_plug_ovs_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.851 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] vif_plug_ovs_privileged.user   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.851 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] os_vif_linux_bridge.flat_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.852 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] os_vif_linux_bridge.forward_bridge_interface = ['all'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.852 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] os_vif_linux_bridge.iptables_bottom_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.852 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] os_vif_linux_bridge.iptables_drop_action = DROP log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.852 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] os_vif_linux_bridge.iptables_top_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.852 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] os_vif_linux_bridge.network_device_mtu = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.852 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] os_vif_linux_bridge.use_ipv6   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.853 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] os_vif_linux_bridge.vlan_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.853 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] os_vif_ovs.isolate_vif         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.853 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] os_vif_ovs.network_device_mtu  = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.853 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] os_vif_ovs.ovs_vsctl_timeout   = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.853 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] os_vif_ovs.ovsdb_connection    = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.853 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] os_vif_ovs.ovsdb_interface     = native log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.854 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] os_vif_ovs.per_port_bridge     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.854 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] os_brick.lock_path             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.854 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] os_brick.wait_mpath_device_attempts = 4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.854 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] os_brick.wait_mpath_device_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.854 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] privsep_osbrick.capabilities   = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.854 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] privsep_osbrick.group          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.855 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] privsep_osbrick.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.855 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] privsep_osbrick.logger_name    = os_brick.privileged log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.855 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] privsep_osbrick.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.855 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] privsep_osbrick.user           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.855 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] nova_sys_admin.capabilities    = [0, 1, 2, 3, 12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.855 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] nova_sys_admin.group           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.855 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] nova_sys_admin.helper_command  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.856 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] nova_sys_admin.logger_name     = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.856 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] nova_sys_admin.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.856 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] nova_sys_admin.user            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.856 225345 DEBUG oslo_service.service [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.857 225345 INFO nova.service [-] Starting compute node (version 27.5.2-0.20260127144738.eaa65f0.el9)
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.877 225345 DEBUG nova.virt.libvirt.host [None req-a768f377-541c-4e5a-9cf1-05c49e440e26 - - - - - -] Starting native event thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:492
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.877 225345 DEBUG nova.virt.libvirt.host [None req-a768f377-541c-4e5a-9cf1-05c49e440e26 - - - - - -] Starting green dispatch thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:498
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.878 225345 DEBUG nova.virt.libvirt.host [None req-a768f377-541c-4e5a-9cf1-05c49e440e26 - - - - - -] Starting connection event dispatch thread initialize /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:620
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.878 225345 DEBUG nova.virt.libvirt.host [None req-a768f377-541c-4e5a-9cf1-05c49e440e26 - - - - - -] Connecting to libvirt: qemu:///system _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:503
Feb 02 09:57:55 compute-1 systemd[1]: Starting libvirt QEMU daemon...
Feb 02 09:57:55 compute-1 systemd[1]: Started libvirt QEMU daemon.
Feb 02 09:57:55 compute-1 python3.9[225966]: ansible-containers.podman.podman_container Invoked with name=nova_nvme_cleaner state=absent executable=podman detach=True debug=False force_restart=False force_delete=True generate_systemd={} image_strict=False recreate=False image=None annotation=None arch=None attach=None authfile=None blkio_weight=None blkio_weight_device=None cap_add=None cap_drop=None cgroup_conf=None cgroup_parent=None cgroupns=None cgroups=None chrootdirs=None cidfile=None cmd_args=None conmon_pidfile=None command=None cpu_period=None cpu_quota=None cpu_rt_period=None cpu_rt_runtime=None cpu_shares=None cpus=None cpuset_cpus=None cpuset_mems=None decryption_key=None delete_depend=None delete_time=None delete_volumes=None detach_keys=None device=None device_cgroup_rule=None device_read_bps=None device_read_iops=None device_write_bps=None device_write_iops=None dns=None dns_option=None dns_search=None entrypoint=None env=None env_file=None env_host=None env_merge=None etc_hosts=None expose=None gidmap=None gpus=None group_add=None group_entry=None healthcheck=None healthcheck_interval=None healthcheck_retries=None healthcheck_start_period=None health_startup_cmd=None health_startup_interval=None health_startup_retries=None health_startup_success=None health_startup_timeout=None healthcheck_timeout=None healthcheck_failure_action=None hooks_dir=None hostname=None hostuser=None http_proxy=None image_volume=None init=None init_ctr=None init_path=None interactive=None ip=None ip6=None ipc=None kernel_memory=None label=None label_file=None log_driver=None log_level=None log_opt=None mac_address=None memory=None memory_reservation=None memory_swap=None memory_swappiness=None mount=None network=None network_aliases=None no_healthcheck=None no_hosts=None oom_kill_disable=None oom_score_adj=None os=None passwd=None passwd_entry=None personality=None pid=None pid_file=None pids_limit=None platform=None pod=None pod_id_file=None preserve_fd=None preserve_fds=None privileged=None publish=None publish_all=None pull=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None rdt_class=None read_only=None read_only_tmpfs=None requires=None restart_policy=None restart_time=None retry=None retry_delay=None rm=None rmi=None rootfs=None seccomp_policy=None secrets=NOT_LOGGING_PARAMETER sdnotify=None security_opt=None shm_size=None shm_size_systemd=None sig_proxy=None stop_signal=None stop_timeout=None stop_time=None subgidname=None subuidname=None sysctl=None systemd=None timeout=None timezone=None tls_verify=None tmpfs=None tty=None uidmap=None ulimit=None umask=None unsetenv=None unsetenv_all=None user=None userns=None uts=None variant=None volume=None volumes_from=None workdir=None
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.933 225345 DEBUG nova.virt.libvirt.host [None req-a768f377-541c-4e5a-9cf1-05c49e440e26 - - - - - -] Registering for lifecycle events <nova.virt.libvirt.host.Host object at 0x7f7a452816d0> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:509
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.936 225345 DEBUG nova.virt.libvirt.host [None req-a768f377-541c-4e5a-9cf1-05c49e440e26 - - - - - -] Registering for connection events: <nova.virt.libvirt.host.Host object at 0x7f7a452816d0> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:530
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.936 225345 INFO nova.virt.libvirt.driver [None req-a768f377-541c-4e5a-9cf1-05c49e440e26 - - - - - -] Connection event '1' reason 'None'
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.966 225345 WARNING nova.virt.libvirt.driver [None req-a768f377-541c-4e5a-9cf1-05c49e440e26 - - - - - -] Cannot update service status on host "compute-1.ctlplane.example.com" since it is not registered.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host compute-1.ctlplane.example.com could not be found.
Feb 02 09:57:55 compute-1 nova_compute[225341]: 2026-02-02 09:57:55.966 225345 DEBUG nova.virt.libvirt.volume.mount [None req-a768f377-541c-4e5a-9cf1-05c49e440e26 - - - - - -] Initialising _HostMountState generation 0 host_up /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/mount.py:130
Feb 02 09:57:56 compute-1 rsyslogd[1009]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Feb 02 09:57:56 compute-1 sudo[225964]: pam_unix(sudo:session): session closed for user root
Feb 02 09:57:56 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:57:56 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:57:56 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:57:56.236 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:57:56 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:57:56 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:57:56 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:57:56.357 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:57:56 compute-1 ceph-mon[80115]: pgmap v548: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Feb 02 09:57:56 compute-1 sudo[226198]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jrwpxkkvhkdojtslogapqtvixyultnhy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770026276.384192-3591-24086217270990/AnsiballZ_systemd.py'
Feb 02 09:57:56 compute-1 sudo[226198]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:57:56 compute-1 nova_compute[225341]: 2026-02-02 09:57:56.755 225345 INFO nova.virt.libvirt.host [None req-a768f377-541c-4e5a-9cf1-05c49e440e26 - - - - - -] Libvirt host capabilities <capabilities>
Feb 02 09:57:56 compute-1 nova_compute[225341]: 
Feb 02 09:57:56 compute-1 nova_compute[225341]:   <host>
Feb 02 09:57:56 compute-1 nova_compute[225341]:     <uuid>7f778d97-f318-4380-8776-2e4d99e5fd86</uuid>
Feb 02 09:57:56 compute-1 nova_compute[225341]:     <cpu>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <arch>x86_64</arch>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model>EPYC-Rome-v4</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <vendor>AMD</vendor>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <microcode version='16777317'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <signature family='23' model='49' stepping='0'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <topology sockets='8' dies='1' clusters='1' cores='1' threads='1'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <maxphysaddr mode='emulate' bits='40'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <feature name='x2apic'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <feature name='tsc-deadline'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <feature name='osxsave'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <feature name='hypervisor'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <feature name='tsc_adjust'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <feature name='spec-ctrl'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <feature name='stibp'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <feature name='arch-capabilities'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <feature name='ssbd'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <feature name='cmp_legacy'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <feature name='topoext'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <feature name='virt-ssbd'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <feature name='lbrv'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <feature name='tsc-scale'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <feature name='vmcb-clean'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <feature name='pause-filter'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <feature name='pfthreshold'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <feature name='svme-addr-chk'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <feature name='rdctl-no'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <feature name='skip-l1dfl-vmentry'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <feature name='mds-no'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <feature name='pschange-mc-no'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <pages unit='KiB' size='4'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <pages unit='KiB' size='2048'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <pages unit='KiB' size='1048576'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:     </cpu>
Feb 02 09:57:56 compute-1 nova_compute[225341]:     <power_management>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <suspend_mem/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:     </power_management>
Feb 02 09:57:56 compute-1 nova_compute[225341]:     <iommu support='no'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:     <migration_features>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <live/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <uri_transports>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <uri_transport>tcp</uri_transport>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <uri_transport>rdma</uri_transport>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </uri_transports>
Feb 02 09:57:56 compute-1 nova_compute[225341]:     </migration_features>
Feb 02 09:57:56 compute-1 nova_compute[225341]:     <topology>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <cells num='1'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <cell id='0'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:           <memory unit='KiB'>7864292</memory>
Feb 02 09:57:56 compute-1 nova_compute[225341]:           <pages unit='KiB' size='4'>1966073</pages>
Feb 02 09:57:56 compute-1 nova_compute[225341]:           <pages unit='KiB' size='2048'>0</pages>
Feb 02 09:57:56 compute-1 nova_compute[225341]:           <pages unit='KiB' size='1048576'>0</pages>
Feb 02 09:57:56 compute-1 nova_compute[225341]:           <distances>
Feb 02 09:57:56 compute-1 nova_compute[225341]:             <sibling id='0' value='10'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:           </distances>
Feb 02 09:57:56 compute-1 nova_compute[225341]:           <cpus num='8'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:             <cpu id='0' socket_id='0' die_id='0' cluster_id='65535' core_id='0' siblings='0'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:             <cpu id='1' socket_id='1' die_id='1' cluster_id='65535' core_id='0' siblings='1'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:             <cpu id='2' socket_id='2' die_id='2' cluster_id='65535' core_id='0' siblings='2'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:             <cpu id='3' socket_id='3' die_id='3' cluster_id='65535' core_id='0' siblings='3'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:             <cpu id='4' socket_id='4' die_id='4' cluster_id='65535' core_id='0' siblings='4'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:             <cpu id='5' socket_id='5' die_id='5' cluster_id='65535' core_id='0' siblings='5'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:             <cpu id='6' socket_id='6' die_id='6' cluster_id='65535' core_id='0' siblings='6'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:             <cpu id='7' socket_id='7' die_id='7' cluster_id='65535' core_id='0' siblings='7'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:           </cpus>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         </cell>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </cells>
Feb 02 09:57:56 compute-1 nova_compute[225341]:     </topology>
Feb 02 09:57:56 compute-1 nova_compute[225341]:     <cache>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <bank id='0' level='2' type='both' size='512' unit='KiB' cpus='0'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <bank id='1' level='2' type='both' size='512' unit='KiB' cpus='1'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <bank id='2' level='2' type='both' size='512' unit='KiB' cpus='2'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <bank id='3' level='2' type='both' size='512' unit='KiB' cpus='3'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <bank id='4' level='2' type='both' size='512' unit='KiB' cpus='4'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <bank id='5' level='2' type='both' size='512' unit='KiB' cpus='5'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <bank id='6' level='2' type='both' size='512' unit='KiB' cpus='6'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <bank id='7' level='2' type='both' size='512' unit='KiB' cpus='7'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <bank id='0' level='3' type='both' size='16' unit='MiB' cpus='0'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <bank id='1' level='3' type='both' size='16' unit='MiB' cpus='1'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <bank id='2' level='3' type='both' size='16' unit='MiB' cpus='2'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <bank id='3' level='3' type='both' size='16' unit='MiB' cpus='3'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <bank id='4' level='3' type='both' size='16' unit='MiB' cpus='4'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <bank id='5' level='3' type='both' size='16' unit='MiB' cpus='5'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <bank id='6' level='3' type='both' size='16' unit='MiB' cpus='6'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <bank id='7' level='3' type='both' size='16' unit='MiB' cpus='7'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:     </cache>
Feb 02 09:57:56 compute-1 nova_compute[225341]:     <secmodel>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model>selinux</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <doi>0</doi>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <baselabel type='kvm'>system_u:system_r:svirt_t:s0</baselabel>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <baselabel type='qemu'>system_u:system_r:svirt_tcg_t:s0</baselabel>
Feb 02 09:57:56 compute-1 nova_compute[225341]:     </secmodel>
Feb 02 09:57:56 compute-1 nova_compute[225341]:     <secmodel>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model>dac</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <doi>0</doi>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <baselabel type='kvm'>+107:+107</baselabel>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <baselabel type='qemu'>+107:+107</baselabel>
Feb 02 09:57:56 compute-1 nova_compute[225341]:     </secmodel>
Feb 02 09:57:56 compute-1 nova_compute[225341]:   </host>
Feb 02 09:57:56 compute-1 nova_compute[225341]: 
Feb 02 09:57:56 compute-1 nova_compute[225341]:   <guest>
Feb 02 09:57:56 compute-1 nova_compute[225341]:     <os_type>hvm</os_type>
Feb 02 09:57:56 compute-1 nova_compute[225341]:     <arch name='i686'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <wordsize>32</wordsize>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <emulator>/usr/libexec/qemu-kvm</emulator>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <domain type='qemu'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <domain type='kvm'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:     </arch>
Feb 02 09:57:56 compute-1 nova_compute[225341]:     <features>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <pae/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <nonpae/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <acpi default='on' toggle='yes'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <apic default='on' toggle='no'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <cpuselection/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <deviceboot/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <disksnapshot default='on' toggle='no'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <externalSnapshot/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:     </features>
Feb 02 09:57:56 compute-1 nova_compute[225341]:   </guest>
Feb 02 09:57:56 compute-1 nova_compute[225341]: 
Feb 02 09:57:56 compute-1 nova_compute[225341]:   <guest>
Feb 02 09:57:56 compute-1 nova_compute[225341]:     <os_type>hvm</os_type>
Feb 02 09:57:56 compute-1 nova_compute[225341]:     <arch name='x86_64'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <wordsize>64</wordsize>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <emulator>/usr/libexec/qemu-kvm</emulator>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <domain type='qemu'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <domain type='kvm'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:     </arch>
Feb 02 09:57:56 compute-1 nova_compute[225341]:     <features>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <acpi default='on' toggle='yes'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <apic default='on' toggle='no'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <cpuselection/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <deviceboot/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <disksnapshot default='on' toggle='no'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <externalSnapshot/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:     </features>
Feb 02 09:57:56 compute-1 nova_compute[225341]:   </guest>
Feb 02 09:57:56 compute-1 nova_compute[225341]: 
Feb 02 09:57:56 compute-1 nova_compute[225341]: </capabilities>
Feb 02 09:57:56 compute-1 nova_compute[225341]: 
Feb 02 09:57:56 compute-1 nova_compute[225341]: 2026-02-02 09:57:56.765 225345 DEBUG nova.virt.libvirt.host [None req-a768f377-541c-4e5a-9cf1-05c49e440e26 - - - - - -] Getting domain capabilities for i686 via machine types: {'pc', 'q35'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952
Feb 02 09:57:56 compute-1 nova_compute[225341]: 2026-02-02 09:57:56.792 225345 DEBUG nova.virt.libvirt.host [None req-a768f377-541c-4e5a-9cf1-05c49e440e26 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=pc:
Feb 02 09:57:56 compute-1 nova_compute[225341]: <domainCapabilities>
Feb 02 09:57:56 compute-1 nova_compute[225341]:   <path>/usr/libexec/qemu-kvm</path>
Feb 02 09:57:56 compute-1 nova_compute[225341]:   <domain>kvm</domain>
Feb 02 09:57:56 compute-1 nova_compute[225341]:   <machine>pc-i440fx-rhel7.6.0</machine>
Feb 02 09:57:56 compute-1 nova_compute[225341]:   <arch>i686</arch>
Feb 02 09:57:56 compute-1 nova_compute[225341]:   <vcpu max='240'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:   <iothreads supported='yes'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:   <os supported='yes'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:     <enum name='firmware'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:     <loader supported='yes'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <enum name='type'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <value>rom</value>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <value>pflash</value>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </enum>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <enum name='readonly'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <value>yes</value>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <value>no</value>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </enum>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <enum name='secure'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <value>no</value>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </enum>
Feb 02 09:57:56 compute-1 nova_compute[225341]:     </loader>
Feb 02 09:57:56 compute-1 nova_compute[225341]:   </os>
Feb 02 09:57:56 compute-1 nova_compute[225341]:   <cpu>
Feb 02 09:57:56 compute-1 nova_compute[225341]:     <mode name='host-passthrough' supported='yes'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <enum name='hostPassthroughMigratable'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <value>on</value>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <value>off</value>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </enum>
Feb 02 09:57:56 compute-1 nova_compute[225341]:     </mode>
Feb 02 09:57:56 compute-1 nova_compute[225341]:     <mode name='maximum' supported='yes'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <enum name='maximumMigratable'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <value>on</value>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <value>off</value>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </enum>
Feb 02 09:57:56 compute-1 nova_compute[225341]:     </mode>
Feb 02 09:57:56 compute-1 nova_compute[225341]:     <mode name='host-model' supported='yes'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model fallback='forbid'>EPYC-Rome</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <vendor>AMD</vendor>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <maxphysaddr mode='passthrough' limit='40'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <feature policy='require' name='x2apic'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <feature policy='require' name='tsc-deadline'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <feature policy='require' name='hypervisor'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <feature policy='require' name='tsc_adjust'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <feature policy='require' name='spec-ctrl'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <feature policy='require' name='stibp'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <feature policy='require' name='ssbd'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <feature policy='require' name='cmp_legacy'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <feature policy='require' name='overflow-recov'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <feature policy='require' name='succor'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <feature policy='require' name='ibrs'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <feature policy='require' name='amd-ssbd'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <feature policy='require' name='virt-ssbd'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <feature policy='require' name='lbrv'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <feature policy='require' name='tsc-scale'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <feature policy='require' name='vmcb-clean'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <feature policy='require' name='flushbyasid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <feature policy='require' name='pause-filter'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <feature policy='require' name='pfthreshold'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <feature policy='require' name='svme-addr-chk'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <feature policy='require' name='lfence-always-serializing'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <feature policy='disable' name='xsaves'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:     </mode>
Feb 02 09:57:56 compute-1 nova_compute[225341]:     <mode name='custom' supported='yes'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='Broadwell'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='hle'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='invpcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='rtm'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='Broadwell-IBRS'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='hle'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='invpcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='rtm'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='Broadwell-noTSX'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='invpcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='Broadwell-noTSX-IBRS'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='invpcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='Broadwell-v1'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='hle'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='invpcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='rtm'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='Broadwell-v2'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='invpcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='Broadwell-v3'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='hle'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='invpcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='rtm'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='Broadwell-v4'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='invpcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='Cascadelake-Server'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512bw'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512cd'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512dq'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512f'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vl'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vnni'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='hle'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='invpcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pku'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='rtm'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='Cascadelake-Server-noTSX'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512bw'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512cd'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512dq'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512f'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vl'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vnni'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='ibrs-all'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='invpcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pku'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='Cascadelake-Server-v1'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512bw'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512cd'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512dq'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512f'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vl'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vnni'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='hle'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='invpcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pku'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='rtm'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='Cascadelake-Server-v2'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512bw'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512cd'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512dq'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512f'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vl'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vnni'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='hle'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='ibrs-all'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='invpcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pku'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='rtm'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='Cascadelake-Server-v3'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512bw'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512cd'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512dq'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512f'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vl'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vnni'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='ibrs-all'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='invpcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pku'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='Cascadelake-Server-v4'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512bw'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512cd'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512dq'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512f'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vl'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vnni'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='ibrs-all'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='invpcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pku'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='Cascadelake-Server-v5'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512bw'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512cd'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512dq'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512f'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vl'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vnni'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='ibrs-all'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='invpcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pku'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='xsaves'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='ClearwaterForest'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx-ifma'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx-ne-convert'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx-vnni'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx-vnni-int16'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx-vnni-int8'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='bhi-ctrl'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='bhi-no'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='bus-lock-detect'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='cldemote'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='cmpccxadd'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='ddpd-u'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='fbsdp-no'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='fsrm'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='fsrs'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='gfni'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='ibrs-all'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='intel-psfd'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='invpcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='ipred-ctrl'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='lam'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='mcdt-no'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='movdir64b'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='movdiri'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pbrsb-no'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pku'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='prefetchiti'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='psdp-no'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='rrsba-ctrl'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='sbdr-ssdp-no'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='serialize'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='sha512'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='sm3'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='sm4'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='ss'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='vaes'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='vpclmulqdq'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='xsaves'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='ClearwaterForest-v1'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx-ifma'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx-ne-convert'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx-vnni'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx-vnni-int16'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx-vnni-int8'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='bhi-ctrl'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='bhi-no'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='bus-lock-detect'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='cldemote'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='cmpccxadd'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='ddpd-u'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='fbsdp-no'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='fsrm'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='fsrs'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='gfni'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='ibrs-all'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='intel-psfd'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='invpcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='ipred-ctrl'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='lam'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='mcdt-no'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='movdir64b'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='movdiri'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pbrsb-no'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pku'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='prefetchiti'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='psdp-no'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='rrsba-ctrl'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='sbdr-ssdp-no'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='serialize'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='sha512'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='sm3'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='sm4'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='ss'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='vaes'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='vpclmulqdq'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='xsaves'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='Cooperlake'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512-bf16'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512bw'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512cd'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512dq'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512f'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vl'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vnni'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='hle'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='ibrs-all'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='invpcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pku'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='rtm'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='taa-no'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='Cooperlake-v1'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512-bf16'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512bw'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512cd'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512dq'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512f'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vl'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vnni'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='hle'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='ibrs-all'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='invpcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pku'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='rtm'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='taa-no'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='Cooperlake-v2'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512-bf16'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512bw'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512cd'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512dq'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512f'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vl'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vnni'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='hle'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='ibrs-all'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='invpcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pku'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='rtm'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='taa-no'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='xsaves'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='Denverton'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='mpx'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='Denverton-v1'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='mpx'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='Denverton-v2'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='Denverton-v3'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='xsaves'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='Dhyana-v2'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='xsaves'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='EPYC-Genoa'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='amd-psfd'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='auto-ibrs'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512-bf16'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512-vpopcntdq'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512bitalg'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512bw'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512cd'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512dq'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512f'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512ifma'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vbmi'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vbmi2'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vl'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vnni'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='fsrm'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='gfni'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='invpcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='la57'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='no-nested-data-bp'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='null-sel-clr-base'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pku'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='stibp-always-on'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='vaes'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='vpclmulqdq'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='xsaves'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='EPYC-Genoa-v1'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='amd-psfd'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='auto-ibrs'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512-bf16'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512-vpopcntdq'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512bitalg'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512bw'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512cd'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512dq'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512f'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512ifma'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vbmi'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vbmi2'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vl'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vnni'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='fsrm'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='gfni'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='invpcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='la57'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='no-nested-data-bp'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='null-sel-clr-base'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pku'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='stibp-always-on'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='vaes'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='vpclmulqdq'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='xsaves'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='EPYC-Genoa-v2'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='amd-psfd'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='auto-ibrs'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512-bf16'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512-vpopcntdq'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512bitalg'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512bw'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512cd'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512dq'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512f'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512ifma'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vbmi'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vbmi2'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vl'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vnni'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='fs-gs-base-ns'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='fsrm'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='gfni'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='invpcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='la57'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='no-nested-data-bp'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='null-sel-clr-base'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='perfmon-v2'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pku'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='stibp-always-on'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='vaes'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='vpclmulqdq'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='xsaves'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='EPYC-Milan'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='fsrm'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='invpcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pku'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='xsaves'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='EPYC-Milan-v1'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='fsrm'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='invpcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pku'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='xsaves'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='EPYC-Milan-v2'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='amd-psfd'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='fsrm'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='invpcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='no-nested-data-bp'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='null-sel-clr-base'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pku'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='stibp-always-on'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='vaes'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='vpclmulqdq'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='xsaves'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='EPYC-Milan-v3'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='amd-psfd'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='fsrm'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='invpcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='no-nested-data-bp'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='null-sel-clr-base'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pku'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='stibp-always-on'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='vaes'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='vpclmulqdq'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='xsaves'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='EPYC-Rome'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='xsaves'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='EPYC-Rome-v1'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='xsaves'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='EPYC-Rome-v2'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='xsaves'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='EPYC-Rome-v3'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='xsaves'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='EPYC-Turin'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='amd-psfd'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='auto-ibrs'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx-vnni'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512-bf16'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512-vp2intersect'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512-vpopcntdq'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512bitalg'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512bw'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512cd'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512dq'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512f'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512ifma'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vbmi'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vbmi2'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vl'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vnni'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='fs-gs-base-ns'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='fsrm'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='gfni'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='ibpb-brtype'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='invpcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='la57'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='movdir64b'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='movdiri'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='no-nested-data-bp'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='null-sel-clr-base'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='perfmon-v2'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pku'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='prefetchi'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='sbpb'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='srso-user-kernel-no'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='stibp-always-on'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='vaes'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='vpclmulqdq'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='xsaves'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='EPYC-Turin-v1'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='amd-psfd'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='auto-ibrs'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx-vnni'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512-bf16'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512-vp2intersect'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512-vpopcntdq'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512bitalg'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512bw'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512cd'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512dq'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512f'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512ifma'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vbmi'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vbmi2'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vl'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vnni'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='fs-gs-base-ns'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='fsrm'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='gfni'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='ibpb-brtype'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='invpcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='la57'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='movdir64b'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='movdiri'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='no-nested-data-bp'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='null-sel-clr-base'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='perfmon-v2'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pku'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='prefetchi'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='sbpb'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='srso-user-kernel-no'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='stibp-always-on'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='vaes'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='vpclmulqdq'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='xsaves'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='EPYC-v3'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='xsaves'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='EPYC-v4'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='xsaves'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='AMD'>EPYC-v5</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='EPYC-v5'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='xsaves'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='GraniteRapids'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='amx-bf16'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='amx-fp16'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='amx-int8'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='amx-tile'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx-vnni'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512-bf16'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512-fp16'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512-vpopcntdq'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512bitalg'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512bw'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512cd'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512dq'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512f'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512ifma'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vbmi'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vbmi2'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vl'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vnni'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='bus-lock-detect'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='fbsdp-no'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='fsrc'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='fsrm'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='fsrs'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='fzrm'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='gfni'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='hle'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='ibrs-all'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='invpcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='la57'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='mcdt-no'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pbrsb-no'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pku'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='prefetchiti'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='psdp-no'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='rtm'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='sbdr-ssdp-no'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='serialize'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='taa-no'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='tsx-ldtrk'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='vaes'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='vpclmulqdq'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='xfd'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='xsaves'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='GraniteRapids-v1'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='amx-bf16'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='amx-fp16'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='amx-int8'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='amx-tile'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx-vnni'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512-bf16'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512-fp16'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512-vpopcntdq'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512bitalg'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512bw'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512cd'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512dq'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512f'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512ifma'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vbmi'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vbmi2'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vl'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vnni'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='bus-lock-detect'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='fbsdp-no'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='fsrc'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='fsrm'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='fsrs'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='fzrm'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='gfni'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='hle'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='ibrs-all'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='invpcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='la57'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='mcdt-no'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pbrsb-no'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pku'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='prefetchiti'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='psdp-no'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='rtm'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='sbdr-ssdp-no'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='serialize'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='taa-no'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='tsx-ldtrk'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='vaes'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='vpclmulqdq'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='xfd'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='xsaves'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='GraniteRapids-v2'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='amx-bf16'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='amx-fp16'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='amx-int8'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='amx-tile'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx-vnni'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx10'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx10-128'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx10-256'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx10-512'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512-bf16'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512-fp16'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512-vpopcntdq'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512bitalg'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512bw'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512cd'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512dq'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512f'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512ifma'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vbmi'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vbmi2'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vl'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vnni'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='bus-lock-detect'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='cldemote'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='fbsdp-no'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='fsrc'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='fsrm'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='fsrs'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='fzrm'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='gfni'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='hle'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='ibrs-all'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='invpcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='la57'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='mcdt-no'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='movdir64b'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='movdiri'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pbrsb-no'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pku'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='prefetchiti'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='psdp-no'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='rtm'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='sbdr-ssdp-no'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='serialize'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='ss'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='taa-no'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='tsx-ldtrk'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='vaes'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='vpclmulqdq'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='xfd'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='xsaves'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='GraniteRapids-v3'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='amx-bf16'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='amx-fp16'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='amx-int8'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='amx-tile'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx-vnni'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx10'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx10-128'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx10-256'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx10-512'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512-bf16'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512-fp16'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512-vpopcntdq'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512bitalg'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512bw'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512cd'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512dq'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512f'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512ifma'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vbmi'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vbmi2'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vl'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vnni'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='bus-lock-detect'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='cldemote'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='fbsdp-no'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='fsrc'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='fsrm'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='fsrs'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='fzrm'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='gfni'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='hle'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='ibrs-all'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='invpcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='la57'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='mcdt-no'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='movdir64b'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='movdiri'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pbrsb-no'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pku'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='prefetchiti'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='psdp-no'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='rtm'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='sbdr-ssdp-no'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='serialize'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='ss'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='taa-no'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='tsx-ldtrk'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='vaes'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='vpclmulqdq'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='xfd'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='xsaves'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='Haswell'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='hle'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='invpcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='rtm'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='Haswell-IBRS'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='hle'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='invpcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='rtm'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='Haswell-noTSX'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='invpcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='Haswell-noTSX-IBRS'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='invpcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='Haswell-v1'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='hle'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='invpcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='rtm'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='Haswell-v2'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='invpcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='Haswell-v3'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='hle'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='invpcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='rtm'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='Haswell-v4'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='invpcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='Icelake-Server'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512-vpopcntdq'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512bitalg'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512bw'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512cd'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512dq'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512f'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vbmi'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vbmi2'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vl'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vnni'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='gfni'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='hle'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='invpcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='la57'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pku'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='rtm'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='vaes'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='vpclmulqdq'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='Icelake-Server-noTSX'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512-vpopcntdq'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512bitalg'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512bw'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512cd'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512dq'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512f'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vbmi'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vbmi2'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vl'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vnni'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='gfni'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='invpcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='la57'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pku'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='vaes'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='vpclmulqdq'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='Icelake-Server-v1'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512-vpopcntdq'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512bitalg'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512bw'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512cd'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512dq'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512f'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vbmi'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vbmi2'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vl'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vnni'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='gfni'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='hle'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='invpcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='la57'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pku'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='rtm'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='vaes'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='vpclmulqdq'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='Icelake-Server-v2'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512-vpopcntdq'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512bitalg'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512bw'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512cd'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512dq'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512f'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vbmi'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vbmi2'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vl'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vnni'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='gfni'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='invpcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='la57'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pku'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='vaes'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='vpclmulqdq'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='Icelake-Server-v3'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512-vpopcntdq'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512bitalg'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512bw'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512cd'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512dq'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512f'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vbmi'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vbmi2'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vl'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vnni'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='gfni'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='ibrs-all'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='invpcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='la57'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pku'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='taa-no'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='vaes'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='vpclmulqdq'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='Icelake-Server-v4'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512-vpopcntdq'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512bitalg'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512bw'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512cd'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512dq'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512f'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512ifma'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vbmi'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vbmi2'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vl'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vnni'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='fsrm'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='gfni'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='ibrs-all'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='invpcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='la57'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pku'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='taa-no'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='vaes'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='vpclmulqdq'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='Icelake-Server-v5'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512-vpopcntdq'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512bitalg'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512bw'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512cd'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512dq'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512f'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512ifma'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vbmi'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vbmi2'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vl'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vnni'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='fsrm'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='gfni'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='ibrs-all'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='invpcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='la57'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pku'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='taa-no'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='vaes'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='vpclmulqdq'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='xsaves'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='Icelake-Server-v6'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512-vpopcntdq'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512bitalg'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512bw'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512cd'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512dq'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512f'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512ifma'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vbmi'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vbmi2'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vl'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vnni'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='fsrm'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='gfni'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='ibrs-all'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='invpcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='la57'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pku'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='taa-no'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='vaes'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='vpclmulqdq'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='xsaves'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='Icelake-Server-v7'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512-vpopcntdq'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512bitalg'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512bw'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512cd'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512dq'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512f'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512ifma'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vbmi'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vbmi2'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vl'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vnni'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='fsrm'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='gfni'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='hle'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='ibrs-all'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='invpcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='la57'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pku'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='rtm'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='taa-no'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='vaes'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='vpclmulqdq'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='xsaves'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='IvyBridge'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='IvyBridge-IBRS'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='IvyBridge-v1'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='IvyBridge-v2'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='KnightsMill'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512-4fmaps'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512-4vnniw'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512-vpopcntdq'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512cd'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512er'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512f'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512pf'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='ss'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='KnightsMill-v1'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512-4fmaps'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512-4vnniw'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512-vpopcntdq'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512cd'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512er'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512f'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512pf'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='ss'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='Opteron_G4'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='fma4'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='xop'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='Opteron_G4-v1'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='fma4'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='xop'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='Opteron_G5'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='fma4'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='tbm'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='xop'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='Opteron_G5-v1'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='fma4'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='tbm'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='xop'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='SapphireRapids'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='amx-bf16'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='amx-int8'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='amx-tile'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx-vnni'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512-bf16'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512-fp16'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512-vpopcntdq'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512bitalg'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512bw'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512cd'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512dq'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512f'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512ifma'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vbmi'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vbmi2'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vl'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vnni'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='bus-lock-detect'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='fsrc'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='fsrm'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='fsrs'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='fzrm'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='gfni'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='hle'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='ibrs-all'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='invpcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='la57'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pku'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='rtm'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='serialize'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='taa-no'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='tsx-ldtrk'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='vaes'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='vpclmulqdq'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='xfd'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='xsaves'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='SapphireRapids-v1'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='amx-bf16'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='amx-int8'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='amx-tile'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx-vnni'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512-bf16'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512-fp16'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512-vpopcntdq'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512bitalg'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512bw'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512cd'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512dq'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512f'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512ifma'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vbmi'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vbmi2'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vl'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vnni'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='bus-lock-detect'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='fsrc'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='fsrm'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='fsrs'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='fzrm'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='gfni'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='hle'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='ibrs-all'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='invpcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='la57'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pku'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='rtm'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='serialize'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='taa-no'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='tsx-ldtrk'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='vaes'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='vpclmulqdq'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='xfd'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='xsaves'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='SapphireRapids-v2'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='amx-bf16'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='amx-int8'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='amx-tile'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx-vnni'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512-bf16'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512-fp16'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512-vpopcntdq'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512bitalg'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512bw'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512cd'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512dq'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512f'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512ifma'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vbmi'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vbmi2'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vl'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vnni'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='bus-lock-detect'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='fbsdp-no'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='fsrc'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='fsrm'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='fsrs'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='fzrm'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='gfni'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='hle'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='ibrs-all'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='invpcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='la57'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pku'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='psdp-no'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='rtm'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='sbdr-ssdp-no'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='serialize'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='taa-no'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='tsx-ldtrk'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='vaes'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='vpclmulqdq'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='xfd'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='xsaves'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='SapphireRapids-v3'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='amx-bf16'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='amx-int8'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='amx-tile'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx-vnni'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512-bf16'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512-fp16'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512-vpopcntdq'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512bitalg'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512bw'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512cd'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512dq'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512f'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512ifma'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vbmi'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vbmi2'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vl'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vnni'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='bus-lock-detect'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='cldemote'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='fbsdp-no'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='fsrc'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='fsrm'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='fsrs'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='fzrm'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='gfni'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='hle'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='ibrs-all'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='invpcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='la57'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='movdir64b'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='movdiri'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pku'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='psdp-no'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='rtm'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='sbdr-ssdp-no'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='serialize'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='ss'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='taa-no'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='tsx-ldtrk'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='vaes'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='vpclmulqdq'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='xfd'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='xsaves'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='SapphireRapids-v4'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='amx-bf16'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='amx-int8'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='amx-tile'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx-vnni'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512-bf16'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512-fp16'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512-vpopcntdq'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512bitalg'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512bw'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512cd'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512dq'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512f'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512ifma'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vbmi'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vbmi2'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vl'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vnni'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='bus-lock-detect'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='cldemote'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='fbsdp-no'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='fsrc'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='fsrm'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='fsrs'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='fzrm'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='gfni'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='hle'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='ibrs-all'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='invpcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='la57'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='movdir64b'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='movdiri'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pku'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='psdp-no'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='rtm'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='sbdr-ssdp-no'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='serialize'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='ss'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='taa-no'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='tsx-ldtrk'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='vaes'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='vpclmulqdq'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='xfd'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='xsaves'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='SierraForest'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx-ifma'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx-ne-convert'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx-vnni'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx-vnni-int8'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='bus-lock-detect'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='cmpccxadd'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='fbsdp-no'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='fsrm'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='fsrs'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='gfni'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='ibrs-all'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='invpcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='mcdt-no'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pbrsb-no'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pku'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='psdp-no'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='sbdr-ssdp-no'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='serialize'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='vaes'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='vpclmulqdq'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='xsaves'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='SierraForest-v1'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx-ifma'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx-ne-convert'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx-vnni'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx-vnni-int8'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='bus-lock-detect'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='cmpccxadd'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='fbsdp-no'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='fsrm'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='fsrs'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='gfni'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='ibrs-all'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='invpcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='mcdt-no'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pbrsb-no'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pku'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='psdp-no'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='sbdr-ssdp-no'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='serialize'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='vaes'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='vpclmulqdq'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='xsaves'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel'>SierraForest-v2</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='SierraForest-v2'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx-ifma'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx-ne-convert'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx-vnni'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx-vnni-int8'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='bhi-ctrl'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='bus-lock-detect'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='cldemote'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='cmpccxadd'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='fbsdp-no'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='fsrm'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='fsrs'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='gfni'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='ibrs-all'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='intel-psfd'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='invpcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='ipred-ctrl'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='lam'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='mcdt-no'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='movdir64b'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='movdiri'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pbrsb-no'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pku'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='psdp-no'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='rrsba-ctrl'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='sbdr-ssdp-no'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='serialize'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='ss'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='vaes'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='vpclmulqdq'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='xsaves'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel'>SierraForest-v3</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='SierraForest-v3'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx-ifma'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx-ne-convert'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx-vnni'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx-vnni-int8'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='bhi-ctrl'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='bus-lock-detect'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='cldemote'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='cmpccxadd'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='fbsdp-no'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='fsrm'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='fsrs'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='gfni'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='ibrs-all'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='intel-psfd'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='invpcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='ipred-ctrl'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='lam'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='mcdt-no'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='movdir64b'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='movdiri'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pbrsb-no'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pku'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='psdp-no'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='rrsba-ctrl'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='sbdr-ssdp-no'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='serialize'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='ss'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='vaes'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='vpclmulqdq'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='xsaves'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='Skylake-Client'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='hle'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='invpcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='rtm'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='Skylake-Client-IBRS'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='hle'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='invpcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='rtm'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='invpcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='Skylake-Client-v1'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='hle'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='invpcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='rtm'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='Skylake-Client-v2'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='hle'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='invpcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='rtm'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='Skylake-Client-v3'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='invpcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='Skylake-Client-v4'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='invpcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='xsaves'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='Skylake-Server'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512bw'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512cd'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512dq'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512f'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vl'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='hle'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='invpcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pku'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='rtm'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='Skylake-Server-IBRS'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512bw'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512cd'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512dq'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512f'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vl'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='hle'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='invpcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pku'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='rtm'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512bw'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512cd'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512dq'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512f'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vl'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='invpcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pku'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='Skylake-Server-v1'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512bw'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512cd'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512dq'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512f'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vl'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='hle'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='invpcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pku'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='rtm'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='Skylake-Server-v2'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512bw'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512cd'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512dq'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512f'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vl'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='hle'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='invpcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pku'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='rtm'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='Skylake-Server-v3'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512bw'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512cd'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512dq'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512f'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vl'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='invpcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pku'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='Skylake-Server-v4'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512bw'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512cd'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512dq'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512f'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vl'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='invpcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pku'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='Skylake-Server-v5'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512bw'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512cd'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512dq'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512f'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vl'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='invpcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pku'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='xsaves'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='Snowridge'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='cldemote'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='core-capability'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='gfni'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='movdir64b'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='movdiri'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='mpx'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='split-lock-detect'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='Snowridge-v1'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='cldemote'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='core-capability'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='gfni'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='movdir64b'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='movdiri'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='mpx'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='split-lock-detect'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='Snowridge-v2'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='cldemote'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='core-capability'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='gfni'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='movdir64b'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='movdiri'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='split-lock-detect'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='Snowridge-v3'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='cldemote'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='core-capability'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='gfni'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='movdir64b'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='movdiri'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='split-lock-detect'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='xsaves'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='Snowridge-v4'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='cldemote'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='gfni'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='movdir64b'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='movdiri'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='xsaves'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='athlon'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='3dnow'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='3dnowext'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='athlon-v1'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='3dnow'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='3dnowext'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='core2duo'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='ss'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='core2duo-v1'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='ss'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='coreduo'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='ss'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='coreduo-v1'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='ss'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='n270'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='ss'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='n270-v1'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='ss'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='phenom'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='3dnow'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='3dnowext'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='phenom-v1'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='3dnow'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='3dnowext'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:     </mode>
Feb 02 09:57:56 compute-1 nova_compute[225341]:   </cpu>
Feb 02 09:57:56 compute-1 nova_compute[225341]:   <memoryBacking supported='yes'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:     <enum name='sourceType'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <value>file</value>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <value>anonymous</value>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <value>memfd</value>
Feb 02 09:57:56 compute-1 nova_compute[225341]:     </enum>
Feb 02 09:57:56 compute-1 nova_compute[225341]:   </memoryBacking>
Feb 02 09:57:56 compute-1 nova_compute[225341]:   <devices>
Feb 02 09:57:56 compute-1 nova_compute[225341]:     <disk supported='yes'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <enum name='diskDevice'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <value>disk</value>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <value>cdrom</value>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <value>floppy</value>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <value>lun</value>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </enum>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <enum name='bus'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <value>ide</value>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <value>fdc</value>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <value>scsi</value>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <value>virtio</value>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <value>usb</value>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <value>sata</value>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </enum>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <enum name='model'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <value>virtio</value>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <value>virtio-transitional</value>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <value>virtio-non-transitional</value>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </enum>
Feb 02 09:57:56 compute-1 nova_compute[225341]:     </disk>
Feb 02 09:57:56 compute-1 nova_compute[225341]:     <graphics supported='yes'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <enum name='type'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <value>vnc</value>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <value>egl-headless</value>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <value>dbus</value>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </enum>
Feb 02 09:57:56 compute-1 nova_compute[225341]:     </graphics>
Feb 02 09:57:56 compute-1 nova_compute[225341]:     <video supported='yes'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <enum name='modelType'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <value>vga</value>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <value>cirrus</value>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <value>virtio</value>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <value>none</value>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <value>bochs</value>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <value>ramfb</value>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </enum>
Feb 02 09:57:56 compute-1 nova_compute[225341]:     </video>
Feb 02 09:57:56 compute-1 nova_compute[225341]:     <hostdev supported='yes'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <enum name='mode'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <value>subsystem</value>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </enum>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <enum name='startupPolicy'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <value>default</value>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <value>mandatory</value>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <value>requisite</value>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <value>optional</value>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </enum>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <enum name='subsysType'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <value>usb</value>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <value>pci</value>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <value>scsi</value>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </enum>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <enum name='capsType'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <enum name='pciBackend'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:     </hostdev>
Feb 02 09:57:56 compute-1 nova_compute[225341]:     <rng supported='yes'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <enum name='model'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <value>virtio</value>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <value>virtio-transitional</value>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <value>virtio-non-transitional</value>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </enum>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <enum name='backendModel'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <value>random</value>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <value>egd</value>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <value>builtin</value>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </enum>
Feb 02 09:57:56 compute-1 nova_compute[225341]:     </rng>
Feb 02 09:57:56 compute-1 nova_compute[225341]:     <filesystem supported='yes'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <enum name='driverType'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <value>path</value>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <value>handle</value>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <value>virtiofs</value>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </enum>
Feb 02 09:57:56 compute-1 nova_compute[225341]:     </filesystem>
Feb 02 09:57:56 compute-1 nova_compute[225341]:     <tpm supported='yes'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <enum name='model'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <value>tpm-tis</value>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <value>tpm-crb</value>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </enum>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <enum name='backendModel'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <value>emulator</value>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <value>external</value>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </enum>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <enum name='backendVersion'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <value>2.0</value>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </enum>
Feb 02 09:57:56 compute-1 nova_compute[225341]:     </tpm>
Feb 02 09:57:56 compute-1 nova_compute[225341]:     <redirdev supported='yes'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <enum name='bus'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <value>usb</value>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </enum>
Feb 02 09:57:56 compute-1 nova_compute[225341]:     </redirdev>
Feb 02 09:57:56 compute-1 nova_compute[225341]:     <channel supported='yes'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <enum name='type'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <value>pty</value>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <value>unix</value>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </enum>
Feb 02 09:57:56 compute-1 nova_compute[225341]:     </channel>
Feb 02 09:57:56 compute-1 nova_compute[225341]:     <crypto supported='yes'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <enum name='model'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <enum name='type'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <value>qemu</value>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </enum>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <enum name='backendModel'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <value>builtin</value>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </enum>
Feb 02 09:57:56 compute-1 nova_compute[225341]:     </crypto>
Feb 02 09:57:56 compute-1 nova_compute[225341]:     <interface supported='yes'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <enum name='backendType'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <value>default</value>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <value>passt</value>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </enum>
Feb 02 09:57:56 compute-1 nova_compute[225341]:     </interface>
Feb 02 09:57:56 compute-1 nova_compute[225341]:     <panic supported='yes'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <enum name='model'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <value>isa</value>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <value>hyperv</value>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </enum>
Feb 02 09:57:56 compute-1 nova_compute[225341]:     </panic>
Feb 02 09:57:56 compute-1 nova_compute[225341]:     <console supported='yes'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <enum name='type'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <value>null</value>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <value>vc</value>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <value>pty</value>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <value>dev</value>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <value>file</value>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <value>pipe</value>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <value>stdio</value>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <value>udp</value>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <value>tcp</value>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <value>unix</value>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <value>qemu-vdagent</value>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <value>dbus</value>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </enum>
Feb 02 09:57:56 compute-1 nova_compute[225341]:     </console>
Feb 02 09:57:56 compute-1 nova_compute[225341]:   </devices>
Feb 02 09:57:56 compute-1 nova_compute[225341]:   <features>
Feb 02 09:57:56 compute-1 nova_compute[225341]:     <gic supported='no'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:     <vmcoreinfo supported='yes'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:     <genid supported='yes'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:     <backingStoreInput supported='yes'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:     <backup supported='yes'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:     <async-teardown supported='yes'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:     <s390-pv supported='no'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:     <ps2 supported='yes'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:     <tdx supported='no'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:     <sev supported='no'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:     <sgx supported='no'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:     <hyperv supported='yes'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <enum name='features'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <value>relaxed</value>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <value>vapic</value>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <value>spinlocks</value>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <value>vpindex</value>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <value>runtime</value>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <value>synic</value>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <value>stimer</value>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <value>reset</value>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <value>vendor_id</value>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <value>frequencies</value>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <value>reenlightenment</value>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <value>tlbflush</value>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <value>ipi</value>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <value>avic</value>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <value>emsr_bitmap</value>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <value>xmm_input</value>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </enum>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <defaults>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <spinlocks>4095</spinlocks>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <stimer_direct>on</stimer_direct>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <tlbflush_direct>on</tlbflush_direct>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <tlbflush_extended>on</tlbflush_extended>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <vendor_id>Linux KVM Hv</vendor_id>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </defaults>
Feb 02 09:57:56 compute-1 nova_compute[225341]:     </hyperv>
Feb 02 09:57:56 compute-1 nova_compute[225341]:     <launchSecurity supported='no'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:   </features>
Feb 02 09:57:56 compute-1 nova_compute[225341]: </domainCapabilities>
Feb 02 09:57:56 compute-1 nova_compute[225341]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Feb 02 09:57:56 compute-1 nova_compute[225341]: 2026-02-02 09:57:56.801 225345 DEBUG nova.virt.libvirt.host [None req-a768f377-541c-4e5a-9cf1-05c49e440e26 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=q35:
Feb 02 09:57:56 compute-1 nova_compute[225341]: <domainCapabilities>
Feb 02 09:57:56 compute-1 nova_compute[225341]:   <path>/usr/libexec/qemu-kvm</path>
Feb 02 09:57:56 compute-1 nova_compute[225341]:   <domain>kvm</domain>
Feb 02 09:57:56 compute-1 nova_compute[225341]:   <machine>pc-q35-rhel9.8.0</machine>
Feb 02 09:57:56 compute-1 nova_compute[225341]:   <arch>i686</arch>
Feb 02 09:57:56 compute-1 nova_compute[225341]:   <vcpu max='4096'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:   <iothreads supported='yes'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:   <os supported='yes'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:     <enum name='firmware'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:     <loader supported='yes'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <enum name='type'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <value>rom</value>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <value>pflash</value>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </enum>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <enum name='readonly'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <value>yes</value>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <value>no</value>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </enum>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <enum name='secure'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <value>no</value>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </enum>
Feb 02 09:57:56 compute-1 nova_compute[225341]:     </loader>
Feb 02 09:57:56 compute-1 nova_compute[225341]:   </os>
Feb 02 09:57:56 compute-1 nova_compute[225341]:   <cpu>
Feb 02 09:57:56 compute-1 nova_compute[225341]:     <mode name='host-passthrough' supported='yes'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <enum name='hostPassthroughMigratable'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <value>on</value>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <value>off</value>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </enum>
Feb 02 09:57:56 compute-1 nova_compute[225341]:     </mode>
Feb 02 09:57:56 compute-1 nova_compute[225341]:     <mode name='maximum' supported='yes'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <enum name='maximumMigratable'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <value>on</value>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <value>off</value>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </enum>
Feb 02 09:57:56 compute-1 nova_compute[225341]:     </mode>
Feb 02 09:57:56 compute-1 nova_compute[225341]:     <mode name='host-model' supported='yes'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model fallback='forbid'>EPYC-Rome</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <vendor>AMD</vendor>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <maxphysaddr mode='passthrough' limit='40'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <feature policy='require' name='x2apic'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <feature policy='require' name='tsc-deadline'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <feature policy='require' name='hypervisor'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <feature policy='require' name='tsc_adjust'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <feature policy='require' name='spec-ctrl'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <feature policy='require' name='stibp'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <feature policy='require' name='ssbd'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <feature policy='require' name='cmp_legacy'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <feature policy='require' name='overflow-recov'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <feature policy='require' name='succor'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <feature policy='require' name='ibrs'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <feature policy='require' name='amd-ssbd'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <feature policy='require' name='virt-ssbd'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <feature policy='require' name='lbrv'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <feature policy='require' name='tsc-scale'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <feature policy='require' name='vmcb-clean'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <feature policy='require' name='flushbyasid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <feature policy='require' name='pause-filter'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <feature policy='require' name='pfthreshold'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <feature policy='require' name='svme-addr-chk'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <feature policy='require' name='lfence-always-serializing'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <feature policy='disable' name='xsaves'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:     </mode>
Feb 02 09:57:56 compute-1 nova_compute[225341]:     <mode name='custom' supported='yes'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='Broadwell'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='hle'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='invpcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='rtm'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='Broadwell-IBRS'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='hle'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='invpcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='rtm'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='Broadwell-noTSX'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='invpcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='Broadwell-noTSX-IBRS'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='invpcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='Broadwell-v1'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='hle'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='invpcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='rtm'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='Broadwell-v2'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='invpcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='Broadwell-v3'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='hle'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='invpcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='rtm'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='Broadwell-v4'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='invpcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='Cascadelake-Server'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512bw'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512cd'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512dq'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512f'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vl'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vnni'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='hle'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='invpcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pku'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='rtm'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='Cascadelake-Server-noTSX'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512bw'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512cd'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512dq'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512f'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vl'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vnni'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='ibrs-all'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='invpcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pku'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='Cascadelake-Server-v1'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512bw'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512cd'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512dq'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512f'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vl'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vnni'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='hle'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='invpcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pku'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='rtm'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='Cascadelake-Server-v2'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512bw'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512cd'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512dq'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512f'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vl'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vnni'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='hle'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='ibrs-all'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='invpcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pku'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='rtm'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='Cascadelake-Server-v3'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512bw'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512cd'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512dq'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512f'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vl'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vnni'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='ibrs-all'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='invpcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pku'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='Cascadelake-Server-v4'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512bw'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512cd'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512dq'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512f'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vl'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vnni'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='ibrs-all'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='invpcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pku'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='Cascadelake-Server-v5'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512bw'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512cd'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512dq'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512f'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vl'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vnni'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='ibrs-all'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='invpcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pku'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='xsaves'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='ClearwaterForest'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx-ifma'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx-ne-convert'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx-vnni'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx-vnni-int16'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx-vnni-int8'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='bhi-ctrl'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='bhi-no'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='bus-lock-detect'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='cldemote'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='cmpccxadd'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='ddpd-u'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='fbsdp-no'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='fsrm'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='fsrs'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='gfni'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='ibrs-all'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='intel-psfd'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='invpcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='ipred-ctrl'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='lam'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='mcdt-no'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='movdir64b'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='movdiri'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pbrsb-no'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pku'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='prefetchiti'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='psdp-no'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='rrsba-ctrl'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='sbdr-ssdp-no'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='serialize'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='sha512'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='sm3'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='sm4'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='ss'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='vaes'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='vpclmulqdq'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='xsaves'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='ClearwaterForest-v1'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx-ifma'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx-ne-convert'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx-vnni'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx-vnni-int16'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx-vnni-int8'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='bhi-ctrl'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='bhi-no'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='bus-lock-detect'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='cldemote'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='cmpccxadd'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='ddpd-u'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='fbsdp-no'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='fsrm'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='fsrs'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='gfni'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='ibrs-all'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='intel-psfd'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='invpcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='ipred-ctrl'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='lam'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='mcdt-no'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='movdir64b'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='movdiri'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pbrsb-no'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pku'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='prefetchiti'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='psdp-no'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='rrsba-ctrl'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='sbdr-ssdp-no'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='serialize'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='sha512'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='sm3'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='sm4'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='ss'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='vaes'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='vpclmulqdq'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='xsaves'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='Cooperlake'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512-bf16'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512bw'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512cd'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512dq'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512f'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vl'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vnni'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='hle'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='ibrs-all'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='invpcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pku'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='rtm'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='taa-no'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='Cooperlake-v1'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512-bf16'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512bw'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512cd'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512dq'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512f'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vl'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vnni'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='hle'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='ibrs-all'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='invpcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pku'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='rtm'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='taa-no'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='Cooperlake-v2'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512-bf16'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512bw'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512cd'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512dq'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512f'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vl'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vnni'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='hle'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='ibrs-all'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='invpcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pku'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='rtm'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='taa-no'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='xsaves'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='Denverton'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='mpx'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='Denverton-v1'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='mpx'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='Denverton-v2'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='Denverton-v3'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='xsaves'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='Dhyana-v2'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='xsaves'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='EPYC-Genoa'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='amd-psfd'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='auto-ibrs'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512-bf16'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512-vpopcntdq'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512bitalg'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512bw'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512cd'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512dq'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512f'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512ifma'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vbmi'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vbmi2'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vl'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vnni'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='fsrm'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='gfni'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='invpcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='la57'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='no-nested-data-bp'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='null-sel-clr-base'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pku'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='stibp-always-on'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='vaes'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='vpclmulqdq'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='xsaves'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='EPYC-Genoa-v1'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='amd-psfd'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='auto-ibrs'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512-bf16'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512-vpopcntdq'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512bitalg'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512bw'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512cd'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512dq'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512f'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512ifma'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vbmi'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vbmi2'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vl'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vnni'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='fsrm'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='gfni'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='invpcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='la57'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='no-nested-data-bp'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='null-sel-clr-base'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pku'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='stibp-always-on'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='vaes'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='vpclmulqdq'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='xsaves'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='EPYC-Genoa-v2'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='amd-psfd'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='auto-ibrs'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512-bf16'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512-vpopcntdq'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512bitalg'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512bw'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512cd'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512dq'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512f'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512ifma'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vbmi'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vbmi2'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vl'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vnni'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='fs-gs-base-ns'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='fsrm'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='gfni'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='invpcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='la57'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='no-nested-data-bp'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='null-sel-clr-base'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='perfmon-v2'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pku'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='stibp-always-on'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='vaes'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='vpclmulqdq'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='xsaves'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='EPYC-Milan'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='fsrm'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='invpcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pku'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='xsaves'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='EPYC-Milan-v1'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='fsrm'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='invpcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pku'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='xsaves'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='EPYC-Milan-v2'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='amd-psfd'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='fsrm'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='invpcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='no-nested-data-bp'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='null-sel-clr-base'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pku'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='stibp-always-on'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='vaes'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='vpclmulqdq'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='xsaves'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='EPYC-Milan-v3'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='amd-psfd'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='fsrm'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='invpcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='no-nested-data-bp'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='null-sel-clr-base'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pku'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='stibp-always-on'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='vaes'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='vpclmulqdq'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='xsaves'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='EPYC-Rome'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='xsaves'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='EPYC-Rome-v1'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='xsaves'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='EPYC-Rome-v2'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='xsaves'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='EPYC-Rome-v3'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='xsaves'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='EPYC-Turin'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='amd-psfd'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='auto-ibrs'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx-vnni'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512-bf16'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512-vp2intersect'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512-vpopcntdq'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512bitalg'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512bw'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512cd'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512dq'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512f'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512ifma'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vbmi'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vbmi2'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vl'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vnni'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='fs-gs-base-ns'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='fsrm'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='gfni'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='ibpb-brtype'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='invpcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='la57'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='movdir64b'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='movdiri'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='no-nested-data-bp'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='null-sel-clr-base'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='perfmon-v2'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pku'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='prefetchi'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='sbpb'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='srso-user-kernel-no'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='stibp-always-on'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='vaes'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='vpclmulqdq'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='xsaves'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='EPYC-Turin-v1'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='amd-psfd'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='auto-ibrs'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx-vnni'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512-bf16'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512-vp2intersect'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512-vpopcntdq'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512bitalg'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512bw'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512cd'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512dq'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512f'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512ifma'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vbmi'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vbmi2'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vl'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vnni'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='fs-gs-base-ns'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='fsrm'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='gfni'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='ibpb-brtype'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='invpcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='la57'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='movdir64b'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='movdiri'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='no-nested-data-bp'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='null-sel-clr-base'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='perfmon-v2'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pku'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='prefetchi'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='sbpb'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='srso-user-kernel-no'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='stibp-always-on'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='vaes'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='vpclmulqdq'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='xsaves'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='EPYC-v3'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='xsaves'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='EPYC-v4'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='xsaves'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='AMD'>EPYC-v5</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='EPYC-v5'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='xsaves'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='GraniteRapids'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='amx-bf16'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='amx-fp16'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='amx-int8'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='amx-tile'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx-vnni'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512-bf16'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512-fp16'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512-vpopcntdq'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512bitalg'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512bw'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512cd'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512dq'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512f'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512ifma'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vbmi'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vbmi2'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vl'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vnni'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='bus-lock-detect'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='fbsdp-no'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='fsrc'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='fsrm'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='fsrs'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='fzrm'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='gfni'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='hle'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='ibrs-all'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='invpcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='la57'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='mcdt-no'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pbrsb-no'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pku'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='prefetchiti'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='psdp-no'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='rtm'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='sbdr-ssdp-no'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='serialize'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='taa-no'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='tsx-ldtrk'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='vaes'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='vpclmulqdq'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='xfd'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='xsaves'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='GraniteRapids-v1'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='amx-bf16'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='amx-fp16'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='amx-int8'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='amx-tile'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx-vnni'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512-bf16'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512-fp16'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512-vpopcntdq'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512bitalg'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512bw'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512cd'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512dq'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512f'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512ifma'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vbmi'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vbmi2'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vl'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vnni'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='bus-lock-detect'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='fbsdp-no'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='fsrc'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='fsrm'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='fsrs'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='fzrm'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='gfni'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='hle'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='ibrs-all'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='invpcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='la57'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='mcdt-no'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pbrsb-no'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pku'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='prefetchiti'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='psdp-no'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='rtm'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='sbdr-ssdp-no'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='serialize'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='taa-no'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='tsx-ldtrk'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='vaes'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='vpclmulqdq'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='xfd'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='xsaves'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='GraniteRapids-v2'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='amx-bf16'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='amx-fp16'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='amx-int8'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='amx-tile'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx-vnni'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx10'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx10-128'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx10-256'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx10-512'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512-bf16'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512-fp16'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512-vpopcntdq'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512bitalg'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512bw'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512cd'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512dq'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512f'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512ifma'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vbmi'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vbmi2'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vl'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vnni'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='bus-lock-detect'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='cldemote'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='fbsdp-no'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='fsrc'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='fsrm'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='fsrs'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='fzrm'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='gfni'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='hle'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='ibrs-all'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='invpcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='la57'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='mcdt-no'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='movdir64b'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='movdiri'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pbrsb-no'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pku'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='prefetchiti'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='psdp-no'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='rtm'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='sbdr-ssdp-no'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='serialize'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='ss'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='taa-no'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='tsx-ldtrk'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='vaes'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='vpclmulqdq'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='xfd'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='xsaves'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='GraniteRapids-v3'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='amx-bf16'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='amx-fp16'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='amx-int8'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='amx-tile'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx-vnni'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx10'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx10-128'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx10-256'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx10-512'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512-bf16'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512-fp16'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512-vpopcntdq'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512bitalg'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512bw'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512cd'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512dq'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512f'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512ifma'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vbmi'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vbmi2'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vl'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vnni'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='bus-lock-detect'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='cldemote'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='fbsdp-no'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='fsrc'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='fsrm'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='fsrs'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='fzrm'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='gfni'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='hle'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='ibrs-all'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='invpcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='la57'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='mcdt-no'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='movdir64b'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='movdiri'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pbrsb-no'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pku'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='prefetchiti'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='psdp-no'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='rtm'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='sbdr-ssdp-no'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='serialize'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='ss'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='taa-no'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='tsx-ldtrk'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='vaes'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='vpclmulqdq'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='xfd'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='xsaves'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='Haswell'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='hle'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='invpcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='rtm'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='Haswell-IBRS'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='hle'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='invpcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='rtm'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='Haswell-noTSX'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='invpcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='Haswell-noTSX-IBRS'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='invpcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='Haswell-v1'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='hle'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='invpcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='rtm'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='Haswell-v2'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='invpcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='Haswell-v3'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='hle'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='invpcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='rtm'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='Haswell-v4'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='invpcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='Icelake-Server'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512-vpopcntdq'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512bitalg'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512bw'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512cd'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512dq'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512f'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vbmi'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vbmi2'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vl'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vnni'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='gfni'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='hle'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='invpcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='la57'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pku'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='rtm'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='vaes'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='vpclmulqdq'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='Icelake-Server-noTSX'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512-vpopcntdq'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512bitalg'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512bw'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512cd'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512dq'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512f'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vbmi'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vbmi2'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vl'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vnni'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='gfni'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='invpcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='la57'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pku'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='vaes'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='vpclmulqdq'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='Icelake-Server-v1'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512-vpopcntdq'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512bitalg'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512bw'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512cd'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512dq'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512f'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vbmi'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vbmi2'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vl'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vnni'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='gfni'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='hle'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='invpcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='la57'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pku'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='rtm'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='vaes'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='vpclmulqdq'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='Icelake-Server-v2'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512-vpopcntdq'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512bitalg'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512bw'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512cd'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512dq'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512f'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vbmi'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vbmi2'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vl'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vnni'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='gfni'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='invpcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='la57'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pku'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='vaes'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='vpclmulqdq'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='Icelake-Server-v3'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512-vpopcntdq'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512bitalg'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512bw'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512cd'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512dq'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512f'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vbmi'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vbmi2'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vl'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vnni'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='gfni'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='ibrs-all'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='invpcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='la57'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pku'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='taa-no'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='vaes'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='vpclmulqdq'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='Icelake-Server-v4'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512-vpopcntdq'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512bitalg'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512bw'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512cd'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512dq'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512f'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512ifma'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vbmi'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vbmi2'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vl'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vnni'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='fsrm'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='gfni'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='ibrs-all'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='invpcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='la57'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pku'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='taa-no'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='vaes'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='vpclmulqdq'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='Icelake-Server-v5'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512-vpopcntdq'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512bitalg'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512bw'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512cd'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512dq'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512f'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512ifma'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vbmi'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vbmi2'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vl'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vnni'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='fsrm'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='gfni'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='ibrs-all'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='invpcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='la57'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pku'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='taa-no'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='vaes'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='vpclmulqdq'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='xsaves'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='Icelake-Server-v6'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512-vpopcntdq'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512bitalg'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512bw'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512cd'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512dq'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512f'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512ifma'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vbmi'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vbmi2'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vl'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vnni'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='fsrm'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='gfni'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='ibrs-all'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='invpcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='la57'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pku'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='taa-no'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='vaes'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='vpclmulqdq'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='xsaves'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='Icelake-Server-v7'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512-vpopcntdq'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512bitalg'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512bw'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512cd'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512dq'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512f'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512ifma'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vbmi'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vbmi2'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vl'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vnni'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='fsrm'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='gfni'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='hle'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='ibrs-all'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='invpcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='la57'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pku'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='rtm'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='taa-no'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='vaes'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='vpclmulqdq'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='xsaves'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='IvyBridge'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='IvyBridge-IBRS'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='IvyBridge-v1'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='IvyBridge-v2'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='KnightsMill'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512-4fmaps'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512-4vnniw'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512-vpopcntdq'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512cd'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512er'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512f'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512pf'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='ss'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='KnightsMill-v1'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512-4fmaps'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512-4vnniw'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512-vpopcntdq'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512cd'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512er'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512f'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512pf'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='ss'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='Opteron_G4'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='fma4'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='xop'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='Opteron_G4-v1'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='fma4'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='xop'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='Opteron_G5'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='fma4'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='tbm'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='xop'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='Opteron_G5-v1'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='fma4'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='tbm'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='xop'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='SapphireRapids'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='amx-bf16'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='amx-int8'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='amx-tile'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx-vnni'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512-bf16'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512-fp16'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512-vpopcntdq'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512bitalg'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512bw'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512cd'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512dq'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512f'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512ifma'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vbmi'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vbmi2'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vl'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vnni'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='bus-lock-detect'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='fsrc'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='fsrm'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='fsrs'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='fzrm'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='gfni'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='hle'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='ibrs-all'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='invpcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='la57'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pku'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='rtm'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='serialize'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='taa-no'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='tsx-ldtrk'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='vaes'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='vpclmulqdq'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='xfd'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='xsaves'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='SapphireRapids-v1'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='amx-bf16'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='amx-int8'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='amx-tile'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx-vnni'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512-bf16'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512-fp16'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512-vpopcntdq'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512bitalg'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512bw'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512cd'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512dq'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512f'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512ifma'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vbmi'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vbmi2'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vl'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vnni'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='bus-lock-detect'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='fsrc'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='fsrm'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='fsrs'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='fzrm'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='gfni'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='hle'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='ibrs-all'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='invpcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='la57'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pku'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='rtm'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='serialize'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='taa-no'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='tsx-ldtrk'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='vaes'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='vpclmulqdq'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='xfd'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='xsaves'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='SapphireRapids-v2'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='amx-bf16'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='amx-int8'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='amx-tile'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx-vnni'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512-bf16'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512-fp16'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512-vpopcntdq'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512bitalg'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512bw'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512cd'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512dq'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512f'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512ifma'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vbmi'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vbmi2'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vl'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vnni'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='bus-lock-detect'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='fbsdp-no'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='fsrc'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='fsrm'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='fsrs'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='fzrm'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='gfni'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='hle'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='ibrs-all'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='invpcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='la57'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pku'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='psdp-no'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='rtm'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='sbdr-ssdp-no'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='serialize'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='taa-no'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='tsx-ldtrk'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='vaes'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='vpclmulqdq'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='xfd'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='xsaves'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='SapphireRapids-v3'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='amx-bf16'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='amx-int8'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='amx-tile'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx-vnni'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512-bf16'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512-fp16'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512-vpopcntdq'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512bitalg'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512bw'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512cd'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512dq'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512f'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512ifma'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vbmi'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vbmi2'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vl'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vnni'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='bus-lock-detect'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='cldemote'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='fbsdp-no'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='fsrc'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='fsrm'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='fsrs'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='fzrm'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='gfni'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='hle'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='ibrs-all'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='invpcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='la57'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='movdir64b'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='movdiri'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pku'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='psdp-no'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='rtm'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='sbdr-ssdp-no'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='serialize'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='ss'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='taa-no'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='tsx-ldtrk'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='vaes'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='vpclmulqdq'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='xfd'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='xsaves'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='SapphireRapids-v4'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='amx-bf16'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='amx-int8'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='amx-tile'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx-vnni'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512-bf16'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512-fp16'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512-vpopcntdq'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512bitalg'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512bw'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512cd'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512dq'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512f'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512ifma'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vbmi'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vbmi2'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vl'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vnni'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='bus-lock-detect'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='cldemote'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='fbsdp-no'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='fsrc'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='fsrm'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='fsrs'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='fzrm'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='gfni'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='hle'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='ibrs-all'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='invpcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='la57'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='movdir64b'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='movdiri'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pku'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='psdp-no'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='rtm'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='sbdr-ssdp-no'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='serialize'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='ss'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='taa-no'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='tsx-ldtrk'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='vaes'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='vpclmulqdq'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='xfd'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='xsaves'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='SierraForest'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx-ifma'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx-ne-convert'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx-vnni'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx-vnni-int8'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='bus-lock-detect'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='cmpccxadd'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='fbsdp-no'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='fsrm'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='fsrs'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='gfni'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='ibrs-all'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='invpcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='mcdt-no'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pbrsb-no'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pku'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='psdp-no'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='sbdr-ssdp-no'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='serialize'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='vaes'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='vpclmulqdq'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='xsaves'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='SierraForest-v1'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx-ifma'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx-ne-convert'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx-vnni'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx-vnni-int8'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='bus-lock-detect'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='cmpccxadd'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='fbsdp-no'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='fsrm'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='fsrs'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='gfni'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='ibrs-all'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='invpcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='mcdt-no'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pbrsb-no'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pku'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='psdp-no'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='sbdr-ssdp-no'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='serialize'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='vaes'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='vpclmulqdq'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='xsaves'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel'>SierraForest-v2</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='SierraForest-v2'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx-ifma'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx-ne-convert'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx-vnni'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx-vnni-int8'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='bhi-ctrl'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='bus-lock-detect'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='cldemote'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='cmpccxadd'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='fbsdp-no'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='fsrm'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='fsrs'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='gfni'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='ibrs-all'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='intel-psfd'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='invpcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='ipred-ctrl'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='lam'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='mcdt-no'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='movdir64b'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='movdiri'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pbrsb-no'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pku'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='psdp-no'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='rrsba-ctrl'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='sbdr-ssdp-no'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='serialize'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='ss'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='vaes'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='vpclmulqdq'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='xsaves'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel'>SierraForest-v3</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='SierraForest-v3'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx-ifma'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx-ne-convert'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx-vnni'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx-vnni-int8'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='bhi-ctrl'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='bus-lock-detect'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='cldemote'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='cmpccxadd'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='fbsdp-no'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='fsrm'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='fsrs'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='gfni'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='ibrs-all'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='intel-psfd'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='invpcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='ipred-ctrl'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='lam'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='mcdt-no'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='movdir64b'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='movdiri'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pbrsb-no'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pku'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='psdp-no'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='rrsba-ctrl'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='sbdr-ssdp-no'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='serialize'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='ss'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='vaes'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='vpclmulqdq'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='xsaves'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='Skylake-Client'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='hle'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='invpcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='rtm'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='Skylake-Client-IBRS'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='hle'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='invpcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='rtm'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='invpcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='Skylake-Client-v1'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='hle'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='invpcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='rtm'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='Skylake-Client-v2'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='hle'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='invpcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='rtm'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='Skylake-Client-v3'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='invpcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='Skylake-Client-v4'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='invpcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='xsaves'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='Skylake-Server'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512bw'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512cd'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512dq'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512f'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vl'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='hle'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='invpcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pku'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='rtm'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='Skylake-Server-IBRS'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512bw'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512cd'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512dq'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512f'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vl'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='hle'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='invpcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pku'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='rtm'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512bw'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512cd'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512dq'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512f'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vl'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='invpcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pku'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='Skylake-Server-v1'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512bw'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512cd'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512dq'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512f'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vl'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='hle'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='invpcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pku'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='rtm'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='Skylake-Server-v2'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512bw'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512cd'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512dq'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512f'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vl'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='hle'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='invpcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pku'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='rtm'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='Skylake-Server-v3'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512bw'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512cd'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512dq'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512f'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vl'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='invpcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pku'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='Skylake-Server-v4'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512bw'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512cd'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512dq'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512f'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vl'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='invpcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pku'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='Skylake-Server-v5'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512bw'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512cd'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512dq'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512f'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vl'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='invpcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pku'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='xsaves'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='Snowridge'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='cldemote'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='core-capability'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='gfni'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='movdir64b'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='movdiri'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='mpx'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='split-lock-detect'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='Snowridge-v1'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='cldemote'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='core-capability'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='gfni'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='movdir64b'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='movdiri'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='mpx'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='split-lock-detect'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='Snowridge-v2'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='cldemote'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='core-capability'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='gfni'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='movdir64b'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='movdiri'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='split-lock-detect'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='Snowridge-v3'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='cldemote'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='core-capability'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='gfni'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='movdir64b'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='movdiri'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='split-lock-detect'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='xsaves'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='Snowridge-v4'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='cldemote'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='gfni'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='movdir64b'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='movdiri'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='xsaves'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='athlon'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='3dnow'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='3dnowext'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='athlon-v1'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='3dnow'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='3dnowext'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='core2duo'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='ss'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='core2duo-v1'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='ss'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='coreduo'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='ss'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='coreduo-v1'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='ss'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='n270'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='ss'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='n270-v1'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='ss'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='phenom'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='3dnow'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='3dnowext'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='phenom-v1'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='3dnow'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='3dnowext'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:     </mode>
Feb 02 09:57:56 compute-1 nova_compute[225341]:   </cpu>
Feb 02 09:57:56 compute-1 nova_compute[225341]:   <memoryBacking supported='yes'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:     <enum name='sourceType'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <value>file</value>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <value>anonymous</value>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <value>memfd</value>
Feb 02 09:57:56 compute-1 nova_compute[225341]:     </enum>
Feb 02 09:57:56 compute-1 nova_compute[225341]:   </memoryBacking>
Feb 02 09:57:56 compute-1 nova_compute[225341]:   <devices>
Feb 02 09:57:56 compute-1 nova_compute[225341]:     <disk supported='yes'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <enum name='diskDevice'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <value>disk</value>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <value>cdrom</value>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <value>floppy</value>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <value>lun</value>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </enum>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <enum name='bus'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <value>fdc</value>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <value>scsi</value>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <value>virtio</value>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <value>usb</value>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <value>sata</value>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </enum>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <enum name='model'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <value>virtio</value>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <value>virtio-transitional</value>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <value>virtio-non-transitional</value>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </enum>
Feb 02 09:57:56 compute-1 nova_compute[225341]:     </disk>
Feb 02 09:57:56 compute-1 nova_compute[225341]:     <graphics supported='yes'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <enum name='type'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <value>vnc</value>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <value>egl-headless</value>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <value>dbus</value>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </enum>
Feb 02 09:57:56 compute-1 nova_compute[225341]:     </graphics>
Feb 02 09:57:56 compute-1 nova_compute[225341]:     <video supported='yes'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <enum name='modelType'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <value>vga</value>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <value>cirrus</value>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <value>virtio</value>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <value>none</value>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <value>bochs</value>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <value>ramfb</value>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </enum>
Feb 02 09:57:56 compute-1 nova_compute[225341]:     </video>
Feb 02 09:57:56 compute-1 nova_compute[225341]:     <hostdev supported='yes'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <enum name='mode'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <value>subsystem</value>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </enum>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <enum name='startupPolicy'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <value>default</value>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <value>mandatory</value>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <value>requisite</value>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <value>optional</value>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </enum>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <enum name='subsysType'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <value>usb</value>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <value>pci</value>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <value>scsi</value>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </enum>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <enum name='capsType'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <enum name='pciBackend'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:     </hostdev>
Feb 02 09:57:56 compute-1 nova_compute[225341]:     <rng supported='yes'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <enum name='model'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <value>virtio</value>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <value>virtio-transitional</value>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <value>virtio-non-transitional</value>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </enum>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <enum name='backendModel'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <value>random</value>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <value>egd</value>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <value>builtin</value>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </enum>
Feb 02 09:57:56 compute-1 nova_compute[225341]:     </rng>
Feb 02 09:57:56 compute-1 nova_compute[225341]:     <filesystem supported='yes'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <enum name='driverType'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <value>path</value>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <value>handle</value>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <value>virtiofs</value>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </enum>
Feb 02 09:57:56 compute-1 nova_compute[225341]:     </filesystem>
Feb 02 09:57:56 compute-1 nova_compute[225341]:     <tpm supported='yes'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <enum name='model'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <value>tpm-tis</value>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <value>tpm-crb</value>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </enum>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <enum name='backendModel'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <value>emulator</value>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <value>external</value>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </enum>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <enum name='backendVersion'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <value>2.0</value>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </enum>
Feb 02 09:57:56 compute-1 nova_compute[225341]:     </tpm>
Feb 02 09:57:56 compute-1 nova_compute[225341]:     <redirdev supported='yes'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <enum name='bus'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <value>usb</value>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </enum>
Feb 02 09:57:56 compute-1 nova_compute[225341]:     </redirdev>
Feb 02 09:57:56 compute-1 nova_compute[225341]:     <channel supported='yes'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <enum name='type'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <value>pty</value>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <value>unix</value>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </enum>
Feb 02 09:57:56 compute-1 nova_compute[225341]:     </channel>
Feb 02 09:57:56 compute-1 nova_compute[225341]:     <crypto supported='yes'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <enum name='model'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <enum name='type'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <value>qemu</value>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </enum>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <enum name='backendModel'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <value>builtin</value>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </enum>
Feb 02 09:57:56 compute-1 nova_compute[225341]:     </crypto>
Feb 02 09:57:56 compute-1 nova_compute[225341]:     <interface supported='yes'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <enum name='backendType'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <value>default</value>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <value>passt</value>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </enum>
Feb 02 09:57:56 compute-1 nova_compute[225341]:     </interface>
Feb 02 09:57:56 compute-1 nova_compute[225341]:     <panic supported='yes'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <enum name='model'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <value>isa</value>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <value>hyperv</value>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </enum>
Feb 02 09:57:56 compute-1 nova_compute[225341]:     </panic>
Feb 02 09:57:56 compute-1 nova_compute[225341]:     <console supported='yes'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <enum name='type'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <value>null</value>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <value>vc</value>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <value>pty</value>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <value>dev</value>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <value>file</value>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <value>pipe</value>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <value>stdio</value>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <value>udp</value>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <value>tcp</value>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <value>unix</value>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <value>qemu-vdagent</value>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <value>dbus</value>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </enum>
Feb 02 09:57:56 compute-1 nova_compute[225341]:     </console>
Feb 02 09:57:56 compute-1 nova_compute[225341]:   </devices>
Feb 02 09:57:56 compute-1 nova_compute[225341]:   <features>
Feb 02 09:57:56 compute-1 nova_compute[225341]:     <gic supported='no'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:     <vmcoreinfo supported='yes'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:     <genid supported='yes'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:     <backingStoreInput supported='yes'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:     <backup supported='yes'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:     <async-teardown supported='yes'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:     <s390-pv supported='no'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:     <ps2 supported='yes'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:     <tdx supported='no'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:     <sev supported='no'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:     <sgx supported='no'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:     <hyperv supported='yes'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <enum name='features'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <value>relaxed</value>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <value>vapic</value>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <value>spinlocks</value>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <value>vpindex</value>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <value>runtime</value>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <value>synic</value>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <value>stimer</value>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <value>reset</value>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <value>vendor_id</value>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <value>frequencies</value>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <value>reenlightenment</value>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <value>tlbflush</value>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <value>ipi</value>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <value>avic</value>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <value>emsr_bitmap</value>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <value>xmm_input</value>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </enum>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <defaults>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <spinlocks>4095</spinlocks>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <stimer_direct>on</stimer_direct>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <tlbflush_direct>on</tlbflush_direct>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <tlbflush_extended>on</tlbflush_extended>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <vendor_id>Linux KVM Hv</vendor_id>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </defaults>
Feb 02 09:57:56 compute-1 nova_compute[225341]:     </hyperv>
Feb 02 09:57:56 compute-1 nova_compute[225341]:     <launchSecurity supported='no'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:   </features>
Feb 02 09:57:56 compute-1 nova_compute[225341]: </domainCapabilities>
Feb 02 09:57:56 compute-1 nova_compute[225341]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Feb 02 09:57:56 compute-1 nova_compute[225341]: 2026-02-02 09:57:56.874 225345 DEBUG nova.virt.libvirt.host [None req-a768f377-541c-4e5a-9cf1-05c49e440e26 - - - - - -] Getting domain capabilities for x86_64 via machine types: {'pc', 'q35'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952
Feb 02 09:57:56 compute-1 nova_compute[225341]: 2026-02-02 09:57:56.878 225345 DEBUG nova.virt.libvirt.host [None req-a768f377-541c-4e5a-9cf1-05c49e440e26 - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=pc:
Feb 02 09:57:56 compute-1 nova_compute[225341]: <domainCapabilities>
Feb 02 09:57:56 compute-1 nova_compute[225341]:   <path>/usr/libexec/qemu-kvm</path>
Feb 02 09:57:56 compute-1 nova_compute[225341]:   <domain>kvm</domain>
Feb 02 09:57:56 compute-1 nova_compute[225341]:   <machine>pc-i440fx-rhel7.6.0</machine>
Feb 02 09:57:56 compute-1 nova_compute[225341]:   <arch>x86_64</arch>
Feb 02 09:57:56 compute-1 nova_compute[225341]:   <vcpu max='240'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:   <iothreads supported='yes'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:   <os supported='yes'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:     <enum name='firmware'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:     <loader supported='yes'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <enum name='type'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <value>rom</value>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <value>pflash</value>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </enum>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <enum name='readonly'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <value>yes</value>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <value>no</value>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </enum>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <enum name='secure'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <value>no</value>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </enum>
Feb 02 09:57:56 compute-1 nova_compute[225341]:     </loader>
Feb 02 09:57:56 compute-1 nova_compute[225341]:   </os>
Feb 02 09:57:56 compute-1 nova_compute[225341]:   <cpu>
Feb 02 09:57:56 compute-1 nova_compute[225341]:     <mode name='host-passthrough' supported='yes'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <enum name='hostPassthroughMigratable'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <value>on</value>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <value>off</value>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </enum>
Feb 02 09:57:56 compute-1 nova_compute[225341]:     </mode>
Feb 02 09:57:56 compute-1 nova_compute[225341]:     <mode name='maximum' supported='yes'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <enum name='maximumMigratable'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <value>on</value>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <value>off</value>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </enum>
Feb 02 09:57:56 compute-1 nova_compute[225341]:     </mode>
Feb 02 09:57:56 compute-1 nova_compute[225341]:     <mode name='host-model' supported='yes'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model fallback='forbid'>EPYC-Rome</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <vendor>AMD</vendor>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <maxphysaddr mode='passthrough' limit='40'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <feature policy='require' name='x2apic'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <feature policy='require' name='tsc-deadline'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <feature policy='require' name='hypervisor'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <feature policy='require' name='tsc_adjust'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <feature policy='require' name='spec-ctrl'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <feature policy='require' name='stibp'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <feature policy='require' name='ssbd'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <feature policy='require' name='cmp_legacy'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <feature policy='require' name='overflow-recov'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <feature policy='require' name='succor'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <feature policy='require' name='ibrs'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <feature policy='require' name='amd-ssbd'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <feature policy='require' name='virt-ssbd'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <feature policy='require' name='lbrv'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <feature policy='require' name='tsc-scale'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <feature policy='require' name='vmcb-clean'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <feature policy='require' name='flushbyasid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <feature policy='require' name='pause-filter'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <feature policy='require' name='pfthreshold'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <feature policy='require' name='svme-addr-chk'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <feature policy='require' name='lfence-always-serializing'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <feature policy='disable' name='xsaves'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:     </mode>
Feb 02 09:57:56 compute-1 nova_compute[225341]:     <mode name='custom' supported='yes'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='Broadwell'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='hle'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='invpcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='rtm'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='Broadwell-IBRS'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='hle'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='invpcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='rtm'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='Broadwell-noTSX'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='invpcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='Broadwell-noTSX-IBRS'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='invpcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='Broadwell-v1'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='hle'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='invpcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='rtm'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='Broadwell-v2'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='invpcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='Broadwell-v3'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='hle'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='invpcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='rtm'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='Broadwell-v4'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='invpcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='Cascadelake-Server'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512bw'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512cd'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512dq'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512f'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vl'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vnni'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='hle'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='invpcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pku'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='rtm'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='Cascadelake-Server-noTSX'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512bw'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512cd'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512dq'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512f'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vl'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vnni'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='ibrs-all'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='invpcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pku'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='Cascadelake-Server-v1'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512bw'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512cd'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512dq'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512f'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vl'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vnni'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='hle'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='invpcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pku'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='rtm'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='Cascadelake-Server-v2'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512bw'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512cd'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512dq'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512f'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vl'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vnni'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='hle'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='ibrs-all'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='invpcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pku'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='rtm'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='Cascadelake-Server-v3'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512bw'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512cd'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512dq'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512f'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vl'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vnni'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='ibrs-all'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='invpcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pku'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='Cascadelake-Server-v4'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512bw'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512cd'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512dq'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512f'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vl'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vnni'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='ibrs-all'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='invpcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pku'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='Cascadelake-Server-v5'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512bw'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512cd'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512dq'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512f'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vl'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vnni'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='ibrs-all'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='invpcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pku'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='xsaves'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='ClearwaterForest'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx-ifma'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx-ne-convert'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx-vnni'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx-vnni-int16'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx-vnni-int8'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='bhi-ctrl'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='bhi-no'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='bus-lock-detect'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='cldemote'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='cmpccxadd'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='ddpd-u'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='fbsdp-no'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='fsrm'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='fsrs'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='gfni'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='ibrs-all'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='intel-psfd'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='invpcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='ipred-ctrl'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='lam'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='mcdt-no'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='movdir64b'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='movdiri'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pbrsb-no'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pku'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='prefetchiti'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='psdp-no'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='rrsba-ctrl'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='sbdr-ssdp-no'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='serialize'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='sha512'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='sm3'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='sm4'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='ss'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='vaes'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='vpclmulqdq'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='xsaves'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='ClearwaterForest-v1'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx-ifma'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx-ne-convert'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx-vnni'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx-vnni-int16'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx-vnni-int8'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='bhi-ctrl'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='bhi-no'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='bus-lock-detect'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='cldemote'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='cmpccxadd'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='ddpd-u'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='fbsdp-no'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='fsrm'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='fsrs'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='gfni'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='ibrs-all'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='intel-psfd'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='invpcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='ipred-ctrl'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='lam'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='mcdt-no'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='movdir64b'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='movdiri'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pbrsb-no'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pku'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='prefetchiti'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='psdp-no'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='rrsba-ctrl'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='sbdr-ssdp-no'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='serialize'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='sha512'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='sm3'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='sm4'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='ss'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='vaes'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='vpclmulqdq'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='xsaves'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='Cooperlake'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512-bf16'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512bw'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512cd'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512dq'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512f'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vl'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vnni'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='hle'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='ibrs-all'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='invpcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pku'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='rtm'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='taa-no'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='Cooperlake-v1'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512-bf16'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512bw'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512cd'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512dq'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512f'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vl'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vnni'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='hle'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='ibrs-all'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='invpcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pku'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='rtm'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='taa-no'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='Cooperlake-v2'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512-bf16'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512bw'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512cd'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512dq'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512f'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vl'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vnni'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='hle'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='ibrs-all'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='invpcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pku'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='rtm'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='taa-no'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='xsaves'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='Denverton'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='mpx'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='Denverton-v1'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='mpx'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='Denverton-v2'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='Denverton-v3'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='xsaves'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='Dhyana-v2'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='xsaves'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='EPYC-Genoa'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='amd-psfd'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='auto-ibrs'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512-bf16'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512-vpopcntdq'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512bitalg'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512bw'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512cd'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512dq'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512f'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512ifma'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vbmi'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vbmi2'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vl'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vnni'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='fsrm'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='gfni'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='invpcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='la57'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='no-nested-data-bp'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='null-sel-clr-base'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pku'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='stibp-always-on'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='vaes'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='vpclmulqdq'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='xsaves'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='EPYC-Genoa-v1'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='amd-psfd'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='auto-ibrs'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512-bf16'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512-vpopcntdq'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512bitalg'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512bw'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512cd'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512dq'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512f'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512ifma'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vbmi'/>
Feb 02 09:57:56 compute-1 sudo[226206]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vbmi2'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vl'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vnni'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='fsrm'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='gfni'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='invpcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='la57'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='no-nested-data-bp'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='null-sel-clr-base'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pku'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='stibp-always-on'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='vaes'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='vpclmulqdq'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='xsaves'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='EPYC-Genoa-v2'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='amd-psfd'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='auto-ibrs'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512-bf16'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512-vpopcntdq'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512bitalg'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512bw'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512cd'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512dq'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512f'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512ifma'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vbmi'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vbmi2'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vl'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vnni'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='fs-gs-base-ns'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='fsrm'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='gfni'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='invpcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='la57'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='no-nested-data-bp'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='null-sel-clr-base'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='perfmon-v2'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pku'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='stibp-always-on'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='vaes'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='vpclmulqdq'/>
Feb 02 09:57:56 compute-1 sudo[226206]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='xsaves'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='EPYC-Milan'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='fsrm'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='invpcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pku'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='xsaves'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='EPYC-Milan-v1'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='fsrm'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='invpcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pku'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='xsaves'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='EPYC-Milan-v2'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='amd-psfd'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='fsrm'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='invpcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='no-nested-data-bp'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='null-sel-clr-base'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pku'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='stibp-always-on'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='vaes'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='vpclmulqdq'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='xsaves'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='EPYC-Milan-v3'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='amd-psfd'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='fsrm'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='invpcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='no-nested-data-bp'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='null-sel-clr-base'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pku'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='stibp-always-on'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='vaes'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='vpclmulqdq'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='xsaves'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='EPYC-Rome'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='xsaves'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='EPYC-Rome-v1'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='xsaves'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='EPYC-Rome-v2'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='xsaves'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='EPYC-Rome-v3'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='xsaves'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='EPYC-Turin'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='amd-psfd'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='auto-ibrs'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx-vnni'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512-bf16'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512-vp2intersect'/>
Feb 02 09:57:56 compute-1 sudo[226206]: pam_unix(sudo:session): session closed for user root
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512-vpopcntdq'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512bitalg'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512bw'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512cd'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512dq'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512f'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512ifma'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vbmi'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vbmi2'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vl'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vnni'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='fs-gs-base-ns'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='fsrm'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='gfni'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='ibpb-brtype'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='invpcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='la57'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='movdir64b'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='movdiri'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='no-nested-data-bp'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='null-sel-clr-base'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='perfmon-v2'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pku'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='prefetchi'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='sbpb'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='srso-user-kernel-no'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='stibp-always-on'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='vaes'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='vpclmulqdq'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='xsaves'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='EPYC-Turin-v1'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='amd-psfd'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='auto-ibrs'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx-vnni'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512-bf16'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512-vp2intersect'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512-vpopcntdq'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512bitalg'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512bw'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512cd'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512dq'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512f'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512ifma'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vbmi'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vbmi2'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vl'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vnni'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='fs-gs-base-ns'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='fsrm'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='gfni'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='ibpb-brtype'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='invpcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='la57'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='movdir64b'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='movdiri'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='no-nested-data-bp'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='null-sel-clr-base'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='perfmon-v2'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pku'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='prefetchi'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='sbpb'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='srso-user-kernel-no'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='stibp-always-on'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='vaes'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='vpclmulqdq'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='xsaves'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='EPYC-v3'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='xsaves'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='EPYC-v4'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='xsaves'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='AMD'>EPYC-v5</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='EPYC-v5'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='xsaves'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='GraniteRapids'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='amx-bf16'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='amx-fp16'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='amx-int8'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='amx-tile'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx-vnni'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512-bf16'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512-fp16'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512-vpopcntdq'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512bitalg'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512bw'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512cd'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512dq'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512f'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512ifma'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vbmi'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vbmi2'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vl'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vnni'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='bus-lock-detect'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='fbsdp-no'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='fsrc'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='fsrm'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='fsrs'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='fzrm'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='gfni'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='hle'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='ibrs-all'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='invpcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='la57'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='mcdt-no'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pbrsb-no'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pku'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='prefetchiti'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='psdp-no'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='rtm'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='sbdr-ssdp-no'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='serialize'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='taa-no'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='tsx-ldtrk'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='vaes'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='vpclmulqdq'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='xfd'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='xsaves'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='GraniteRapids-v1'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='amx-bf16'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='amx-fp16'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='amx-int8'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='amx-tile'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx-vnni'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512-bf16'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512-fp16'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512-vpopcntdq'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512bitalg'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512bw'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512cd'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512dq'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512f'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512ifma'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vbmi'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vbmi2'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vl'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vnni'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='bus-lock-detect'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='fbsdp-no'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='fsrc'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='fsrm'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='fsrs'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='fzrm'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='gfni'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='hle'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='ibrs-all'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='invpcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='la57'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='mcdt-no'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pbrsb-no'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pku'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='prefetchiti'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='psdp-no'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='rtm'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='sbdr-ssdp-no'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='serialize'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='taa-no'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='tsx-ldtrk'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='vaes'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='vpclmulqdq'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='xfd'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='xsaves'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='GraniteRapids-v2'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='amx-bf16'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='amx-fp16'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='amx-int8'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='amx-tile'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx-vnni'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx10'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx10-128'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx10-256'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx10-512'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512-bf16'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512-fp16'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512-vpopcntdq'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512bitalg'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512bw'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512cd'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512dq'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512f'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512ifma'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vbmi'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vbmi2'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vl'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vnni'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='bus-lock-detect'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='cldemote'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='fbsdp-no'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='fsrc'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='fsrm'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='fsrs'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='fzrm'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='gfni'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='hle'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='ibrs-all'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='invpcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='la57'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='mcdt-no'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='movdir64b'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='movdiri'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pbrsb-no'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pku'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='prefetchiti'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='psdp-no'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='rtm'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='sbdr-ssdp-no'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='serialize'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='ss'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='taa-no'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='tsx-ldtrk'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='vaes'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='vpclmulqdq'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='xfd'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='xsaves'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='GraniteRapids-v3'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='amx-bf16'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='amx-fp16'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='amx-int8'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='amx-tile'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx-vnni'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx10'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx10-128'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx10-256'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx10-512'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512-bf16'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512-fp16'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512-vpopcntdq'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512bitalg'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512bw'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512cd'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512dq'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512f'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512ifma'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vbmi'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vbmi2'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vl'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vnni'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='bus-lock-detect'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='cldemote'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='fbsdp-no'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='fsrc'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='fsrm'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='fsrs'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='fzrm'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='gfni'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='hle'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='ibrs-all'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='invpcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='la57'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='mcdt-no'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='movdir64b'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='movdiri'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pbrsb-no'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pku'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='prefetchiti'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='psdp-no'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='rtm'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='sbdr-ssdp-no'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='serialize'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='ss'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='taa-no'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='tsx-ldtrk'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='vaes'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='vpclmulqdq'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='xfd'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='xsaves'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='Haswell'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='hle'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='invpcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='rtm'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='Haswell-IBRS'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='hle'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='invpcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='rtm'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='Haswell-noTSX'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='invpcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='Haswell-noTSX-IBRS'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='invpcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='Haswell-v1'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='hle'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='invpcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='rtm'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='Haswell-v2'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='invpcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='Haswell-v3'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='hle'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='invpcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='rtm'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='Haswell-v4'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='invpcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='Icelake-Server'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512-vpopcntdq'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512bitalg'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512bw'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512cd'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512dq'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512f'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vbmi'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vbmi2'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vl'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vnni'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='gfni'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='hle'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='invpcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='la57'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pku'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='rtm'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='vaes'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='vpclmulqdq'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='Icelake-Server-noTSX'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512-vpopcntdq'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512bitalg'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512bw'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512cd'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512dq'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512f'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vbmi'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vbmi2'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vl'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vnni'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='gfni'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='invpcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='la57'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pku'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='vaes'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='vpclmulqdq'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='Icelake-Server-v1'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512-vpopcntdq'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512bitalg'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512bw'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512cd'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512dq'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512f'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vbmi'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vbmi2'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vl'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vnni'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='gfni'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='hle'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='invpcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='la57'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pku'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='rtm'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='vaes'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='vpclmulqdq'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='Icelake-Server-v2'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512-vpopcntdq'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512bitalg'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512bw'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512cd'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512dq'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512f'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vbmi'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vbmi2'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vl'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vnni'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='gfni'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='invpcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='la57'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pku'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='vaes'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='vpclmulqdq'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='Icelake-Server-v3'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512-vpopcntdq'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512bitalg'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512bw'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512cd'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512dq'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512f'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vbmi'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vbmi2'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vl'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vnni'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='gfni'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='ibrs-all'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='invpcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='la57'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pku'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='taa-no'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='vaes'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='vpclmulqdq'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='Icelake-Server-v4'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512-vpopcntdq'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512bitalg'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512bw'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512cd'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512dq'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512f'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512ifma'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vbmi'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vbmi2'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vl'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vnni'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='fsrm'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='gfni'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='ibrs-all'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='invpcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='la57'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pku'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='taa-no'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='vaes'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='vpclmulqdq'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='Icelake-Server-v5'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512-vpopcntdq'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512bitalg'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512bw'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512cd'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512dq'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512f'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512ifma'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vbmi'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vbmi2'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vl'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vnni'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='fsrm'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='gfni'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='ibrs-all'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='invpcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='la57'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pku'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='taa-no'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='vaes'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='vpclmulqdq'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='xsaves'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='Icelake-Server-v6'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512-vpopcntdq'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512bitalg'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512bw'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512cd'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512dq'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512f'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512ifma'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vbmi'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vbmi2'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vl'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vnni'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='fsrm'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='gfni'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='ibrs-all'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='invpcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='la57'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pku'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='taa-no'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='vaes'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='vpclmulqdq'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='xsaves'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='Icelake-Server-v7'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512-vpopcntdq'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512bitalg'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512bw'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512cd'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512dq'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512f'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512ifma'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vbmi'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vbmi2'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vl'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vnni'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='fsrm'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='gfni'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='hle'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='ibrs-all'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='invpcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='la57'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pku'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='rtm'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='taa-no'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='vaes'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='vpclmulqdq'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='xsaves'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='IvyBridge'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='IvyBridge-IBRS'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='IvyBridge-v1'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='IvyBridge-v2'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='KnightsMill'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512-4fmaps'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512-4vnniw'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512-vpopcntdq'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512cd'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512er'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512f'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512pf'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='ss'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='KnightsMill-v1'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512-4fmaps'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512-4vnniw'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512-vpopcntdq'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512cd'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512er'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512f'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512pf'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='ss'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='Opteron_G4'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='fma4'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='xop'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='Opteron_G4-v1'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='fma4'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='xop'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='Opteron_G5'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='fma4'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='tbm'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='xop'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='Opteron_G5-v1'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='fma4'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='tbm'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='xop'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='SapphireRapids'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='amx-bf16'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='amx-int8'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='amx-tile'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx-vnni'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512-bf16'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512-fp16'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512-vpopcntdq'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512bitalg'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512bw'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512cd'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512dq'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512f'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512ifma'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vbmi'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vbmi2'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vl'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vnni'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='bus-lock-detect'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='fsrc'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='fsrm'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='fsrs'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='fzrm'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='gfni'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='hle'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='ibrs-all'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='invpcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='la57'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pku'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='rtm'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='serialize'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='taa-no'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='tsx-ldtrk'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='vaes'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='vpclmulqdq'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='xfd'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='xsaves'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='SapphireRapids-v1'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='amx-bf16'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='amx-int8'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='amx-tile'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx-vnni'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512-bf16'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512-fp16'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512-vpopcntdq'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512bitalg'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512bw'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512cd'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512dq'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512f'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512ifma'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vbmi'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vbmi2'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vl'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vnni'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='bus-lock-detect'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='fsrc'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='fsrm'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='fsrs'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='fzrm'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='gfni'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='hle'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='ibrs-all'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='invpcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='la57'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pku'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='rtm'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='serialize'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='taa-no'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='tsx-ldtrk'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='vaes'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='vpclmulqdq'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='xfd'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='xsaves'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='SapphireRapids-v2'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='amx-bf16'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='amx-int8'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='amx-tile'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx-vnni'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512-bf16'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512-fp16'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512-vpopcntdq'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512bitalg'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512bw'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512cd'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512dq'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512f'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512ifma'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vbmi'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vbmi2'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vl'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vnni'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='bus-lock-detect'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='fbsdp-no'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='fsrc'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='fsrm'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='fsrs'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='fzrm'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='gfni'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='hle'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='ibrs-all'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='invpcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='la57'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pku'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='psdp-no'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='rtm'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='sbdr-ssdp-no'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='serialize'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='taa-no'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='tsx-ldtrk'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='vaes'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='vpclmulqdq'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='xfd'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='xsaves'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='SapphireRapids-v3'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='amx-bf16'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='amx-int8'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='amx-tile'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx-vnni'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512-bf16'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512-fp16'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512-vpopcntdq'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512bitalg'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512bw'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512cd'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512dq'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512f'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512ifma'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vbmi'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vbmi2'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vl'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vnni'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='bus-lock-detect'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='cldemote'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='fbsdp-no'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='fsrc'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='fsrm'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='fsrs'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='fzrm'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='gfni'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='hle'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='ibrs-all'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='invpcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='la57'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='movdir64b'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='movdiri'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pku'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='psdp-no'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='rtm'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='sbdr-ssdp-no'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='serialize'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='ss'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='taa-no'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='tsx-ldtrk'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='vaes'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='vpclmulqdq'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='xfd'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='xsaves'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='SapphireRapids-v4'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='amx-bf16'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='amx-int8'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='amx-tile'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx-vnni'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512-bf16'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512-fp16'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512-vpopcntdq'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512bitalg'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512bw'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512cd'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512dq'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512f'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512ifma'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vbmi'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vbmi2'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vl'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vnni'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='bus-lock-detect'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='cldemote'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='fbsdp-no'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='fsrc'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='fsrm'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='fsrs'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='fzrm'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='gfni'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='hle'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='ibrs-all'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='invpcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='la57'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='movdir64b'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='movdiri'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pku'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='psdp-no'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='rtm'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='sbdr-ssdp-no'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='serialize'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='ss'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='taa-no'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='tsx-ldtrk'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='vaes'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='vpclmulqdq'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='xfd'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='xsaves'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='SierraForest'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx-ifma'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx-ne-convert'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx-vnni'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx-vnni-int8'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='bus-lock-detect'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='cmpccxadd'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='fbsdp-no'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='fsrm'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='fsrs'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='gfni'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='ibrs-all'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='invpcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='mcdt-no'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pbrsb-no'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pku'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='psdp-no'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='sbdr-ssdp-no'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='serialize'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='vaes'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='vpclmulqdq'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='xsaves'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='SierraForest-v1'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx-ifma'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx-ne-convert'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx-vnni'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx-vnni-int8'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='bus-lock-detect'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='cmpccxadd'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='fbsdp-no'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='fsrm'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='fsrs'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='gfni'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='ibrs-all'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='invpcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='mcdt-no'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pbrsb-no'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pku'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='psdp-no'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='sbdr-ssdp-no'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='serialize'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='vaes'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='vpclmulqdq'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='xsaves'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel'>SierraForest-v2</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='SierraForest-v2'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx-ifma'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx-ne-convert'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx-vnni'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx-vnni-int8'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='bhi-ctrl'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='bus-lock-detect'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='cldemote'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='cmpccxadd'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='fbsdp-no'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='fsrm'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='fsrs'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='gfni'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='ibrs-all'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='intel-psfd'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='invpcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='ipred-ctrl'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='lam'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='mcdt-no'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='movdir64b'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='movdiri'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pbrsb-no'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pku'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='psdp-no'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='rrsba-ctrl'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='sbdr-ssdp-no'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='serialize'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='ss'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='vaes'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='vpclmulqdq'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='xsaves'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel'>SierraForest-v3</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='SierraForest-v3'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx-ifma'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx-ne-convert'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx-vnni'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx-vnni-int8'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='bhi-ctrl'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='bus-lock-detect'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='cldemote'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='cmpccxadd'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='fbsdp-no'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='fsrm'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='fsrs'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='gfni'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='ibrs-all'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='intel-psfd'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='invpcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='ipred-ctrl'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='lam'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='mcdt-no'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='movdir64b'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='movdiri'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pbrsb-no'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pku'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='psdp-no'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='rrsba-ctrl'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='sbdr-ssdp-no'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='serialize'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='ss'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='vaes'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='vpclmulqdq'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='xsaves'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='Skylake-Client'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='hle'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='invpcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='rtm'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='Skylake-Client-IBRS'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='hle'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='invpcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='rtm'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='invpcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='Skylake-Client-v1'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='hle'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='invpcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='rtm'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='Skylake-Client-v2'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='hle'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='invpcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='rtm'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='Skylake-Client-v3'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='invpcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='Skylake-Client-v4'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='invpcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='xsaves'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='Skylake-Server'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512bw'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512cd'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512dq'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512f'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vl'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='hle'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='invpcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pku'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='rtm'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='Skylake-Server-IBRS'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512bw'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512cd'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512dq'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512f'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vl'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='hle'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='invpcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pku'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='rtm'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512bw'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512cd'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512dq'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512f'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vl'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='invpcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pku'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='Skylake-Server-v1'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512bw'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512cd'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512dq'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512f'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vl'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='hle'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='invpcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pku'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='rtm'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='Skylake-Server-v2'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512bw'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512cd'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512dq'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512f'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vl'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='hle'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='invpcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pku'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='rtm'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='Skylake-Server-v3'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512bw'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512cd'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512dq'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512f'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vl'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='invpcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pku'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='Skylake-Server-v4'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512bw'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512cd'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512dq'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512f'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vl'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='invpcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pku'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='Skylake-Server-v5'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512bw'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512cd'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512dq'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512f'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vl'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='invpcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pku'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='xsaves'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='Snowridge'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='cldemote'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='core-capability'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='gfni'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='movdir64b'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='movdiri'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='mpx'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='split-lock-detect'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='Snowridge-v1'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='cldemote'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='core-capability'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='gfni'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='movdir64b'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='movdiri'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='mpx'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='split-lock-detect'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='Snowridge-v2'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='cldemote'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='core-capability'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='gfni'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='movdir64b'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='movdiri'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='split-lock-detect'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='Snowridge-v3'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='cldemote'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='core-capability'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='gfni'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='movdir64b'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='movdiri'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='split-lock-detect'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='xsaves'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='Snowridge-v4'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='cldemote'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='gfni'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='movdir64b'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='movdiri'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='xsaves'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='athlon'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='3dnow'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='3dnowext'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='athlon-v1'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='3dnow'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='3dnowext'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='core2duo'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='ss'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='core2duo-v1'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='ss'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='coreduo'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='ss'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='coreduo-v1'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='ss'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='n270'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='ss'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='n270-v1'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='ss'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='phenom'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='3dnow'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='3dnowext'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='phenom-v1'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='3dnow'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='3dnowext'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:     </mode>
Feb 02 09:57:56 compute-1 nova_compute[225341]:   </cpu>
Feb 02 09:57:56 compute-1 nova_compute[225341]:   <memoryBacking supported='yes'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:     <enum name='sourceType'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <value>file</value>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <value>anonymous</value>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <value>memfd</value>
Feb 02 09:57:56 compute-1 nova_compute[225341]:     </enum>
Feb 02 09:57:56 compute-1 nova_compute[225341]:   </memoryBacking>
Feb 02 09:57:56 compute-1 nova_compute[225341]:   <devices>
Feb 02 09:57:56 compute-1 nova_compute[225341]:     <disk supported='yes'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <enum name='diskDevice'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <value>disk</value>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <value>cdrom</value>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <value>floppy</value>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <value>lun</value>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </enum>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <enum name='bus'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <value>ide</value>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <value>fdc</value>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <value>scsi</value>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <value>virtio</value>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <value>usb</value>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <value>sata</value>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </enum>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <enum name='model'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <value>virtio</value>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <value>virtio-transitional</value>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <value>virtio-non-transitional</value>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </enum>
Feb 02 09:57:56 compute-1 nova_compute[225341]:     </disk>
Feb 02 09:57:56 compute-1 nova_compute[225341]:     <graphics supported='yes'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <enum name='type'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <value>vnc</value>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <value>egl-headless</value>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <value>dbus</value>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </enum>
Feb 02 09:57:56 compute-1 nova_compute[225341]:     </graphics>
Feb 02 09:57:56 compute-1 nova_compute[225341]:     <video supported='yes'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <enum name='modelType'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <value>vga</value>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <value>cirrus</value>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <value>virtio</value>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <value>none</value>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <value>bochs</value>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <value>ramfb</value>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </enum>
Feb 02 09:57:56 compute-1 nova_compute[225341]:     </video>
Feb 02 09:57:56 compute-1 nova_compute[225341]:     <hostdev supported='yes'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <enum name='mode'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <value>subsystem</value>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </enum>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <enum name='startupPolicy'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <value>default</value>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <value>mandatory</value>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <value>requisite</value>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <value>optional</value>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </enum>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <enum name='subsysType'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <value>usb</value>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <value>pci</value>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <value>scsi</value>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </enum>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <enum name='capsType'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <enum name='pciBackend'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:     </hostdev>
Feb 02 09:57:56 compute-1 nova_compute[225341]:     <rng supported='yes'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <enum name='model'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <value>virtio</value>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <value>virtio-transitional</value>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <value>virtio-non-transitional</value>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </enum>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <enum name='backendModel'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <value>random</value>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <value>egd</value>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <value>builtin</value>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </enum>
Feb 02 09:57:56 compute-1 nova_compute[225341]:     </rng>
Feb 02 09:57:56 compute-1 nova_compute[225341]:     <filesystem supported='yes'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <enum name='driverType'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <value>path</value>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <value>handle</value>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <value>virtiofs</value>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </enum>
Feb 02 09:57:56 compute-1 nova_compute[225341]:     </filesystem>
Feb 02 09:57:56 compute-1 nova_compute[225341]:     <tpm supported='yes'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <enum name='model'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <value>tpm-tis</value>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <value>tpm-crb</value>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </enum>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <enum name='backendModel'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <value>emulator</value>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <value>external</value>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </enum>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <enum name='backendVersion'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <value>2.0</value>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </enum>
Feb 02 09:57:56 compute-1 nova_compute[225341]:     </tpm>
Feb 02 09:57:56 compute-1 nova_compute[225341]:     <redirdev supported='yes'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <enum name='bus'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <value>usb</value>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </enum>
Feb 02 09:57:56 compute-1 nova_compute[225341]:     </redirdev>
Feb 02 09:57:56 compute-1 nova_compute[225341]:     <channel supported='yes'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <enum name='type'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <value>pty</value>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <value>unix</value>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </enum>
Feb 02 09:57:56 compute-1 nova_compute[225341]:     </channel>
Feb 02 09:57:56 compute-1 nova_compute[225341]:     <crypto supported='yes'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <enum name='model'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <enum name='type'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <value>qemu</value>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </enum>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <enum name='backendModel'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <value>builtin</value>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </enum>
Feb 02 09:57:56 compute-1 nova_compute[225341]:     </crypto>
Feb 02 09:57:56 compute-1 nova_compute[225341]:     <interface supported='yes'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <enum name='backendType'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <value>default</value>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <value>passt</value>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </enum>
Feb 02 09:57:56 compute-1 nova_compute[225341]:     </interface>
Feb 02 09:57:56 compute-1 nova_compute[225341]:     <panic supported='yes'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <enum name='model'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <value>isa</value>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <value>hyperv</value>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </enum>
Feb 02 09:57:56 compute-1 nova_compute[225341]:     </panic>
Feb 02 09:57:56 compute-1 nova_compute[225341]:     <console supported='yes'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <enum name='type'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <value>null</value>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <value>vc</value>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <value>pty</value>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <value>dev</value>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <value>file</value>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <value>pipe</value>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <value>stdio</value>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <value>udp</value>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <value>tcp</value>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <value>unix</value>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <value>qemu-vdagent</value>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <value>dbus</value>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </enum>
Feb 02 09:57:56 compute-1 nova_compute[225341]:     </console>
Feb 02 09:57:56 compute-1 nova_compute[225341]:   </devices>
Feb 02 09:57:56 compute-1 nova_compute[225341]:   <features>
Feb 02 09:57:56 compute-1 nova_compute[225341]:     <gic supported='no'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:     <vmcoreinfo supported='yes'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:     <genid supported='yes'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:     <backingStoreInput supported='yes'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:     <backup supported='yes'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:     <async-teardown supported='yes'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:     <s390-pv supported='no'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:     <ps2 supported='yes'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:     <tdx supported='no'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:     <sev supported='no'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:     <sgx supported='no'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:     <hyperv supported='yes'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <enum name='features'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <value>relaxed</value>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <value>vapic</value>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <value>spinlocks</value>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <value>vpindex</value>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <value>runtime</value>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <value>synic</value>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <value>stimer</value>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <value>reset</value>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <value>vendor_id</value>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <value>frequencies</value>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <value>reenlightenment</value>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <value>tlbflush</value>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <value>ipi</value>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <value>avic</value>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <value>emsr_bitmap</value>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <value>xmm_input</value>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </enum>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <defaults>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <spinlocks>4095</spinlocks>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <stimer_direct>on</stimer_direct>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <tlbflush_direct>on</tlbflush_direct>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <tlbflush_extended>on</tlbflush_extended>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <vendor_id>Linux KVM Hv</vendor_id>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </defaults>
Feb 02 09:57:56 compute-1 nova_compute[225341]:     </hyperv>
Feb 02 09:57:56 compute-1 nova_compute[225341]:     <launchSecurity supported='no'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:   </features>
Feb 02 09:57:56 compute-1 nova_compute[225341]: </domainCapabilities>
Feb 02 09:57:56 compute-1 nova_compute[225341]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Feb 02 09:57:56 compute-1 nova_compute[225341]: 2026-02-02 09:57:56.943 225345 DEBUG nova.virt.libvirt.host [None req-a768f377-541c-4e5a-9cf1-05c49e440e26 - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=q35:
Feb 02 09:57:56 compute-1 nova_compute[225341]: <domainCapabilities>
Feb 02 09:57:56 compute-1 nova_compute[225341]:   <path>/usr/libexec/qemu-kvm</path>
Feb 02 09:57:56 compute-1 nova_compute[225341]:   <domain>kvm</domain>
Feb 02 09:57:56 compute-1 nova_compute[225341]:   <machine>pc-q35-rhel9.8.0</machine>
Feb 02 09:57:56 compute-1 nova_compute[225341]:   <arch>x86_64</arch>
Feb 02 09:57:56 compute-1 nova_compute[225341]:   <vcpu max='4096'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:   <iothreads supported='yes'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:   <os supported='yes'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:     <enum name='firmware'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <value>efi</value>
Feb 02 09:57:56 compute-1 nova_compute[225341]:     </enum>
Feb 02 09:57:56 compute-1 nova_compute[225341]:     <loader supported='yes'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <value>/usr/share/edk2/ovmf/OVMF_CODE.secboot.fd</value>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <value>/usr/share/edk2/ovmf/OVMF_CODE.fd</value>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <value>/usr/share/edk2/ovmf/OVMF.amdsev.fd</value>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <value>/usr/share/edk2/ovmf/OVMF.inteltdx.secboot.fd</value>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <enum name='type'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <value>rom</value>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <value>pflash</value>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </enum>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <enum name='readonly'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <value>yes</value>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <value>no</value>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </enum>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <enum name='secure'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <value>yes</value>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <value>no</value>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </enum>
Feb 02 09:57:56 compute-1 nova_compute[225341]:     </loader>
Feb 02 09:57:56 compute-1 nova_compute[225341]:   </os>
Feb 02 09:57:56 compute-1 nova_compute[225341]:   <cpu>
Feb 02 09:57:56 compute-1 nova_compute[225341]:     <mode name='host-passthrough' supported='yes'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <enum name='hostPassthroughMigratable'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <value>on</value>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <value>off</value>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </enum>
Feb 02 09:57:56 compute-1 nova_compute[225341]:     </mode>
Feb 02 09:57:56 compute-1 nova_compute[225341]:     <mode name='maximum' supported='yes'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <enum name='maximumMigratable'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <value>on</value>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <value>off</value>
Feb 02 09:57:56 compute-1 python3.9[226200]: ansible-ansible.builtin.systemd Invoked with name=edpm_nova_compute.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </enum>
Feb 02 09:57:56 compute-1 nova_compute[225341]:     </mode>
Feb 02 09:57:56 compute-1 nova_compute[225341]:     <mode name='host-model' supported='yes'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model fallback='forbid'>EPYC-Rome</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <vendor>AMD</vendor>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <maxphysaddr mode='passthrough' limit='40'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <feature policy='require' name='x2apic'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <feature policy='require' name='tsc-deadline'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <feature policy='require' name='hypervisor'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <feature policy='require' name='tsc_adjust'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <feature policy='require' name='spec-ctrl'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <feature policy='require' name='stibp'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <feature policy='require' name='ssbd'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <feature policy='require' name='cmp_legacy'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <feature policy='require' name='overflow-recov'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <feature policy='require' name='succor'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <feature policy='require' name='ibrs'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <feature policy='require' name='amd-ssbd'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <feature policy='require' name='virt-ssbd'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <feature policy='require' name='lbrv'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <feature policy='require' name='tsc-scale'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <feature policy='require' name='vmcb-clean'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <feature policy='require' name='flushbyasid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <feature policy='require' name='pause-filter'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <feature policy='require' name='pfthreshold'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <feature policy='require' name='svme-addr-chk'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <feature policy='require' name='lfence-always-serializing'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <feature policy='disable' name='xsaves'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:     </mode>
Feb 02 09:57:56 compute-1 nova_compute[225341]:     <mode name='custom' supported='yes'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='Broadwell'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='hle'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='invpcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='rtm'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='Broadwell-IBRS'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='hle'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='invpcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='rtm'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='Broadwell-noTSX'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='invpcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='Broadwell-noTSX-IBRS'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='invpcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='Broadwell-v1'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='hle'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='invpcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='rtm'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='Broadwell-v2'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='invpcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='Broadwell-v3'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='hle'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='invpcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='rtm'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='Broadwell-v4'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='invpcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='Cascadelake-Server'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512bw'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512cd'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512dq'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512f'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vl'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vnni'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='hle'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='invpcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pku'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='rtm'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='Cascadelake-Server-noTSX'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512bw'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512cd'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512dq'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512f'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vl'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vnni'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='ibrs-all'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='invpcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pku'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='Cascadelake-Server-v1'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512bw'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512cd'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512dq'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512f'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vl'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512vnni'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='hle'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='invpcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pcid'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='pku'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='rtm'/>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Feb 02 09:57:56 compute-1 nova_compute[225341]:       <blockers model='Cascadelake-Server-v2'>
Feb 02 09:57:56 compute-1 nova_compute[225341]:         <feature name='avx512bw'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512cd'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512dq'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512f'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512vl'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512vnni'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='hle'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='ibrs-all'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='invpcid'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='pcid'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='pku'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='rtm'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       <blockers model='Cascadelake-Server-v3'>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512bw'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512cd'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512dq'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512f'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512vl'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512vnni'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='ibrs-all'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='invpcid'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='pcid'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='pku'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       <blockers model='Cascadelake-Server-v4'>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512bw'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512cd'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512dq'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512f'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512vl'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512vnni'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='ibrs-all'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='invpcid'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='pcid'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='pku'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       <blockers model='Cascadelake-Server-v5'>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512bw'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512cd'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512dq'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512f'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512vl'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512vnni'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='ibrs-all'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='invpcid'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='pcid'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='pku'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='xsaves'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       <blockers model='ClearwaterForest'>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx-ifma'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx-ne-convert'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx-vnni'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx-vnni-int16'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx-vnni-int8'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='bhi-ctrl'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='bhi-no'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='bus-lock-detect'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='cldemote'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='cmpccxadd'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='ddpd-u'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='fbsdp-no'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='fsrm'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='fsrs'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='gfni'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='ibrs-all'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='intel-psfd'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='invpcid'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='ipred-ctrl'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='lam'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='mcdt-no'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='movdir64b'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='movdiri'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='pbrsb-no'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='pcid'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='pku'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='prefetchiti'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='psdp-no'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='rrsba-ctrl'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='sbdr-ssdp-no'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='serialize'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='sha512'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='sm3'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='sm4'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='ss'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='vaes'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='vpclmulqdq'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='xsaves'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       <blockers model='ClearwaterForest-v1'>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx-ifma'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx-ne-convert'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx-vnni'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx-vnni-int16'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx-vnni-int8'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='bhi-ctrl'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='bhi-no'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='bus-lock-detect'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='cldemote'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='cmpccxadd'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='ddpd-u'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='fbsdp-no'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='fsrm'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='fsrs'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='gfni'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='ibrs-all'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='intel-psfd'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='invpcid'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='ipred-ctrl'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='lam'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='mcdt-no'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='movdir64b'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='movdiri'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='pbrsb-no'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='pcid'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='pku'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='prefetchiti'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='psdp-no'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='rrsba-ctrl'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='sbdr-ssdp-no'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='serialize'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='sha512'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='sm3'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='sm4'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='ss'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='vaes'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='vpclmulqdq'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='xsaves'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       <blockers model='Cooperlake'>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512-bf16'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512bw'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512cd'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512dq'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512f'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512vl'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512vnni'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='hle'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='ibrs-all'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='invpcid'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='pcid'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='pku'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='rtm'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='taa-no'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       <blockers model='Cooperlake-v1'>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512-bf16'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512bw'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512cd'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512dq'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512f'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512vl'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512vnni'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='hle'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='ibrs-all'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='invpcid'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='pcid'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='pku'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='rtm'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='taa-no'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       <blockers model='Cooperlake-v2'>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512-bf16'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512bw'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512cd'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512dq'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512f'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512vl'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512vnni'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='hle'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='ibrs-all'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='invpcid'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='pcid'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='pku'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='rtm'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='taa-no'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='xsaves'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       <blockers model='Denverton'>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='mpx'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       <blockers model='Denverton-v1'>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='mpx'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       <blockers model='Denverton-v2'>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       <blockers model='Denverton-v3'>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='xsaves'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       <blockers model='Dhyana-v2'>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='xsaves'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       <blockers model='EPYC-Genoa'>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='amd-psfd'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='auto-ibrs'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512-bf16'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512-vpopcntdq'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512bitalg'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512bw'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512cd'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512dq'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512f'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512ifma'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512vbmi'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512vbmi2'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512vl'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512vnni'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='fsrm'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='gfni'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='invpcid'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='la57'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='no-nested-data-bp'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='null-sel-clr-base'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='pcid'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='pku'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='stibp-always-on'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='vaes'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='vpclmulqdq'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='xsaves'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       <blockers model='EPYC-Genoa-v1'>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='amd-psfd'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='auto-ibrs'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512-bf16'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512-vpopcntdq'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512bitalg'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512bw'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512cd'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512dq'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512f'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512ifma'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512vbmi'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512vbmi2'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512vl'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512vnni'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='fsrm'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='gfni'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='invpcid'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='la57'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='no-nested-data-bp'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='null-sel-clr-base'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='pcid'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='pku'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='stibp-always-on'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='vaes'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='vpclmulqdq'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='xsaves'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       <blockers model='EPYC-Genoa-v2'>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='amd-psfd'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='auto-ibrs'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512-bf16'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512-vpopcntdq'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512bitalg'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512bw'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512cd'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512dq'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512f'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512ifma'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512vbmi'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512vbmi2'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512vl'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512vnni'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='fs-gs-base-ns'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='fsrm'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='gfni'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='invpcid'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='la57'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='no-nested-data-bp'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='null-sel-clr-base'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='pcid'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='perfmon-v2'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='pku'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='stibp-always-on'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='vaes'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='vpclmulqdq'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='xsaves'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       <blockers model='EPYC-Milan'>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='fsrm'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='invpcid'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='pcid'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='pku'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='xsaves'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       <blockers model='EPYC-Milan-v1'>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='fsrm'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='invpcid'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='pcid'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='pku'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='xsaves'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       <blockers model='EPYC-Milan-v2'>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='amd-psfd'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='fsrm'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='invpcid'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='no-nested-data-bp'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='null-sel-clr-base'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='pcid'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='pku'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='stibp-always-on'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='vaes'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='vpclmulqdq'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='xsaves'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       <blockers model='EPYC-Milan-v3'>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='amd-psfd'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='fsrm'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='invpcid'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='no-nested-data-bp'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='null-sel-clr-base'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='pcid'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='pku'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='stibp-always-on'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='vaes'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='vpclmulqdq'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='xsaves'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       <blockers model='EPYC-Rome'>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='xsaves'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       <blockers model='EPYC-Rome-v1'>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='xsaves'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       <blockers model='EPYC-Rome-v2'>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='xsaves'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       <blockers model='EPYC-Rome-v3'>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='xsaves'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       <blockers model='EPYC-Turin'>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='amd-psfd'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='auto-ibrs'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx-vnni'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512-bf16'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512-vp2intersect'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512-vpopcntdq'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512bitalg'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512bw'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512cd'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512dq'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512f'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512ifma'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512vbmi'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512vbmi2'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512vl'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512vnni'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='fs-gs-base-ns'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='fsrm'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='gfni'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='ibpb-brtype'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='invpcid'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='la57'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='movdir64b'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='movdiri'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='no-nested-data-bp'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='null-sel-clr-base'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='pcid'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='perfmon-v2'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='pku'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='prefetchi'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='sbpb'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='srso-user-kernel-no'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='stibp-always-on'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='vaes'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='vpclmulqdq'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='xsaves'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       <blockers model='EPYC-Turin-v1'>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='amd-psfd'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='auto-ibrs'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx-vnni'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512-bf16'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512-vp2intersect'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512-vpopcntdq'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512bitalg'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512bw'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512cd'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512dq'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512f'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512ifma'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512vbmi'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512vbmi2'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512vl'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512vnni'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='fs-gs-base-ns'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='fsrm'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='gfni'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='ibpb-brtype'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='invpcid'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='la57'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='movdir64b'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='movdiri'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='no-nested-data-bp'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='null-sel-clr-base'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='pcid'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='perfmon-v2'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='pku'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='prefetchi'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='sbpb'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='srso-user-kernel-no'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='stibp-always-on'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='vaes'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='vpclmulqdq'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='xsaves'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       <blockers model='EPYC-v3'>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='xsaves'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       <blockers model='EPYC-v4'>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='xsaves'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       <model usable='no' vendor='AMD'>EPYC-v5</model>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       <blockers model='EPYC-v5'>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='xsaves'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       <blockers model='GraniteRapids'>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='amx-bf16'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='amx-fp16'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='amx-int8'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='amx-tile'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx-vnni'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512-bf16'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512-fp16'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512-vpopcntdq'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512bitalg'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512bw'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512cd'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512dq'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512f'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512ifma'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512vbmi'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512vbmi2'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512vl'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512vnni'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='bus-lock-detect'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='fbsdp-no'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='fsrc'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='fsrm'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='fsrs'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='fzrm'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='gfni'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='hle'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='ibrs-all'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='invpcid'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='la57'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='mcdt-no'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='pbrsb-no'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='pcid'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='pku'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='prefetchiti'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='psdp-no'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='rtm'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='sbdr-ssdp-no'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='serialize'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='taa-no'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='tsx-ldtrk'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='vaes'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='vpclmulqdq'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='xfd'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='xsaves'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       <blockers model='GraniteRapids-v1'>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='amx-bf16'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='amx-fp16'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='amx-int8'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='amx-tile'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx-vnni'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512-bf16'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512-fp16'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512-vpopcntdq'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512bitalg'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512bw'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512cd'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512dq'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512f'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512ifma'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512vbmi'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512vbmi2'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512vl'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512vnni'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='bus-lock-detect'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='fbsdp-no'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='fsrc'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='fsrm'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='fsrs'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='fzrm'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='gfni'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='hle'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='ibrs-all'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='invpcid'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='la57'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='mcdt-no'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='pbrsb-no'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='pcid'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='pku'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='prefetchiti'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='psdp-no'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='rtm'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='sbdr-ssdp-no'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='serialize'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='taa-no'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='tsx-ldtrk'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='vaes'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='vpclmulqdq'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='xfd'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='xsaves'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       <blockers model='GraniteRapids-v2'>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='amx-bf16'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='amx-fp16'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='amx-int8'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='amx-tile'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx-vnni'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx10'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx10-128'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx10-256'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx10-512'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512-bf16'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512-fp16'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512-vpopcntdq'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512bitalg'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512bw'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512cd'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512dq'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512f'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512ifma'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512vbmi'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512vbmi2'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512vl'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512vnni'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='bus-lock-detect'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='cldemote'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='fbsdp-no'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='fsrc'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='fsrm'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='fsrs'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='fzrm'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='gfni'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='hle'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='ibrs-all'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='invpcid'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='la57'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='mcdt-no'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='movdir64b'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='movdiri'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='pbrsb-no'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='pcid'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='pku'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='prefetchiti'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='psdp-no'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='rtm'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='sbdr-ssdp-no'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='serialize'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='ss'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='taa-no'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='tsx-ldtrk'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='vaes'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='vpclmulqdq'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='xfd'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='xsaves'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       <blockers model='GraniteRapids-v3'>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='amx-bf16'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='amx-fp16'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='amx-int8'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='amx-tile'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx-vnni'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx10'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx10-128'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx10-256'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx10-512'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512-bf16'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512-fp16'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512-vpopcntdq'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512bitalg'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512bw'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512cd'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512dq'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512f'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512ifma'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512vbmi'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512vbmi2'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512vl'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512vnni'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='bus-lock-detect'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='cldemote'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='fbsdp-no'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='fsrc'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='fsrm'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='fsrs'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='fzrm'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='gfni'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='hle'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='ibrs-all'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='invpcid'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='la57'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='mcdt-no'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='movdir64b'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='movdiri'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='pbrsb-no'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='pcid'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='pku'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='prefetchiti'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='psdp-no'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='rtm'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='sbdr-ssdp-no'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='serialize'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='ss'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='taa-no'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='tsx-ldtrk'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='vaes'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='vpclmulqdq'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='xfd'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='xsaves'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       <blockers model='Haswell'>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='hle'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='invpcid'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='pcid'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='rtm'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       <blockers model='Haswell-IBRS'>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='hle'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='invpcid'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='pcid'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='rtm'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       <blockers model='Haswell-noTSX'>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='invpcid'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='pcid'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       <blockers model='Haswell-noTSX-IBRS'>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='invpcid'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='pcid'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       <blockers model='Haswell-v1'>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='hle'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='invpcid'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='pcid'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='rtm'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       <blockers model='Haswell-v2'>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='invpcid'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='pcid'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       <blockers model='Haswell-v3'>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='hle'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='invpcid'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='pcid'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='rtm'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       <blockers model='Haswell-v4'>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='invpcid'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='pcid'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       <blockers model='Icelake-Server'>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512-vpopcntdq'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512bitalg'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512bw'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512cd'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512dq'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512f'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512vbmi'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512vbmi2'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512vl'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512vnni'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='gfni'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='hle'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='invpcid'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='la57'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='pcid'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='pku'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='rtm'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='vaes'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='vpclmulqdq'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       <blockers model='Icelake-Server-noTSX'>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512-vpopcntdq'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512bitalg'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512bw'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512cd'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512dq'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512f'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512vbmi'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512vbmi2'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512vl'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512vnni'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='gfni'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='invpcid'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='la57'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='pcid'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='pku'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='vaes'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='vpclmulqdq'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       <blockers model='Icelake-Server-v1'>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512-vpopcntdq'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512bitalg'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512bw'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512cd'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512dq'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512f'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512vbmi'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512vbmi2'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512vl'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512vnni'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='gfni'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='hle'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='invpcid'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='la57'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='pcid'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='pku'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='rtm'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='vaes'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='vpclmulqdq'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       <blockers model='Icelake-Server-v2'>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512-vpopcntdq'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512bitalg'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512bw'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512cd'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512dq'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512f'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512vbmi'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512vbmi2'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512vl'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512vnni'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='gfni'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='invpcid'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='la57'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='pcid'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='pku'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='vaes'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='vpclmulqdq'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       <blockers model='Icelake-Server-v3'>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512-vpopcntdq'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512bitalg'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512bw'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512cd'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512dq'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512f'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512vbmi'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512vbmi2'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512vl'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512vnni'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='gfni'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='ibrs-all'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='invpcid'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='la57'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='pcid'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='pku'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='taa-no'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='vaes'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='vpclmulqdq'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       <blockers model='Icelake-Server-v4'>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512-vpopcntdq'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512bitalg'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512bw'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512cd'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512dq'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512f'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512ifma'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512vbmi'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512vbmi2'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512vl'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512vnni'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='fsrm'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='gfni'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='ibrs-all'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='invpcid'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='la57'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='pcid'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='pku'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='taa-no'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='vaes'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='vpclmulqdq'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       <blockers model='Icelake-Server-v5'>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512-vpopcntdq'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512bitalg'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512bw'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512cd'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512dq'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512f'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512ifma'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512vbmi'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512vbmi2'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512vl'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512vnni'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='fsrm'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='gfni'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='ibrs-all'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='invpcid'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='la57'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='pcid'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='pku'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='taa-no'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='vaes'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='vpclmulqdq'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='xsaves'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       <blockers model='Icelake-Server-v6'>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512-vpopcntdq'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512bitalg'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512bw'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512cd'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512dq'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512f'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512ifma'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512vbmi'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512vbmi2'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512vl'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512vnni'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='fsrm'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='gfni'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='ibrs-all'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='invpcid'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='la57'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='pcid'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='pku'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='taa-no'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='vaes'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='vpclmulqdq'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='xsaves'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       <blockers model='Icelake-Server-v7'>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512-vpopcntdq'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512bitalg'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512bw'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512cd'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512dq'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512f'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512ifma'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512vbmi'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512vbmi2'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512vl'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512vnni'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='fsrm'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='gfni'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='hle'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='ibrs-all'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='invpcid'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='la57'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='pcid'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='pku'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='rtm'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='taa-no'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='vaes'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='vpclmulqdq'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='xsaves'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       <blockers model='IvyBridge'>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       <blockers model='IvyBridge-IBRS'>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       <blockers model='IvyBridge-v1'>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       <blockers model='IvyBridge-v2'>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       <blockers model='KnightsMill'>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512-4fmaps'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512-4vnniw'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512-vpopcntdq'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512cd'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512er'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512f'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512pf'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='ss'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       <blockers model='KnightsMill-v1'>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512-4fmaps'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512-4vnniw'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512-vpopcntdq'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512cd'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512er'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512f'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512pf'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='ss'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       <blockers model='Opteron_G4'>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='fma4'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='xop'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       <blockers model='Opteron_G4-v1'>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='fma4'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='xop'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       <blockers model='Opteron_G5'>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='fma4'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='tbm'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='xop'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       <blockers model='Opteron_G5-v1'>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='fma4'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='tbm'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='xop'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       <blockers model='SapphireRapids'>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='amx-bf16'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='amx-int8'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='amx-tile'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx-vnni'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512-bf16'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512-fp16'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512-vpopcntdq'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512bitalg'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512bw'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512cd'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512dq'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512f'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512ifma'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512vbmi'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512vbmi2'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512vl'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512vnni'/>
Feb 02 09:57:57 compute-1 systemd[1]: Stopping nova_compute container...
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='bus-lock-detect'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='fsrc'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='fsrm'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='fsrs'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='fzrm'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='gfni'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='hle'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='ibrs-all'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='invpcid'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='la57'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='pcid'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='pku'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='rtm'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='serialize'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='taa-no'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='tsx-ldtrk'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='vaes'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='vpclmulqdq'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='xfd'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='xsaves'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       <blockers model='SapphireRapids-v1'>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='amx-bf16'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='amx-int8'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='amx-tile'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx-vnni'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512-bf16'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512-fp16'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512-vpopcntdq'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512bitalg'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512bw'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512cd'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512dq'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512f'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512ifma'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512vbmi'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512vbmi2'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512vl'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512vnni'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='bus-lock-detect'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='fsrc'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='fsrm'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='fsrs'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='fzrm'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='gfni'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='hle'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='ibrs-all'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='invpcid'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='la57'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='pcid'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='pku'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='rtm'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='serialize'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='taa-no'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='tsx-ldtrk'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='vaes'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='vpclmulqdq'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='xfd'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='xsaves'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       <blockers model='SapphireRapids-v2'>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='amx-bf16'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='amx-int8'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='amx-tile'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx-vnni'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512-bf16'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512-fp16'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512-vpopcntdq'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512bitalg'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512bw'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512cd'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512dq'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512f'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512ifma'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512vbmi'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512vbmi2'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512vl'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512vnni'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='bus-lock-detect'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='fbsdp-no'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='fsrc'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='fsrm'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='fsrs'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='fzrm'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='gfni'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='hle'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='ibrs-all'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='invpcid'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='la57'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='pcid'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='pku'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='psdp-no'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='rtm'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='sbdr-ssdp-no'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='serialize'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='taa-no'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='tsx-ldtrk'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='vaes'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='vpclmulqdq'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='xfd'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='xsaves'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       <blockers model='SapphireRapids-v3'>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='amx-bf16'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='amx-int8'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='amx-tile'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx-vnni'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512-bf16'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512-fp16'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512-vpopcntdq'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512bitalg'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512bw'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512cd'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512dq'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512f'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512ifma'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512vbmi'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512vbmi2'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512vl'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512vnni'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='bus-lock-detect'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='cldemote'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='fbsdp-no'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='fsrc'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='fsrm'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='fsrs'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='fzrm'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='gfni'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='hle'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='ibrs-all'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='invpcid'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='la57'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='movdir64b'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='movdiri'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='pcid'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='pku'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='psdp-no'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='rtm'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='sbdr-ssdp-no'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='serialize'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='ss'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='taa-no'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='tsx-ldtrk'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='vaes'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='vpclmulqdq'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='xfd'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='xsaves'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       <blockers model='SapphireRapids-v4'>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='amx-bf16'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='amx-int8'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='amx-tile'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx-vnni'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512-bf16'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512-fp16'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512-vpopcntdq'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512bitalg'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512bw'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512cd'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512dq'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512f'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512ifma'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512vbmi'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512vbmi2'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512vl'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512vnni'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='bus-lock-detect'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='cldemote'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='fbsdp-no'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='fsrc'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='fsrm'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='fsrs'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='fzrm'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='gfni'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='hle'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='ibrs-all'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='invpcid'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='la57'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='movdir64b'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='movdiri'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='pcid'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='pku'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='psdp-no'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='rtm'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='sbdr-ssdp-no'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='serialize'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='ss'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='taa-no'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='tsx-ldtrk'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='vaes'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='vpclmulqdq'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='xfd'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='xsaves'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       <blockers model='SierraForest'>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx-ifma'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx-ne-convert'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx-vnni'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx-vnni-int8'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='bus-lock-detect'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='cmpccxadd'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='fbsdp-no'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='fsrm'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='fsrs'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='gfni'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='ibrs-all'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='invpcid'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='mcdt-no'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='pbrsb-no'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='pcid'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='pku'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='psdp-no'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='sbdr-ssdp-no'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='serialize'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='vaes'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='vpclmulqdq'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='xsaves'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       <blockers model='SierraForest-v1'>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx-ifma'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx-ne-convert'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx-vnni'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx-vnni-int8'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='bus-lock-detect'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='cmpccxadd'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='fbsdp-no'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='fsrm'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='fsrs'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='gfni'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='ibrs-all'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='invpcid'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='mcdt-no'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='pbrsb-no'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='pcid'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='pku'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='psdp-no'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='sbdr-ssdp-no'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='serialize'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='vaes'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='vpclmulqdq'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='xsaves'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel'>SierraForest-v2</model>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       <blockers model='SierraForest-v2'>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx-ifma'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx-ne-convert'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx-vnni'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx-vnni-int8'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='bhi-ctrl'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='bus-lock-detect'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='cldemote'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='cmpccxadd'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='fbsdp-no'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='fsrm'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='fsrs'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='gfni'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='ibrs-all'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='intel-psfd'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='invpcid'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='ipred-ctrl'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='lam'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='mcdt-no'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='movdir64b'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='movdiri'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='pbrsb-no'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='pcid'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='pku'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='psdp-no'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='rrsba-ctrl'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='sbdr-ssdp-no'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='serialize'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='ss'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='vaes'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='vpclmulqdq'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='xsaves'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel'>SierraForest-v3</model>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       <blockers model='SierraForest-v3'>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx-ifma'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx-ne-convert'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx-vnni'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx-vnni-int8'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='bhi-ctrl'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='bus-lock-detect'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='cldemote'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='cmpccxadd'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='fbsdp-no'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='fsrm'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='fsrs'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='gfni'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='ibrs-all'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='intel-psfd'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='invpcid'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='ipred-ctrl'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='lam'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='mcdt-no'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='movdir64b'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='movdiri'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='pbrsb-no'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='pcid'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='pku'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='psdp-no'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='rrsba-ctrl'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='sbdr-ssdp-no'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='serialize'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='ss'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='vaes'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='vpclmulqdq'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='xsaves'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       <blockers model='Skylake-Client'>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='hle'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='invpcid'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='pcid'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='rtm'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       <blockers model='Skylake-Client-IBRS'>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='hle'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='invpcid'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='pcid'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='rtm'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='invpcid'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='pcid'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       <blockers model='Skylake-Client-v1'>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='hle'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='invpcid'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='pcid'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='rtm'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       <blockers model='Skylake-Client-v2'>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='hle'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='invpcid'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='pcid'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='rtm'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       <blockers model='Skylake-Client-v3'>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='invpcid'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='pcid'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       <blockers model='Skylake-Client-v4'>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='invpcid'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='pcid'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='xsaves'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       <blockers model='Skylake-Server'>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512bw'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512cd'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512dq'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512f'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512vl'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='hle'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='invpcid'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='pcid'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='pku'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='rtm'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       <blockers model='Skylake-Server-IBRS'>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512bw'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512cd'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512dq'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512f'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512vl'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='hle'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='invpcid'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='pcid'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='pku'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='rtm'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512bw'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512cd'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512dq'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512f'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512vl'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='invpcid'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='pcid'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='pku'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       <blockers model='Skylake-Server-v1'>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512bw'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512cd'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512dq'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512f'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512vl'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='hle'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='invpcid'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='pcid'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='pku'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='rtm'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       <blockers model='Skylake-Server-v2'>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512bw'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512cd'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512dq'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512f'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512vl'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='hle'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='invpcid'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='pcid'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='pku'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='rtm'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       <blockers model='Skylake-Server-v3'>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512bw'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512cd'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512dq'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512f'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512vl'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='invpcid'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='pcid'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='pku'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       <blockers model='Skylake-Server-v4'>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512bw'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512cd'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512dq'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512f'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512vl'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='invpcid'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='pcid'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='pku'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       <blockers model='Skylake-Server-v5'>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512bw'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512cd'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512dq'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512f'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='avx512vl'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='invpcid'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='pcid'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='pku'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='xsaves'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       <blockers model='Snowridge'>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='cldemote'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='core-capability'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='gfni'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='movdir64b'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='movdiri'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='mpx'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='split-lock-detect'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       <blockers model='Snowridge-v1'>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='cldemote'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='core-capability'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='gfni'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='movdir64b'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='movdiri'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='mpx'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='split-lock-detect'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       <blockers model='Snowridge-v2'>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='cldemote'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='core-capability'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='gfni'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='movdir64b'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='movdiri'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='split-lock-detect'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       <blockers model='Snowridge-v3'>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='cldemote'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='core-capability'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='gfni'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='movdir64b'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='movdiri'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='split-lock-detect'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='xsaves'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       <blockers model='Snowridge-v4'>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='cldemote'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='erms'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='gfni'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='movdir64b'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='movdiri'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='xsaves'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       <blockers model='athlon'>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='3dnow'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='3dnowext'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       <blockers model='athlon-v1'>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='3dnow'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='3dnowext'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       <blockers model='core2duo'>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='ss'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       <blockers model='core2duo-v1'>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='ss'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       <blockers model='coreduo'>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='ss'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       <blockers model='coreduo-v1'>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='ss'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       <blockers model='n270'>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='ss'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       <blockers model='n270-v1'>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='ss'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       <blockers model='phenom'>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='3dnow'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='3dnowext'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       <blockers model='phenom-v1'>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='3dnow'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <feature name='3dnowext'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       </blockers>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Feb 02 09:57:57 compute-1 nova_compute[225341]:     </mode>
Feb 02 09:57:57 compute-1 nova_compute[225341]:   </cpu>
Feb 02 09:57:57 compute-1 nova_compute[225341]:   <memoryBacking supported='yes'>
Feb 02 09:57:57 compute-1 nova_compute[225341]:     <enum name='sourceType'>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       <value>file</value>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       <value>anonymous</value>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       <value>memfd</value>
Feb 02 09:57:57 compute-1 nova_compute[225341]:     </enum>
Feb 02 09:57:57 compute-1 nova_compute[225341]:   </memoryBacking>
Feb 02 09:57:57 compute-1 nova_compute[225341]:   <devices>
Feb 02 09:57:57 compute-1 nova_compute[225341]:     <disk supported='yes'>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       <enum name='diskDevice'>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <value>disk</value>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <value>cdrom</value>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <value>floppy</value>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <value>lun</value>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       </enum>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       <enum name='bus'>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <value>fdc</value>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <value>scsi</value>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <value>virtio</value>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <value>usb</value>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <value>sata</value>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       </enum>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       <enum name='model'>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <value>virtio</value>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <value>virtio-transitional</value>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <value>virtio-non-transitional</value>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       </enum>
Feb 02 09:57:57 compute-1 nova_compute[225341]:     </disk>
Feb 02 09:57:57 compute-1 nova_compute[225341]:     <graphics supported='yes'>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       <enum name='type'>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <value>vnc</value>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <value>egl-headless</value>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <value>dbus</value>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       </enum>
Feb 02 09:57:57 compute-1 nova_compute[225341]:     </graphics>
Feb 02 09:57:57 compute-1 nova_compute[225341]:     <video supported='yes'>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       <enum name='modelType'>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <value>vga</value>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <value>cirrus</value>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <value>virtio</value>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <value>none</value>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <value>bochs</value>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <value>ramfb</value>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       </enum>
Feb 02 09:57:57 compute-1 nova_compute[225341]:     </video>
Feb 02 09:57:57 compute-1 nova_compute[225341]:     <hostdev supported='yes'>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       <enum name='mode'>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <value>subsystem</value>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       </enum>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       <enum name='startupPolicy'>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <value>default</value>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <value>mandatory</value>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <value>requisite</value>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <value>optional</value>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       </enum>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       <enum name='subsysType'>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <value>usb</value>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <value>pci</value>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <value>scsi</value>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       </enum>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       <enum name='capsType'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       <enum name='pciBackend'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:     </hostdev>
Feb 02 09:57:57 compute-1 nova_compute[225341]:     <rng supported='yes'>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       <enum name='model'>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <value>virtio</value>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <value>virtio-transitional</value>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <value>virtio-non-transitional</value>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       </enum>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       <enum name='backendModel'>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <value>random</value>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <value>egd</value>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <value>builtin</value>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       </enum>
Feb 02 09:57:57 compute-1 nova_compute[225341]:     </rng>
Feb 02 09:57:57 compute-1 nova_compute[225341]:     <filesystem supported='yes'>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       <enum name='driverType'>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <value>path</value>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <value>handle</value>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <value>virtiofs</value>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       </enum>
Feb 02 09:57:57 compute-1 nova_compute[225341]:     </filesystem>
Feb 02 09:57:57 compute-1 nova_compute[225341]:     <tpm supported='yes'>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       <enum name='model'>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <value>tpm-tis</value>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <value>tpm-crb</value>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       </enum>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       <enum name='backendModel'>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <value>emulator</value>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <value>external</value>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       </enum>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       <enum name='backendVersion'>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <value>2.0</value>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       </enum>
Feb 02 09:57:57 compute-1 nova_compute[225341]:     </tpm>
Feb 02 09:57:57 compute-1 nova_compute[225341]:     <redirdev supported='yes'>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       <enum name='bus'>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <value>usb</value>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       </enum>
Feb 02 09:57:57 compute-1 nova_compute[225341]:     </redirdev>
Feb 02 09:57:57 compute-1 nova_compute[225341]:     <channel supported='yes'>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       <enum name='type'>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <value>pty</value>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <value>unix</value>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       </enum>
Feb 02 09:57:57 compute-1 nova_compute[225341]:     </channel>
Feb 02 09:57:57 compute-1 nova_compute[225341]:     <crypto supported='yes'>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       <enum name='model'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       <enum name='type'>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <value>qemu</value>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       </enum>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       <enum name='backendModel'>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <value>builtin</value>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       </enum>
Feb 02 09:57:57 compute-1 nova_compute[225341]:     </crypto>
Feb 02 09:57:57 compute-1 nova_compute[225341]:     <interface supported='yes'>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       <enum name='backendType'>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <value>default</value>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <value>passt</value>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       </enum>
Feb 02 09:57:57 compute-1 nova_compute[225341]:     </interface>
Feb 02 09:57:57 compute-1 nova_compute[225341]:     <panic supported='yes'>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       <enum name='model'>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <value>isa</value>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <value>hyperv</value>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       </enum>
Feb 02 09:57:57 compute-1 nova_compute[225341]:     </panic>
Feb 02 09:57:57 compute-1 nova_compute[225341]:     <console supported='yes'>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       <enum name='type'>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <value>null</value>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <value>vc</value>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <value>pty</value>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <value>dev</value>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <value>file</value>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <value>pipe</value>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <value>stdio</value>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <value>udp</value>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <value>tcp</value>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <value>unix</value>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <value>qemu-vdagent</value>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <value>dbus</value>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       </enum>
Feb 02 09:57:57 compute-1 nova_compute[225341]:     </console>
Feb 02 09:57:57 compute-1 nova_compute[225341]:   </devices>
Feb 02 09:57:57 compute-1 nova_compute[225341]:   <features>
Feb 02 09:57:57 compute-1 nova_compute[225341]:     <gic supported='no'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:     <vmcoreinfo supported='yes'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:     <genid supported='yes'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:     <backingStoreInput supported='yes'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:     <backup supported='yes'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:     <async-teardown supported='yes'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:     <s390-pv supported='no'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:     <ps2 supported='yes'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:     <tdx supported='no'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:     <sev supported='no'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:     <sgx supported='no'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:     <hyperv supported='yes'>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       <enum name='features'>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <value>relaxed</value>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <value>vapic</value>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <value>spinlocks</value>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <value>vpindex</value>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <value>runtime</value>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <value>synic</value>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <value>stimer</value>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <value>reset</value>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <value>vendor_id</value>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <value>frequencies</value>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <value>reenlightenment</value>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <value>tlbflush</value>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <value>ipi</value>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <value>avic</value>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <value>emsr_bitmap</value>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <value>xmm_input</value>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       </enum>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       <defaults>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <spinlocks>4095</spinlocks>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <stimer_direct>on</stimer_direct>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <tlbflush_direct>on</tlbflush_direct>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <tlbflush_extended>on</tlbflush_extended>
Feb 02 09:57:57 compute-1 nova_compute[225341]:         <vendor_id>Linux KVM Hv</vendor_id>
Feb 02 09:57:57 compute-1 nova_compute[225341]:       </defaults>
Feb 02 09:57:57 compute-1 nova_compute[225341]:     </hyperv>
Feb 02 09:57:57 compute-1 nova_compute[225341]:     <launchSecurity supported='no'/>
Feb 02 09:57:57 compute-1 nova_compute[225341]:   </features>
Feb 02 09:57:57 compute-1 nova_compute[225341]: </domainCapabilities>
Feb 02 09:57:57 compute-1 nova_compute[225341]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Feb 02 09:57:57 compute-1 nova_compute[225341]: 2026-02-02 09:57:56.998 225345 DEBUG nova.virt.libvirt.host [None req-a768f377-541c-4e5a-9cf1-05c49e440e26 - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782
Feb 02 09:57:57 compute-1 nova_compute[225341]: 2026-02-02 09:57:56.999 225345 DEBUG nova.virt.libvirt.host [None req-a768f377-541c-4e5a-9cf1-05c49e440e26 - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782
Feb 02 09:57:57 compute-1 nova_compute[225341]: 2026-02-02 09:57:56.999 225345 DEBUG nova.virt.libvirt.host [None req-a768f377-541c-4e5a-9cf1-05c49e440e26 - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782
Feb 02 09:57:57 compute-1 nova_compute[225341]: 2026-02-02 09:57:57.003 225345 INFO nova.virt.libvirt.host [None req-a768f377-541c-4e5a-9cf1-05c49e440e26 - - - - - -] Secure Boot support detected
Feb 02 09:57:57 compute-1 nova_compute[225341]: 2026-02-02 09:57:57.005 225345 INFO nova.virt.libvirt.driver [None req-a768f377-541c-4e5a-9cf1-05c49e440e26 - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.
Feb 02 09:57:57 compute-1 nova_compute[225341]: 2026-02-02 09:57:57.005 225345 INFO nova.virt.libvirt.driver [None req-a768f377-541c-4e5a-9cf1-05c49e440e26 - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.
Feb 02 09:57:57 compute-1 nova_compute[225341]: 2026-02-02 09:57:57.013 225345 DEBUG nova.virt.libvirt.driver [None req-a768f377-541c-4e5a-9cf1-05c49e440e26 - - - - - -] Enabling emulated TPM support _check_vtpm_support /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1097
Feb 02 09:57:57 compute-1 nova_compute[225341]: 2026-02-02 09:57:57.064 225345 INFO nova.virt.node [None req-a768f377-541c-4e5a-9cf1-05c49e440e26 - - - - - -] Determined node identity 8e32c057-ad28-4c19-8374-763e0c1c8622 from /var/lib/nova/compute_id
Feb 02 09:57:57 compute-1 nova_compute[225341]: 2026-02-02 09:57:57.082 225345 WARNING nova.compute.manager [None req-a768f377-541c-4e5a-9cf1-05c49e440e26 - - - - - -] Compute nodes ['8e32c057-ad28-4c19-8374-763e0c1c8622'] for host compute-1.ctlplane.example.com were not found in the database. If this is the first time this service is starting on this host, then you can ignore this warning.
Feb 02 09:57:57 compute-1 nova_compute[225341]: 2026-02-02 09:57:57.086 225345 DEBUG oslo_concurrency.lockutils [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 02 09:57:57 compute-1 nova_compute[225341]: 2026-02-02 09:57:57.086 225345 DEBUG oslo_concurrency.lockutils [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 02 09:57:57 compute-1 nova_compute[225341]: 2026-02-02 09:57:57.086 225345 DEBUG oslo_concurrency.lockutils [None req-c0ba57ca-5570-4204-aae8-1b56e76c9536 - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 02 09:57:57 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 09:57:57 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[219563]: 02/02/2026 09:57:57 : epoch 698074f5 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc724003c10 fd 37 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:57:57 compute-1 virtqemud[225988]: libvirt version: 11.10.0, package: 2.el9 (builder@centos.org, 2025-12-18-15:09:54, )
Feb 02 09:57:57 compute-1 virtqemud[225988]: hostname: compute-1
Feb 02 09:57:57 compute-1 virtqemud[225988]: End of file while reading data: Input/output error
Feb 02 09:57:57 compute-1 systemd[1]: libpod-44a8f98e42b54b85c089d415e06284450a966543946fe740f844231d5b8c260e.scope: Deactivated successfully.
Feb 02 09:57:57 compute-1 systemd[1]: libpod-44a8f98e42b54b85c089d415e06284450a966543946fe740f844231d5b8c260e.scope: Consumed 3.065s CPU time.
Feb 02 09:57:57 compute-1 podman[226234]: 2026-02-02 09:57:57.505006021 +0000 UTC m=+0.462728510 container died 44a8f98e42b54b85c089d415e06284450a966543946fe740f844231d5b8c260e (image=quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:f96bd21c79ae0d7e8e17010c5e2573637d6c0f47f03e63134c477edd8ad73d83, name=nova_compute, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:f96bd21c79ae0d7e8e17010c5e2573637d6c0f47f03e63134c477edd8ad73d83', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, container_name=nova_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_id=edpm, maintainer=OpenStack Kubernetes Operator team)
Feb 02 09:57:57 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-44a8f98e42b54b85c089d415e06284450a966543946fe740f844231d5b8c260e-userdata-shm.mount: Deactivated successfully.
Feb 02 09:57:57 compute-1 systemd[1]: var-lib-containers-storage-overlay-ff854ca9e4e8bf877a980a70dc449dc8b76c85e2d48fcddfd47deaf64bc04154-merged.mount: Deactivated successfully.
Feb 02 09:57:57 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[219563]: 02/02/2026 09:57:57 : epoch 698074f5 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc724003c10 fd 37 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:57:57 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[219563]: 02/02/2026 09:57:57 : epoch 698074f5 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc728003820 fd 37 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:57:58 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:57:58 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.002000052s ======
Feb 02 09:57:58 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:57:58.240 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000052s
Feb 02 09:57:58 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:57:58 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:57:58 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:57:58.360 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:57:58 compute-1 podman[226234]: 2026-02-02 09:57:58.648371276 +0000 UTC m=+1.606093735 container cleanup 44a8f98e42b54b85c089d415e06284450a966543946fe740f844231d5b8c260e (image=quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:f96bd21c79ae0d7e8e17010c5e2573637d6c0f47f03e63134c477edd8ad73d83, name=nova_compute, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:f96bd21c79ae0d7e8e17010c5e2573637d6c0f47f03e63134c477edd8ad73d83', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, container_name=nova_compute, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_id=edpm, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 02 09:57:58 compute-1 podman[226234]: nova_compute
Feb 02 09:57:58 compute-1 podman[226263]: nova_compute
Feb 02 09:57:58 compute-1 systemd[1]: edpm_nova_compute.service: Deactivated successfully.
Feb 02 09:57:58 compute-1 systemd[1]: Stopped nova_compute container.
Feb 02 09:57:58 compute-1 systemd[1]: Starting nova_compute container...
Feb 02 09:57:58 compute-1 systemd[1]: Started libcrun container.
Feb 02 09:57:58 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ff854ca9e4e8bf877a980a70dc449dc8b76c85e2d48fcddfd47deaf64bc04154/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Feb 02 09:57:58 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ff854ca9e4e8bf877a980a70dc449dc8b76c85e2d48fcddfd47deaf64bc04154/merged/etc/nvme supports timestamps until 2038 (0x7fffffff)
Feb 02 09:57:58 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ff854ca9e4e8bf877a980a70dc449dc8b76c85e2d48fcddfd47deaf64bc04154/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Feb 02 09:57:58 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ff854ca9e4e8bf877a980a70dc449dc8b76c85e2d48fcddfd47deaf64bc04154/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Feb 02 09:57:58 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ff854ca9e4e8bf877a980a70dc449dc8b76c85e2d48fcddfd47deaf64bc04154/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Feb 02 09:57:58 compute-1 podman[226276]: 2026-02-02 09:57:58.891322631 +0000 UTC m=+0.140103349 container init 44a8f98e42b54b85c089d415e06284450a966543946fe740f844231d5b8c260e (image=quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:f96bd21c79ae0d7e8e17010c5e2573637d6c0f47f03e63134c477edd8ad73d83, name=nova_compute, container_name=nova_compute, managed_by=edpm_ansible, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:f96bd21c79ae0d7e8e17010c5e2573637d6c0f47f03e63134c477edd8ad73d83', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=edpm)
Feb 02 09:57:58 compute-1 podman[226276]: 2026-02-02 09:57:58.899227483 +0000 UTC m=+0.148008191 container start 44a8f98e42b54b85c089d415e06284450a966543946fe740f844231d5b8c260e (image=quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:f96bd21c79ae0d7e8e17010c5e2573637d6c0f47f03e63134c477edd8ad73d83, name=nova_compute, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:f96bd21c79ae0d7e8e17010c5e2573637d6c0f47f03e63134c477edd8ad73d83', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, config_id=edpm, managed_by=edpm_ansible, org.label-schema.build-date=20260127, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=nova_compute, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3)
Feb 02 09:57:58 compute-1 nova_compute[226294]: + sudo -E kolla_set_configs
Feb 02 09:57:58 compute-1 podman[226276]: nova_compute
Feb 02 09:57:58 compute-1 systemd[1]: Started nova_compute container.
Feb 02 09:57:58 compute-1 sudo[226198]: pam_unix(sudo:session): session closed for user root
Feb 02 09:57:58 compute-1 nova_compute[226294]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Feb 02 09:57:58 compute-1 nova_compute[226294]: INFO:__main__:Validating config file
Feb 02 09:57:58 compute-1 nova_compute[226294]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Feb 02 09:57:58 compute-1 nova_compute[226294]: INFO:__main__:Copying service configuration files
Feb 02 09:57:58 compute-1 nova_compute[226294]: INFO:__main__:Deleting /etc/nova/nova.conf
Feb 02 09:57:58 compute-1 nova_compute[226294]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf
Feb 02 09:57:58 compute-1 nova_compute[226294]: INFO:__main__:Setting permission for /etc/nova/nova.conf
Feb 02 09:57:58 compute-1 nova_compute[226294]: INFO:__main__:Deleting /etc/nova/nova.conf.d/01-nova.conf
Feb 02 09:57:58 compute-1 nova_compute[226294]: INFO:__main__:Copying /var/lib/kolla/config_files/01-nova.conf to /etc/nova/nova.conf.d/01-nova.conf
Feb 02 09:57:58 compute-1 nova_compute[226294]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/01-nova.conf
Feb 02 09:57:58 compute-1 nova_compute[226294]: INFO:__main__:Deleting /etc/nova/nova.conf.d/03-ceph-nova.conf
Feb 02 09:57:58 compute-1 nova_compute[226294]: INFO:__main__:Copying /var/lib/kolla/config_files/03-ceph-nova.conf to /etc/nova/nova.conf.d/03-ceph-nova.conf
Feb 02 09:57:58 compute-1 nova_compute[226294]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/03-ceph-nova.conf
Feb 02 09:57:58 compute-1 nova_compute[226294]: INFO:__main__:Deleting /etc/nova/nova.conf.d/25-nova-extra.conf
Feb 02 09:57:58 compute-1 nova_compute[226294]: INFO:__main__:Copying /var/lib/kolla/config_files/25-nova-extra.conf to /etc/nova/nova.conf.d/25-nova-extra.conf
Feb 02 09:57:58 compute-1 nova_compute[226294]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/25-nova-extra.conf
Feb 02 09:57:58 compute-1 nova_compute[226294]: INFO:__main__:Deleting /etc/nova/nova.conf.d/nova-blank.conf
Feb 02 09:57:58 compute-1 nova_compute[226294]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf.d/nova-blank.conf
Feb 02 09:57:58 compute-1 nova_compute[226294]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/nova-blank.conf
Feb 02 09:57:58 compute-1 nova_compute[226294]: INFO:__main__:Deleting /etc/nova/nova.conf.d/02-nova-host-specific.conf
Feb 02 09:57:58 compute-1 nova_compute[226294]: INFO:__main__:Copying /var/lib/kolla/config_files/02-nova-host-specific.conf to /etc/nova/nova.conf.d/02-nova-host-specific.conf
Feb 02 09:57:58 compute-1 nova_compute[226294]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/02-nova-host-specific.conf
Feb 02 09:57:58 compute-1 nova_compute[226294]: INFO:__main__:Deleting /etc/ceph
Feb 02 09:57:58 compute-1 nova_compute[226294]: INFO:__main__:Creating directory /etc/ceph
Feb 02 09:57:58 compute-1 nova_compute[226294]: INFO:__main__:Setting permission for /etc/ceph
Feb 02 09:57:58 compute-1 nova_compute[226294]: INFO:__main__:Copying /var/lib/kolla/config_files/ceph/ceph.conf to /etc/ceph/ceph.conf
Feb 02 09:57:58 compute-1 nova_compute[226294]: INFO:__main__:Setting permission for /etc/ceph/ceph.conf
Feb 02 09:57:58 compute-1 nova_compute[226294]: INFO:__main__:Copying /var/lib/kolla/config_files/ceph/ceph.client.openstack.keyring to /etc/ceph/ceph.client.openstack.keyring
Feb 02 09:57:58 compute-1 nova_compute[226294]: INFO:__main__:Setting permission for /etc/ceph/ceph.client.openstack.keyring
Feb 02 09:57:58 compute-1 nova_compute[226294]: INFO:__main__:Deleting /var/lib/nova/.ssh/ssh-privatekey
Feb 02 09:57:58 compute-1 nova_compute[226294]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-privatekey to /var/lib/nova/.ssh/ssh-privatekey
Feb 02 09:57:58 compute-1 nova_compute[226294]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Feb 02 09:57:58 compute-1 nova_compute[226294]: INFO:__main__:Deleting /var/lib/nova/.ssh/config
Feb 02 09:57:58 compute-1 nova_compute[226294]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-config to /var/lib/nova/.ssh/config
Feb 02 09:57:58 compute-1 nova_compute[226294]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Feb 02 09:57:58 compute-1 nova_compute[226294]: INFO:__main__:Deleting /usr/sbin/iscsiadm
Feb 02 09:57:58 compute-1 nova_compute[226294]: INFO:__main__:Copying /var/lib/kolla/config_files/run-on-host to /usr/sbin/iscsiadm
Feb 02 09:57:58 compute-1 nova_compute[226294]: INFO:__main__:Setting permission for /usr/sbin/iscsiadm
Feb 02 09:57:58 compute-1 nova_compute[226294]: INFO:__main__:Writing out command to execute
Feb 02 09:57:59 compute-1 nova_compute[226294]: INFO:__main__:Setting permission for /etc/ceph/ceph.conf
Feb 02 09:57:59 compute-1 nova_compute[226294]: INFO:__main__:Setting permission for /etc/ceph/ceph.client.openstack.keyring
Feb 02 09:57:59 compute-1 nova_compute[226294]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/
Feb 02 09:57:59 compute-1 nova_compute[226294]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Feb 02 09:57:59 compute-1 nova_compute[226294]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Feb 02 09:57:59 compute-1 nova_compute[226294]: ++ cat /run_command
Feb 02 09:57:59 compute-1 nova_compute[226294]: + CMD=nova-compute
Feb 02 09:57:59 compute-1 nova_compute[226294]: + ARGS=
Feb 02 09:57:59 compute-1 nova_compute[226294]: + sudo kolla_copy_cacerts
Feb 02 09:57:59 compute-1 nova_compute[226294]: + [[ ! -n '' ]]
Feb 02 09:57:59 compute-1 nova_compute[226294]: + . kolla_extend_start
Feb 02 09:57:59 compute-1 nova_compute[226294]: Running command: 'nova-compute'
Feb 02 09:57:59 compute-1 nova_compute[226294]: + echo 'Running command: '\''nova-compute'\'''
Feb 02 09:57:59 compute-1 nova_compute[226294]: + umask 0022
Feb 02 09:57:59 compute-1 nova_compute[226294]: + exec nova-compute
Feb 02 09:57:59 compute-1 ceph-mon[80115]: pgmap v549: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Feb 02 09:57:59 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[219563]: 02/02/2026 09:57:59 : epoch 698074f5 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc720001b20 fd 37 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:57:59 compute-1 kernel: ganesha.nfsd[221734]: segfault at 50 ip 00007fc7c990232e sp 00007fc7357f9210 error 4 in libntirpc.so.5.8[7fc7c98e7000+2c000] likely on CPU 6 (core 0, socket 6)
Feb 02 09:57:59 compute-1 kernel: Code: 47 20 66 41 89 86 f2 00 00 00 41 bf 01 00 00 00 b9 40 00 00 00 e9 af fd ff ff 66 90 48 8b 85 f8 00 00 00 48 8b 40 08 4c 8b 28 <45> 8b 65 50 49 8b 75 68 41 8b be 28 02 00 00 b9 40 00 00 00 e8 29
Feb 02 09:57:59 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[219563]: 02/02/2026 09:57:59 : epoch 698074f5 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc728003820 fd 37 proxy ignored for local
Feb 02 09:57:59 compute-1 systemd[1]: Started Process Core Dump (PID 226330/UID 0).
Feb 02 09:58:00 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:58:00 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 09:58:00 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:58:00.244 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 09:58:00 compute-1 sudo[226458]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jmrymggtxvljawlqmcmiieoyppjzfckp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1770026280.0757983-3618-76822130573622/AnsiballZ_podman_container.py'
Feb 02 09:58:00 compute-1 sudo[226458]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 09:58:00 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:58:00 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:58:00 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:58:00.364 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:58:00 compute-1 systemd-coredump[226331]: Process 219567 (ganesha.nfsd) of user 0 dumped core.
                                                    
                                                    Stack trace of thread 55:
                                                    #0  0x00007fc7c990232e n/a (/usr/lib64/libntirpc.so.5.8 + 0x2232e)
                                                    #1  0x0000000000000000 n/a (n/a + 0x0)
                                                    #2  0x00007fc7c990c900 n/a (/usr/lib64/libntirpc.so.5.8 + 0x2c900)
                                                    ELF object binary architecture: AMD x86-64
Feb 02 09:58:00 compute-1 python3.9[226460]: ansible-containers.podman.podman_container Invoked with name=nova_compute_init state=started executable=podman detach=True debug=False force_restart=False force_delete=True generate_systemd={} image_strict=False recreate=False image=None annotation=None arch=None attach=None authfile=None blkio_weight=None blkio_weight_device=None cap_add=None cap_drop=None cgroup_conf=None cgroup_parent=None cgroupns=None cgroups=None chrootdirs=None cidfile=None cmd_args=None conmon_pidfile=None command=None cpu_period=None cpu_quota=None cpu_rt_period=None cpu_rt_runtime=None cpu_shares=None cpus=None cpuset_cpus=None cpuset_mems=None decryption_key=None delete_depend=None delete_time=None delete_volumes=None detach_keys=None device=None device_cgroup_rule=None device_read_bps=None device_read_iops=None device_write_bps=None device_write_iops=None dns=None dns_option=None dns_search=None entrypoint=None env=None env_file=None env_host=None env_merge=None etc_hosts=None expose=None gidmap=None gpus=None group_add=None group_entry=None healthcheck=None healthcheck_interval=None healthcheck_retries=None healthcheck_start_period=None health_startup_cmd=None health_startup_interval=None health_startup_retries=None health_startup_success=None health_startup_timeout=None healthcheck_timeout=None healthcheck_failure_action=None hooks_dir=None hostname=None hostuser=None http_proxy=None image_volume=None init=None init_ctr=None init_path=None interactive=None ip=None ip6=None ipc=None kernel_memory=None label=None label_file=None log_driver=None log_level=None log_opt=None mac_address=None memory=None memory_reservation=None memory_swap=None memory_swappiness=None mount=None network=None network_aliases=None no_healthcheck=None no_hosts=None oom_kill_disable=None oom_score_adj=None os=None passwd=None passwd_entry=None personality=None pid=None pid_file=None pids_limit=None platform=None pod=None pod_id_file=None preserve_fd=None preserve_fds=None privileged=None publish=None publish_all=None pull=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None rdt_class=None read_only=None read_only_tmpfs=None requires=None restart_policy=None restart_time=None retry=None retry_delay=None rm=None rmi=None rootfs=None seccomp_policy=None secrets=NOT_LOGGING_PARAMETER sdnotify=None security_opt=None shm_size=None shm_size_systemd=None sig_proxy=None stop_signal=None stop_timeout=None stop_time=None subgidname=None subuidname=None sysctl=None systemd=None timeout=None timezone=None tls_verify=None tmpfs=None tty=None uidmap=None ulimit=None umask=None unsetenv=None unsetenv_all=None user=None userns=None uts=None variant=None volume=None volumes_from=None workdir=None
Feb 02 09:58:00 compute-1 systemd[1]: systemd-coredump@8-226330-0.service: Deactivated successfully.
Feb 02 09:58:00 compute-1 rsyslogd[1009]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Feb 02 09:58:00 compute-1 ceph-mon[80115]: pgmap v550: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Feb 02 09:58:00 compute-1 podman[226467]: 2026-02-02 09:58:00.687019155 +0000 UTC m=+0.030540091 container died 86fb8b86fe4fd313b488fe64e1e5eb8bb5d56efdf256a6fe24227489fcde9b6e (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx, org.label-schema.vendor=CentOS, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, org.label-schema.build-date=20250325, ceph=True)
Feb 02 09:58:00 compute-1 systemd[1]: var-lib-containers-storage-overlay-73dc38675cb95bd2ce9e007b58ce4c3a2f4231b213b0e6b3b48bce8e41439171-merged.mount: Deactivated successfully.
Feb 02 09:58:00 compute-1 podman[226467]: 2026-02-02 09:58:00.818889644 +0000 UTC m=+0.162410540 container remove 86fb8b86fe4fd313b488fe64e1e5eb8bb5d56efdf256a6fe24227489fcde9b6e (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, io.buildah.version=1.40.1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid)
Feb 02 09:58:00 compute-1 systemd[1]: ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1@nfs.cephfs.0.0.compute-1.mhzhsx.service: Main process exited, code=exited, status=139/n/a
Feb 02 09:58:00 compute-1 systemd[1]: Started libpod-conmon-f928f7489bec4355ce1c96cdb3df24e598d845581ddb3fb4d77610b087f1888a.scope.
Feb 02 09:58:00 compute-1 systemd[1]: Started libcrun container.
Feb 02 09:58:00 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ff11bf020beacd9d846ed3e45cbd9b061d3e32b2ecd9b949861e46bfe46e1ac1/merged/usr/sbin/nova_statedir_ownership.py supports timestamps until 2038 (0x7fffffff)
Feb 02 09:58:00 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ff11bf020beacd9d846ed3e45cbd9b061d3e32b2ecd9b949861e46bfe46e1ac1/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Feb 02 09:58:00 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ff11bf020beacd9d846ed3e45cbd9b061d3e32b2ecd9b949861e46bfe46e1ac1/merged/var/lib/_nova_secontext supports timestamps until 2038 (0x7fffffff)
Feb 02 09:58:00 compute-1 podman[226503]: 2026-02-02 09:58:00.932343372 +0000 UTC m=+0.190035935 container init f928f7489bec4355ce1c96cdb3df24e598d845581ddb3fb4d77610b087f1888a (image=quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:f96bd21c79ae0d7e8e17010c5e2573637d6c0f47f03e63134c477edd8ad73d83, name=nova_compute_init, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, container_name=nova_compute_init, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:f96bd21c79ae0d7e8e17010c5e2573637d6c0f47f03e63134c477edd8ad73d83', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127)
Feb 02 09:58:00 compute-1 podman[226503]: 2026-02-02 09:58:00.940792947 +0000 UTC m=+0.198485460 container start f928f7489bec4355ce1c96cdb3df24e598d845581ddb3fb4d77610b087f1888a (image=quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:f96bd21c79ae0d7e8e17010c5e2573637d6c0f47f03e63134c477edd8ad73d83, name=nova_compute_init, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:f96bd21c79ae0d7e8e17010c5e2573637d6c0f47f03e63134c477edd8ad73d83', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, config_id=edpm, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, container_name=nova_compute_init, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 02 09:58:00 compute-1 python3.9[226460]: ansible-containers.podman.podman_container PODMAN-CONTAINER-DEBUG: podman start nova_compute_init
Feb 02 09:58:00 compute-1 nova_compute_init[226539]: INFO:nova_statedir:Applying nova statedir ownership
Feb 02 09:58:00 compute-1 nova_compute_init[226539]: INFO:nova_statedir:Target ownership for /var/lib/nova: 42436:42436
Feb 02 09:58:00 compute-1 nova_compute_init[226539]: INFO:nova_statedir:Checking uid: 1000 gid: 1000 path: /var/lib/nova/
Feb 02 09:58:00 compute-1 nova_compute_init[226539]: INFO:nova_statedir:Changing ownership of /var/lib/nova from 1000:1000 to 42436:42436
Feb 02 09:58:00 compute-1 nova_compute_init[226539]: INFO:nova_statedir:Setting selinux context of /var/lib/nova to system_u:object_r:container_file_t:s0
Feb 02 09:58:00 compute-1 nova_compute_init[226539]: INFO:nova_statedir:Checking uid: 1000 gid: 1000 path: /var/lib/nova/instances/
Feb 02 09:58:00 compute-1 nova_compute_init[226539]: INFO:nova_statedir:Changing ownership of /var/lib/nova/instances from 1000:1000 to 42436:42436
Feb 02 09:58:00 compute-1 nova_compute_init[226539]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/instances to system_u:object_r:container_file_t:s0
Feb 02 09:58:00 compute-1 nova_compute_init[226539]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/
Feb 02 09:58:00 compute-1 nova_compute_init[226539]: INFO:nova_statedir:Ownership of /var/lib/nova/.ssh already 42436:42436
Feb 02 09:58:00 compute-1 nova_compute_init[226539]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/.ssh to system_u:object_r:container_file_t:s0
Feb 02 09:58:00 compute-1 nova_compute_init[226539]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/ssh-privatekey
Feb 02 09:58:00 compute-1 nova_compute_init[226539]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/config
Feb 02 09:58:00 compute-1 nova_compute_init[226539]: INFO:nova_statedir:Nova statedir ownership complete
Feb 02 09:58:01 compute-1 systemd[1]: libpod-f928f7489bec4355ce1c96cdb3df24e598d845581ddb3fb4d77610b087f1888a.scope: Deactivated successfully.
Feb 02 09:58:01 compute-1 podman[226541]: 2026-02-02 09:58:01.019104298 +0000 UTC m=+0.047422043 container died f928f7489bec4355ce1c96cdb3df24e598d845581ddb3fb4d77610b087f1888a (image=quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:f96bd21c79ae0d7e8e17010c5e2573637d6c0f47f03e63134c477edd8ad73d83, name=nova_compute_init, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:f96bd21c79ae0d7e8e17010c5e2573637d6c0f47f03e63134c477edd8ad73d83', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=nova_compute_init, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=edpm)
Feb 02 09:58:01 compute-1 systemd[1]: ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1@nfs.cephfs.0.0.compute-1.mhzhsx.service: Failed with result 'exit-code'.
Feb 02 09:58:01 compute-1 systemd[1]: ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1@nfs.cephfs.0.0.compute-1.mhzhsx.service: Consumed 1.206s CPU time.
Feb 02 09:58:01 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-f928f7489bec4355ce1c96cdb3df24e598d845581ddb3fb4d77610b087f1888a-userdata-shm.mount: Deactivated successfully.
Feb 02 09:58:01 compute-1 systemd[1]: var-lib-containers-storage-overlay-ff11bf020beacd9d846ed3e45cbd9b061d3e32b2ecd9b949861e46bfe46e1ac1-merged.mount: Deactivated successfully.
Feb 02 09:58:01 compute-1 podman[226563]: 2026-02-02 09:58:01.080510786 +0000 UTC m=+0.066314715 container cleanup f928f7489bec4355ce1c96cdb3df24e598d845581ddb3fb4d77610b087f1888a (image=quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:f96bd21c79ae0d7e8e17010c5e2573637d6c0f47f03e63134c477edd8ad73d83, name=nova_compute_init, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:f96bd21c79ae0d7e8e17010c5e2573637d6c0f47f03e63134c477edd8ad73d83', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, config_id=edpm, container_name=nova_compute_init, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.build-date=20260127, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 02 09:58:01 compute-1 systemd[1]: libpod-conmon-f928f7489bec4355ce1c96cdb3df24e598d845581ddb3fb4d77610b087f1888a.scope: Deactivated successfully.
Feb 02 09:58:01 compute-1 sudo[226458]: pam_unix(sudo:session): session closed for user root
Feb 02 09:58:01 compute-1 nova_compute[226294]: 2026-02-02 09:58:01.484 226298 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_linux_bridge.linux_bridge.LinuxBridgePlugin'>' with name 'linux_bridge' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Feb 02 09:58:01 compute-1 nova_compute[226294]: 2026-02-02 09:58:01.485 226298 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_noop.noop.NoOpPlugin'>' with name 'noop' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Feb 02 09:58:01 compute-1 nova_compute[226294]: 2026-02-02 09:58:01.485 226298 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_ovs.ovs.OvsPlugin'>' with name 'ovs' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Feb 02 09:58:01 compute-1 nova_compute[226294]: 2026-02-02 09:58:01.485 226298 INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs
Feb 02 09:58:01 compute-1 nova_compute[226294]: 2026-02-02 09:58:01.617 226298 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 02 09:58:01 compute-1 nova_compute[226294]: 2026-02-02 09:58:01.637 226298 DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 1 in 0.021s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 02 09:58:01 compute-1 nova_compute[226294]: 2026-02-02 09:58:01.638 226298 DEBUG oslo_concurrency.processutils [-] 'grep -F node.session.scan /sbin/iscsiadm' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473
Feb 02 09:58:01 compute-1 sshd-session[202245]: Connection closed by 192.168.122.30 port 57414
Feb 02 09:58:01 compute-1 sshd-session[202242]: pam_unix(sshd:session): session closed for user zuul
Feb 02 09:58:01 compute-1 systemd[1]: session-53.scope: Deactivated successfully.
Feb 02 09:58:01 compute-1 systemd[1]: session-53.scope: Consumed 1min 54.117s CPU time.
Feb 02 09:58:01 compute-1 systemd-logind[805]: Session 53 logged out. Waiting for processes to exit.
Feb 02 09:58:01 compute-1 systemd-logind[805]: Removed session 53.
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.070 226298 INFO nova.virt.driver [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] Loading compute driver 'libvirt.LibvirtDriver'
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.225 226298 INFO nova.compute.provider_config [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access.
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.243 226298 DEBUG oslo_concurrency.lockutils [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.243 226298 DEBUG oslo_concurrency.lockutils [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.244 226298 DEBUG oslo_concurrency.lockutils [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.244 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] Full set of CONF: _wait_for_exit_or_signal /usr/lib/python3.9/site-packages/oslo_service/service.py:362
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.244 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.244 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.245 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.245 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] config files: ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.245 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.245 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] allow_resize_to_same_host      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.246 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] arq_binding_timeout            = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.246 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] backdoor_port                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.246 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] backdoor_socket                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.246 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] block_device_allocate_retries  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.246 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] block_device_allocate_retries_interval = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.246 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] cert                           = self.pem log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.247 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] compute_driver                 = libvirt.LibvirtDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.247 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] compute_monitors               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:58:02 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.247 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] config_dir                     = ['/etc/nova/nova.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:58:02 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:58:02 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:58:02.247 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.248 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] config_drive_format            = iso9660 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.248 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] config_file                    = ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.248 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.248 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] console_host                   = compute-1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.249 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] control_exchange               = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.249 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] cpu_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.249 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] daemon                         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.249 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.249 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] default_access_ip_network_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.250 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] default_availability_zone      = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.250 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] default_ephemeral_format       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.250 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.250 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] default_schedule_zone          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.251 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] disk_allocation_ratio          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.251 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] enable_new_services            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.251 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] enabled_apis                   = ['osapi_compute', 'metadata'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.251 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] enabled_ssl_apis               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.251 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] flat_injected                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.252 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] force_config_drive             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.252 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] force_raw_images               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.252 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.252 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] heal_instance_info_cache_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.253 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] host                           = compute-1.ctlplane.example.com log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.253 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] initial_cpu_allocation_ratio   = 4.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.253 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] initial_disk_allocation_ratio  = 0.9 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.253 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] initial_ram_allocation_ratio   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.254 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] injected_network_template      = /usr/lib/python3.9/site-packages/nova/virt/interfaces.template log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.254 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] instance_build_timeout         = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.254 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] instance_delete_interval       = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.255 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.255 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] instance_name_template         = instance-%08x log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.255 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] instance_usage_audit           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.255 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] instance_usage_audit_period    = month log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.256 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.256 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] instances_path                 = /var/lib/nova/instances log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.256 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] internal_service_availability_zone = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.256 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] key                            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.256 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] live_migration_retry_count     = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.257 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.257 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.257 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.257 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.257 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.258 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.258 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.258 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] log_rotation_type              = size log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.258 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.259 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.259 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.259 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.259 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.260 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] long_rpc_timeout               = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:58:02 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.260 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] max_concurrent_builds          = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.260 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] max_concurrent_live_migrations = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.260 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] max_concurrent_snapshots       = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.260 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] max_local_block_devices        = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.261 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] max_logfile_count              = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.261 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] max_logfile_size_mb            = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.261 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] maximum_instance_delete_attempts = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.261 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] metadata_listen                = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.262 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] metadata_listen_port           = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.262 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] metadata_workers               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.262 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] migrate_max_retries            = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.262 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] mkisofs_cmd                    = /usr/bin/mkisofs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.263 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] my_block_storage_ip            = 192.168.122.101 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.263 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] my_ip                          = 192.168.122.101 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.263 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] network_allocate_retries       = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.263 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.263 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] osapi_compute_listen           = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.264 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] osapi_compute_listen_port      = 8774 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.264 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] osapi_compute_unique_server_name_scope =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.264 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] osapi_compute_workers          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.264 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] password_length                = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.264 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] periodic_enable                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.264 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] periodic_fuzzy_delay           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.265 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] pointer_model                  = usbtablet log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.265 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] preallocate_images             = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.265 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.265 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] pybasedir                      = /usr/lib/python3.9/site-packages log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.265 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] ram_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.266 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.266 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.266 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.266 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] reboot_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.266 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] reclaim_instance_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.267 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] record                         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.267 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] reimage_timeout_per_gb         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.267 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] report_interval                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.267 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] rescue_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.267 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] reserved_host_cpus             = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.268 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] reserved_host_disk_mb          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.268 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] reserved_host_memory_mb        = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.268 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] reserved_huge_pages            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.268 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] resize_confirm_window          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.268 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] resize_fs_using_block_device   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.269 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] resume_guests_state_on_host_boot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.269 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] rootwrap_config                = /etc/nova/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.269 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] rpc_response_timeout           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.269 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] run_external_periodic_tasks    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.269 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] running_deleted_instance_action = reap log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.270 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] running_deleted_instance_poll_interval = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.270 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] running_deleted_instance_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.270 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] scheduler_instance_sync_interval = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.270 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] service_down_time              = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.270 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] servicegroup_driver            = db log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.271 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] shelved_offload_time           = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.271 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] shelved_poll_interval          = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.271 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] shutdown_timeout               = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.271 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] source_is_ipv6                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.271 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] ssl_only                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.272 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] state_path                     = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.272 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] sync_power_state_interval      = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.272 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] sync_power_state_pool_size     = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.272 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.272 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] tempdir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.273 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] timeout_nbd                    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.273 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.273 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] update_resources_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.273 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] use_cow_images                 = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.273 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.274 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.274 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.274 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] use_rootwrap_daemon            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.274 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.274 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.275 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] vcpu_pin_set                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.275 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] vif_plugging_is_fatal          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.275 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] vif_plugging_timeout           = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.275 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] virt_mkfs                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.275 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] volume_usage_poll_interval     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.276 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.276 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] web                            = /usr/share/spice-html5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.276 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.276 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] oslo_concurrency.lock_path     = /var/lib/nova/tmp log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.276 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.277 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.277 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.277 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.277 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.277 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] api.auth_strategy              = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.278 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] api.compute_link_prefix        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.278 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.278 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] api.dhcp_domain                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.278 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] api.enable_instance_password   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.278 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] api.glance_link_prefix         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.278 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] api.instance_list_cells_batch_fixed_size = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.279 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] api.instance_list_cells_batch_strategy = distributed log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.279 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] api.instance_list_per_project_cells = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.279 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] api.list_records_by_skipping_down_cells = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.279 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] api.local_metadata_per_cell    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.279 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] api.max_limit                  = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.279 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] api.metadata_cache_expiration  = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.279 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] api.neutron_default_tenant_id  = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.279 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] api.use_forwarded_for          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.280 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] api.use_neutron_default_nets   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.280 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] api.vendordata_dynamic_connect_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.280 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] api.vendordata_dynamic_failure_fatal = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.280 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] api.vendordata_dynamic_read_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.280 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] api.vendordata_dynamic_ssl_certfile =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.280 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] api.vendordata_dynamic_targets = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.280 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] api.vendordata_jsonfile_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.281 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] api.vendordata_providers       = ['StaticJSON'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.281 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] cache.backend                  = oslo_cache.dict log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.281 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] cache.backend_argument         = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.281 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] cache.config_prefix            = cache.oslo log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.281 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] cache.dead_timeout             = 60.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.281 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] cache.debug_cache_backend      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.281 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] cache.enable_retry_client      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.282 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] cache.enable_socket_keepalive  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.282 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] cache.enabled                  = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.282 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] cache.expiration_time          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.282 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] cache.hashclient_retry_attempts = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.282 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] cache.hashclient_retry_delay   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.282 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] cache.memcache_dead_retry      = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.282 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] cache.memcache_password        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.282 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] cache.memcache_pool_connection_get_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.283 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] cache.memcache_pool_flush_on_reconnect = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.283 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] cache.memcache_pool_maxsize    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.283 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] cache.memcache_pool_unused_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.283 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] cache.memcache_sasl_enabled    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.283 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] cache.memcache_servers         = ['localhost:11211'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.283 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] cache.memcache_socket_timeout  = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.283 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] cache.memcache_username        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.284 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] cache.proxies                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.284 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] cache.retry_attempts           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.284 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] cache.retry_delay              = 0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.284 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] cache.socket_keepalive_count   = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.284 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] cache.socket_keepalive_idle    = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.284 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] cache.socket_keepalive_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.285 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] cache.tls_allowed_ciphers      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.285 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] cache.tls_cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.285 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] cache.tls_certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.285 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] cache.tls_enabled              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.285 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] cache.tls_keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.285 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] cinder.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.285 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] cinder.auth_type               = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.286 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] cinder.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.286 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] cinder.catalog_info            = volumev3:cinderv3:internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.286 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] cinder.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.286 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] cinder.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.286 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] cinder.cross_az_attach         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.286 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] cinder.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.286 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] cinder.endpoint_template       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.287 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] cinder.http_retries            = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.287 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] cinder.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.287 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] cinder.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.287 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] cinder.os_region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.287 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] cinder.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.287 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] cinder.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.287 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] compute.consecutive_build_service_disable_threshold = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.287 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] compute.cpu_dedicated_set      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.288 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] compute.cpu_shared_set         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.288 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] compute.image_type_exclude_list = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.288 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] compute.live_migration_wait_for_vif_plug = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.288 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] compute.max_concurrent_disk_ops = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.288 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] compute.max_disk_devices_to_attach = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.288 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] compute.packing_host_numa_cells_allocation_strategy = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.288 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] compute.provider_config_location = /etc/nova/provider_config/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.289 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] compute.resource_provider_association_refresh = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.289 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] compute.shutdown_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.289 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] compute.vmdk_allowed_types     = ['streamOptimized', 'monolithicSparse'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.289 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] conductor.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.289 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] console.allowed_origins        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.289 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] console.ssl_ciphers            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.289 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] console.ssl_minimum_version    = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.289 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] consoleauth.token_ttl          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.290 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] cyborg.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.290 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] cyborg.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.290 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] cyborg.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.290 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] cyborg.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.290 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] cyborg.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.290 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] cyborg.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.290 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] cyborg.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.291 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] cyborg.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.291 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] cyborg.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.291 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] cyborg.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.291 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] cyborg.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.291 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] cyborg.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.291 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] cyborg.service_type            = accelerator log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.291 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] cyborg.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.291 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] cyborg.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.292 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] cyborg.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.292 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] cyborg.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.292 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] cyborg.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.292 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] cyborg.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.292 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] database.backend               = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.292 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] database.connection            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.293 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] database.connection_debug      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.293 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.293 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.293 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] database.connection_trace      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.293 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.293 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] database.db_max_retries        = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.293 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.294 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] database.db_retry_interval     = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.294 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] database.max_overflow          = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.294 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] database.max_pool_size         = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.294 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] database.max_retries           = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.294 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] database.mysql_enable_ndb      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.294 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] database.mysql_sql_mode        = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.294 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.295 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] database.pool_timeout          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.295 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] database.retry_interval        = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.295 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] database.slave_connection      = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.295 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] database.sqlite_synchronous    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.295 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] api_database.backend           = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.295 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] api_database.connection        = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.295 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] api_database.connection_debug  = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.295 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] api_database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.296 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] api_database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.296 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] api_database.connection_trace  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.296 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] api_database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.296 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] api_database.db_max_retries    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.296 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] api_database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.296 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] api_database.db_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.296 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] api_database.max_overflow      = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.297 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] api_database.max_pool_size     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.297 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] api_database.max_retries       = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.297 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] api_database.mysql_enable_ndb  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.297 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] api_database.mysql_sql_mode    = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.297 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] api_database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.297 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] api_database.pool_timeout      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.297 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] api_database.retry_interval    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.298 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] api_database.slave_connection  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.298 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] api_database.sqlite_synchronous = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.298 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] devices.enabled_mdev_types     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.298 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] ephemeral_storage_encryption.cipher = aes-xts-plain64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.298 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] ephemeral_storage_encryption.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.298 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] ephemeral_storage_encryption.key_size = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.298 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] glance.api_servers             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.298 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] glance.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.299 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] glance.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.299 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] glance.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.299 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] glance.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.299 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] glance.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.299 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] glance.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.299 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] glance.default_trusted_certificate_ids = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.299 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] glance.enable_certificate_validation = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.300 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] glance.enable_rbd_download     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.300 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] glance.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.300 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] glance.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.300 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] glance.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.300 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] glance.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.300 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] glance.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.300 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] glance.num_retries             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.301 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] glance.rbd_ceph_conf           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.301 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] glance.rbd_connect_timeout     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.301 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] glance.rbd_pool                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.301 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] glance.rbd_user                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.301 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] glance.region_name             = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.301 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] glance.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.301 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] glance.service_type            = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.301 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] glance.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.302 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] glance.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.302 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] glance.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.302 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] glance.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.302 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] glance.valid_interfaces        = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.302 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] glance.verify_glance_signatures = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.302 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] glance.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.302 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] guestfs.debug                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.303 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] hyperv.config_drive_cdrom      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.303 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] hyperv.config_drive_inject_password = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.303 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] hyperv.dynamic_memory_ratio    = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.303 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] hyperv.enable_instance_metrics_collection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.303 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] hyperv.enable_remotefx         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.303 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] hyperv.instances_path_share    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.303 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] hyperv.iscsi_initiator_list    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.304 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] hyperv.limit_cpu_features      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.304 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] hyperv.mounted_disk_query_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.304 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] hyperv.mounted_disk_query_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.304 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] hyperv.power_state_check_timeframe = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.304 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] hyperv.power_state_event_polling_interval = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.304 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] hyperv.qemu_img_cmd            = qemu-img.exe log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.304 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] hyperv.use_multipath_io        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.304 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] hyperv.volume_attach_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.305 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] hyperv.volume_attach_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.305 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] hyperv.vswitch_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.305 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] hyperv.wait_soft_reboot_seconds = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.305 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] mks.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.305 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] mks.mksproxy_base_url          = http://127.0.0.1:6090/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.305 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] image_cache.manager_interval   = 2400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.306 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] image_cache.precache_concurrency = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.306 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] image_cache.remove_unused_base_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.306 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] image_cache.remove_unused_original_minimum_age_seconds = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.306 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] image_cache.remove_unused_resized_minimum_age_seconds = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.306 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] image_cache.subdirectory_name  = _base log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.306 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] ironic.api_max_retries         = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.306 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] ironic.api_retry_interval      = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.307 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.307 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.307 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.307 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.307 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.307 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.307 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.307 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.308 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.308 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.308 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.308 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.308 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] ironic.partition_key           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.308 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] ironic.peer_list               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.308 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.309 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] ironic.serial_console_state_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.309 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.309 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] ironic.service_type            = baremetal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.309 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.309 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.309 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.309 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.310 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] ironic.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.310 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.310 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] key_manager.backend            = barbican log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.310 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] key_manager.fixed_key          = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.310 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] barbican.auth_endpoint         = http://localhost/identity/v3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.310 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] barbican.barbican_api_version  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.310 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] barbican.barbican_endpoint     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.311 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] barbican.barbican_endpoint_type = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.311 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] barbican.barbican_region_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.311 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] barbican.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.311 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] barbican.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.311 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] barbican.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.311 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] barbican.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.311 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] barbican.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.312 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] barbican.number_of_retries     = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.312 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] barbican.retry_delay           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.312 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] barbican.send_service_user_token = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.312 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] barbican.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.312 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] barbican.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.312 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] barbican.verify_ssl            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.312 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] barbican.verify_ssl_path       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.313 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] barbican_service_user.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.313 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] barbican_service_user.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.313 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] barbican_service_user.cafile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.313 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] barbican_service_user.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.313 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] barbican_service_user.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.313 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] barbican_service_user.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.313 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] barbican_service_user.keyfile  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.313 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] barbican_service_user.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.314 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] barbican_service_user.timeout  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.314 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] vault.approle_role_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.314 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] vault.approle_secret_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.314 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] vault.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.314 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] vault.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.314 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] vault.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.314 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] vault.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.315 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] vault.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.315 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] vault.kv_mountpoint            = secret log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.315 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] vault.kv_version               = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.315 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] vault.namespace                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.315 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] vault.root_token_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.315 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] vault.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.315 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] vault.ssl_ca_crt_file          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.316 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] vault.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.316 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] vault.use_ssl                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.316 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] vault.vault_url                = http://127.0.0.1:8200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.316 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] keystone.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.316 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] keystone.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.316 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] keystone.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.316 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] keystone.connect_retries       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.316 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] keystone.connect_retry_delay   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.317 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] keystone.endpoint_override     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.317 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] keystone.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.317 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] keystone.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.317 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] keystone.max_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.317 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] keystone.min_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.317 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] keystone.region_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.317 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] keystone.service_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.317 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] keystone.service_type          = identity log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.318 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] keystone.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.318 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] keystone.status_code_retries   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.318 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] keystone.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.318 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] keystone.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.318 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] keystone.valid_interfaces      = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.318 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] keystone.version               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.318 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] libvirt.connection_uri         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.319 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] libvirt.cpu_mode               = host-model log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.319 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] libvirt.cpu_model_extra_flags  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.319 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] libvirt.cpu_models             = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.319 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] libvirt.cpu_power_governor_high = performance log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.319 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] libvirt.cpu_power_governor_low = powersave log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.319 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] libvirt.cpu_power_management   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.319 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] libvirt.cpu_power_management_strategy = cpu_state log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.320 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] libvirt.device_detach_attempts = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.320 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] libvirt.device_detach_timeout  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.320 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] libvirt.disk_cachemodes        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.320 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] libvirt.disk_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.320 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] libvirt.enabled_perf_events    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.320 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] libvirt.file_backed_memory     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.320 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] libvirt.gid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.320 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] libvirt.hw_disk_discard        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.321 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] libvirt.hw_machine_type        = ['x86_64=q35'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.321 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] libvirt.images_rbd_ceph_conf   = /etc/ceph/ceph.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.321 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] libvirt.images_rbd_glance_copy_poll_interval = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.321 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] libvirt.images_rbd_glance_copy_timeout = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.321 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] libvirt.images_rbd_glance_store_name = default_backend log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.321 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] libvirt.images_rbd_pool        = vms log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.322 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] libvirt.images_type            = rbd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.322 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] libvirt.images_volume_group    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.322 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] libvirt.inject_key             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.322 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] libvirt.inject_partition       = -2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.322 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] libvirt.inject_password        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.322 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] libvirt.iscsi_iface            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.322 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] libvirt.iser_use_multipath     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.322 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] libvirt.live_migration_bandwidth = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.323 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] libvirt.live_migration_completion_timeout = 800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.323 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] libvirt.live_migration_downtime = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.323 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] libvirt.live_migration_downtime_delay = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.323 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] libvirt.live_migration_downtime_steps = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.323 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] libvirt.live_migration_inbound_addr = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.323 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] libvirt.live_migration_permit_auto_converge = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.323 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] libvirt.live_migration_permit_post_copy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.323 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] libvirt.live_migration_scheme  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.324 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] libvirt.live_migration_timeout_action = force_complete log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.324 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] libvirt.live_migration_tunnelled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.324 226298 WARNING oslo_config.cfg [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] Deprecated: Option "live_migration_uri" from group "libvirt" is deprecated for removal (
Feb 02 09:58:02 compute-1 nova_compute[226294]: live_migration_uri is deprecated for removal in favor of two other options that
Feb 02 09:58:02 compute-1 nova_compute[226294]: allow to change live migration scheme and target URI: ``live_migration_scheme``
Feb 02 09:58:02 compute-1 nova_compute[226294]: and ``live_migration_inbound_addr`` respectively.
Feb 02 09:58:02 compute-1 nova_compute[226294]: ).  Its value may be silently ignored in the future.
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.324 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] libvirt.live_migration_uri     = qemu+tls://%s/system log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.324 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] libvirt.live_migration_with_native_tls = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.324 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] libvirt.max_queues             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.325 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] libvirt.mem_stats_period_seconds = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.325 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] libvirt.nfs_mount_options      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.325 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] libvirt.nfs_mount_point_base   = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.325 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] libvirt.num_aoe_discover_tries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.325 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] libvirt.num_iser_scan_tries    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.325 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] libvirt.num_memory_encrypted_guests = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.325 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] libvirt.num_nvme_discover_tries = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.325 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] libvirt.num_pcie_ports         = 24 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.326 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] libvirt.num_volume_scan_tries  = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.326 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] libvirt.pmem_namespaces        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.326 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] libvirt.quobyte_client_cfg     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.326 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] libvirt.quobyte_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.326 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] libvirt.rbd_connect_timeout    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.326 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] libvirt.rbd_destroy_volume_retries = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.326 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] libvirt.rbd_destroy_volume_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.327 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] libvirt.rbd_secret_uuid        = d241d473-9fcb-5f74-b163-f1ca4454e7f1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.327 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] libvirt.rbd_user               = openstack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.327 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] libvirt.realtime_scheduler_priority = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.327 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] libvirt.remote_filesystem_transport = ssh log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.327 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] libvirt.rescue_image_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.327 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] libvirt.rescue_kernel_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.327 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] libvirt.rescue_ramdisk_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.327 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] libvirt.rng_dev_path           = /dev/urandom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.328 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] libvirt.rx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.328 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] libvirt.smbfs_mount_options    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.328 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] libvirt.smbfs_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.328 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] libvirt.snapshot_compression   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.328 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] libvirt.snapshot_image_format  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.328 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] libvirt.snapshots_directory    = /var/lib/nova/instances/snapshots log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.329 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] libvirt.sparse_logical_volumes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.329 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] libvirt.swtpm_enabled          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.329 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] libvirt.swtpm_group            = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.329 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] libvirt.swtpm_user             = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.329 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] libvirt.sysinfo_serial         = unique log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.329 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] libvirt.tx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.329 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] libvirt.uid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.329 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] libvirt.use_virtio_for_bridges = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.330 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] libvirt.virt_type              = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.330 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] libvirt.volume_clear           = zero log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.330 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] libvirt.volume_clear_size      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.330 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] libvirt.volume_use_multipath   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.330 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] libvirt.vzstorage_cache_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.330 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] libvirt.vzstorage_log_path     = /var/log/vstorage/%(cluster_name)s/nova.log.gz log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.330 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] libvirt.vzstorage_mount_group  = qemu log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.330 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] libvirt.vzstorage_mount_opts   = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.331 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] libvirt.vzstorage_mount_perms  = 0770 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.331 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] libvirt.vzstorage_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.331 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] libvirt.vzstorage_mount_user   = stack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.331 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] libvirt.wait_soft_reboot_seconds = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.331 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] neutron.auth_section           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.331 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] neutron.auth_type              = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.331 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] neutron.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.332 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] neutron.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.332 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] neutron.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.332 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] neutron.connect_retries        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.332 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] neutron.connect_retry_delay    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.332 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] neutron.default_floating_pool  = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.332 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] neutron.endpoint_override      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.332 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] neutron.extension_sync_interval = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.332 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] neutron.http_retries           = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.333 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] neutron.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.333 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] neutron.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.333 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] neutron.max_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.333 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] neutron.metadata_proxy_shared_secret = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.333 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] neutron.min_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.333 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] neutron.ovs_bridge             = br-int log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.333 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] neutron.physnets               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.334 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] neutron.region_name            = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.334 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] neutron.service_metadata_proxy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.334 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] neutron.service_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.334 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] neutron.service_type           = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.334 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] neutron.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.334 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] neutron.status_code_retries    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.334 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] neutron.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.335 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] neutron.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.335 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] neutron.valid_interfaces       = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.335 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] neutron.version                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.335 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] notifications.bdms_in_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.335 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] notifications.default_level    = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.335 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] notifications.notification_format = unversioned log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.335 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] notifications.notify_on_state_change = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.336 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] notifications.versioned_notifications_topics = ['versioned_notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.336 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] pci.alias                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.336 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] pci.device_spec                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.336 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] pci.report_in_placement        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.336 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.336 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] placement.auth_type            = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.336 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] placement.auth_url             = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.337 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.337 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.337 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.337 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] placement.connect_retries      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.337 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] placement.connect_retry_delay  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.337 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] placement.default_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.337 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] placement.default_domain_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.337 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] placement.domain_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.338 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] placement.domain_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.338 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] placement.endpoint_override    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.338 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.338 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.338 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] placement.max_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.338 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] placement.min_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.338 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] placement.password             = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.338 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] placement.project_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.339 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] placement.project_domain_name  = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.339 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] placement.project_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.339 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] placement.project_name         = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.339 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] placement.region_name          = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.339 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] placement.service_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.339 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] placement.service_type         = placement log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.339 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.340 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] placement.status_code_retries  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.340 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] placement.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.340 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] placement.system_scope         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.340 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.340 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] placement.trust_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.340 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] placement.user_domain_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.340 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] placement.user_domain_name     = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.340 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] placement.user_id              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.341 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] placement.username             = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.341 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] placement.valid_interfaces     = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.341 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] placement.version              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.341 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] quota.cores                    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.341 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] quota.count_usage_from_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.341 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] quota.driver                   = nova.quota.DbQuotaDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.341 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] quota.injected_file_content_bytes = 10240 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.342 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] quota.injected_file_path_length = 255 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.342 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] quota.injected_files           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.342 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] quota.instances                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.342 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] quota.key_pairs                = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.342 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] quota.metadata_items           = 128 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.342 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] quota.ram                      = 51200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.342 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] quota.recheck_quota            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.342 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] quota.server_group_members     = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.343 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] quota.server_groups            = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.343 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] rdp.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.343 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] rdp.html5_proxy_base_url       = http://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.343 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] scheduler.discover_hosts_in_cells_interval = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.343 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] scheduler.enable_isolated_aggregate_filtering = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.343 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] scheduler.image_metadata_prefilter = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.344 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] scheduler.limit_tenants_to_placement_aggregate = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.344 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] scheduler.max_attempts         = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.344 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] scheduler.max_placement_results = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.344 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] scheduler.placement_aggregate_required_for_tenants = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.344 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] scheduler.query_placement_for_availability_zone = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.344 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] scheduler.query_placement_for_image_type_support = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.344 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] scheduler.query_placement_for_routed_network_aggregates = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.345 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] scheduler.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.345 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_namespace = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.345 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_separator = . log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.345 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.345 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] filter_scheduler.build_failure_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.345 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] filter_scheduler.cpu_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.345 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.346 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] filter_scheduler.disk_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.346 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.346 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] filter_scheduler.host_subset_size = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.346 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] filter_scheduler.image_properties_default_architecture = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.346 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] filter_scheduler.io_ops_weight_multiplier = -1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.346 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] filter_scheduler.isolated_hosts = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.346 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] filter_scheduler.isolated_images = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.347 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] filter_scheduler.max_instances_per_host = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.347 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] filter_scheduler.max_io_ops_per_host = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.347 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] filter_scheduler.pci_in_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.347 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] filter_scheduler.pci_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.347 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] filter_scheduler.ram_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.347 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.347 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] filter_scheduler.shuffle_best_same_weighed_hosts = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.347 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] filter_scheduler.soft_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.348 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.348 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] filter_scheduler.track_instance_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.348 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.348 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] metrics.required               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.348 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] metrics.weight_multiplier      = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.348 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] metrics.weight_of_unavailable  = -10000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.349 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] metrics.weight_setting         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.349 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] serial_console.base_url        = ws://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.349 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] serial_console.enabled         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.349 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] serial_console.port_range      = 10000:20000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.349 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] serial_console.proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.349 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] serial_console.serialproxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.349 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] serial_console.serialproxy_port = 6083 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.350 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] service_user.auth_section      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.350 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] service_user.auth_type         = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.350 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] service_user.cafile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.350 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] service_user.certfile          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.350 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] service_user.collect_timing    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.350 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] service_user.insecure          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.350 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] service_user.keyfile           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.351 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] service_user.send_service_user_token = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.351 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] service_user.split_loggers     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.351 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] service_user.timeout           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.351 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] spice.agent_enabled            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.351 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] spice.enabled                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.351 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] spice.html5proxy_base_url      = http://127.0.0.1:6082/spice_auto.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.351 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] spice.html5proxy_host          = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.352 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] spice.html5proxy_port          = 6082 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.352 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] spice.image_compression        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.352 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] spice.jpeg_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.352 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] spice.playback_compression     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.352 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] spice.server_listen            = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.352 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] spice.server_proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.352 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] spice.streaming_mode           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.352 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] spice.zlib_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.353 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] upgrade_levels.baseapi         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.353 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] upgrade_levels.cert            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.353 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] upgrade_levels.compute         = auto log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.353 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] upgrade_levels.conductor       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.353 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] upgrade_levels.scheduler       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.353 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] vendordata_dynamic_auth.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.353 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] vendordata_dynamic_auth.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.354 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] vendordata_dynamic_auth.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.354 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] vendordata_dynamic_auth.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.354 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] vendordata_dynamic_auth.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.354 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] vendordata_dynamic_auth.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.354 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] vendordata_dynamic_auth.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.354 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] vendordata_dynamic_auth.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.355 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] vendordata_dynamic_auth.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.355 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.355 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.355 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] vmware.cache_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.355 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] vmware.cluster_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.355 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] vmware.connection_pool_size    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.355 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] vmware.console_delay_seconds   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.355 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] vmware.datastore_regex         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.356 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] vmware.host_ip                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.356 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.356 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.356 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] vmware.host_username           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.356 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.356 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] vmware.integration_bridge      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.356 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] vmware.maximum_objects         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.356 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] vmware.pbm_default_policy      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.357 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] vmware.pbm_enabled             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.357 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] vmware.pbm_wsdl_location       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.357 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] vmware.serial_log_dir          = /opt/vmware/vspc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.357 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] vmware.serial_port_proxy_uri   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.357 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] vmware.serial_port_service_uri = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.357 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.357 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] vmware.use_linked_clone        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.358 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] vmware.vnc_keymap              = en-us log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.358 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] vmware.vnc_port                = 5900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.358 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] vmware.vnc_port_total          = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.358 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] vnc.auth_schemes               = ['none'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.358 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] vnc.enabled                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.358 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] vnc.novncproxy_base_url        = https://nova-novncproxy-cell1-public-openstack.apps-crc.testing/vnc_lite.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.358 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] vnc.novncproxy_host            = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.359 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] vnc.novncproxy_port            = 6080 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.359 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] vnc.server_listen              = ::0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.359 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] vnc.server_proxyclient_address = 192.168.122.101 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.359 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] vnc.vencrypt_ca_certs          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.359 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] vnc.vencrypt_client_cert       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.359 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] vnc.vencrypt_client_key        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.359 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] workarounds.disable_compute_service_check_for_ffu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.360 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] workarounds.disable_deep_image_inspection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.360 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] workarounds.disable_fallback_pcpu_query = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.360 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] workarounds.disable_group_policy_check_upcall = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.360 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] workarounds.disable_libvirt_livesnapshot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.360 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] workarounds.disable_rootwrap   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.360 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] workarounds.enable_numa_live_migration = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.360 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] workarounds.enable_qemu_monitor_announce_self = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.361 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.361 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] workarounds.handle_virt_lifecycle_events = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.361 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] workarounds.libvirt_disable_apic = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.361 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] workarounds.never_download_image_if_on_rbd = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.361 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] workarounds.qemu_monitor_announce_self_count = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.361 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] workarounds.qemu_monitor_announce_self_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.361 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] workarounds.reserve_disk_resource_for_image_cache = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.361 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] workarounds.skip_cpu_compare_at_startup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.362 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] workarounds.skip_cpu_compare_on_dest = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.362 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] workarounds.skip_hypervisor_version_check_on_lm = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.362 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] workarounds.skip_reserve_in_use_ironic_nodes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.362 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] workarounds.unified_limits_count_pcpu_as_vcpu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.362 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.362 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] wsgi.api_paste_config          = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.363 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] wsgi.client_socket_timeout     = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.363 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] wsgi.default_pool_size         = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.363 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] wsgi.keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.363 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] wsgi.max_header_line           = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.363 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] wsgi.secure_proxy_ssl_header   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.364 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] wsgi.ssl_ca_file               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.364 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] wsgi.ssl_cert_file             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.364 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] wsgi.ssl_key_file              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.364 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] wsgi.tcp_keepidle              = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.364 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] wsgi.wsgi_log_format           = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.364 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] zvm.ca_file                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.364 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] zvm.cloud_connector_url        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.365 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] zvm.image_tmp_path             = /var/lib/nova/images log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.365 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] zvm.reachable_timeout          = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.365 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] oslo_policy.enforce_new_defaults = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.365 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] oslo_policy.enforce_scope      = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.365 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.365 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.366 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.366 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.366 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.366 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.366 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.366 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.366 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.367 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.367 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] remote_debug.host              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.367 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] remote_debug.port              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.367 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.367 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] oslo_messaging_rabbit.amqp_durable_queues = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.367 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:58:02 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:58:02.367 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.367 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.368 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.368 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.368 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.368 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.368 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.368 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.368 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.369 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.369 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.369 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.369 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.369 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.369 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.369 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.370 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.370 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.370 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_queue = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.370 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.370 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.370 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.370 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.370 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.371 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.371 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.371 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.371 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.371 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.371 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] oslo_messaging_notifications.driver = ['noop'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.372 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.372 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.372 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.372 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] oslo_limit.auth_section        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.372 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] oslo_limit.auth_type           = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.372 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] oslo_limit.auth_url            = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.372 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] oslo_limit.cafile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.373 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] oslo_limit.certfile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.373 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] oslo_limit.collect_timing      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.373 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] oslo_limit.connect_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.373 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] oslo_limit.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.373 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] oslo_limit.default_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.373 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] oslo_limit.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.373 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] oslo_limit.domain_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.374 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] oslo_limit.domain_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.374 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] oslo_limit.endpoint_id         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.374 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] oslo_limit.endpoint_override   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.374 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] oslo_limit.insecure            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.374 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] oslo_limit.keyfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.374 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] oslo_limit.max_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.374 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] oslo_limit.min_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.374 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] oslo_limit.password            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.375 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] oslo_limit.project_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.375 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] oslo_limit.project_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.375 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] oslo_limit.project_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.375 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] oslo_limit.project_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.375 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] oslo_limit.region_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.375 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] oslo_limit.service_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.375 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] oslo_limit.service_type        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.376 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] oslo_limit.split_loggers       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.376 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] oslo_limit.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.376 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] oslo_limit.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.376 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] oslo_limit.system_scope        = all log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.376 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] oslo_limit.timeout             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.376 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] oslo_limit.trust_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.377 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] oslo_limit.user_domain_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.377 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] oslo_limit.user_domain_name    = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.377 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] oslo_limit.user_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.377 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] oslo_limit.username            = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.377 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] oslo_limit.valid_interfaces    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.377 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] oslo_limit.version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.377 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] oslo_reports.file_event_handler = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.378 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] oslo_reports.file_event_handler_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.378 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] oslo_reports.log_dir           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.378 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] vif_plug_linux_bridge_privileged.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.378 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] vif_plug_linux_bridge_privileged.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.378 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] vif_plug_linux_bridge_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.378 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.378 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] vif_plug_linux_bridge_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.378 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] vif_plug_linux_bridge_privileged.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.379 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] vif_plug_ovs_privileged.capabilities = [12, 1] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.379 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] vif_plug_ovs_privileged.group  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.379 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] vif_plug_ovs_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.379 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.379 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] vif_plug_ovs_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.379 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] vif_plug_ovs_privileged.user   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.379 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] os_vif_linux_bridge.flat_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.380 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] os_vif_linux_bridge.forward_bridge_interface = ['all'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.380 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] os_vif_linux_bridge.iptables_bottom_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.380 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] os_vif_linux_bridge.iptables_drop_action = DROP log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.380 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] os_vif_linux_bridge.iptables_top_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.380 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] os_vif_linux_bridge.network_device_mtu = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.380 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] os_vif_linux_bridge.use_ipv6   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.380 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] os_vif_linux_bridge.vlan_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.381 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] os_vif_ovs.isolate_vif         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.381 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] os_vif_ovs.network_device_mtu  = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.381 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] os_vif_ovs.ovs_vsctl_timeout   = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.381 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] os_vif_ovs.ovsdb_connection    = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.381 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] os_vif_ovs.ovsdb_interface     = native log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.381 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] os_vif_ovs.per_port_bridge     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.382 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] os_brick.lock_path             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.382 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] os_brick.wait_mpath_device_attempts = 4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.382 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] os_brick.wait_mpath_device_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.382 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] privsep_osbrick.capabilities   = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.382 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] privsep_osbrick.group          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.382 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] privsep_osbrick.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.382 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] privsep_osbrick.logger_name    = os_brick.privileged log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.382 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] privsep_osbrick.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.383 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] privsep_osbrick.user           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.383 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] nova_sys_admin.capabilities    = [0, 1, 2, 3, 12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.383 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] nova_sys_admin.group           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.383 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] nova_sys_admin.helper_command  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.383 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] nova_sys_admin.logger_name     = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.383 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] nova_sys_admin.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.383 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] nova_sys_admin.user            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.384 226298 DEBUG oslo_service.service [None req-9db0eecc-7299-4c2b-aeac-2682fc314033 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.385 226298 INFO nova.service [-] Starting compute node (version 27.5.2-0.20260127144738.eaa65f0.el9)
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.399 226298 DEBUG nova.virt.libvirt.host [None req-9f87d029-9537-479a-a951-66a5b7f88d49 - - - - - -] Starting native event thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:492
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.399 226298 DEBUG nova.virt.libvirt.host [None req-9f87d029-9537-479a-a951-66a5b7f88d49 - - - - - -] Starting green dispatch thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:498
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.400 226298 DEBUG nova.virt.libvirt.host [None req-9f87d029-9537-479a-a951-66a5b7f88d49 - - - - - -] Starting connection event dispatch thread initialize /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:620
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.400 226298 DEBUG nova.virt.libvirt.host [None req-9f87d029-9537-479a-a951-66a5b7f88d49 - - - - - -] Connecting to libvirt: qemu:///system _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:503
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.410 226298 DEBUG nova.virt.libvirt.host [None req-9f87d029-9537-479a-a951-66a5b7f88d49 - - - - - -] Registering for lifecycle events <nova.virt.libvirt.host.Host object at 0x7f858bc1b610> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:509
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.412 226298 DEBUG nova.virt.libvirt.host [None req-9f87d029-9537-479a-a951-66a5b7f88d49 - - - - - -] Registering for connection events: <nova.virt.libvirt.host.Host object at 0x7f858bc1b610> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:530
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.412 226298 INFO nova.virt.libvirt.driver [None req-9f87d029-9537-479a-a951-66a5b7f88d49 - - - - - -] Connection event '1' reason 'None'
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.420 226298 INFO nova.virt.libvirt.host [None req-9f87d029-9537-479a-a951-66a5b7f88d49 - - - - - -] Libvirt host capabilities <capabilities>
Feb 02 09:58:02 compute-1 nova_compute[226294]: 
Feb 02 09:58:02 compute-1 nova_compute[226294]:   <host>
Feb 02 09:58:02 compute-1 nova_compute[226294]:     <uuid>7f778d97-f318-4380-8776-2e4d99e5fd86</uuid>
Feb 02 09:58:02 compute-1 nova_compute[226294]:     <cpu>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <arch>x86_64</arch>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model>EPYC-Rome-v4</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <vendor>AMD</vendor>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <microcode version='16777317'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <signature family='23' model='49' stepping='0'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <topology sockets='8' dies='1' clusters='1' cores='1' threads='1'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <maxphysaddr mode='emulate' bits='40'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <feature name='x2apic'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <feature name='tsc-deadline'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <feature name='osxsave'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <feature name='hypervisor'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <feature name='tsc_adjust'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <feature name='spec-ctrl'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <feature name='stibp'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <feature name='arch-capabilities'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <feature name='ssbd'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <feature name='cmp_legacy'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <feature name='topoext'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <feature name='virt-ssbd'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <feature name='lbrv'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <feature name='tsc-scale'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <feature name='vmcb-clean'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <feature name='pause-filter'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <feature name='pfthreshold'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <feature name='svme-addr-chk'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <feature name='rdctl-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <feature name='skip-l1dfl-vmentry'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <feature name='mds-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <feature name='pschange-mc-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <pages unit='KiB' size='4'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <pages unit='KiB' size='2048'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <pages unit='KiB' size='1048576'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:     </cpu>
Feb 02 09:58:02 compute-1 nova_compute[226294]:     <power_management>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <suspend_mem/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:     </power_management>
Feb 02 09:58:02 compute-1 nova_compute[226294]:     <iommu support='no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:     <migration_features>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <live/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <uri_transports>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <uri_transport>tcp</uri_transport>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <uri_transport>rdma</uri_transport>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </uri_transports>
Feb 02 09:58:02 compute-1 nova_compute[226294]:     </migration_features>
Feb 02 09:58:02 compute-1 nova_compute[226294]:     <topology>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <cells num='1'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <cell id='0'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:           <memory unit='KiB'>7864292</memory>
Feb 02 09:58:02 compute-1 nova_compute[226294]:           <pages unit='KiB' size='4'>1966073</pages>
Feb 02 09:58:02 compute-1 nova_compute[226294]:           <pages unit='KiB' size='2048'>0</pages>
Feb 02 09:58:02 compute-1 nova_compute[226294]:           <pages unit='KiB' size='1048576'>0</pages>
Feb 02 09:58:02 compute-1 nova_compute[226294]:           <distances>
Feb 02 09:58:02 compute-1 nova_compute[226294]:             <sibling id='0' value='10'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:           </distances>
Feb 02 09:58:02 compute-1 nova_compute[226294]:           <cpus num='8'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:             <cpu id='0' socket_id='0' die_id='0' cluster_id='65535' core_id='0' siblings='0'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:             <cpu id='1' socket_id='1' die_id='1' cluster_id='65535' core_id='0' siblings='1'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:             <cpu id='2' socket_id='2' die_id='2' cluster_id='65535' core_id='0' siblings='2'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:             <cpu id='3' socket_id='3' die_id='3' cluster_id='65535' core_id='0' siblings='3'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:             <cpu id='4' socket_id='4' die_id='4' cluster_id='65535' core_id='0' siblings='4'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:             <cpu id='5' socket_id='5' die_id='5' cluster_id='65535' core_id='0' siblings='5'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:             <cpu id='6' socket_id='6' die_id='6' cluster_id='65535' core_id='0' siblings='6'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:             <cpu id='7' socket_id='7' die_id='7' cluster_id='65535' core_id='0' siblings='7'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:           </cpus>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         </cell>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </cells>
Feb 02 09:58:02 compute-1 nova_compute[226294]:     </topology>
Feb 02 09:58:02 compute-1 nova_compute[226294]:     <cache>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <bank id='0' level='2' type='both' size='512' unit='KiB' cpus='0'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <bank id='1' level='2' type='both' size='512' unit='KiB' cpus='1'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <bank id='2' level='2' type='both' size='512' unit='KiB' cpus='2'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <bank id='3' level='2' type='both' size='512' unit='KiB' cpus='3'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <bank id='4' level='2' type='both' size='512' unit='KiB' cpus='4'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <bank id='5' level='2' type='both' size='512' unit='KiB' cpus='5'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <bank id='6' level='2' type='both' size='512' unit='KiB' cpus='6'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <bank id='7' level='2' type='both' size='512' unit='KiB' cpus='7'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <bank id='0' level='3' type='both' size='16' unit='MiB' cpus='0'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <bank id='1' level='3' type='both' size='16' unit='MiB' cpus='1'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <bank id='2' level='3' type='both' size='16' unit='MiB' cpus='2'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <bank id='3' level='3' type='both' size='16' unit='MiB' cpus='3'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <bank id='4' level='3' type='both' size='16' unit='MiB' cpus='4'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <bank id='5' level='3' type='both' size='16' unit='MiB' cpus='5'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <bank id='6' level='3' type='both' size='16' unit='MiB' cpus='6'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <bank id='7' level='3' type='both' size='16' unit='MiB' cpus='7'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:     </cache>
Feb 02 09:58:02 compute-1 nova_compute[226294]:     <secmodel>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model>selinux</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <doi>0</doi>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <baselabel type='kvm'>system_u:system_r:svirt_t:s0</baselabel>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <baselabel type='qemu'>system_u:system_r:svirt_tcg_t:s0</baselabel>
Feb 02 09:58:02 compute-1 nova_compute[226294]:     </secmodel>
Feb 02 09:58:02 compute-1 nova_compute[226294]:     <secmodel>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model>dac</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <doi>0</doi>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <baselabel type='kvm'>+107:+107</baselabel>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <baselabel type='qemu'>+107:+107</baselabel>
Feb 02 09:58:02 compute-1 nova_compute[226294]:     </secmodel>
Feb 02 09:58:02 compute-1 nova_compute[226294]:   </host>
Feb 02 09:58:02 compute-1 nova_compute[226294]: 
Feb 02 09:58:02 compute-1 nova_compute[226294]:   <guest>
Feb 02 09:58:02 compute-1 nova_compute[226294]:     <os_type>hvm</os_type>
Feb 02 09:58:02 compute-1 nova_compute[226294]:     <arch name='i686'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <wordsize>32</wordsize>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <emulator>/usr/libexec/qemu-kvm</emulator>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <domain type='qemu'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <domain type='kvm'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:     </arch>
Feb 02 09:58:02 compute-1 nova_compute[226294]:     <features>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <pae/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <nonpae/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <acpi default='on' toggle='yes'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <apic default='on' toggle='no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <cpuselection/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <deviceboot/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <disksnapshot default='on' toggle='no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <externalSnapshot/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:     </features>
Feb 02 09:58:02 compute-1 nova_compute[226294]:   </guest>
Feb 02 09:58:02 compute-1 nova_compute[226294]: 
Feb 02 09:58:02 compute-1 nova_compute[226294]:   <guest>
Feb 02 09:58:02 compute-1 nova_compute[226294]:     <os_type>hvm</os_type>
Feb 02 09:58:02 compute-1 nova_compute[226294]:     <arch name='x86_64'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <wordsize>64</wordsize>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <emulator>/usr/libexec/qemu-kvm</emulator>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <domain type='qemu'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <domain type='kvm'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:     </arch>
Feb 02 09:58:02 compute-1 nova_compute[226294]:     <features>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <acpi default='on' toggle='yes'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <apic default='on' toggle='no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <cpuselection/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <deviceboot/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <disksnapshot default='on' toggle='no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <externalSnapshot/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:     </features>
Feb 02 09:58:02 compute-1 nova_compute[226294]:   </guest>
Feb 02 09:58:02 compute-1 nova_compute[226294]: 
Feb 02 09:58:02 compute-1 nova_compute[226294]: </capabilities>
Feb 02 09:58:02 compute-1 nova_compute[226294]: 
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.427 226298 DEBUG nova.virt.libvirt.host [None req-9f87d029-9537-479a-a951-66a5b7f88d49 - - - - - -] Getting domain capabilities for i686 via machine types: {'q35', 'pc'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.432 226298 DEBUG nova.virt.libvirt.host [None req-9f87d029-9537-479a-a951-66a5b7f88d49 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=q35:
Feb 02 09:58:02 compute-1 nova_compute[226294]: <domainCapabilities>
Feb 02 09:58:02 compute-1 nova_compute[226294]:   <path>/usr/libexec/qemu-kvm</path>
Feb 02 09:58:02 compute-1 nova_compute[226294]:   <domain>kvm</domain>
Feb 02 09:58:02 compute-1 nova_compute[226294]:   <machine>pc-q35-rhel9.8.0</machine>
Feb 02 09:58:02 compute-1 nova_compute[226294]:   <arch>i686</arch>
Feb 02 09:58:02 compute-1 nova_compute[226294]:   <vcpu max='4096'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:   <iothreads supported='yes'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:   <os supported='yes'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:     <enum name='firmware'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:     <loader supported='yes'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <enum name='type'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>rom</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>pflash</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </enum>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <enum name='readonly'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>yes</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>no</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </enum>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <enum name='secure'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>no</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </enum>
Feb 02 09:58:02 compute-1 nova_compute[226294]:     </loader>
Feb 02 09:58:02 compute-1 nova_compute[226294]:   </os>
Feb 02 09:58:02 compute-1 nova_compute[226294]:   <cpu>
Feb 02 09:58:02 compute-1 nova_compute[226294]:     <mode name='host-passthrough' supported='yes'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <enum name='hostPassthroughMigratable'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>on</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>off</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </enum>
Feb 02 09:58:02 compute-1 nova_compute[226294]:     </mode>
Feb 02 09:58:02 compute-1 nova_compute[226294]:     <mode name='maximum' supported='yes'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <enum name='maximumMigratable'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>on</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>off</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </enum>
Feb 02 09:58:02 compute-1 nova_compute[226294]:     </mode>
Feb 02 09:58:02 compute-1 nova_compute[226294]:     <mode name='host-model' supported='yes'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model fallback='forbid'>EPYC-Rome</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <vendor>AMD</vendor>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <maxphysaddr mode='passthrough' limit='40'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <feature policy='require' name='x2apic'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <feature policy='require' name='tsc-deadline'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <feature policy='require' name='hypervisor'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <feature policy='require' name='tsc_adjust'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <feature policy='require' name='spec-ctrl'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <feature policy='require' name='stibp'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <feature policy='require' name='ssbd'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <feature policy='require' name='cmp_legacy'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <feature policy='require' name='overflow-recov'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <feature policy='require' name='succor'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <feature policy='require' name='ibrs'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <feature policy='require' name='amd-ssbd'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <feature policy='require' name='virt-ssbd'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <feature policy='require' name='lbrv'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <feature policy='require' name='tsc-scale'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <feature policy='require' name='vmcb-clean'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <feature policy='require' name='flushbyasid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <feature policy='require' name='pause-filter'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <feature policy='require' name='pfthreshold'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <feature policy='require' name='svme-addr-chk'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <feature policy='require' name='lfence-always-serializing'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <feature policy='disable' name='xsaves'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:     </mode>
Feb 02 09:58:02 compute-1 nova_compute[226294]:     <mode name='custom' supported='yes'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='Broadwell'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='hle'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='invpcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='rtm'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='Broadwell-IBRS'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='hle'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='invpcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='rtm'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='Broadwell-noTSX'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='invpcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='Broadwell-noTSX-IBRS'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='invpcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='Broadwell-v1'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='hle'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='invpcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='rtm'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='Broadwell-v2'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='invpcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='Broadwell-v3'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='hle'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='invpcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='rtm'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='Broadwell-v4'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='invpcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='Cascadelake-Server'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512bw'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512cd'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512dq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512f'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vl'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vnni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='hle'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='invpcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pku'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='rtm'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='Cascadelake-Server-noTSX'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512bw'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512cd'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512dq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512f'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vl'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vnni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='ibrs-all'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='invpcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pku'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='Cascadelake-Server-v1'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512bw'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512cd'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512dq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512f'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vl'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vnni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='hle'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='invpcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pku'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='rtm'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='Cascadelake-Server-v2'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512bw'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512cd'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512dq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512f'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vl'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vnni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='hle'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='ibrs-all'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='invpcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pku'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='rtm'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='Cascadelake-Server-v3'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512bw'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512cd'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512dq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512f'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vl'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vnni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='ibrs-all'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='invpcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pku'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='Cascadelake-Server-v4'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512bw'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512cd'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512dq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512f'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vl'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vnni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='ibrs-all'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='invpcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pku'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='Cascadelake-Server-v5'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512bw'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512cd'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512dq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512f'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vl'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vnni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='ibrs-all'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='invpcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pku'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='xsaves'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='ClearwaterForest'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx-ifma'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx-ne-convert'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx-vnni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx-vnni-int16'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx-vnni-int8'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='bhi-ctrl'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='bhi-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='bus-lock-detect'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='cldemote'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='cmpccxadd'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='ddpd-u'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fbsdp-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fsrm'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fsrs'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='gfni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='ibrs-all'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='intel-psfd'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='invpcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='ipred-ctrl'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='lam'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='mcdt-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='movdir64b'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='movdiri'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pbrsb-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pku'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='prefetchiti'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='psdp-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='rrsba-ctrl'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='sbdr-ssdp-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='serialize'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='sha512'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='sm3'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='sm4'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='ss'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='vaes'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='vpclmulqdq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='xsaves'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='ClearwaterForest-v1'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx-ifma'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx-ne-convert'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx-vnni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx-vnni-int16'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx-vnni-int8'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='bhi-ctrl'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='bhi-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='bus-lock-detect'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='cldemote'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='cmpccxadd'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='ddpd-u'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fbsdp-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fsrm'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fsrs'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='gfni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='ibrs-all'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='intel-psfd'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='invpcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='ipred-ctrl'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='lam'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='mcdt-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='movdir64b'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='movdiri'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pbrsb-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pku'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='prefetchiti'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='psdp-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='rrsba-ctrl'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='sbdr-ssdp-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='serialize'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='sha512'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='sm3'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='sm4'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='ss'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='vaes'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='vpclmulqdq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='xsaves'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='Cooperlake'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512-bf16'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512bw'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512cd'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512dq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512f'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vl'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vnni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='hle'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='ibrs-all'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='invpcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pku'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='rtm'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='taa-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='Cooperlake-v1'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512-bf16'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512bw'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512cd'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512dq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512f'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vl'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vnni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='hle'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='ibrs-all'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='invpcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pku'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='rtm'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='taa-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='Cooperlake-v2'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512-bf16'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512bw'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512cd'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512dq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512f'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vl'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vnni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='hle'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='ibrs-all'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='invpcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pku'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='rtm'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='taa-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='xsaves'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='Denverton'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='mpx'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='Denverton-v1'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='mpx'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='Denverton-v2'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='Denverton-v3'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='xsaves'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='Dhyana-v2'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='xsaves'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='EPYC-Genoa'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='amd-psfd'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='auto-ibrs'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512-bf16'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512-vpopcntdq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512bitalg'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512bw'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512cd'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512dq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512f'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512ifma'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vbmi'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vbmi2'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vl'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vnni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fsrm'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='gfni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='invpcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='la57'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='no-nested-data-bp'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='null-sel-clr-base'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pku'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='stibp-always-on'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='vaes'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='vpclmulqdq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='xsaves'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='EPYC-Genoa-v1'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='amd-psfd'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='auto-ibrs'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512-bf16'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512-vpopcntdq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512bitalg'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512bw'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512cd'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512dq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512f'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512ifma'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vbmi'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vbmi2'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vl'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vnni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fsrm'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='gfni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='invpcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='la57'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='no-nested-data-bp'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='null-sel-clr-base'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pku'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='stibp-always-on'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='vaes'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='vpclmulqdq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='xsaves'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='EPYC-Genoa-v2'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='amd-psfd'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='auto-ibrs'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512-bf16'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512-vpopcntdq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512bitalg'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512bw'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512cd'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512dq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512f'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512ifma'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vbmi'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vbmi2'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vl'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vnni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fs-gs-base-ns'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fsrm'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='gfni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='invpcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='la57'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='no-nested-data-bp'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='null-sel-clr-base'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='perfmon-v2'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pku'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='stibp-always-on'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='vaes'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='vpclmulqdq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='xsaves'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='EPYC-Milan'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fsrm'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='invpcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pku'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='xsaves'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='EPYC-Milan-v1'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fsrm'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='invpcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pku'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='xsaves'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='EPYC-Milan-v2'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='amd-psfd'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fsrm'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='invpcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='no-nested-data-bp'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='null-sel-clr-base'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pku'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='stibp-always-on'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='vaes'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='vpclmulqdq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='xsaves'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='EPYC-Milan-v3'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='amd-psfd'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fsrm'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='invpcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='no-nested-data-bp'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='null-sel-clr-base'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pku'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='stibp-always-on'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='vaes'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='vpclmulqdq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='xsaves'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='EPYC-Rome'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='xsaves'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='EPYC-Rome-v1'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='xsaves'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='EPYC-Rome-v2'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='xsaves'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='EPYC-Rome-v3'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='xsaves'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='EPYC-Turin'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='amd-psfd'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='auto-ibrs'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx-vnni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512-bf16'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512-vp2intersect'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512-vpopcntdq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512bitalg'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512bw'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512cd'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512dq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512f'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512ifma'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vbmi'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vbmi2'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vl'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vnni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fs-gs-base-ns'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fsrm'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='gfni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='ibpb-brtype'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='invpcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='la57'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='movdir64b'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='movdiri'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='no-nested-data-bp'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='null-sel-clr-base'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='perfmon-v2'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pku'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='prefetchi'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='sbpb'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='srso-user-kernel-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='stibp-always-on'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='vaes'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='vpclmulqdq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='xsaves'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='EPYC-Turin-v1'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='amd-psfd'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='auto-ibrs'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx-vnni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512-bf16'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512-vp2intersect'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512-vpopcntdq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512bitalg'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512bw'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512cd'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512dq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512f'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512ifma'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vbmi'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vbmi2'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vl'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vnni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fs-gs-base-ns'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fsrm'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='gfni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='ibpb-brtype'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='invpcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='la57'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='movdir64b'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='movdiri'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='no-nested-data-bp'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='null-sel-clr-base'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='perfmon-v2'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pku'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='prefetchi'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='sbpb'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='srso-user-kernel-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='stibp-always-on'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='vaes'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='vpclmulqdq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='xsaves'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='EPYC-v3'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='xsaves'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='EPYC-v4'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='xsaves'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='AMD'>EPYC-v5</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='EPYC-v5'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='xsaves'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='GraniteRapids'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='amx-bf16'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='amx-fp16'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='amx-int8'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='amx-tile'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx-vnni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512-bf16'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512-fp16'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512-vpopcntdq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512bitalg'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512bw'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512cd'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512dq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512f'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512ifma'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vbmi'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vbmi2'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vl'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vnni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='bus-lock-detect'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fbsdp-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fsrc'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fsrm'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fsrs'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fzrm'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='gfni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='hle'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='ibrs-all'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='invpcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='la57'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='mcdt-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pbrsb-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pku'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='prefetchiti'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='psdp-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='rtm'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='sbdr-ssdp-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='serialize'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='taa-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='tsx-ldtrk'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='vaes'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='vpclmulqdq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='xfd'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='xsaves'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='GraniteRapids-v1'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='amx-bf16'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='amx-fp16'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='amx-int8'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='amx-tile'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx-vnni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512-bf16'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512-fp16'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512-vpopcntdq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512bitalg'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512bw'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512cd'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512dq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512f'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512ifma'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vbmi'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vbmi2'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vl'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vnni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='bus-lock-detect'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fbsdp-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fsrc'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fsrm'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fsrs'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fzrm'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='gfni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='hle'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='ibrs-all'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='invpcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='la57'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='mcdt-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pbrsb-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pku'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='prefetchiti'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='psdp-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='rtm'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='sbdr-ssdp-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='serialize'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='taa-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='tsx-ldtrk'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='vaes'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='vpclmulqdq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='xfd'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='xsaves'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='GraniteRapids-v2'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='amx-bf16'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='amx-fp16'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='amx-int8'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='amx-tile'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx-vnni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx10'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx10-128'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx10-256'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx10-512'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512-bf16'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512-fp16'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512-vpopcntdq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512bitalg'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512bw'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512cd'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512dq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512f'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512ifma'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vbmi'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vbmi2'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vl'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vnni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='bus-lock-detect'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='cldemote'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fbsdp-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fsrc'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fsrm'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fsrs'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fzrm'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='gfni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='hle'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='ibrs-all'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='invpcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='la57'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='mcdt-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='movdir64b'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='movdiri'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pbrsb-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pku'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='prefetchiti'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='psdp-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='rtm'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='sbdr-ssdp-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='serialize'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='ss'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='taa-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='tsx-ldtrk'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='vaes'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='vpclmulqdq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='xfd'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='xsaves'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='GraniteRapids-v3'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='amx-bf16'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='amx-fp16'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='amx-int8'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='amx-tile'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx-vnni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx10'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx10-128'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx10-256'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx10-512'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512-bf16'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512-fp16'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512-vpopcntdq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512bitalg'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512bw'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512cd'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512dq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512f'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512ifma'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vbmi'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vbmi2'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vl'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vnni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='bus-lock-detect'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='cldemote'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fbsdp-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fsrc'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fsrm'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fsrs'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fzrm'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='gfni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='hle'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='ibrs-all'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='invpcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='la57'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='mcdt-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='movdir64b'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='movdiri'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pbrsb-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pku'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='prefetchiti'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='psdp-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='rtm'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='sbdr-ssdp-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='serialize'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='ss'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='taa-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='tsx-ldtrk'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='vaes'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='vpclmulqdq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='xfd'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='xsaves'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='Haswell'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='hle'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='invpcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='rtm'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='Haswell-IBRS'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='hle'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='invpcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='rtm'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='Haswell-noTSX'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='invpcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='Haswell-noTSX-IBRS'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='invpcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='Haswell-v1'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='hle'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='invpcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='rtm'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='Haswell-v2'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='invpcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='Haswell-v3'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='hle'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='invpcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='rtm'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='Haswell-v4'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='invpcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='Icelake-Server'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512-vpopcntdq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512bitalg'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512bw'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512cd'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512dq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512f'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vbmi'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vbmi2'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vl'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vnni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='gfni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='hle'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='invpcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='la57'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pku'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='rtm'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='vaes'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='vpclmulqdq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='Icelake-Server-noTSX'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512-vpopcntdq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512bitalg'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512bw'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512cd'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512dq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512f'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vbmi'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vbmi2'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vl'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vnni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='gfni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='invpcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='la57'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pku'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='vaes'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='vpclmulqdq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='Icelake-Server-v1'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512-vpopcntdq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512bitalg'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512bw'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512cd'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512dq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512f'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vbmi'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vbmi2'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vl'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vnni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='gfni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='hle'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='invpcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='la57'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pku'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='rtm'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='vaes'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='vpclmulqdq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='Icelake-Server-v2'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512-vpopcntdq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512bitalg'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512bw'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512cd'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512dq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512f'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vbmi'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vbmi2'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vl'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vnni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='gfni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='invpcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='la57'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pku'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='vaes'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='vpclmulqdq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='Icelake-Server-v3'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512-vpopcntdq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512bitalg'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512bw'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512cd'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512dq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512f'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vbmi'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vbmi2'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vl'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vnni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='gfni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='ibrs-all'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='invpcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='la57'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pku'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='taa-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='vaes'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='vpclmulqdq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='Icelake-Server-v4'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512-vpopcntdq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512bitalg'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512bw'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512cd'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512dq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512f'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512ifma'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vbmi'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vbmi2'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vl'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vnni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fsrm'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='gfni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='ibrs-all'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='invpcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='la57'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pku'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='taa-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='vaes'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='vpclmulqdq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='Icelake-Server-v5'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512-vpopcntdq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512bitalg'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512bw'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512cd'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512dq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512f'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512ifma'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vbmi'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vbmi2'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vl'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vnni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fsrm'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='gfni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='ibrs-all'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='invpcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='la57'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pku'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='taa-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='vaes'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='vpclmulqdq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='xsaves'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='Icelake-Server-v6'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512-vpopcntdq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512bitalg'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512bw'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512cd'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512dq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512f'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512ifma'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vbmi'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vbmi2'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vl'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vnni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fsrm'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='gfni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='ibrs-all'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='invpcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='la57'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pku'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='taa-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='vaes'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='vpclmulqdq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='xsaves'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='Icelake-Server-v7'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512-vpopcntdq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512bitalg'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512bw'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512cd'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512dq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512f'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512ifma'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vbmi'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vbmi2'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vl'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vnni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fsrm'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='gfni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='hle'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='ibrs-all'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='invpcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='la57'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pku'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='rtm'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='taa-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='vaes'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='vpclmulqdq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='xsaves'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='IvyBridge'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='IvyBridge-IBRS'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='IvyBridge-v1'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='IvyBridge-v2'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='KnightsMill'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512-4fmaps'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512-4vnniw'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512-vpopcntdq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512cd'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512er'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512f'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512pf'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='ss'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='KnightsMill-v1'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512-4fmaps'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512-4vnniw'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512-vpopcntdq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512cd'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512er'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512f'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512pf'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='ss'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='Opteron_G4'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fma4'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='xop'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='Opteron_G4-v1'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fma4'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='xop'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='Opteron_G5'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fma4'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='tbm'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='xop'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='Opteron_G5-v1'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fma4'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='tbm'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='xop'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='SapphireRapids'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='amx-bf16'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='amx-int8'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='amx-tile'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx-vnni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512-bf16'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512-fp16'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512-vpopcntdq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512bitalg'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512bw'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512cd'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512dq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512f'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512ifma'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vbmi'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vbmi2'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vl'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vnni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='bus-lock-detect'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fsrc'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fsrm'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fsrs'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fzrm'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='gfni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='hle'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='ibrs-all'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='invpcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='la57'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pku'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='rtm'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='serialize'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='taa-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='tsx-ldtrk'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='vaes'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='vpclmulqdq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='xfd'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='xsaves'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='SapphireRapids-v1'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='amx-bf16'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='amx-int8'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='amx-tile'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx-vnni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512-bf16'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512-fp16'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512-vpopcntdq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512bitalg'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512bw'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512cd'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512dq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512f'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512ifma'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vbmi'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vbmi2'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vl'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vnni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='bus-lock-detect'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fsrc'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fsrm'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fsrs'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fzrm'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='gfni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='hle'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='ibrs-all'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='invpcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='la57'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pku'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='rtm'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='serialize'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='taa-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='tsx-ldtrk'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='vaes'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='vpclmulqdq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='xfd'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='xsaves'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='SapphireRapids-v2'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='amx-bf16'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='amx-int8'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='amx-tile'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx-vnni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512-bf16'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512-fp16'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512-vpopcntdq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512bitalg'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512bw'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512cd'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512dq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512f'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512ifma'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vbmi'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vbmi2'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vl'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vnni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='bus-lock-detect'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fbsdp-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fsrc'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fsrm'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fsrs'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fzrm'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='gfni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='hle'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='ibrs-all'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='invpcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='la57'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pku'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='psdp-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='rtm'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='sbdr-ssdp-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='serialize'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='taa-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='tsx-ldtrk'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='vaes'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='vpclmulqdq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='xfd'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='xsaves'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='SapphireRapids-v3'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='amx-bf16'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='amx-int8'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='amx-tile'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx-vnni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512-bf16'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512-fp16'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512-vpopcntdq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512bitalg'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512bw'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512cd'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512dq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512f'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512ifma'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vbmi'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vbmi2'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vl'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vnni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='bus-lock-detect'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='cldemote'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fbsdp-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fsrc'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fsrm'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fsrs'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fzrm'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='gfni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='hle'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='ibrs-all'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='invpcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='la57'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='movdir64b'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='movdiri'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pku'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='psdp-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='rtm'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='sbdr-ssdp-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='serialize'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='ss'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='taa-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='tsx-ldtrk'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='vaes'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='vpclmulqdq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='xfd'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='xsaves'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='SapphireRapids-v4'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='amx-bf16'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='amx-int8'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='amx-tile'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx-vnni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512-bf16'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512-fp16'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512-vpopcntdq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512bitalg'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512bw'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512cd'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512dq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512f'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512ifma'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vbmi'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vbmi2'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vl'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vnni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='bus-lock-detect'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='cldemote'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fbsdp-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fsrc'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fsrm'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fsrs'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fzrm'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='gfni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='hle'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='ibrs-all'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='invpcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='la57'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='movdir64b'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='movdiri'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pku'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='psdp-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='rtm'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='sbdr-ssdp-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='serialize'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='ss'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='taa-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='tsx-ldtrk'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='vaes'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='vpclmulqdq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='xfd'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='xsaves'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='SierraForest'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx-ifma'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx-ne-convert'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx-vnni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx-vnni-int8'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='bus-lock-detect'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='cmpccxadd'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fbsdp-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fsrm'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fsrs'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='gfni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='ibrs-all'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='invpcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='mcdt-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pbrsb-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pku'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='psdp-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='sbdr-ssdp-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='serialize'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='vaes'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='vpclmulqdq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='xsaves'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='SierraForest-v1'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx-ifma'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx-ne-convert'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx-vnni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx-vnni-int8'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='bus-lock-detect'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='cmpccxadd'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fbsdp-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fsrm'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fsrs'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='gfni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='ibrs-all'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='invpcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='mcdt-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pbrsb-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pku'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='psdp-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='sbdr-ssdp-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='serialize'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='vaes'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='vpclmulqdq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='xsaves'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel'>SierraForest-v2</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='SierraForest-v2'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx-ifma'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx-ne-convert'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx-vnni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx-vnni-int8'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='bhi-ctrl'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='bus-lock-detect'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='cldemote'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='cmpccxadd'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fbsdp-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fsrm'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fsrs'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='gfni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='ibrs-all'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='intel-psfd'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='invpcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='ipred-ctrl'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='lam'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='mcdt-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='movdir64b'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='movdiri'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pbrsb-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pku'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='psdp-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='rrsba-ctrl'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='sbdr-ssdp-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='serialize'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='ss'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='vaes'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='vpclmulqdq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='xsaves'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel'>SierraForest-v3</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='SierraForest-v3'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx-ifma'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx-ne-convert'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx-vnni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx-vnni-int8'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='bhi-ctrl'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='bus-lock-detect'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='cldemote'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='cmpccxadd'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fbsdp-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fsrm'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fsrs'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='gfni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='ibrs-all'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='intel-psfd'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='invpcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='ipred-ctrl'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='lam'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='mcdt-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='movdir64b'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='movdiri'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pbrsb-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pku'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='psdp-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='rrsba-ctrl'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='sbdr-ssdp-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='serialize'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='ss'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='vaes'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='vpclmulqdq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='xsaves'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='Skylake-Client'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='hle'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='invpcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='rtm'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='Skylake-Client-IBRS'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='hle'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='invpcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='rtm'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='invpcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='Skylake-Client-v1'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='hle'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='invpcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='rtm'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='Skylake-Client-v2'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='hle'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='invpcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='rtm'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='Skylake-Client-v3'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='invpcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='Skylake-Client-v4'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='invpcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='xsaves'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='Skylake-Server'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512bw'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512cd'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512dq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512f'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vl'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='hle'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='invpcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pku'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='rtm'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='Skylake-Server-IBRS'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512bw'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512cd'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512dq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512f'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vl'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='hle'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='invpcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pku'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='rtm'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512bw'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512cd'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512dq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512f'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vl'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='invpcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pku'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='Skylake-Server-v1'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512bw'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512cd'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512dq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512f'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vl'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='hle'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='invpcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pku'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='rtm'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='Skylake-Server-v2'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512bw'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512cd'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512dq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512f'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vl'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='hle'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='invpcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pku'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='rtm'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='Skylake-Server-v3'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512bw'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512cd'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512dq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512f'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vl'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='invpcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pku'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='Skylake-Server-v4'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512bw'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512cd'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512dq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512f'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vl'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='invpcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pku'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='Skylake-Server-v5'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512bw'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512cd'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512dq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512f'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vl'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='invpcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pku'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='xsaves'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='Snowridge'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='cldemote'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='core-capability'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='gfni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='movdir64b'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='movdiri'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='mpx'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='split-lock-detect'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='Snowridge-v1'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='cldemote'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='core-capability'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='gfni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='movdir64b'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='movdiri'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='mpx'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='split-lock-detect'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='Snowridge-v2'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='cldemote'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='core-capability'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='gfni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='movdir64b'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='movdiri'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='split-lock-detect'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='Snowridge-v3'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='cldemote'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='core-capability'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='gfni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='movdir64b'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='movdiri'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='split-lock-detect'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='xsaves'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='Snowridge-v4'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='cldemote'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='gfni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='movdir64b'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='movdiri'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='xsaves'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='athlon'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='3dnow'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='3dnowext'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='athlon-v1'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='3dnow'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='3dnowext'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='core2duo'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='ss'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='core2duo-v1'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='ss'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='coreduo'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='ss'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='coreduo-v1'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='ss'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='n270'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='ss'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='n270-v1'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='ss'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='phenom'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='3dnow'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='3dnowext'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='phenom-v1'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='3dnow'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='3dnowext'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:     </mode>
Feb 02 09:58:02 compute-1 nova_compute[226294]:   </cpu>
Feb 02 09:58:02 compute-1 nova_compute[226294]:   <memoryBacking supported='yes'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:     <enum name='sourceType'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <value>file</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <value>anonymous</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <value>memfd</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:     </enum>
Feb 02 09:58:02 compute-1 nova_compute[226294]:   </memoryBacking>
Feb 02 09:58:02 compute-1 nova_compute[226294]:   <devices>
Feb 02 09:58:02 compute-1 nova_compute[226294]:     <disk supported='yes'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <enum name='diskDevice'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>disk</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>cdrom</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>floppy</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>lun</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </enum>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <enum name='bus'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>fdc</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>scsi</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>virtio</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>usb</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>sata</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </enum>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <enum name='model'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>virtio</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>virtio-transitional</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>virtio-non-transitional</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </enum>
Feb 02 09:58:02 compute-1 nova_compute[226294]:     </disk>
Feb 02 09:58:02 compute-1 nova_compute[226294]:     <graphics supported='yes'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <enum name='type'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>vnc</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>egl-headless</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>dbus</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </enum>
Feb 02 09:58:02 compute-1 nova_compute[226294]:     </graphics>
Feb 02 09:58:02 compute-1 nova_compute[226294]:     <video supported='yes'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <enum name='modelType'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>vga</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>cirrus</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>virtio</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>none</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>bochs</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>ramfb</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </enum>
Feb 02 09:58:02 compute-1 nova_compute[226294]:     </video>
Feb 02 09:58:02 compute-1 nova_compute[226294]:     <hostdev supported='yes'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <enum name='mode'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>subsystem</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </enum>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <enum name='startupPolicy'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>default</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>mandatory</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>requisite</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>optional</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </enum>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <enum name='subsysType'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>usb</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>pci</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>scsi</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </enum>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <enum name='capsType'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <enum name='pciBackend'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:     </hostdev>
Feb 02 09:58:02 compute-1 nova_compute[226294]:     <rng supported='yes'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <enum name='model'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>virtio</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>virtio-transitional</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>virtio-non-transitional</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </enum>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <enum name='backendModel'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>random</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>egd</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>builtin</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </enum>
Feb 02 09:58:02 compute-1 nova_compute[226294]:     </rng>
Feb 02 09:58:02 compute-1 nova_compute[226294]:     <filesystem supported='yes'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <enum name='driverType'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>path</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>handle</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>virtiofs</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </enum>
Feb 02 09:58:02 compute-1 nova_compute[226294]:     </filesystem>
Feb 02 09:58:02 compute-1 nova_compute[226294]:     <tpm supported='yes'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <enum name='model'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>tpm-tis</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>tpm-crb</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </enum>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <enum name='backendModel'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>emulator</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>external</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </enum>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <enum name='backendVersion'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>2.0</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </enum>
Feb 02 09:58:02 compute-1 nova_compute[226294]:     </tpm>
Feb 02 09:58:02 compute-1 nova_compute[226294]:     <redirdev supported='yes'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <enum name='bus'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>usb</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </enum>
Feb 02 09:58:02 compute-1 nova_compute[226294]:     </redirdev>
Feb 02 09:58:02 compute-1 nova_compute[226294]:     <channel supported='yes'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <enum name='type'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>pty</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>unix</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </enum>
Feb 02 09:58:02 compute-1 nova_compute[226294]:     </channel>
Feb 02 09:58:02 compute-1 nova_compute[226294]:     <crypto supported='yes'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <enum name='model'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <enum name='type'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>qemu</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </enum>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <enum name='backendModel'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>builtin</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </enum>
Feb 02 09:58:02 compute-1 nova_compute[226294]:     </crypto>
Feb 02 09:58:02 compute-1 nova_compute[226294]:     <interface supported='yes'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <enum name='backendType'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>default</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>passt</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </enum>
Feb 02 09:58:02 compute-1 nova_compute[226294]:     </interface>
Feb 02 09:58:02 compute-1 nova_compute[226294]:     <panic supported='yes'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <enum name='model'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>isa</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>hyperv</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </enum>
Feb 02 09:58:02 compute-1 nova_compute[226294]:     </panic>
Feb 02 09:58:02 compute-1 nova_compute[226294]:     <console supported='yes'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <enum name='type'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>null</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>vc</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>pty</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>dev</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>file</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>pipe</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>stdio</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>udp</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>tcp</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>unix</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>qemu-vdagent</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>dbus</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </enum>
Feb 02 09:58:02 compute-1 nova_compute[226294]:     </console>
Feb 02 09:58:02 compute-1 nova_compute[226294]:   </devices>
Feb 02 09:58:02 compute-1 nova_compute[226294]:   <features>
Feb 02 09:58:02 compute-1 nova_compute[226294]:     <gic supported='no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:     <vmcoreinfo supported='yes'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:     <genid supported='yes'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:     <backingStoreInput supported='yes'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:     <backup supported='yes'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:     <async-teardown supported='yes'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:     <s390-pv supported='no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:     <ps2 supported='yes'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:     <tdx supported='no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:     <sev supported='no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:     <sgx supported='no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:     <hyperv supported='yes'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <enum name='features'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>relaxed</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>vapic</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>spinlocks</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>vpindex</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>runtime</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>synic</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>stimer</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>reset</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>vendor_id</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>frequencies</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>reenlightenment</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>tlbflush</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>ipi</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>avic</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>emsr_bitmap</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>xmm_input</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </enum>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <defaults>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <spinlocks>4095</spinlocks>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <stimer_direct>on</stimer_direct>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <tlbflush_direct>on</tlbflush_direct>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <tlbflush_extended>on</tlbflush_extended>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <vendor_id>Linux KVM Hv</vendor_id>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </defaults>
Feb 02 09:58:02 compute-1 nova_compute[226294]:     </hyperv>
Feb 02 09:58:02 compute-1 nova_compute[226294]:     <launchSecurity supported='no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:   </features>
Feb 02 09:58:02 compute-1 nova_compute[226294]: </domainCapabilities>
Feb 02 09:58:02 compute-1 nova_compute[226294]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.436 226298 WARNING nova.virt.libvirt.driver [None req-9f87d029-9537-479a-a951-66a5b7f88d49 - - - - - -] Cannot update service status on host "compute-1.ctlplane.example.com" since it is not registered.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host compute-1.ctlplane.example.com could not be found.
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.437 226298 DEBUG nova.virt.libvirt.volume.mount [None req-9f87d029-9537-479a-a951-66a5b7f88d49 - - - - - -] Initialising _HostMountState generation 0 host_up /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/mount.py:130
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.447 226298 DEBUG nova.virt.libvirt.host [None req-9f87d029-9537-479a-a951-66a5b7f88d49 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=pc:
Feb 02 09:58:02 compute-1 nova_compute[226294]: <domainCapabilities>
Feb 02 09:58:02 compute-1 nova_compute[226294]:   <path>/usr/libexec/qemu-kvm</path>
Feb 02 09:58:02 compute-1 nova_compute[226294]:   <domain>kvm</domain>
Feb 02 09:58:02 compute-1 nova_compute[226294]:   <machine>pc-i440fx-rhel7.6.0</machine>
Feb 02 09:58:02 compute-1 nova_compute[226294]:   <arch>i686</arch>
Feb 02 09:58:02 compute-1 nova_compute[226294]:   <vcpu max='240'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:   <iothreads supported='yes'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:   <os supported='yes'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:     <enum name='firmware'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:     <loader supported='yes'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <enum name='type'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>rom</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>pflash</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </enum>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <enum name='readonly'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>yes</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>no</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </enum>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <enum name='secure'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>no</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </enum>
Feb 02 09:58:02 compute-1 nova_compute[226294]:     </loader>
Feb 02 09:58:02 compute-1 nova_compute[226294]:   </os>
Feb 02 09:58:02 compute-1 nova_compute[226294]:   <cpu>
Feb 02 09:58:02 compute-1 nova_compute[226294]:     <mode name='host-passthrough' supported='yes'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <enum name='hostPassthroughMigratable'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>on</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>off</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </enum>
Feb 02 09:58:02 compute-1 nova_compute[226294]:     </mode>
Feb 02 09:58:02 compute-1 nova_compute[226294]:     <mode name='maximum' supported='yes'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <enum name='maximumMigratable'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>on</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>off</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </enum>
Feb 02 09:58:02 compute-1 nova_compute[226294]:     </mode>
Feb 02 09:58:02 compute-1 nova_compute[226294]:     <mode name='host-model' supported='yes'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model fallback='forbid'>EPYC-Rome</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <vendor>AMD</vendor>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <maxphysaddr mode='passthrough' limit='40'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <feature policy='require' name='x2apic'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <feature policy='require' name='tsc-deadline'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <feature policy='require' name='hypervisor'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <feature policy='require' name='tsc_adjust'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <feature policy='require' name='spec-ctrl'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <feature policy='require' name='stibp'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <feature policy='require' name='ssbd'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <feature policy='require' name='cmp_legacy'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <feature policy='require' name='overflow-recov'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <feature policy='require' name='succor'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <feature policy='require' name='ibrs'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <feature policy='require' name='amd-ssbd'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <feature policy='require' name='virt-ssbd'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <feature policy='require' name='lbrv'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <feature policy='require' name='tsc-scale'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <feature policy='require' name='vmcb-clean'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <feature policy='require' name='flushbyasid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <feature policy='require' name='pause-filter'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <feature policy='require' name='pfthreshold'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <feature policy='require' name='svme-addr-chk'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <feature policy='require' name='lfence-always-serializing'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <feature policy='disable' name='xsaves'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:     </mode>
Feb 02 09:58:02 compute-1 nova_compute[226294]:     <mode name='custom' supported='yes'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='Broadwell'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='hle'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='invpcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='rtm'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='Broadwell-IBRS'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='hle'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='invpcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='rtm'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='Broadwell-noTSX'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='invpcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='Broadwell-noTSX-IBRS'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='invpcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='Broadwell-v1'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='hle'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='invpcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='rtm'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='Broadwell-v2'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='invpcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='Broadwell-v3'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='hle'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='invpcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='rtm'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='Broadwell-v4'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='invpcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='Cascadelake-Server'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512bw'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512cd'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512dq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512f'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vl'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vnni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='hle'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='invpcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pku'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='rtm'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='Cascadelake-Server-noTSX'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512bw'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512cd'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512dq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512f'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vl'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vnni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='ibrs-all'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='invpcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pku'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='Cascadelake-Server-v1'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512bw'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512cd'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512dq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512f'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vl'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vnni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='hle'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='invpcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pku'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='rtm'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='Cascadelake-Server-v2'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512bw'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512cd'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512dq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512f'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vl'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vnni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='hle'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='ibrs-all'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='invpcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pku'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='rtm'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='Cascadelake-Server-v3'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512bw'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512cd'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512dq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512f'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vl'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vnni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='ibrs-all'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='invpcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pku'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='Cascadelake-Server-v4'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512bw'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512cd'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512dq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512f'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vl'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vnni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='ibrs-all'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='invpcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pku'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='Cascadelake-Server-v5'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512bw'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512cd'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512dq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512f'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vl'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vnni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='ibrs-all'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='invpcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pku'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='xsaves'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='ClearwaterForest'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx-ifma'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx-ne-convert'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx-vnni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx-vnni-int16'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx-vnni-int8'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='bhi-ctrl'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='bhi-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='bus-lock-detect'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='cldemote'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='cmpccxadd'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='ddpd-u'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fbsdp-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fsrm'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fsrs'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='gfni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='ibrs-all'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='intel-psfd'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='invpcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='ipred-ctrl'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='lam'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='mcdt-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='movdir64b'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='movdiri'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pbrsb-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pku'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='prefetchiti'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='psdp-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='rrsba-ctrl'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='sbdr-ssdp-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='serialize'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='sha512'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='sm3'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='sm4'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='ss'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='vaes'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='vpclmulqdq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='xsaves'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='ClearwaterForest-v1'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx-ifma'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx-ne-convert'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx-vnni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx-vnni-int16'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx-vnni-int8'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='bhi-ctrl'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='bhi-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='bus-lock-detect'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='cldemote'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='cmpccxadd'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='ddpd-u'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fbsdp-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fsrm'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fsrs'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='gfni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='ibrs-all'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='intel-psfd'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='invpcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='ipred-ctrl'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='lam'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='mcdt-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='movdir64b'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='movdiri'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pbrsb-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pku'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='prefetchiti'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='psdp-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='rrsba-ctrl'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='sbdr-ssdp-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='serialize'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='sha512'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='sm3'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='sm4'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='ss'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='vaes'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='vpclmulqdq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='xsaves'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='Cooperlake'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512-bf16'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512bw'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512cd'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512dq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512f'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vl'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vnni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='hle'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='ibrs-all'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='invpcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pku'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='rtm'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='taa-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='Cooperlake-v1'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512-bf16'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512bw'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512cd'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512dq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512f'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vl'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vnni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='hle'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='ibrs-all'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='invpcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pku'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='rtm'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='taa-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='Cooperlake-v2'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512-bf16'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512bw'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512cd'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512dq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512f'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vl'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vnni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='hle'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='ibrs-all'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='invpcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pku'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='rtm'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='taa-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='xsaves'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='Denverton'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='mpx'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='Denverton-v1'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='mpx'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='Denverton-v2'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='Denverton-v3'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='xsaves'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='Dhyana-v2'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='xsaves'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='EPYC-Genoa'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='amd-psfd'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='auto-ibrs'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512-bf16'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512-vpopcntdq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512bitalg'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512bw'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512cd'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512dq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512f'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512ifma'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vbmi'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vbmi2'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vl'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vnni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fsrm'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='gfni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='invpcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='la57'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='no-nested-data-bp'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='null-sel-clr-base'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pku'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='stibp-always-on'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='vaes'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='vpclmulqdq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='xsaves'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='EPYC-Genoa-v1'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='amd-psfd'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='auto-ibrs'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512-bf16'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512-vpopcntdq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512bitalg'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512bw'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512cd'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512dq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512f'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512ifma'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vbmi'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vbmi2'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vl'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vnni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fsrm'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='gfni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='invpcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='la57'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='no-nested-data-bp'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='null-sel-clr-base'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pku'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='stibp-always-on'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='vaes'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='vpclmulqdq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='xsaves'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='EPYC-Genoa-v2'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='amd-psfd'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='auto-ibrs'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512-bf16'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512-vpopcntdq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512bitalg'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512bw'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512cd'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512dq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512f'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512ifma'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vbmi'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vbmi2'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vl'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vnni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fs-gs-base-ns'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fsrm'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='gfni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='invpcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='la57'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='no-nested-data-bp'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='null-sel-clr-base'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='perfmon-v2'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pku'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='stibp-always-on'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='vaes'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='vpclmulqdq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='xsaves'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='EPYC-Milan'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fsrm'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='invpcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pku'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='xsaves'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='EPYC-Milan-v1'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fsrm'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='invpcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pku'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='xsaves'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='EPYC-Milan-v2'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='amd-psfd'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fsrm'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='invpcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='no-nested-data-bp'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='null-sel-clr-base'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pku'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='stibp-always-on'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='vaes'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='vpclmulqdq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='xsaves'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='EPYC-Milan-v3'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='amd-psfd'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fsrm'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='invpcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='no-nested-data-bp'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='null-sel-clr-base'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pku'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='stibp-always-on'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='vaes'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='vpclmulqdq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='xsaves'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='EPYC-Rome'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='xsaves'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='EPYC-Rome-v1'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='xsaves'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='EPYC-Rome-v2'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='xsaves'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='EPYC-Rome-v3'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='xsaves'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='EPYC-Turin'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='amd-psfd'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='auto-ibrs'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx-vnni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512-bf16'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512-vp2intersect'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512-vpopcntdq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512bitalg'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512bw'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512cd'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512dq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512f'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512ifma'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vbmi'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vbmi2'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vl'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vnni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fs-gs-base-ns'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fsrm'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='gfni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='ibpb-brtype'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='invpcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='la57'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='movdir64b'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='movdiri'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='no-nested-data-bp'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='null-sel-clr-base'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='perfmon-v2'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pku'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='prefetchi'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='sbpb'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='srso-user-kernel-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='stibp-always-on'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='vaes'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='vpclmulqdq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='xsaves'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='EPYC-Turin-v1'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='amd-psfd'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='auto-ibrs'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx-vnni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512-bf16'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512-vp2intersect'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512-vpopcntdq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512bitalg'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512bw'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512cd'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512dq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512f'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512ifma'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vbmi'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vbmi2'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vl'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vnni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fs-gs-base-ns'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fsrm'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='gfni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='ibpb-brtype'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='invpcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='la57'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='movdir64b'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='movdiri'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='no-nested-data-bp'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='null-sel-clr-base'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='perfmon-v2'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pku'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='prefetchi'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='sbpb'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='srso-user-kernel-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='stibp-always-on'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='vaes'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='vpclmulqdq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='xsaves'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='EPYC-v3'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='xsaves'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='EPYC-v4'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='xsaves'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='AMD'>EPYC-v5</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='EPYC-v5'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='xsaves'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='GraniteRapids'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='amx-bf16'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='amx-fp16'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='amx-int8'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='amx-tile'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx-vnni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512-bf16'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512-fp16'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512-vpopcntdq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512bitalg'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512bw'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512cd'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512dq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512f'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512ifma'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vbmi'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vbmi2'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vl'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vnni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='bus-lock-detect'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fbsdp-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fsrc'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fsrm'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fsrs'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fzrm'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='gfni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='hle'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='ibrs-all'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='invpcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='la57'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='mcdt-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pbrsb-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pku'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='prefetchiti'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='psdp-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='rtm'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='sbdr-ssdp-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='serialize'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='taa-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='tsx-ldtrk'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='vaes'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='vpclmulqdq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='xfd'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='xsaves'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='GraniteRapids-v1'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='amx-bf16'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='amx-fp16'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='amx-int8'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='amx-tile'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx-vnni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512-bf16'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512-fp16'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512-vpopcntdq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512bitalg'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512bw'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512cd'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512dq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512f'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512ifma'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vbmi'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vbmi2'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vl'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vnni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='bus-lock-detect'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fbsdp-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fsrc'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fsrm'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fsrs'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fzrm'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='gfni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='hle'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='ibrs-all'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='invpcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='la57'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='mcdt-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pbrsb-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pku'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='prefetchiti'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='psdp-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='rtm'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='sbdr-ssdp-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='serialize'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='taa-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='tsx-ldtrk'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='vaes'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='vpclmulqdq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='xfd'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='xsaves'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='GraniteRapids-v2'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='amx-bf16'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='amx-fp16'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='amx-int8'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='amx-tile'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx-vnni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx10'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx10-128'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx10-256'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx10-512'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512-bf16'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512-fp16'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512-vpopcntdq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512bitalg'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512bw'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512cd'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512dq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512f'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512ifma'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vbmi'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vbmi2'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vl'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vnni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='bus-lock-detect'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='cldemote'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fbsdp-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fsrc'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fsrm'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fsrs'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fzrm'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='gfni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='hle'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='ibrs-all'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='invpcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='la57'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='mcdt-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='movdir64b'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='movdiri'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pbrsb-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pku'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='prefetchiti'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='psdp-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='rtm'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='sbdr-ssdp-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='serialize'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='ss'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='taa-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='tsx-ldtrk'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='vaes'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='vpclmulqdq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='xfd'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='xsaves'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='GraniteRapids-v3'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='amx-bf16'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='amx-fp16'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='amx-int8'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='amx-tile'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx-vnni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx10'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx10-128'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx10-256'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx10-512'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512-bf16'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512-fp16'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512-vpopcntdq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512bitalg'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512bw'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512cd'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512dq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512f'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512ifma'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vbmi'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vbmi2'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vl'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vnni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='bus-lock-detect'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='cldemote'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fbsdp-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fsrc'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fsrm'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fsrs'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fzrm'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='gfni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='hle'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='ibrs-all'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='invpcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='la57'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='mcdt-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='movdir64b'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='movdiri'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pbrsb-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pku'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='prefetchiti'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='psdp-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='rtm'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='sbdr-ssdp-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='serialize'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='ss'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='taa-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='tsx-ldtrk'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='vaes'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='vpclmulqdq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='xfd'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='xsaves'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='Haswell'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='hle'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='invpcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='rtm'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='Haswell-IBRS'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='hle'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='invpcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='rtm'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='Haswell-noTSX'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='invpcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='Haswell-noTSX-IBRS'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='invpcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='Haswell-v1'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='hle'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='invpcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='rtm'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='Haswell-v2'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='invpcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='Haswell-v3'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='hle'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='invpcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='rtm'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='Haswell-v4'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='invpcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='Icelake-Server'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512-vpopcntdq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512bitalg'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512bw'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512cd'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512dq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512f'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vbmi'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vbmi2'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vl'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vnni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='gfni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='hle'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='invpcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='la57'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pku'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='rtm'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='vaes'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='vpclmulqdq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='Icelake-Server-noTSX'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512-vpopcntdq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512bitalg'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512bw'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512cd'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512dq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512f'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vbmi'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vbmi2'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vl'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vnni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='gfni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='invpcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='la57'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pku'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='vaes'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='vpclmulqdq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='Icelake-Server-v1'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512-vpopcntdq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512bitalg'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512bw'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512cd'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512dq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512f'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vbmi'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vbmi2'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vl'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vnni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='gfni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='hle'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='invpcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='la57'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pku'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='rtm'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='vaes'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='vpclmulqdq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='Icelake-Server-v2'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512-vpopcntdq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512bitalg'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512bw'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512cd'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512dq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512f'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vbmi'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vbmi2'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vl'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vnni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='gfni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='invpcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='la57'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pku'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='vaes'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='vpclmulqdq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='Icelake-Server-v3'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512-vpopcntdq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512bitalg'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512bw'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512cd'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512dq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512f'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vbmi'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vbmi2'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vl'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vnni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='gfni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='ibrs-all'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='invpcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='la57'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pku'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='taa-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='vaes'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='vpclmulqdq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='Icelake-Server-v4'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512-vpopcntdq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512bitalg'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512bw'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512cd'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512dq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512f'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512ifma'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vbmi'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vbmi2'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vl'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vnni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fsrm'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='gfni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='ibrs-all'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='invpcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='la57'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pku'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='taa-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='vaes'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='vpclmulqdq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='Icelake-Server-v5'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512-vpopcntdq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512bitalg'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512bw'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512cd'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512dq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512f'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512ifma'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vbmi'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vbmi2'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vl'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vnni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fsrm'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='gfni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='ibrs-all'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='invpcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='la57'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pku'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='taa-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='vaes'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='vpclmulqdq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='xsaves'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='Icelake-Server-v6'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512-vpopcntdq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512bitalg'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512bw'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512cd'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512dq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512f'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512ifma'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vbmi'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vbmi2'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vl'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vnni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fsrm'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='gfni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='ibrs-all'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='invpcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='la57'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pku'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='taa-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='vaes'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='vpclmulqdq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='xsaves'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='Icelake-Server-v7'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512-vpopcntdq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512bitalg'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512bw'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512cd'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512dq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512f'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512ifma'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vbmi'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vbmi2'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vl'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vnni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fsrm'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='gfni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='hle'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='ibrs-all'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='invpcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='la57'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pku'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='rtm'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='taa-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='vaes'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='vpclmulqdq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='xsaves'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='IvyBridge'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='IvyBridge-IBRS'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='IvyBridge-v1'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='IvyBridge-v2'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='KnightsMill'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512-4fmaps'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512-4vnniw'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512-vpopcntdq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512cd'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512er'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512f'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512pf'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='ss'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='KnightsMill-v1'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512-4fmaps'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512-4vnniw'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512-vpopcntdq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512cd'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512er'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512f'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512pf'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='ss'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='Opteron_G4'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fma4'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='xop'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='Opteron_G4-v1'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fma4'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='xop'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='Opteron_G5'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fma4'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='tbm'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='xop'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='Opteron_G5-v1'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fma4'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='tbm'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='xop'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='SapphireRapids'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='amx-bf16'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='amx-int8'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='amx-tile'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx-vnni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512-bf16'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512-fp16'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512-vpopcntdq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512bitalg'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512bw'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512cd'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512dq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512f'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512ifma'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vbmi'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vbmi2'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vl'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vnni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='bus-lock-detect'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fsrc'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fsrm'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fsrs'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fzrm'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='gfni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='hle'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='ibrs-all'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='invpcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='la57'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pku'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='rtm'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='serialize'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='taa-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='tsx-ldtrk'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='vaes'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='vpclmulqdq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='xfd'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='xsaves'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='SapphireRapids-v1'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='amx-bf16'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='amx-int8'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='amx-tile'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx-vnni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512-bf16'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512-fp16'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512-vpopcntdq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512bitalg'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512bw'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512cd'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512dq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512f'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512ifma'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vbmi'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vbmi2'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vl'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vnni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='bus-lock-detect'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fsrc'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fsrm'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fsrs'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fzrm'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='gfni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='hle'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='ibrs-all'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='invpcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='la57'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pku'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='rtm'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='serialize'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='taa-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='tsx-ldtrk'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='vaes'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='vpclmulqdq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='xfd'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='xsaves'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='SapphireRapids-v2'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='amx-bf16'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='amx-int8'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='amx-tile'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx-vnni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512-bf16'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512-fp16'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512-vpopcntdq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512bitalg'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512bw'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512cd'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512dq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512f'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512ifma'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vbmi'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vbmi2'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vl'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vnni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='bus-lock-detect'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fbsdp-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fsrc'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fsrm'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fsrs'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fzrm'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='gfni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='hle'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='ibrs-all'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='invpcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='la57'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pku'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='psdp-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='rtm'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='sbdr-ssdp-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='serialize'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='taa-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='tsx-ldtrk'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='vaes'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='vpclmulqdq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='xfd'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='xsaves'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='SapphireRapids-v3'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='amx-bf16'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='amx-int8'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='amx-tile'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx-vnni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512-bf16'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512-fp16'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512-vpopcntdq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512bitalg'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512bw'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512cd'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512dq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512f'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512ifma'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vbmi'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vbmi2'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vl'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vnni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='bus-lock-detect'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='cldemote'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fbsdp-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fsrc'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fsrm'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fsrs'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fzrm'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='gfni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='hle'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='ibrs-all'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='invpcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='la57'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='movdir64b'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='movdiri'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pku'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='psdp-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='rtm'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='sbdr-ssdp-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='serialize'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='ss'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='taa-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='tsx-ldtrk'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='vaes'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='vpclmulqdq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='xfd'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='xsaves'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='SapphireRapids-v4'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='amx-bf16'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='amx-int8'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='amx-tile'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx-vnni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512-bf16'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512-fp16'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512-vpopcntdq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512bitalg'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512bw'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512cd'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512dq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512f'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512ifma'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vbmi'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vbmi2'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vl'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vnni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='bus-lock-detect'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='cldemote'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fbsdp-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fsrc'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fsrm'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fsrs'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fzrm'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='gfni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='hle'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='ibrs-all'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='invpcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='la57'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='movdir64b'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='movdiri'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pku'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='psdp-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='rtm'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='sbdr-ssdp-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='serialize'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='ss'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='taa-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='tsx-ldtrk'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='vaes'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='vpclmulqdq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='xfd'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='xsaves'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='SierraForest'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx-ifma'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx-ne-convert'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx-vnni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx-vnni-int8'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='bus-lock-detect'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='cmpccxadd'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fbsdp-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fsrm'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fsrs'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='gfni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='ibrs-all'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='invpcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='mcdt-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pbrsb-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pku'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='psdp-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='sbdr-ssdp-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='serialize'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='vaes'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='vpclmulqdq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='xsaves'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='SierraForest-v1'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx-ifma'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx-ne-convert'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx-vnni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx-vnni-int8'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='bus-lock-detect'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='cmpccxadd'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fbsdp-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fsrm'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fsrs'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='gfni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='ibrs-all'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='invpcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='mcdt-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pbrsb-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pku'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='psdp-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='sbdr-ssdp-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='serialize'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='vaes'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='vpclmulqdq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='xsaves'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel'>SierraForest-v2</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='SierraForest-v2'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx-ifma'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx-ne-convert'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx-vnni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx-vnni-int8'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='bhi-ctrl'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='bus-lock-detect'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='cldemote'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='cmpccxadd'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fbsdp-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fsrm'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fsrs'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='gfni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='ibrs-all'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='intel-psfd'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='invpcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='ipred-ctrl'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='lam'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='mcdt-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='movdir64b'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='movdiri'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pbrsb-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pku'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='psdp-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='rrsba-ctrl'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='sbdr-ssdp-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='serialize'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='ss'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='vaes'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='vpclmulqdq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='xsaves'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel'>SierraForest-v3</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='SierraForest-v3'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx-ifma'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx-ne-convert'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx-vnni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx-vnni-int8'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='bhi-ctrl'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='bus-lock-detect'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='cldemote'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='cmpccxadd'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fbsdp-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fsrm'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fsrs'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='gfni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='ibrs-all'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='intel-psfd'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='invpcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='ipred-ctrl'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='lam'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='mcdt-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='movdir64b'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='movdiri'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pbrsb-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pku'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='psdp-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='rrsba-ctrl'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='sbdr-ssdp-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='serialize'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='ss'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='vaes'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='vpclmulqdq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='xsaves'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='Skylake-Client'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='hle'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='invpcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='rtm'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='Skylake-Client-IBRS'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='hle'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='invpcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='rtm'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='invpcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='Skylake-Client-v1'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='hle'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='invpcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='rtm'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='Skylake-Client-v2'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='hle'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='invpcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='rtm'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='Skylake-Client-v3'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='invpcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='Skylake-Client-v4'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='invpcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='xsaves'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='Skylake-Server'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512bw'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512cd'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512dq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512f'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vl'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='hle'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='invpcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pku'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='rtm'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='Skylake-Server-IBRS'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512bw'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512cd'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512dq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512f'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vl'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='hle'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='invpcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pku'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='rtm'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512bw'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512cd'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512dq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512f'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vl'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='invpcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pku'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='Skylake-Server-v1'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512bw'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512cd'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512dq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512f'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vl'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='hle'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='invpcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pku'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='rtm'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='Skylake-Server-v2'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512bw'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512cd'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512dq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512f'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vl'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='hle'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='invpcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pku'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='rtm'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='Skylake-Server-v3'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512bw'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512cd'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512dq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512f'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vl'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='invpcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pku'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='Skylake-Server-v4'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512bw'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512cd'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512dq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512f'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vl'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='invpcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pku'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='Skylake-Server-v5'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512bw'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512cd'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512dq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512f'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vl'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='invpcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pku'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='xsaves'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='Snowridge'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='cldemote'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='core-capability'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='gfni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='movdir64b'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='movdiri'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='mpx'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='split-lock-detect'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='Snowridge-v1'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='cldemote'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='core-capability'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='gfni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='movdir64b'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='movdiri'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='mpx'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='split-lock-detect'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='Snowridge-v2'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='cldemote'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='core-capability'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='gfni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='movdir64b'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='movdiri'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='split-lock-detect'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='Snowridge-v3'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='cldemote'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='core-capability'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='gfni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='movdir64b'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='movdiri'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='split-lock-detect'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='xsaves'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='Snowridge-v4'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='cldemote'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='gfni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='movdir64b'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='movdiri'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='xsaves'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='athlon'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='3dnow'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='3dnowext'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='athlon-v1'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='3dnow'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='3dnowext'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='core2duo'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='ss'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='core2duo-v1'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='ss'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='coreduo'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='ss'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='coreduo-v1'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='ss'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='n270'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='ss'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='n270-v1'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='ss'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='phenom'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='3dnow'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='3dnowext'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='phenom-v1'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='3dnow'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='3dnowext'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:     </mode>
Feb 02 09:58:02 compute-1 nova_compute[226294]:   </cpu>
Feb 02 09:58:02 compute-1 nova_compute[226294]:   <memoryBacking supported='yes'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:     <enum name='sourceType'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <value>file</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <value>anonymous</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <value>memfd</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:     </enum>
Feb 02 09:58:02 compute-1 nova_compute[226294]:   </memoryBacking>
Feb 02 09:58:02 compute-1 nova_compute[226294]:   <devices>
Feb 02 09:58:02 compute-1 nova_compute[226294]:     <disk supported='yes'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <enum name='diskDevice'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>disk</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>cdrom</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>floppy</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>lun</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </enum>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <enum name='bus'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>ide</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>fdc</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>scsi</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>virtio</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>usb</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>sata</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </enum>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <enum name='model'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>virtio</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>virtio-transitional</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>virtio-non-transitional</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </enum>
Feb 02 09:58:02 compute-1 nova_compute[226294]:     </disk>
Feb 02 09:58:02 compute-1 nova_compute[226294]:     <graphics supported='yes'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <enum name='type'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>vnc</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>egl-headless</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>dbus</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </enum>
Feb 02 09:58:02 compute-1 nova_compute[226294]:     </graphics>
Feb 02 09:58:02 compute-1 nova_compute[226294]:     <video supported='yes'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <enum name='modelType'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>vga</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>cirrus</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>virtio</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>none</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>bochs</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>ramfb</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </enum>
Feb 02 09:58:02 compute-1 nova_compute[226294]:     </video>
Feb 02 09:58:02 compute-1 nova_compute[226294]:     <hostdev supported='yes'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <enum name='mode'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>subsystem</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </enum>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <enum name='startupPolicy'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>default</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>mandatory</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>requisite</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>optional</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </enum>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <enum name='subsysType'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>usb</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>pci</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>scsi</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </enum>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <enum name='capsType'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <enum name='pciBackend'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:     </hostdev>
Feb 02 09:58:02 compute-1 nova_compute[226294]:     <rng supported='yes'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <enum name='model'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>virtio</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>virtio-transitional</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>virtio-non-transitional</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </enum>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <enum name='backendModel'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>random</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>egd</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>builtin</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </enum>
Feb 02 09:58:02 compute-1 nova_compute[226294]:     </rng>
Feb 02 09:58:02 compute-1 nova_compute[226294]:     <filesystem supported='yes'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <enum name='driverType'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>path</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>handle</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>virtiofs</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </enum>
Feb 02 09:58:02 compute-1 nova_compute[226294]:     </filesystem>
Feb 02 09:58:02 compute-1 nova_compute[226294]:     <tpm supported='yes'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <enum name='model'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>tpm-tis</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>tpm-crb</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </enum>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <enum name='backendModel'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>emulator</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>external</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </enum>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <enum name='backendVersion'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>2.0</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </enum>
Feb 02 09:58:02 compute-1 nova_compute[226294]:     </tpm>
Feb 02 09:58:02 compute-1 nova_compute[226294]:     <redirdev supported='yes'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <enum name='bus'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>usb</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </enum>
Feb 02 09:58:02 compute-1 nova_compute[226294]:     </redirdev>
Feb 02 09:58:02 compute-1 nova_compute[226294]:     <channel supported='yes'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <enum name='type'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>pty</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>unix</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </enum>
Feb 02 09:58:02 compute-1 nova_compute[226294]:     </channel>
Feb 02 09:58:02 compute-1 nova_compute[226294]:     <crypto supported='yes'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <enum name='model'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <enum name='type'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>qemu</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </enum>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <enum name='backendModel'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>builtin</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </enum>
Feb 02 09:58:02 compute-1 nova_compute[226294]:     </crypto>
Feb 02 09:58:02 compute-1 nova_compute[226294]:     <interface supported='yes'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <enum name='backendType'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>default</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>passt</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </enum>
Feb 02 09:58:02 compute-1 nova_compute[226294]:     </interface>
Feb 02 09:58:02 compute-1 nova_compute[226294]:     <panic supported='yes'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <enum name='model'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>isa</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>hyperv</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </enum>
Feb 02 09:58:02 compute-1 nova_compute[226294]:     </panic>
Feb 02 09:58:02 compute-1 nova_compute[226294]:     <console supported='yes'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <enum name='type'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>null</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>vc</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>pty</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>dev</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>file</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>pipe</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>stdio</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>udp</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>tcp</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>unix</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>qemu-vdagent</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>dbus</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </enum>
Feb 02 09:58:02 compute-1 nova_compute[226294]:     </console>
Feb 02 09:58:02 compute-1 nova_compute[226294]:   </devices>
Feb 02 09:58:02 compute-1 nova_compute[226294]:   <features>
Feb 02 09:58:02 compute-1 nova_compute[226294]:     <gic supported='no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:     <vmcoreinfo supported='yes'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:     <genid supported='yes'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:     <backingStoreInput supported='yes'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:     <backup supported='yes'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:     <async-teardown supported='yes'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:     <s390-pv supported='no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:     <ps2 supported='yes'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:     <tdx supported='no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:     <sev supported='no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:     <sgx supported='no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:     <hyperv supported='yes'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <enum name='features'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>relaxed</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>vapic</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>spinlocks</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>vpindex</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>runtime</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>synic</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>stimer</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>reset</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>vendor_id</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>frequencies</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>reenlightenment</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>tlbflush</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>ipi</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>avic</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>emsr_bitmap</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>xmm_input</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </enum>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <defaults>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <spinlocks>4095</spinlocks>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <stimer_direct>on</stimer_direct>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <tlbflush_direct>on</tlbflush_direct>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <tlbflush_extended>on</tlbflush_extended>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <vendor_id>Linux KVM Hv</vendor_id>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </defaults>
Feb 02 09:58:02 compute-1 nova_compute[226294]:     </hyperv>
Feb 02 09:58:02 compute-1 nova_compute[226294]:     <launchSecurity supported='no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:   </features>
Feb 02 09:58:02 compute-1 nova_compute[226294]: </domainCapabilities>
Feb 02 09:58:02 compute-1 nova_compute[226294]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.500 226298 DEBUG nova.virt.libvirt.host [None req-9f87d029-9537-479a-a951-66a5b7f88d49 - - - - - -] Getting domain capabilities for x86_64 via machine types: {'q35', 'pc'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.504 226298 DEBUG nova.virt.libvirt.host [None req-9f87d029-9537-479a-a951-66a5b7f88d49 - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=q35:
Feb 02 09:58:02 compute-1 nova_compute[226294]: <domainCapabilities>
Feb 02 09:58:02 compute-1 nova_compute[226294]:   <path>/usr/libexec/qemu-kvm</path>
Feb 02 09:58:02 compute-1 nova_compute[226294]:   <domain>kvm</domain>
Feb 02 09:58:02 compute-1 nova_compute[226294]:   <machine>pc-q35-rhel9.8.0</machine>
Feb 02 09:58:02 compute-1 nova_compute[226294]:   <arch>x86_64</arch>
Feb 02 09:58:02 compute-1 nova_compute[226294]:   <vcpu max='4096'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:   <iothreads supported='yes'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:   <os supported='yes'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:     <enum name='firmware'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <value>efi</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:     </enum>
Feb 02 09:58:02 compute-1 nova_compute[226294]:     <loader supported='yes'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <value>/usr/share/edk2/ovmf/OVMF_CODE.secboot.fd</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <value>/usr/share/edk2/ovmf/OVMF_CODE.fd</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <value>/usr/share/edk2/ovmf/OVMF.amdsev.fd</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <value>/usr/share/edk2/ovmf/OVMF.inteltdx.secboot.fd</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <enum name='type'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>rom</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>pflash</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </enum>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <enum name='readonly'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>yes</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>no</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </enum>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <enum name='secure'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>yes</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>no</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </enum>
Feb 02 09:58:02 compute-1 nova_compute[226294]:     </loader>
Feb 02 09:58:02 compute-1 nova_compute[226294]:   </os>
Feb 02 09:58:02 compute-1 nova_compute[226294]:   <cpu>
Feb 02 09:58:02 compute-1 nova_compute[226294]:     <mode name='host-passthrough' supported='yes'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <enum name='hostPassthroughMigratable'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>on</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>off</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </enum>
Feb 02 09:58:02 compute-1 nova_compute[226294]:     </mode>
Feb 02 09:58:02 compute-1 nova_compute[226294]:     <mode name='maximum' supported='yes'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <enum name='maximumMigratable'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>on</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>off</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </enum>
Feb 02 09:58:02 compute-1 nova_compute[226294]:     </mode>
Feb 02 09:58:02 compute-1 nova_compute[226294]:     <mode name='host-model' supported='yes'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model fallback='forbid'>EPYC-Rome</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <vendor>AMD</vendor>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <maxphysaddr mode='passthrough' limit='40'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <feature policy='require' name='x2apic'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <feature policy='require' name='tsc-deadline'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <feature policy='require' name='hypervisor'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <feature policy='require' name='tsc_adjust'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <feature policy='require' name='spec-ctrl'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <feature policy='require' name='stibp'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <feature policy='require' name='ssbd'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <feature policy='require' name='cmp_legacy'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <feature policy='require' name='overflow-recov'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <feature policy='require' name='succor'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <feature policy='require' name='ibrs'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <feature policy='require' name='amd-ssbd'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <feature policy='require' name='virt-ssbd'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <feature policy='require' name='lbrv'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <feature policy='require' name='tsc-scale'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <feature policy='require' name='vmcb-clean'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <feature policy='require' name='flushbyasid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <feature policy='require' name='pause-filter'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <feature policy='require' name='pfthreshold'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <feature policy='require' name='svme-addr-chk'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <feature policy='require' name='lfence-always-serializing'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <feature policy='disable' name='xsaves'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:     </mode>
Feb 02 09:58:02 compute-1 nova_compute[226294]:     <mode name='custom' supported='yes'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='Broadwell'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='hle'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='invpcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='rtm'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='Broadwell-IBRS'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='hle'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='invpcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='rtm'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='Broadwell-noTSX'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='invpcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='Broadwell-noTSX-IBRS'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='invpcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='Broadwell-v1'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='hle'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='invpcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='rtm'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='Broadwell-v2'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='invpcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='Broadwell-v3'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='hle'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='invpcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='rtm'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='Broadwell-v4'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='invpcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='Cascadelake-Server'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512bw'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512cd'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512dq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512f'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vl'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vnni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='hle'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='invpcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pku'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='rtm'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='Cascadelake-Server-noTSX'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512bw'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512cd'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512dq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512f'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vl'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vnni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='ibrs-all'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='invpcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pku'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='Cascadelake-Server-v1'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512bw'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512cd'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512dq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512f'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vl'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vnni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='hle'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='invpcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pku'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='rtm'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='Cascadelake-Server-v2'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512bw'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512cd'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512dq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512f'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vl'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vnni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='hle'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='ibrs-all'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='invpcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pku'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='rtm'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='Cascadelake-Server-v3'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512bw'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512cd'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512dq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512f'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vl'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vnni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='ibrs-all'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='invpcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pku'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='Cascadelake-Server-v4'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512bw'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512cd'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512dq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512f'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vl'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vnni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='ibrs-all'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='invpcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pku'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='Cascadelake-Server-v5'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512bw'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512cd'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512dq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512f'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vl'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vnni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='ibrs-all'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='invpcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pku'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='xsaves'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='ClearwaterForest'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx-ifma'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx-ne-convert'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx-vnni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx-vnni-int16'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx-vnni-int8'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='bhi-ctrl'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='bhi-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='bus-lock-detect'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='cldemote'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='cmpccxadd'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='ddpd-u'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fbsdp-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fsrm'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fsrs'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='gfni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='ibrs-all'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='intel-psfd'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='invpcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='ipred-ctrl'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='lam'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='mcdt-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='movdir64b'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='movdiri'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pbrsb-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pku'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='prefetchiti'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='psdp-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='rrsba-ctrl'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='sbdr-ssdp-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='serialize'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='sha512'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='sm3'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='sm4'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='ss'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='vaes'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='vpclmulqdq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='xsaves'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='ClearwaterForest-v1'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx-ifma'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx-ne-convert'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx-vnni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx-vnni-int16'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx-vnni-int8'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='bhi-ctrl'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='bhi-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='bus-lock-detect'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='cldemote'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='cmpccxadd'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='ddpd-u'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fbsdp-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fsrm'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fsrs'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='gfni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='ibrs-all'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='intel-psfd'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='invpcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='ipred-ctrl'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='lam'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='mcdt-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='movdir64b'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='movdiri'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pbrsb-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pku'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='prefetchiti'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='psdp-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='rrsba-ctrl'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='sbdr-ssdp-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='serialize'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='sha512'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='sm3'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='sm4'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='ss'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='vaes'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='vpclmulqdq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='xsaves'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='Cooperlake'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512-bf16'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512bw'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512cd'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512dq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512f'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vl'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vnni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='hle'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='ibrs-all'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='invpcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pku'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='rtm'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='taa-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='Cooperlake-v1'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512-bf16'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512bw'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512cd'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512dq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512f'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vl'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vnni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='hle'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='ibrs-all'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='invpcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pku'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='rtm'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='taa-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='Cooperlake-v2'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512-bf16'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512bw'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512cd'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512dq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512f'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vl'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vnni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='hle'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='ibrs-all'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='invpcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pku'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='rtm'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='taa-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='xsaves'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='Denverton'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='mpx'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='Denverton-v1'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='mpx'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='Denverton-v2'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='Denverton-v3'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='xsaves'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='Dhyana-v2'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='xsaves'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='EPYC-Genoa'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='amd-psfd'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='auto-ibrs'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512-bf16'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512-vpopcntdq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512bitalg'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512bw'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512cd'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512dq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512f'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512ifma'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vbmi'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vbmi2'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vl'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vnni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fsrm'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='gfni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='invpcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='la57'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='no-nested-data-bp'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='null-sel-clr-base'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pku'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='stibp-always-on'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='vaes'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='vpclmulqdq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='xsaves'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='EPYC-Genoa-v1'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='amd-psfd'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='auto-ibrs'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512-bf16'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512-vpopcntdq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512bitalg'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512bw'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512cd'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512dq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512f'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512ifma'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vbmi'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vbmi2'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vl'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vnni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fsrm'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='gfni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='invpcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='la57'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='no-nested-data-bp'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='null-sel-clr-base'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pku'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='stibp-always-on'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='vaes'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='vpclmulqdq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='xsaves'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='EPYC-Genoa-v2'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='amd-psfd'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='auto-ibrs'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512-bf16'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512-vpopcntdq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512bitalg'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512bw'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512cd'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512dq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512f'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512ifma'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vbmi'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vbmi2'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vl'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vnni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fs-gs-base-ns'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fsrm'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='gfni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='invpcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='la57'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='no-nested-data-bp'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='null-sel-clr-base'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='perfmon-v2'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pku'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='stibp-always-on'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='vaes'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='vpclmulqdq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='xsaves'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='EPYC-Milan'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fsrm'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='invpcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pku'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='xsaves'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='EPYC-Milan-v1'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fsrm'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='invpcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pku'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='xsaves'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='EPYC-Milan-v2'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='amd-psfd'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fsrm'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='invpcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='no-nested-data-bp'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='null-sel-clr-base'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pku'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='stibp-always-on'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='vaes'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='vpclmulqdq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='xsaves'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='EPYC-Milan-v3'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='amd-psfd'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fsrm'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='invpcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='no-nested-data-bp'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='null-sel-clr-base'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pku'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='stibp-always-on'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='vaes'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='vpclmulqdq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='xsaves'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='EPYC-Rome'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='xsaves'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='EPYC-Rome-v1'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='xsaves'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='EPYC-Rome-v2'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='xsaves'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='EPYC-Rome-v3'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='xsaves'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='EPYC-Turin'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='amd-psfd'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='auto-ibrs'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx-vnni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512-bf16'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512-vp2intersect'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512-vpopcntdq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512bitalg'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512bw'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512cd'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512dq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512f'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512ifma'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vbmi'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vbmi2'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vl'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vnni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fs-gs-base-ns'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fsrm'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='gfni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='ibpb-brtype'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='invpcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='la57'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='movdir64b'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='movdiri'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='no-nested-data-bp'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='null-sel-clr-base'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='perfmon-v2'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pku'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='prefetchi'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='sbpb'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='srso-user-kernel-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='stibp-always-on'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='vaes'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='vpclmulqdq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='xsaves'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='EPYC-Turin-v1'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='amd-psfd'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='auto-ibrs'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx-vnni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512-bf16'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512-vp2intersect'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512-vpopcntdq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512bitalg'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512bw'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512cd'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512dq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512f'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512ifma'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vbmi'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vbmi2'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vl'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vnni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fs-gs-base-ns'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fsrm'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='gfni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='ibpb-brtype'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='invpcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='la57'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='movdir64b'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='movdiri'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='no-nested-data-bp'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='null-sel-clr-base'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='perfmon-v2'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pku'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='prefetchi'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='sbpb'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='srso-user-kernel-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='stibp-always-on'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='vaes'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='vpclmulqdq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='xsaves'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='EPYC-v3'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='xsaves'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='EPYC-v4'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='xsaves'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='AMD'>EPYC-v5</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='EPYC-v5'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='xsaves'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='GraniteRapids'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='amx-bf16'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='amx-fp16'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='amx-int8'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='amx-tile'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx-vnni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512-bf16'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512-fp16'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512-vpopcntdq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512bitalg'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512bw'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512cd'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512dq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512f'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512ifma'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vbmi'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vbmi2'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vl'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vnni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='bus-lock-detect'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fbsdp-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fsrc'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fsrm'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fsrs'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fzrm'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='gfni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='hle'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='ibrs-all'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='invpcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='la57'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='mcdt-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pbrsb-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pku'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='prefetchiti'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='psdp-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='rtm'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='sbdr-ssdp-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='serialize'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='taa-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='tsx-ldtrk'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='vaes'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='vpclmulqdq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='xfd'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='xsaves'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='GraniteRapids-v1'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='amx-bf16'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='amx-fp16'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='amx-int8'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='amx-tile'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx-vnni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512-bf16'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512-fp16'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512-vpopcntdq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512bitalg'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512bw'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512cd'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512dq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512f'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512ifma'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vbmi'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vbmi2'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vl'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vnni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='bus-lock-detect'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fbsdp-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fsrc'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fsrm'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fsrs'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fzrm'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='gfni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='hle'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='ibrs-all'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='invpcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='la57'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='mcdt-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pbrsb-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pku'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='prefetchiti'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='psdp-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='rtm'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='sbdr-ssdp-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='serialize'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='taa-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='tsx-ldtrk'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='vaes'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='vpclmulqdq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='xfd'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='xsaves'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='GraniteRapids-v2'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='amx-bf16'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='amx-fp16'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='amx-int8'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='amx-tile'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx-vnni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx10'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx10-128'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx10-256'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx10-512'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512-bf16'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512-fp16'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512-vpopcntdq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512bitalg'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512bw'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512cd'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512dq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512f'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512ifma'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vbmi'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vbmi2'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vl'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vnni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='bus-lock-detect'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='cldemote'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fbsdp-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fsrc'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fsrm'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fsrs'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fzrm'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='gfni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='hle'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='ibrs-all'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='invpcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='la57'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='mcdt-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='movdir64b'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='movdiri'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pbrsb-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pku'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='prefetchiti'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='psdp-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='rtm'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='sbdr-ssdp-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='serialize'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='ss'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='taa-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='tsx-ldtrk'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='vaes'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='vpclmulqdq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='xfd'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='xsaves'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='GraniteRapids-v3'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='amx-bf16'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='amx-fp16'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='amx-int8'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='amx-tile'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx-vnni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx10'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx10-128'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx10-256'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx10-512'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512-bf16'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512-fp16'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512-vpopcntdq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512bitalg'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512bw'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512cd'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512dq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512f'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512ifma'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vbmi'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vbmi2'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vl'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vnni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='bus-lock-detect'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='cldemote'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fbsdp-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fsrc'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fsrm'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fsrs'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fzrm'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='gfni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='hle'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='ibrs-all'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='invpcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='la57'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='mcdt-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='movdir64b'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='movdiri'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pbrsb-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pku'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='prefetchiti'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='psdp-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='rtm'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='sbdr-ssdp-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='serialize'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='ss'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='taa-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='tsx-ldtrk'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='vaes'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='vpclmulqdq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='xfd'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='xsaves'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='Haswell'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='hle'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='invpcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='rtm'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='Haswell-IBRS'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='hle'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='invpcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='rtm'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='Haswell-noTSX'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='invpcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='Haswell-noTSX-IBRS'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='invpcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='Haswell-v1'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='hle'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='invpcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='rtm'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='Haswell-v2'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='invpcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='Haswell-v3'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='hle'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='invpcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='rtm'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='Haswell-v4'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='invpcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='Icelake-Server'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512-vpopcntdq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512bitalg'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512bw'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512cd'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512dq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512f'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vbmi'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vbmi2'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vl'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vnni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='gfni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='hle'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='invpcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='la57'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pku'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='rtm'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='vaes'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='vpclmulqdq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='Icelake-Server-noTSX'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512-vpopcntdq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512bitalg'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512bw'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512cd'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512dq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512f'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vbmi'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vbmi2'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vl'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vnni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='gfni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='invpcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='la57'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pku'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='vaes'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='vpclmulqdq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='Icelake-Server-v1'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512-vpopcntdq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512bitalg'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512bw'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512cd'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512dq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512f'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vbmi'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vbmi2'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vl'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vnni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='gfni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='hle'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='invpcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='la57'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pku'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='rtm'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='vaes'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='vpclmulqdq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='Icelake-Server-v2'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512-vpopcntdq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512bitalg'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512bw'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512cd'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512dq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512f'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vbmi'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vbmi2'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vl'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vnni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='gfni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='invpcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='la57'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pku'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='vaes'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='vpclmulqdq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='Icelake-Server-v3'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512-vpopcntdq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512bitalg'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512bw'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512cd'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512dq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512f'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vbmi'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vbmi2'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vl'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vnni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='gfni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='ibrs-all'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='invpcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='la57'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pku'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='taa-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='vaes'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='vpclmulqdq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='Icelake-Server-v4'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512-vpopcntdq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512bitalg'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512bw'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512cd'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512dq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512f'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512ifma'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vbmi'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vbmi2'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vl'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vnni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fsrm'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='gfni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='ibrs-all'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='invpcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='la57'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pku'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='taa-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='vaes'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='vpclmulqdq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='Icelake-Server-v5'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512-vpopcntdq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512bitalg'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512bw'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512cd'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512dq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512f'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512ifma'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vbmi'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vbmi2'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vl'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vnni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fsrm'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='gfni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='ibrs-all'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='invpcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='la57'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pku'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='taa-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='vaes'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='vpclmulqdq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='xsaves'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='Icelake-Server-v6'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512-vpopcntdq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512bitalg'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512bw'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512cd'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512dq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512f'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512ifma'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vbmi'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vbmi2'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vl'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vnni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fsrm'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='gfni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='ibrs-all'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='invpcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='la57'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pku'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='taa-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='vaes'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='vpclmulqdq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='xsaves'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='Icelake-Server-v7'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512-vpopcntdq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512bitalg'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512bw'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512cd'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512dq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512f'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512ifma'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vbmi'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vbmi2'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vl'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vnni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fsrm'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='gfni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='hle'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='ibrs-all'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='invpcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='la57'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pku'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='rtm'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='taa-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='vaes'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='vpclmulqdq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='xsaves'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='IvyBridge'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='IvyBridge-IBRS'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='IvyBridge-v1'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='IvyBridge-v2'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='KnightsMill'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512-4fmaps'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512-4vnniw'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512-vpopcntdq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512cd'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512er'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512f'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512pf'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='ss'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='KnightsMill-v1'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512-4fmaps'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512-4vnniw'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512-vpopcntdq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512cd'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512er'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512f'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512pf'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='ss'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='Opteron_G4'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fma4'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='xop'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='Opteron_G4-v1'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fma4'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='xop'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='Opteron_G5'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fma4'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='tbm'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='xop'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='Opteron_G5-v1'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fma4'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='tbm'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='xop'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='SapphireRapids'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='amx-bf16'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='amx-int8'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='amx-tile'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx-vnni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512-bf16'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512-fp16'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512-vpopcntdq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512bitalg'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512bw'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512cd'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512dq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512f'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512ifma'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vbmi'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vbmi2'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vl'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vnni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='bus-lock-detect'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fsrc'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fsrm'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fsrs'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fzrm'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='gfni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='hle'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='ibrs-all'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='invpcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='la57'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pku'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='rtm'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='serialize'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='taa-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='tsx-ldtrk'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='vaes'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='vpclmulqdq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='xfd'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='xsaves'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='SapphireRapids-v1'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='amx-bf16'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='amx-int8'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='amx-tile'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx-vnni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512-bf16'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512-fp16'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512-vpopcntdq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512bitalg'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512bw'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512cd'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512dq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512f'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512ifma'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vbmi'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vbmi2'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vl'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vnni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='bus-lock-detect'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fsrc'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fsrm'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fsrs'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fzrm'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='gfni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='hle'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='ibrs-all'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='invpcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='la57'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pku'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='rtm'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='serialize'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='taa-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='tsx-ldtrk'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='vaes'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='vpclmulqdq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='xfd'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='xsaves'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='SapphireRapids-v2'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='amx-bf16'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='amx-int8'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='amx-tile'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx-vnni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512-bf16'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512-fp16'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512-vpopcntdq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512bitalg'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512bw'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512cd'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512dq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512f'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512ifma'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vbmi'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vbmi2'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vl'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vnni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='bus-lock-detect'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fbsdp-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fsrc'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fsrm'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fsrs'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fzrm'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='gfni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='hle'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='ibrs-all'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='invpcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='la57'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pku'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='psdp-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='rtm'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='sbdr-ssdp-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='serialize'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='taa-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='tsx-ldtrk'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='vaes'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='vpclmulqdq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='xfd'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='xsaves'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='SapphireRapids-v3'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='amx-bf16'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='amx-int8'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='amx-tile'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx-vnni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512-bf16'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512-fp16'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512-vpopcntdq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512bitalg'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512bw'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512cd'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512dq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512f'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512ifma'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vbmi'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vbmi2'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vl'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vnni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='bus-lock-detect'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='cldemote'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fbsdp-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fsrc'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fsrm'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fsrs'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fzrm'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='gfni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='hle'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='ibrs-all'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='invpcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='la57'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='movdir64b'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='movdiri'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pku'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='psdp-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='rtm'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='sbdr-ssdp-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='serialize'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='ss'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='taa-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='tsx-ldtrk'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='vaes'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='vpclmulqdq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='xfd'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='xsaves'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='SapphireRapids-v4'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='amx-bf16'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='amx-int8'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='amx-tile'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx-vnni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512-bf16'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512-fp16'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512-vpopcntdq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512bitalg'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512bw'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512cd'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512dq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512f'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512ifma'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vbmi'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vbmi2'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vl'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vnni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='bus-lock-detect'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='cldemote'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fbsdp-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fsrc'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fsrm'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fsrs'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fzrm'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='gfni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='hle'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='ibrs-all'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='invpcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='la57'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='movdir64b'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='movdiri'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pku'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='psdp-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='rtm'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='sbdr-ssdp-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='serialize'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='ss'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='taa-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='tsx-ldtrk'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='vaes'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='vpclmulqdq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='xfd'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='xsaves'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='SierraForest'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx-ifma'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx-ne-convert'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx-vnni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx-vnni-int8'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='bus-lock-detect'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='cmpccxadd'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fbsdp-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fsrm'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fsrs'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='gfni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='ibrs-all'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='invpcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='mcdt-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pbrsb-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pku'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='psdp-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='sbdr-ssdp-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='serialize'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='vaes'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='vpclmulqdq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='xsaves'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='SierraForest-v1'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx-ifma'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx-ne-convert'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx-vnni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx-vnni-int8'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='bus-lock-detect'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='cmpccxadd'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fbsdp-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fsrm'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fsrs'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='gfni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='ibrs-all'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='invpcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='mcdt-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pbrsb-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pku'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='psdp-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='sbdr-ssdp-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='serialize'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='vaes'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='vpclmulqdq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='xsaves'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel'>SierraForest-v2</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='SierraForest-v2'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx-ifma'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx-ne-convert'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx-vnni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx-vnni-int8'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='bhi-ctrl'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='bus-lock-detect'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='cldemote'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='cmpccxadd'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fbsdp-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fsrm'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fsrs'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='gfni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='ibrs-all'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='intel-psfd'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='invpcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='ipred-ctrl'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='lam'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='mcdt-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='movdir64b'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='movdiri'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pbrsb-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pku'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='psdp-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='rrsba-ctrl'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='sbdr-ssdp-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='serialize'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='ss'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='vaes'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='vpclmulqdq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='xsaves'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel'>SierraForest-v3</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='SierraForest-v3'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx-ifma'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx-ne-convert'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx-vnni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx-vnni-int8'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='bhi-ctrl'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='bus-lock-detect'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='cldemote'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='cmpccxadd'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fbsdp-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fsrm'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fsrs'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='gfni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='ibrs-all'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='intel-psfd'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='invpcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='ipred-ctrl'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='lam'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='mcdt-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='movdir64b'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='movdiri'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pbrsb-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pku'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='psdp-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='rrsba-ctrl'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='sbdr-ssdp-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='serialize'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='ss'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='vaes'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='vpclmulqdq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='xsaves'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='Skylake-Client'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='hle'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='invpcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='rtm'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='Skylake-Client-IBRS'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='hle'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='invpcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='rtm'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='invpcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='Skylake-Client-v1'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='hle'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='invpcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='rtm'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='Skylake-Client-v2'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='hle'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='invpcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='rtm'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='Skylake-Client-v3'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='invpcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='Skylake-Client-v4'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='invpcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='xsaves'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='Skylake-Server'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512bw'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512cd'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512dq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512f'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vl'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='hle'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='invpcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pku'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='rtm'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='Skylake-Server-IBRS'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512bw'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512cd'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512dq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512f'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vl'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='hle'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='invpcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pku'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='rtm'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512bw'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512cd'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512dq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512f'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vl'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='invpcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pku'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='Skylake-Server-v1'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512bw'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512cd'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512dq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512f'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vl'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='hle'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='invpcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pku'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='rtm'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='Skylake-Server-v2'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512bw'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512cd'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512dq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512f'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vl'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='hle'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='invpcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pku'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='rtm'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='Skylake-Server-v3'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512bw'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512cd'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512dq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512f'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vl'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='invpcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pku'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='Skylake-Server-v4'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512bw'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512cd'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512dq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512f'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vl'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='invpcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pku'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='Skylake-Server-v5'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512bw'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512cd'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512dq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512f'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vl'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='invpcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pku'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='xsaves'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='Snowridge'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='cldemote'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='core-capability'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='gfni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='movdir64b'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='movdiri'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='mpx'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='split-lock-detect'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='Snowridge-v1'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='cldemote'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='core-capability'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='gfni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='movdir64b'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='movdiri'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='mpx'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='split-lock-detect'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='Snowridge-v2'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='cldemote'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='core-capability'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='gfni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='movdir64b'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='movdiri'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='split-lock-detect'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='Snowridge-v3'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='cldemote'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='core-capability'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='gfni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='movdir64b'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='movdiri'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='split-lock-detect'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='xsaves'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='Snowridge-v4'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='cldemote'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='gfni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='movdir64b'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='movdiri'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='xsaves'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='athlon'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='3dnow'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='3dnowext'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='athlon-v1'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='3dnow'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='3dnowext'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='core2duo'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='ss'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='core2duo-v1'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='ss'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='coreduo'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='ss'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='coreduo-v1'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='ss'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='n270'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='ss'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='n270-v1'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='ss'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='phenom'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='3dnow'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='3dnowext'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='phenom-v1'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='3dnow'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='3dnowext'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:     </mode>
Feb 02 09:58:02 compute-1 nova_compute[226294]:   </cpu>
Feb 02 09:58:02 compute-1 nova_compute[226294]:   <memoryBacking supported='yes'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:     <enum name='sourceType'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <value>file</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <value>anonymous</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <value>memfd</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:     </enum>
Feb 02 09:58:02 compute-1 nova_compute[226294]:   </memoryBacking>
Feb 02 09:58:02 compute-1 nova_compute[226294]:   <devices>
Feb 02 09:58:02 compute-1 nova_compute[226294]:     <disk supported='yes'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <enum name='diskDevice'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>disk</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>cdrom</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>floppy</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>lun</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </enum>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <enum name='bus'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>fdc</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>scsi</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>virtio</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>usb</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>sata</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </enum>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <enum name='model'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>virtio</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>virtio-transitional</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>virtio-non-transitional</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </enum>
Feb 02 09:58:02 compute-1 nova_compute[226294]:     </disk>
Feb 02 09:58:02 compute-1 nova_compute[226294]:     <graphics supported='yes'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <enum name='type'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>vnc</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>egl-headless</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>dbus</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </enum>
Feb 02 09:58:02 compute-1 nova_compute[226294]:     </graphics>
Feb 02 09:58:02 compute-1 nova_compute[226294]:     <video supported='yes'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <enum name='modelType'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>vga</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>cirrus</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>virtio</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>none</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>bochs</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>ramfb</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </enum>
Feb 02 09:58:02 compute-1 nova_compute[226294]:     </video>
Feb 02 09:58:02 compute-1 nova_compute[226294]:     <hostdev supported='yes'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <enum name='mode'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>subsystem</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </enum>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <enum name='startupPolicy'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>default</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>mandatory</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>requisite</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>optional</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </enum>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <enum name='subsysType'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>usb</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>pci</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>scsi</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </enum>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <enum name='capsType'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <enum name='pciBackend'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:     </hostdev>
Feb 02 09:58:02 compute-1 nova_compute[226294]:     <rng supported='yes'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <enum name='model'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>virtio</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>virtio-transitional</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>virtio-non-transitional</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </enum>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <enum name='backendModel'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>random</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>egd</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>builtin</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </enum>
Feb 02 09:58:02 compute-1 nova_compute[226294]:     </rng>
Feb 02 09:58:02 compute-1 nova_compute[226294]:     <filesystem supported='yes'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <enum name='driverType'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>path</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>handle</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>virtiofs</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </enum>
Feb 02 09:58:02 compute-1 nova_compute[226294]:     </filesystem>
Feb 02 09:58:02 compute-1 nova_compute[226294]:     <tpm supported='yes'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <enum name='model'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>tpm-tis</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>tpm-crb</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </enum>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <enum name='backendModel'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>emulator</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>external</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </enum>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <enum name='backendVersion'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>2.0</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </enum>
Feb 02 09:58:02 compute-1 nova_compute[226294]:     </tpm>
Feb 02 09:58:02 compute-1 nova_compute[226294]:     <redirdev supported='yes'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <enum name='bus'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>usb</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </enum>
Feb 02 09:58:02 compute-1 nova_compute[226294]:     </redirdev>
Feb 02 09:58:02 compute-1 nova_compute[226294]:     <channel supported='yes'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <enum name='type'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>pty</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>unix</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </enum>
Feb 02 09:58:02 compute-1 nova_compute[226294]:     </channel>
Feb 02 09:58:02 compute-1 nova_compute[226294]:     <crypto supported='yes'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <enum name='model'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <enum name='type'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>qemu</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </enum>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <enum name='backendModel'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>builtin</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </enum>
Feb 02 09:58:02 compute-1 nova_compute[226294]:     </crypto>
Feb 02 09:58:02 compute-1 nova_compute[226294]:     <interface supported='yes'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <enum name='backendType'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>default</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>passt</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </enum>
Feb 02 09:58:02 compute-1 nova_compute[226294]:     </interface>
Feb 02 09:58:02 compute-1 nova_compute[226294]:     <panic supported='yes'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <enum name='model'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>isa</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>hyperv</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </enum>
Feb 02 09:58:02 compute-1 nova_compute[226294]:     </panic>
Feb 02 09:58:02 compute-1 nova_compute[226294]:     <console supported='yes'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <enum name='type'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>null</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>vc</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>pty</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>dev</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>file</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>pipe</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>stdio</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>udp</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>tcp</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>unix</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>qemu-vdagent</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>dbus</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </enum>
Feb 02 09:58:02 compute-1 nova_compute[226294]:     </console>
Feb 02 09:58:02 compute-1 nova_compute[226294]:   </devices>
Feb 02 09:58:02 compute-1 nova_compute[226294]:   <features>
Feb 02 09:58:02 compute-1 nova_compute[226294]:     <gic supported='no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:     <vmcoreinfo supported='yes'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:     <genid supported='yes'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:     <backingStoreInput supported='yes'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:     <backup supported='yes'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:     <async-teardown supported='yes'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:     <s390-pv supported='no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:     <ps2 supported='yes'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:     <tdx supported='no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:     <sev supported='no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:     <sgx supported='no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:     <hyperv supported='yes'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <enum name='features'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>relaxed</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>vapic</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>spinlocks</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>vpindex</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>runtime</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>synic</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>stimer</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>reset</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>vendor_id</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>frequencies</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>reenlightenment</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>tlbflush</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>ipi</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>avic</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>emsr_bitmap</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>xmm_input</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </enum>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <defaults>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <spinlocks>4095</spinlocks>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <stimer_direct>on</stimer_direct>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <tlbflush_direct>on</tlbflush_direct>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <tlbflush_extended>on</tlbflush_extended>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <vendor_id>Linux KVM Hv</vendor_id>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </defaults>
Feb 02 09:58:02 compute-1 nova_compute[226294]:     </hyperv>
Feb 02 09:58:02 compute-1 nova_compute[226294]:     <launchSecurity supported='no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:   </features>
Feb 02 09:58:02 compute-1 nova_compute[226294]: </domainCapabilities>
Feb 02 09:58:02 compute-1 nova_compute[226294]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.590 226298 DEBUG nova.virt.libvirt.host [None req-9f87d029-9537-479a-a951-66a5b7f88d49 - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=pc:
Feb 02 09:58:02 compute-1 nova_compute[226294]: <domainCapabilities>
Feb 02 09:58:02 compute-1 nova_compute[226294]:   <path>/usr/libexec/qemu-kvm</path>
Feb 02 09:58:02 compute-1 nova_compute[226294]:   <domain>kvm</domain>
Feb 02 09:58:02 compute-1 nova_compute[226294]:   <machine>pc-i440fx-rhel7.6.0</machine>
Feb 02 09:58:02 compute-1 nova_compute[226294]:   <arch>x86_64</arch>
Feb 02 09:58:02 compute-1 nova_compute[226294]:   <vcpu max='240'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:   <iothreads supported='yes'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:   <os supported='yes'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:     <enum name='firmware'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:     <loader supported='yes'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <enum name='type'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>rom</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>pflash</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </enum>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <enum name='readonly'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>yes</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>no</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </enum>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <enum name='secure'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>no</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </enum>
Feb 02 09:58:02 compute-1 nova_compute[226294]:     </loader>
Feb 02 09:58:02 compute-1 nova_compute[226294]:   </os>
Feb 02 09:58:02 compute-1 nova_compute[226294]:   <cpu>
Feb 02 09:58:02 compute-1 nova_compute[226294]:     <mode name='host-passthrough' supported='yes'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <enum name='hostPassthroughMigratable'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>on</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>off</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </enum>
Feb 02 09:58:02 compute-1 nova_compute[226294]:     </mode>
Feb 02 09:58:02 compute-1 nova_compute[226294]:     <mode name='maximum' supported='yes'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <enum name='maximumMigratable'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>on</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>off</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </enum>
Feb 02 09:58:02 compute-1 nova_compute[226294]:     </mode>
Feb 02 09:58:02 compute-1 nova_compute[226294]:     <mode name='host-model' supported='yes'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model fallback='forbid'>EPYC-Rome</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <vendor>AMD</vendor>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <maxphysaddr mode='passthrough' limit='40'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <feature policy='require' name='x2apic'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <feature policy='require' name='tsc-deadline'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <feature policy='require' name='hypervisor'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <feature policy='require' name='tsc_adjust'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <feature policy='require' name='spec-ctrl'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <feature policy='require' name='stibp'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <feature policy='require' name='ssbd'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <feature policy='require' name='cmp_legacy'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <feature policy='require' name='overflow-recov'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <feature policy='require' name='succor'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <feature policy='require' name='ibrs'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <feature policy='require' name='amd-ssbd'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <feature policy='require' name='virt-ssbd'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <feature policy='require' name='lbrv'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <feature policy='require' name='tsc-scale'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <feature policy='require' name='vmcb-clean'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <feature policy='require' name='flushbyasid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <feature policy='require' name='pause-filter'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <feature policy='require' name='pfthreshold'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <feature policy='require' name='svme-addr-chk'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <feature policy='require' name='lfence-always-serializing'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <feature policy='disable' name='xsaves'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:     </mode>
Feb 02 09:58:02 compute-1 nova_compute[226294]:     <mode name='custom' supported='yes'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='Broadwell'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='hle'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='invpcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='rtm'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='Broadwell-IBRS'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='hle'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='invpcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='rtm'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='Broadwell-noTSX'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='invpcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='Broadwell-noTSX-IBRS'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='invpcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='Broadwell-v1'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='hle'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='invpcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='rtm'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='Broadwell-v2'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='invpcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='Broadwell-v3'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='hle'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='invpcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='rtm'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='Broadwell-v4'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='invpcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='Cascadelake-Server'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512bw'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512cd'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512dq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512f'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vl'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vnni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='hle'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='invpcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pku'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='rtm'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='Cascadelake-Server-noTSX'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512bw'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512cd'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512dq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512f'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vl'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vnni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='ibrs-all'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='invpcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pku'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='Cascadelake-Server-v1'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512bw'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512cd'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512dq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512f'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vl'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vnni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='hle'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='invpcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pku'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='rtm'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='Cascadelake-Server-v2'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512bw'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512cd'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512dq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512f'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vl'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vnni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='hle'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='ibrs-all'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='invpcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pku'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='rtm'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='Cascadelake-Server-v3'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512bw'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512cd'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512dq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512f'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vl'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vnni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='ibrs-all'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='invpcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pku'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='Cascadelake-Server-v4'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512bw'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512cd'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512dq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512f'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vl'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vnni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='ibrs-all'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='invpcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pku'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='Cascadelake-Server-v5'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512bw'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512cd'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512dq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512f'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vl'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vnni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='ibrs-all'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='invpcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pku'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='xsaves'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='ClearwaterForest'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx-ifma'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx-ne-convert'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx-vnni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx-vnni-int16'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx-vnni-int8'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='bhi-ctrl'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='bhi-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='bus-lock-detect'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='cldemote'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='cmpccxadd'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='ddpd-u'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fbsdp-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fsrm'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fsrs'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='gfni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='ibrs-all'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='intel-psfd'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='invpcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='ipred-ctrl'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='lam'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='mcdt-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='movdir64b'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='movdiri'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pbrsb-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pku'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='prefetchiti'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='psdp-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='rrsba-ctrl'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='sbdr-ssdp-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='serialize'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='sha512'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='sm3'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='sm4'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='ss'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='vaes'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='vpclmulqdq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='xsaves'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='ClearwaterForest-v1'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx-ifma'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx-ne-convert'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx-vnni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx-vnni-int16'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx-vnni-int8'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='bhi-ctrl'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='bhi-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='bus-lock-detect'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='cldemote'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='cmpccxadd'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='ddpd-u'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fbsdp-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fsrm'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fsrs'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='gfni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='ibrs-all'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='intel-psfd'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='invpcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='ipred-ctrl'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='lam'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='mcdt-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='movdir64b'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='movdiri'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pbrsb-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pku'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='prefetchiti'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='psdp-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='rrsba-ctrl'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='sbdr-ssdp-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='serialize'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='sha512'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='sm3'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='sm4'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='ss'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='vaes'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='vpclmulqdq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='xsaves'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='Cooperlake'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512-bf16'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512bw'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512cd'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512dq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512f'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vl'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vnni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='hle'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='ibrs-all'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='invpcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pku'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='rtm'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='taa-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='Cooperlake-v1'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512-bf16'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512bw'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512cd'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512dq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512f'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vl'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vnni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='hle'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='ibrs-all'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='invpcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pku'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='rtm'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='taa-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='Cooperlake-v2'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512-bf16'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512bw'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512cd'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512dq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512f'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vl'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vnni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='hle'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='ibrs-all'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='invpcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pku'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='rtm'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='taa-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='xsaves'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='Denverton'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='mpx'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='Denverton-v1'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='mpx'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='Denverton-v2'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='Denverton-v3'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='xsaves'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='Dhyana-v2'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='xsaves'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='EPYC-Genoa'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='amd-psfd'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='auto-ibrs'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512-bf16'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512-vpopcntdq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512bitalg'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512bw'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512cd'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512dq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512f'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512ifma'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vbmi'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vbmi2'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vl'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vnni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fsrm'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='gfni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='invpcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='la57'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='no-nested-data-bp'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='null-sel-clr-base'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pku'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='stibp-always-on'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='vaes'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='vpclmulqdq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='xsaves'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='EPYC-Genoa-v1'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='amd-psfd'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='auto-ibrs'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512-bf16'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512-vpopcntdq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512bitalg'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512bw'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512cd'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512dq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512f'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512ifma'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vbmi'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vbmi2'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vl'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vnni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fsrm'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='gfni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='invpcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='la57'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='no-nested-data-bp'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='null-sel-clr-base'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pku'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='stibp-always-on'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='vaes'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='vpclmulqdq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='xsaves'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='EPYC-Genoa-v2'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='amd-psfd'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='auto-ibrs'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512-bf16'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512-vpopcntdq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512bitalg'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512bw'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512cd'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512dq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512f'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512ifma'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vbmi'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vbmi2'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vl'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vnni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fs-gs-base-ns'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fsrm'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='gfni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='invpcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='la57'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='no-nested-data-bp'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='null-sel-clr-base'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='perfmon-v2'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pku'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='stibp-always-on'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='vaes'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='vpclmulqdq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='xsaves'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='EPYC-Milan'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fsrm'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='invpcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pku'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='xsaves'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='EPYC-Milan-v1'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fsrm'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='invpcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pku'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='xsaves'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='EPYC-Milan-v2'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='amd-psfd'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fsrm'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='invpcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='no-nested-data-bp'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='null-sel-clr-base'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pku'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='stibp-always-on'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='vaes'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='vpclmulqdq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='xsaves'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='EPYC-Milan-v3'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='amd-psfd'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fsrm'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='invpcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='no-nested-data-bp'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='null-sel-clr-base'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pku'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='stibp-always-on'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='vaes'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='vpclmulqdq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='xsaves'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='EPYC-Rome'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='xsaves'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='EPYC-Rome-v1'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='xsaves'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='EPYC-Rome-v2'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='xsaves'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='EPYC-Rome-v3'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='xsaves'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='EPYC-Turin'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='amd-psfd'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='auto-ibrs'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx-vnni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512-bf16'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512-vp2intersect'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512-vpopcntdq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512bitalg'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512bw'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512cd'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512dq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512f'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512ifma'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vbmi'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vbmi2'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vl'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vnni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fs-gs-base-ns'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fsrm'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='gfni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='ibpb-brtype'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='invpcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='la57'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='movdir64b'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='movdiri'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='no-nested-data-bp'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='null-sel-clr-base'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='perfmon-v2'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pku'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='prefetchi'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='sbpb'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='srso-user-kernel-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='stibp-always-on'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='vaes'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='vpclmulqdq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='xsaves'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='EPYC-Turin-v1'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='amd-psfd'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='auto-ibrs'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx-vnni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512-bf16'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512-vp2intersect'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512-vpopcntdq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512bitalg'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512bw'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512cd'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512dq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512f'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512ifma'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vbmi'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vbmi2'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vl'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vnni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fs-gs-base-ns'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fsrm'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='gfni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='ibpb-brtype'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='invpcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='la57'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='movdir64b'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='movdiri'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='no-nested-data-bp'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='null-sel-clr-base'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='perfmon-v2'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pku'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='prefetchi'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='sbpb'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='srso-user-kernel-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='stibp-always-on'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='vaes'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='vpclmulqdq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='xsaves'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='EPYC-v3'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='xsaves'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='EPYC-v4'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='xsaves'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='AMD'>EPYC-v5</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='EPYC-v5'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='xsaves'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='GraniteRapids'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='amx-bf16'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='amx-fp16'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='amx-int8'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='amx-tile'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx-vnni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512-bf16'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512-fp16'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512-vpopcntdq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512bitalg'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512bw'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512cd'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512dq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512f'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512ifma'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vbmi'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vbmi2'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vl'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vnni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='bus-lock-detect'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fbsdp-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fsrc'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fsrm'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fsrs'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fzrm'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='gfni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='hle'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='ibrs-all'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='invpcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='la57'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='mcdt-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pbrsb-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pku'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='prefetchiti'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='psdp-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='rtm'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='sbdr-ssdp-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='serialize'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='taa-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='tsx-ldtrk'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='vaes'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='vpclmulqdq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='xfd'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='xsaves'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='GraniteRapids-v1'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='amx-bf16'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='amx-fp16'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='amx-int8'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='amx-tile'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx-vnni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512-bf16'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512-fp16'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512-vpopcntdq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512bitalg'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512bw'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512cd'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512dq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512f'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512ifma'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vbmi'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vbmi2'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vl'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vnni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='bus-lock-detect'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fbsdp-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fsrc'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fsrm'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fsrs'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fzrm'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='gfni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='hle'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='ibrs-all'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='invpcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='la57'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='mcdt-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pbrsb-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pku'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='prefetchiti'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='psdp-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='rtm'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='sbdr-ssdp-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='serialize'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='taa-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='tsx-ldtrk'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='vaes'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='vpclmulqdq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='xfd'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='xsaves'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='GraniteRapids-v2'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='amx-bf16'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='amx-fp16'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='amx-int8'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='amx-tile'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx-vnni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx10'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx10-128'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx10-256'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx10-512'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512-bf16'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512-fp16'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512-vpopcntdq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512bitalg'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512bw'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512cd'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512dq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512f'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512ifma'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vbmi'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vbmi2'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vl'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vnni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='bus-lock-detect'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='cldemote'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fbsdp-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fsrc'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fsrm'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fsrs'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fzrm'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='gfni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='hle'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='ibrs-all'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='invpcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='la57'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='mcdt-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='movdir64b'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='movdiri'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pbrsb-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pku'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='prefetchiti'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='psdp-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='rtm'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='sbdr-ssdp-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='serialize'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='ss'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='taa-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='tsx-ldtrk'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='vaes'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='vpclmulqdq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='xfd'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='xsaves'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='GraniteRapids-v3'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='amx-bf16'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='amx-fp16'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='amx-int8'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='amx-tile'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx-vnni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx10'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx10-128'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx10-256'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx10-512'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512-bf16'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512-fp16'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512-vpopcntdq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512bitalg'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512bw'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512cd'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512dq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512f'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512ifma'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vbmi'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vbmi2'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vl'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vnni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='bus-lock-detect'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='cldemote'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fbsdp-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fsrc'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fsrm'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fsrs'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fzrm'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='gfni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='hle'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='ibrs-all'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='invpcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='la57'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='mcdt-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='movdir64b'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='movdiri'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pbrsb-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pku'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='prefetchiti'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='psdp-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='rtm'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='sbdr-ssdp-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='serialize'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='ss'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='taa-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='tsx-ldtrk'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='vaes'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='vpclmulqdq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='xfd'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='xsaves'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='Haswell'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='hle'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='invpcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='rtm'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='Haswell-IBRS'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='hle'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='invpcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='rtm'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='Haswell-noTSX'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='invpcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='Haswell-noTSX-IBRS'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='invpcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='Haswell-v1'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='hle'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='invpcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='rtm'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='Haswell-v2'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='invpcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='Haswell-v3'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='hle'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='invpcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='rtm'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='Haswell-v4'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='invpcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='Icelake-Server'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512-vpopcntdq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512bitalg'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512bw'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512cd'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512dq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512f'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vbmi'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vbmi2'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vl'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vnni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='gfni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='hle'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='invpcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='la57'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pku'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='rtm'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='vaes'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='vpclmulqdq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='Icelake-Server-noTSX'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512-vpopcntdq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512bitalg'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512bw'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512cd'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512dq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512f'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vbmi'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vbmi2'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vl'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vnni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='gfni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='invpcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='la57'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pku'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='vaes'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='vpclmulqdq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='Icelake-Server-v1'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512-vpopcntdq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512bitalg'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512bw'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512cd'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512dq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512f'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vbmi'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vbmi2'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vl'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vnni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='gfni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='hle'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='invpcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='la57'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pku'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='rtm'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='vaes'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='vpclmulqdq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='Icelake-Server-v2'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512-vpopcntdq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512bitalg'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512bw'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512cd'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512dq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512f'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vbmi'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vbmi2'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vl'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vnni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='gfni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='invpcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='la57'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pku'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='vaes'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='vpclmulqdq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='Icelake-Server-v3'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512-vpopcntdq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512bitalg'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512bw'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512cd'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512dq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512f'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vbmi'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vbmi2'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vl'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vnni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='gfni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='ibrs-all'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='invpcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='la57'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pku'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='taa-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='vaes'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='vpclmulqdq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='Icelake-Server-v4'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512-vpopcntdq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512bitalg'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512bw'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512cd'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512dq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512f'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512ifma'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vbmi'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vbmi2'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vl'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vnni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fsrm'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='gfni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='ibrs-all'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='invpcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='la57'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pku'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='taa-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='vaes'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='vpclmulqdq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='Icelake-Server-v5'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512-vpopcntdq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512bitalg'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512bw'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512cd'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512dq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512f'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512ifma'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vbmi'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vbmi2'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vl'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vnni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fsrm'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='gfni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='ibrs-all'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='invpcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='la57'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pku'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='taa-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='vaes'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='vpclmulqdq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='xsaves'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='Icelake-Server-v6'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512-vpopcntdq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512bitalg'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512bw'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512cd'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512dq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512f'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512ifma'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vbmi'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vbmi2'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vl'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vnni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fsrm'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='gfni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='ibrs-all'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='invpcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='la57'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pku'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='taa-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='vaes'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='vpclmulqdq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='xsaves'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='Icelake-Server-v7'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512-vpopcntdq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512bitalg'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512bw'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512cd'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512dq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512f'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512ifma'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vbmi'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vbmi2'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vl'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vnni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fsrm'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='gfni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='hle'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='ibrs-all'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='invpcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='la57'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pku'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='rtm'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='taa-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='vaes'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='vpclmulqdq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='xsaves'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='IvyBridge'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='IvyBridge-IBRS'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='IvyBridge-v1'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='IvyBridge-v2'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='KnightsMill'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512-4fmaps'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512-4vnniw'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512-vpopcntdq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512cd'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512er'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512f'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512pf'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='ss'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='KnightsMill-v1'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512-4fmaps'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512-4vnniw'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512-vpopcntdq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512cd'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512er'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512f'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512pf'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='ss'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='Opteron_G4'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fma4'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='xop'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='Opteron_G4-v1'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fma4'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='xop'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='Opteron_G5'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fma4'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='tbm'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='xop'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='Opteron_G5-v1'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fma4'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='tbm'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='xop'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='SapphireRapids'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='amx-bf16'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='amx-int8'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='amx-tile'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx-vnni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512-bf16'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512-fp16'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512-vpopcntdq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512bitalg'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512bw'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512cd'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512dq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512f'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512ifma'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vbmi'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vbmi2'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vl'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vnni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='bus-lock-detect'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fsrc'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fsrm'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fsrs'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fzrm'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='gfni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='hle'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='ibrs-all'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='invpcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='la57'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pku'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='rtm'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='serialize'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='taa-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='tsx-ldtrk'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='vaes'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='vpclmulqdq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='xfd'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='xsaves'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='SapphireRapids-v1'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='amx-bf16'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='amx-int8'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='amx-tile'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx-vnni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512-bf16'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512-fp16'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512-vpopcntdq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512bitalg'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512bw'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512cd'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512dq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512f'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512ifma'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vbmi'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vbmi2'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vl'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vnni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='bus-lock-detect'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fsrc'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fsrm'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fsrs'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fzrm'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='gfni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='hle'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='ibrs-all'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='invpcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='la57'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pku'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='rtm'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='serialize'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='taa-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='tsx-ldtrk'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='vaes'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='vpclmulqdq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='xfd'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='xsaves'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='SapphireRapids-v2'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='amx-bf16'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='amx-int8'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='amx-tile'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx-vnni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512-bf16'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512-fp16'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512-vpopcntdq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512bitalg'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512bw'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512cd'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512dq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512f'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512ifma'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vbmi'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vbmi2'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vl'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vnni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='bus-lock-detect'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fbsdp-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fsrc'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fsrm'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fsrs'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fzrm'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='gfni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='hle'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='ibrs-all'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='invpcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='la57'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pku'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='psdp-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='rtm'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='sbdr-ssdp-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='serialize'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='taa-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='tsx-ldtrk'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='vaes'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='vpclmulqdq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='xfd'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='xsaves'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='SapphireRapids-v3'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='amx-bf16'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='amx-int8'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='amx-tile'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx-vnni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512-bf16'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512-fp16'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512-vpopcntdq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512bitalg'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512bw'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512cd'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512dq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512f'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512ifma'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vbmi'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vbmi2'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vl'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vnni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='bus-lock-detect'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='cldemote'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fbsdp-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fsrc'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fsrm'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fsrs'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fzrm'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='gfni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='hle'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='ibrs-all'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='invpcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='la57'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='movdir64b'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='movdiri'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pku'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='psdp-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='rtm'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='sbdr-ssdp-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='serialize'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='ss'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='taa-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='tsx-ldtrk'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='vaes'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='vpclmulqdq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='xfd'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='xsaves'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='SapphireRapids-v4'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='amx-bf16'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='amx-int8'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='amx-tile'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx-vnni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512-bf16'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512-fp16'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512-vpopcntdq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512bitalg'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512bw'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512cd'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512dq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512f'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512ifma'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vbmi'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vbmi2'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vl'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vnni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='bus-lock-detect'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='cldemote'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fbsdp-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fsrc'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fsrm'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fsrs'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fzrm'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='gfni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='hle'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='ibrs-all'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='invpcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='la57'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='movdir64b'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='movdiri'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pku'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='psdp-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='rtm'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='sbdr-ssdp-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='serialize'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='ss'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='taa-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='tsx-ldtrk'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='vaes'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='vpclmulqdq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='xfd'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='xsaves'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='SierraForest'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx-ifma'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx-ne-convert'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx-vnni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx-vnni-int8'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='bus-lock-detect'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='cmpccxadd'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fbsdp-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fsrm'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fsrs'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='gfni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='ibrs-all'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='invpcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='mcdt-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pbrsb-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pku'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='psdp-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='sbdr-ssdp-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='serialize'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='vaes'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='vpclmulqdq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='xsaves'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='SierraForest-v1'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx-ifma'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx-ne-convert'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx-vnni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx-vnni-int8'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='bus-lock-detect'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='cmpccxadd'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fbsdp-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fsrm'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fsrs'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='gfni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='ibrs-all'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='invpcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='mcdt-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pbrsb-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pku'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='psdp-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='sbdr-ssdp-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='serialize'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='vaes'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='vpclmulqdq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='xsaves'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel'>SierraForest-v2</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='SierraForest-v2'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx-ifma'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx-ne-convert'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx-vnni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx-vnni-int8'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='bhi-ctrl'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='bus-lock-detect'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='cldemote'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='cmpccxadd'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fbsdp-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fsrm'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fsrs'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='gfni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='ibrs-all'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='intel-psfd'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='invpcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='ipred-ctrl'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='lam'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='mcdt-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='movdir64b'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='movdiri'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pbrsb-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pku'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='psdp-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='rrsba-ctrl'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='sbdr-ssdp-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='serialize'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='ss'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='vaes'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='vpclmulqdq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='xsaves'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel'>SierraForest-v3</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='SierraForest-v3'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx-ifma'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx-ne-convert'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx-vnni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx-vnni-int8'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='bhi-ctrl'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='bus-lock-detect'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='cldemote'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='cmpccxadd'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fbsdp-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fsrm'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='fsrs'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='gfni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='ibrs-all'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='intel-psfd'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='invpcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='ipred-ctrl'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='lam'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='mcdt-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='movdir64b'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='movdiri'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pbrsb-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pku'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='psdp-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='rrsba-ctrl'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='sbdr-ssdp-no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='serialize'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='ss'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='vaes'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='vpclmulqdq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='xsaves'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='Skylake-Client'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='hle'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='invpcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='rtm'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='Skylake-Client-IBRS'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='hle'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='invpcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='rtm'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='invpcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='Skylake-Client-v1'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='hle'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='invpcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='rtm'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='Skylake-Client-v2'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='hle'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='invpcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='rtm'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='Skylake-Client-v3'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='invpcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='Skylake-Client-v4'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='invpcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='xsaves'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='Skylake-Server'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512bw'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512cd'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512dq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512f'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vl'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='hle'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='invpcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pku'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='rtm'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='Skylake-Server-IBRS'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512bw'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512cd'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512dq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512f'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vl'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='hle'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='invpcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pku'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='rtm'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512bw'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512cd'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512dq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512f'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vl'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='invpcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pku'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='Skylake-Server-v1'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512bw'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512cd'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512dq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512f'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vl'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='hle'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='invpcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pku'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='rtm'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='Skylake-Server-v2'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512bw'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512cd'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512dq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512f'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vl'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='hle'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='invpcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pku'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='rtm'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='Skylake-Server-v3'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512bw'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512cd'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512dq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512f'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vl'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='invpcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pku'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='Skylake-Server-v4'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512bw'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512cd'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512dq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512f'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vl'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='invpcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pku'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='Skylake-Server-v5'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512bw'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512cd'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512dq'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512f'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='avx512vl'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='invpcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pcid'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='pku'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='xsaves'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='Snowridge'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='cldemote'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='core-capability'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='gfni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='movdir64b'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='movdiri'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='mpx'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='split-lock-detect'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='Snowridge-v1'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='cldemote'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='core-capability'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='gfni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='movdir64b'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='movdiri'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='mpx'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='split-lock-detect'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='Snowridge-v2'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='cldemote'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='core-capability'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='gfni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='movdir64b'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='movdiri'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='split-lock-detect'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='Snowridge-v3'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='cldemote'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='core-capability'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='gfni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='movdir64b'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='movdiri'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='split-lock-detect'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='xsaves'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='Snowridge-v4'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='cldemote'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='erms'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='gfni'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='movdir64b'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='movdiri'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='xsaves'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='athlon'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='3dnow'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='3dnowext'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='athlon-v1'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='3dnow'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='3dnowext'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='core2duo'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='ss'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='core2duo-v1'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='ss'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='coreduo'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='ss'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='coreduo-v1'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='ss'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='n270'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='ss'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='n270-v1'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='ss'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='phenom'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='3dnow'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='3dnowext'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <blockers model='phenom-v1'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='3dnow'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <feature name='3dnowext'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </blockers>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Feb 02 09:58:02 compute-1 nova_compute[226294]:     </mode>
Feb 02 09:58:02 compute-1 nova_compute[226294]:   </cpu>
Feb 02 09:58:02 compute-1 nova_compute[226294]:   <memoryBacking supported='yes'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:     <enum name='sourceType'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <value>file</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <value>anonymous</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <value>memfd</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:     </enum>
Feb 02 09:58:02 compute-1 nova_compute[226294]:   </memoryBacking>
Feb 02 09:58:02 compute-1 nova_compute[226294]:   <devices>
Feb 02 09:58:02 compute-1 nova_compute[226294]:     <disk supported='yes'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <enum name='diskDevice'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>disk</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>cdrom</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>floppy</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>lun</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </enum>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <enum name='bus'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>ide</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>fdc</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>scsi</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>virtio</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>usb</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>sata</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </enum>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <enum name='model'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>virtio</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>virtio-transitional</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>virtio-non-transitional</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </enum>
Feb 02 09:58:02 compute-1 nova_compute[226294]:     </disk>
Feb 02 09:58:02 compute-1 nova_compute[226294]:     <graphics supported='yes'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <enum name='type'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>vnc</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>egl-headless</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>dbus</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </enum>
Feb 02 09:58:02 compute-1 nova_compute[226294]:     </graphics>
Feb 02 09:58:02 compute-1 nova_compute[226294]:     <video supported='yes'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <enum name='modelType'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>vga</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>cirrus</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>virtio</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>none</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>bochs</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>ramfb</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </enum>
Feb 02 09:58:02 compute-1 nova_compute[226294]:     </video>
Feb 02 09:58:02 compute-1 nova_compute[226294]:     <hostdev supported='yes'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <enum name='mode'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>subsystem</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </enum>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <enum name='startupPolicy'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>default</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>mandatory</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>requisite</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>optional</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </enum>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <enum name='subsysType'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>usb</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>pci</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>scsi</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </enum>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <enum name='capsType'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <enum name='pciBackend'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:     </hostdev>
Feb 02 09:58:02 compute-1 nova_compute[226294]:     <rng supported='yes'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <enum name='model'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>virtio</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>virtio-transitional</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>virtio-non-transitional</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </enum>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <enum name='backendModel'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>random</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>egd</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>builtin</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </enum>
Feb 02 09:58:02 compute-1 nova_compute[226294]:     </rng>
Feb 02 09:58:02 compute-1 nova_compute[226294]:     <filesystem supported='yes'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <enum name='driverType'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>path</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>handle</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>virtiofs</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </enum>
Feb 02 09:58:02 compute-1 nova_compute[226294]:     </filesystem>
Feb 02 09:58:02 compute-1 nova_compute[226294]:     <tpm supported='yes'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <enum name='model'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>tpm-tis</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>tpm-crb</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </enum>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <enum name='backendModel'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>emulator</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>external</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </enum>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <enum name='backendVersion'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>2.0</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </enum>
Feb 02 09:58:02 compute-1 nova_compute[226294]:     </tpm>
Feb 02 09:58:02 compute-1 nova_compute[226294]:     <redirdev supported='yes'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <enum name='bus'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>usb</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </enum>
Feb 02 09:58:02 compute-1 nova_compute[226294]:     </redirdev>
Feb 02 09:58:02 compute-1 nova_compute[226294]:     <channel supported='yes'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <enum name='type'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>pty</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>unix</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </enum>
Feb 02 09:58:02 compute-1 nova_compute[226294]:     </channel>
Feb 02 09:58:02 compute-1 nova_compute[226294]:     <crypto supported='yes'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <enum name='model'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <enum name='type'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>qemu</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </enum>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <enum name='backendModel'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>builtin</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </enum>
Feb 02 09:58:02 compute-1 nova_compute[226294]:     </crypto>
Feb 02 09:58:02 compute-1 nova_compute[226294]:     <interface supported='yes'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <enum name='backendType'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>default</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>passt</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </enum>
Feb 02 09:58:02 compute-1 nova_compute[226294]:     </interface>
Feb 02 09:58:02 compute-1 nova_compute[226294]:     <panic supported='yes'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <enum name='model'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>isa</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>hyperv</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </enum>
Feb 02 09:58:02 compute-1 nova_compute[226294]:     </panic>
Feb 02 09:58:02 compute-1 nova_compute[226294]:     <console supported='yes'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <enum name='type'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>null</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>vc</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>pty</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>dev</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>file</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>pipe</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>stdio</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>udp</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>tcp</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>unix</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>qemu-vdagent</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>dbus</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </enum>
Feb 02 09:58:02 compute-1 nova_compute[226294]:     </console>
Feb 02 09:58:02 compute-1 nova_compute[226294]:   </devices>
Feb 02 09:58:02 compute-1 nova_compute[226294]:   <features>
Feb 02 09:58:02 compute-1 nova_compute[226294]:     <gic supported='no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:     <vmcoreinfo supported='yes'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:     <genid supported='yes'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:     <backingStoreInput supported='yes'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:     <backup supported='yes'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:     <async-teardown supported='yes'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:     <s390-pv supported='no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:     <ps2 supported='yes'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:     <tdx supported='no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:     <sev supported='no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:     <sgx supported='no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:     <hyperv supported='yes'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <enum name='features'>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>relaxed</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>vapic</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>spinlocks</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>vpindex</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>runtime</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>synic</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>stimer</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>reset</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>vendor_id</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>frequencies</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>reenlightenment</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>tlbflush</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>ipi</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>avic</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>emsr_bitmap</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <value>xmm_input</value>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </enum>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       <defaults>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <spinlocks>4095</spinlocks>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <stimer_direct>on</stimer_direct>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <tlbflush_direct>on</tlbflush_direct>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <tlbflush_extended>on</tlbflush_extended>
Feb 02 09:58:02 compute-1 nova_compute[226294]:         <vendor_id>Linux KVM Hv</vendor_id>
Feb 02 09:58:02 compute-1 nova_compute[226294]:       </defaults>
Feb 02 09:58:02 compute-1 nova_compute[226294]:     </hyperv>
Feb 02 09:58:02 compute-1 nova_compute[226294]:     <launchSecurity supported='no'/>
Feb 02 09:58:02 compute-1 nova_compute[226294]:   </features>
Feb 02 09:58:02 compute-1 nova_compute[226294]: </domainCapabilities>
Feb 02 09:58:02 compute-1 nova_compute[226294]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.671 226298 DEBUG nova.virt.libvirt.host [None req-9f87d029-9537-479a-a951-66a5b7f88d49 - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.671 226298 INFO nova.virt.libvirt.host [None req-9f87d029-9537-479a-a951-66a5b7f88d49 - - - - - -] Secure Boot support detected
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.673 226298 INFO nova.virt.libvirt.driver [None req-9f87d029-9537-479a-a951-66a5b7f88d49 - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.673 226298 INFO nova.virt.libvirt.driver [None req-9f87d029-9537-479a-a951-66a5b7f88d49 - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.680 226298 DEBUG nova.virt.libvirt.driver [None req-9f87d029-9537-479a-a951-66a5b7f88d49 - - - - - -] Enabling emulated TPM support _check_vtpm_support /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1097
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.696 226298 INFO nova.virt.node [None req-9f87d029-9537-479a-a951-66a5b7f88d49 - - - - - -] Determined node identity 8e32c057-ad28-4c19-8374-763e0c1c8622 from /var/lib/nova/compute_id
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.714 226298 WARNING nova.compute.manager [None req-9f87d029-9537-479a-a951-66a5b7f88d49 - - - - - -] Compute nodes ['8e32c057-ad28-4c19-8374-763e0c1c8622'] for host compute-1.ctlplane.example.com were not found in the database. If this is the first time this service is starting on this host, then you can ignore this warning.
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.737 226298 INFO nova.compute.manager [None req-9f87d029-9537-479a-a951-66a5b7f88d49 - - - - - -] Looking for unclaimed instances stuck in BUILDING status for nodes managed by this host
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.776 226298 WARNING nova.compute.manager [None req-9f87d029-9537-479a-a951-66a5b7f88d49 - - - - - -] No compute node record found for host compute-1.ctlplane.example.com. If this is the first time this service is starting on this host, then you can ignore this warning.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host compute-1.ctlplane.example.com could not be found.
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.776 226298 DEBUG oslo_concurrency.lockutils [None req-9f87d029-9537-479a-a951-66a5b7f88d49 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.777 226298 DEBUG oslo_concurrency.lockutils [None req-9f87d029-9537-479a-a951-66a5b7f88d49 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.777 226298 DEBUG oslo_concurrency.lockutils [None req-9f87d029-9537-479a-a951-66a5b7f88d49 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.777 226298 DEBUG nova.compute.resource_tracker [None req-9f87d029-9537-479a-a951-66a5b7f88d49 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 02 09:58:02 compute-1 nova_compute[226294]: 2026-02-02 09:58:02.777 226298 DEBUG oslo_concurrency.processutils [None req-9f87d029-9537-479a-a951-66a5b7f88d49 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 02 09:58:03 compute-1 rsyslogd[1009]: imjournal from <np0005604791:nova_compute>: begin to drop messages due to rate-limiting
Feb 02 09:58:03 compute-1 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Feb 02 09:58:03 compute-1 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1649079883' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Feb 02 09:58:03 compute-1 nova_compute[226294]: 2026-02-02 09:58:03.197 226298 DEBUG oslo_concurrency.processutils [None req-9f87d029-9537-479a-a951-66a5b7f88d49 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.420s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 02 09:58:03 compute-1 systemd[1]: Starting libvirt nodedev daemon...
Feb 02 09:58:03 compute-1 systemd[1]: Started libvirt nodedev daemon.
Feb 02 09:58:03 compute-1 ceph-mon[80115]: pgmap v551: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Feb 02 09:58:03 compute-1 ceph-mon[80115]: from='client.? 192.168.122.100:0/2902250898' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Feb 02 09:58:03 compute-1 ceph-mon[80115]: from='client.? 192.168.122.101:0/1649079883' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Feb 02 09:58:03 compute-1 ceph-mon[80115]: from='client.? 192.168.122.102:0/1057447144' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Feb 02 09:58:03 compute-1 nova_compute[226294]: 2026-02-02 09:58:03.472 226298 WARNING nova.virt.libvirt.driver [None req-9f87d029-9537-479a-a951-66a5b7f88d49 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 02 09:58:03 compute-1 nova_compute[226294]: 2026-02-02 09:58:03.472 226298 DEBUG nova.compute.resource_tracker [None req-9f87d029-9537-479a-a951-66a5b7f88d49 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5291MB free_disk=59.98828125GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 02 09:58:03 compute-1 nova_compute[226294]: 2026-02-02 09:58:03.473 226298 DEBUG oslo_concurrency.lockutils [None req-9f87d029-9537-479a-a951-66a5b7f88d49 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 02 09:58:03 compute-1 nova_compute[226294]: 2026-02-02 09:58:03.473 226298 DEBUG oslo_concurrency.lockutils [None req-9f87d029-9537-479a-a951-66a5b7f88d49 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 02 09:58:03 compute-1 nova_compute[226294]: 2026-02-02 09:58:03.494 226298 WARNING nova.compute.resource_tracker [None req-9f87d029-9537-479a-a951-66a5b7f88d49 - - - - - -] No compute node record for compute-1.ctlplane.example.com:8e32c057-ad28-4c19-8374-763e0c1c8622: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host 8e32c057-ad28-4c19-8374-763e0c1c8622 could not be found.
Feb 02 09:58:03 compute-1 nova_compute[226294]: 2026-02-02 09:58:03.519 226298 INFO nova.compute.resource_tracker [None req-9f87d029-9537-479a-a951-66a5b7f88d49 - - - - - -] Compute node record created for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com with uuid: 8e32c057-ad28-4c19-8374-763e0c1c8622
Feb 02 09:58:03 compute-1 nova_compute[226294]: 2026-02-02 09:58:03.606 226298 DEBUG nova.compute.resource_tracker [None req-9f87d029-9537-479a-a951-66a5b7f88d49 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 02 09:58:03 compute-1 nova_compute[226294]: 2026-02-02 09:58:03.607 226298 DEBUG nova.compute.resource_tracker [None req-9f87d029-9537-479a-a951-66a5b7f88d49 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 02 09:58:04 compute-1 nova_compute[226294]: 2026-02-02 09:58:04.170 226298 INFO nova.scheduler.client.report [None req-9f87d029-9537-479a-a951-66a5b7f88d49 - - - - - -] [req-89c039ce-83b6-44de-aa8c-a5bb0eb4d150] Created resource provider record via placement API for resource provider with UUID 8e32c057-ad28-4c19-8374-763e0c1c8622 and name compute-1.ctlplane.example.com.
Feb 02 09:58:04 compute-1 nova_compute[226294]: 2026-02-02 09:58:04.246 226298 DEBUG oslo_concurrency.processutils [None req-9f87d029-9537-479a-a951-66a5b7f88d49 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 02 09:58:04 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:58:04 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 09:58:04 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:58:04.249 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 09:58:04 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:58:04 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:58:04 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:58:04.370 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:58:04 compute-1 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Feb 02 09:58:04 compute-1 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1974295898' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Feb 02 09:58:04 compute-1 nova_compute[226294]: 2026-02-02 09:58:04.755 226298 DEBUG oslo_concurrency.processutils [None req-9f87d029-9537-479a-a951-66a5b7f88d49 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.509s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 02 09:58:04 compute-1 nova_compute[226294]: 2026-02-02 09:58:04.760 226298 DEBUG nova.virt.libvirt.host [None req-9f87d029-9537-479a-a951-66a5b7f88d49 - - - - - -] /sys/module/kvm_amd/parameters/sev contains [N
Feb 02 09:58:04 compute-1 nova_compute[226294]: ] _kernel_supports_amd_sev /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1803
Feb 02 09:58:04 compute-1 nova_compute[226294]: 2026-02-02 09:58:04.760 226298 INFO nova.virt.libvirt.host [None req-9f87d029-9537-479a-a951-66a5b7f88d49 - - - - - -] kernel doesn't support AMD SEV
Feb 02 09:58:04 compute-1 nova_compute[226294]: 2026-02-02 09:58:04.761 226298 DEBUG nova.compute.provider_tree [None req-9f87d029-9537-479a-a951-66a5b7f88d49 - - - - - -] Updating inventory in ProviderTree for provider 8e32c057-ad28-4c19-8374-763e0c1c8622 with inventory: {'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 59, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 0}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Feb 02 09:58:04 compute-1 nova_compute[226294]: 2026-02-02 09:58:04.762 226298 DEBUG nova.virt.libvirt.driver [None req-9f87d029-9537-479a-a951-66a5b7f88d49 - - - - - -] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Feb 02 09:58:04 compute-1 ceph-mon[80115]: pgmap v552: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 426 B/s rd, 0 op/s
Feb 02 09:58:04 compute-1 ceph-mon[80115]: from='client.? 192.168.122.100:0/3775011156' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Feb 02 09:58:04 compute-1 nova_compute[226294]: 2026-02-02 09:58:04.931 226298 DEBUG nova.scheduler.client.report [None req-9f87d029-9537-479a-a951-66a5b7f88d49 - - - - - -] Updated inventory for provider 8e32c057-ad28-4c19-8374-763e0c1c8622 with generation 0 in Placement from set_inventory_for_provider using data: {'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 59, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:957
Feb 02 09:58:04 compute-1 nova_compute[226294]: 2026-02-02 09:58:04.932 226298 DEBUG nova.compute.provider_tree [None req-9f87d029-9537-479a-a951-66a5b7f88d49 - - - - - -] Updating resource provider 8e32c057-ad28-4c19-8374-763e0c1c8622 generation from 0 to 1 during operation: update_inventory _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164
Feb 02 09:58:04 compute-1 nova_compute[226294]: 2026-02-02 09:58:04.932 226298 DEBUG nova.compute.provider_tree [None req-9f87d029-9537-479a-a951-66a5b7f88d49 - - - - - -] Updating inventory in ProviderTree for provider 8e32c057-ad28-4c19-8374-763e0c1c8622 with inventory: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Feb 02 09:58:05 compute-1 nova_compute[226294]: 2026-02-02 09:58:05.055 226298 DEBUG nova.compute.provider_tree [None req-9f87d029-9537-479a-a951-66a5b7f88d49 - - - - - -] Updating resource provider 8e32c057-ad28-4c19-8374-763e0c1c8622 generation from 1 to 2 during operation: update_traits _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164
Feb 02 09:58:05 compute-1 nova_compute[226294]: 2026-02-02 09:58:05.135 226298 DEBUG nova.compute.resource_tracker [None req-9f87d029-9537-479a-a951-66a5b7f88d49 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 02 09:58:05 compute-1 nova_compute[226294]: 2026-02-02 09:58:05.136 226298 DEBUG oslo_concurrency.lockutils [None req-9f87d029-9537-479a-a951-66a5b7f88d49 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.663s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 02 09:58:05 compute-1 nova_compute[226294]: 2026-02-02 09:58:05.136 226298 DEBUG nova.service [None req-9f87d029-9537-479a-a951-66a5b7f88d49 - - - - - -] Creating RPC server for service compute start /usr/lib/python3.9/site-packages/nova/service.py:182
Feb 02 09:58:05 compute-1 nova_compute[226294]: 2026-02-02 09:58:05.399 226298 DEBUG nova.service [None req-9f87d029-9537-479a-a951-66a5b7f88d49 - - - - - -] Join ServiceGroup membership for this service compute start /usr/lib/python3.9/site-packages/nova/service.py:199
Feb 02 09:58:05 compute-1 nova_compute[226294]: 2026-02-02 09:58:05.400 226298 DEBUG nova.servicegroup.drivers.db [None req-9f87d029-9537-479a-a951-66a5b7f88d49 - - - - - -] DB_Driver: join new ServiceGroup member compute-1.ctlplane.example.com to the compute group, service = <Service: host=compute-1.ctlplane.example.com, binary=nova-compute, manager_class_name=nova.compute.manager.ComputeManager> join /usr/lib/python3.9/site-packages/nova/servicegroup/drivers/db.py:44
Feb 02 09:58:05 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-haproxy-nfs-cephfs-compute-1-sryqbx[86108]: [WARNING] 032/095805 (4) : Server backend/nfs.cephfs.0 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Feb 02 09:58:05 compute-1 ceph-mon[80115]: from='client.? 192.168.122.102:0/685825359' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Feb 02 09:58:05 compute-1 ceph-mon[80115]: from='client.? 192.168.122.101:0/1974295898' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Feb 02 09:58:06 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:58:06 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:58:06 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:58:06.251 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:58:06 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:58:06 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:58:06 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:58:06.372 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:58:06 compute-1 ceph-mon[80115]: pgmap v553: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s
Feb 02 09:58:07 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 09:58:08 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:58:08 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:58:08 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:58:08.254 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:58:08 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:58:08 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:58:08 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:58:08.376 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:58:09 compute-1 ceph-mon[80115]: pgmap v554: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s
Feb 02 09:58:10 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:58:10 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 09:58:10 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:58:10.256 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 09:58:10 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:58:10 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:58:10 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:58:10.378 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:58:11 compute-1 systemd[1]: ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1@nfs.cephfs.0.0.compute-1.mhzhsx.service: Scheduled restart job, restart counter is at 9.
Feb 02 09:58:11 compute-1 systemd[1]: Stopped Ceph nfs.cephfs.0.0.compute-1.mhzhsx for d241d473-9fcb-5f74-b163-f1ca4454e7f1.
Feb 02 09:58:11 compute-1 systemd[1]: ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1@nfs.cephfs.0.0.compute-1.mhzhsx.service: Consumed 1.206s CPU time.
Feb 02 09:58:11 compute-1 systemd[1]: Starting Ceph nfs.cephfs.0.0.compute-1.mhzhsx for d241d473-9fcb-5f74-b163-f1ca4454e7f1...
Feb 02 09:58:11 compute-1 podman[226763]: 2026-02-02 09:58:11.334084348 +0000 UTC m=+0.063606005 container create 123b155399b379be1f4fa4925cb331f5cbd111e1b9d64915a39b1dd9646a2993 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, ceph=True)
Feb 02 09:58:11 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/db349f9cb7ea7145c448aa73f1739c4f5f00f38f55f443501a2048cc19eefd46/merged/etc/ganesha supports timestamps until 2038 (0x7fffffff)
Feb 02 09:58:11 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/db349f9cb7ea7145c448aa73f1739c4f5f00f38f55f443501a2048cc19eefd46/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Feb 02 09:58:11 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/db349f9cb7ea7145c448aa73f1739c4f5f00f38f55f443501a2048cc19eefd46/merged/etc/ceph/keyring supports timestamps until 2038 (0x7fffffff)
Feb 02 09:58:11 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/db349f9cb7ea7145c448aa73f1739c4f5f00f38f55f443501a2048cc19eefd46/merged/var/lib/ceph/radosgw/ceph-nfs.cephfs.0.0.compute-1.mhzhsx-rgw/keyring supports timestamps until 2038 (0x7fffffff)
Feb 02 09:58:11 compute-1 podman[226763]: 2026-02-02 09:58:11.395702332 +0000 UTC m=+0.125223949 container init 123b155399b379be1f4fa4925cb331f5cbd111e1b9d64915a39b1dd9646a2993 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1)
Feb 02 09:58:11 compute-1 podman[226763]: 2026-02-02 09:58:11.30596584 +0000 UTC m=+0.035487547 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Feb 02 09:58:11 compute-1 podman[226763]: 2026-02-02 09:58:11.40033213 +0000 UTC m=+0.129853747 container start 123b155399b379be1f4fa4925cb331f5cbd111e1b9d64915a39b1dd9646a2993 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx, ceph=True, io.buildah.version=1.40.1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Feb 02 09:58:11 compute-1 bash[226763]: 123b155399b379be1f4fa4925cb331f5cbd111e1b9d64915a39b1dd9646a2993
Feb 02 09:58:11 compute-1 systemd[1]: Started Ceph nfs.cephfs.0.0.compute-1.mhzhsx for d241d473-9fcb-5f74-b163-f1ca4454e7f1.
Feb 02 09:58:11 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[226779]: 02/02/2026 09:58:11 : epoch 69807533 : compute-1 : ganesha.nfsd-2[main] init_logging :LOG :NULL :LOG: Setting log level for all components to NIV_EVENT
Feb 02 09:58:11 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[226779]: 02/02/2026 09:58:11 : epoch 69807533 : compute-1 : ganesha.nfsd-2[main] main :MAIN :EVENT :ganesha.nfsd Starting: Ganesha Version 5.9
Feb 02 09:58:11 compute-1 ceph-mon[80115]: pgmap v555: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s
Feb 02 09:58:11 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[226779]: 02/02/2026 09:58:11 : epoch 69807533 : compute-1 : ganesha.nfsd-2[main] nfs_set_param_from_conf :NFS STARTUP :EVENT :Configuration file successfully parsed
Feb 02 09:58:11 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[226779]: 02/02/2026 09:58:11 : epoch 69807533 : compute-1 : ganesha.nfsd-2[main] monitoring_init :NFS STARTUP :EVENT :Init monitoring at 0.0.0.0:9587
Feb 02 09:58:11 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[226779]: 02/02/2026 09:58:11 : epoch 69807533 : compute-1 : ganesha.nfsd-2[main] fsal_init_fds_limit :MDCACHE LRU :EVENT :Setting the system-imposed limit on FDs to 1048576.
Feb 02 09:58:11 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[226779]: 02/02/2026 09:58:11 : epoch 69807533 : compute-1 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :Initializing ID Mapper.
Feb 02 09:58:11 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[226779]: 02/02/2026 09:58:11 : epoch 69807533 : compute-1 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :ID Mapper successfully initialized.
Feb 02 09:58:11 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[226779]: 02/02/2026 09:58:11 : epoch 69807533 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Feb 02 09:58:12 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:58:12 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:58:12 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:58:12.259 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:58:12 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 09:58:12 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:58:12 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:58:12 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:58:12.381 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:58:13 compute-1 podman[226822]: 2026-02-02 09:58:13.43306648 +0000 UTC m=+0.100371184 container health_status 1fb2696999ea15e9688144989093738543d3cef725f9986792d6dfd707aefbe2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:099d88ae13fa2b3409da5310cdcba7fa01d2c87a8bc98296299a57054b9a075e, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'db4758ee7523fe447444c4bd2b867b543b1eee4e3bbcf6676cd1b27bf6147d86-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:099d88ae13fa2b3409da5310cdcba7fa01d2c87a8bc98296299a57054b9a075e', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20260127)
Feb 02 09:58:13 compute-1 ceph-mon[80115]: pgmap v556: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s
Feb 02 09:58:14 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:58:14 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:58:14 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:58:14.263 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:58:14 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:58:14 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000025s ======
Feb 02 09:58:14 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:58:14.384 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Feb 02 09:58:15 compute-1 ceph-mon[80115]: pgmap v557: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 85 B/s wr, 0 op/s
Feb 02 09:58:16 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:58:16 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 09:58:16 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:58:16.266 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 09:58:16 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:58:16 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:58:16 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:58:16.387 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:58:17 compute-1 sudo[226850]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Feb 02 09:58:17 compute-1 sudo[226850]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:58:17 compute-1 sudo[226850]: pam_unix(sudo:session): session closed for user root
Feb 02 09:58:17 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 09:58:17 compute-1 ceph-mon[80115]: pgmap v558: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 596 B/s rd, 85 B/s wr, 0 op/s
Feb 02 09:58:17 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Feb 02 09:58:17 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[226779]: 02/02/2026 09:58:17 : epoch 69807533 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Feb 02 09:58:17 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[226779]: 02/02/2026 09:58:17 : epoch 69807533 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Feb 02 09:58:18 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:58:18 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:58:18 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:58:18.269 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:58:18 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:58:18 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:58:18 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:58:18.390 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:58:19 compute-1 ceph-mon[80115]: pgmap v559: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 2.2 KiB/s rd, 938 B/s wr, 3 op/s
Feb 02 09:58:20 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:58:20 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:58:20 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:58:20.272 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:58:20 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:58:20 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:58:20 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:58:20.392 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:58:20 compute-1 podman[226876]: 2026-02-02 09:58:20.396139739 +0000 UTC m=+0.064082998 container health_status 9cab238676dda9a572dafc047b624a8b78a8fbec063bf997856559897160605d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'db4758ee7523fe447444c4bd2b867b543b1eee4e3bbcf6676cd1b27bf6147d86-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 02 09:58:21 compute-1 ceph-mon[80115]: pgmap v560: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 2.2 KiB/s rd, 938 B/s wr, 3 op/s
Feb 02 09:58:22 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:58:22 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:58:22 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:58:22.275 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:58:22 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 09:58:22 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:58:22 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:58:22 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:58:22.395 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:58:23 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[226779]: 02/02/2026 09:58:23 : epoch 69807533 : compute-1 : ganesha.nfsd-2[main] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Feb 02 09:58:23 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[226779]: 02/02/2026 09:58:23 : epoch 69807533 : compute-1 : ganesha.nfsd-2[main] main :NFS STARTUP :WARN :No export entries found in configuration file !!!
Feb 02 09:58:23 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[226779]: 02/02/2026 09:58:23 : epoch 69807533 : compute-1 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:24): Unknown block (RADOS_URLS)
Feb 02 09:58:23 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[226779]: 02/02/2026 09:58:23 : epoch 69807533 : compute-1 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:29): Unknown block (RGW)
Feb 02 09:58:23 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[226779]: 02/02/2026 09:58:23 : epoch 69807533 : compute-1 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :CAP_SYS_RESOURCE was successfully removed for proper quota management in FSAL
Feb 02 09:58:23 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[226779]: 02/02/2026 09:58:23 : epoch 69807533 : compute-1 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :currently set capabilities are: cap_chown,cap_dac_override,cap_fowner,cap_fsetid,cap_kill,cap_setgid,cap_setuid,cap_setpcap,cap_net_bind_service,cap_sys_chroot,cap_setfcap=ep
Feb 02 09:58:23 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[226779]: 02/02/2026 09:58:23 : epoch 69807533 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_pkginit :DBUS :CRIT :dbus_bus_get failed (Failed to connect to socket /run/dbus/system_bus_socket: No such file or directory)
Feb 02 09:58:23 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[226779]: 02/02/2026 09:58:23 : epoch 69807533 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Feb 02 09:58:23 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[226779]: 02/02/2026 09:58:23 : epoch 69807533 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Feb 02 09:58:23 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[226779]: 02/02/2026 09:58:23 : epoch 69807533 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Feb 02 09:58:23 compute-1 ceph-mon[80115]: pgmap v561: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 2.2 KiB/s rd, 938 B/s wr, 3 op/s
Feb 02 09:58:23 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[226779]: 02/02/2026 09:58:23 : epoch 69807533 : compute-1 : ganesha.nfsd-2[main] nfs_Init_svc :DISP :CRIT :Cannot acquire credentials for principal nfs
Feb 02 09:58:23 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[226779]: 02/02/2026 09:58:23 : epoch 69807533 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Feb 02 09:58:23 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[226779]: 02/02/2026 09:58:23 : epoch 69807533 : compute-1 : ganesha.nfsd-2[main] nfs_Init_admin_thread :NFS CB :EVENT :Admin thread initialized
Feb 02 09:58:23 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[226779]: 02/02/2026 09:58:23 : epoch 69807533 : compute-1 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :EVENT :Callback creds directory (/var/run/ganesha) already exists
Feb 02 09:58:23 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[226779]: 02/02/2026 09:58:23 : epoch 69807533 : compute-1 : ganesha.nfsd-2[main] find_keytab_entry :NFS CB :WARN :Configuration file does not specify default realm while getting default realm name
Feb 02 09:58:23 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[226779]: 02/02/2026 09:58:23 : epoch 69807533 : compute-1 : ganesha.nfsd-2[main] gssd_refresh_krb5_machine_credential :NFS CB :CRIT :ERROR: gssd_refresh_krb5_machine_credential: no usable keytab entry found in keytab /etc/krb5.keytab for connection with host localhost
Feb 02 09:58:23 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[226779]: 02/02/2026 09:58:23 : epoch 69807533 : compute-1 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :WARN :gssd_refresh_krb5_machine_credential failed (-1765328160:0)
Feb 02 09:58:23 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[226779]: 02/02/2026 09:58:23 : epoch 69807533 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :Starting delayed executor.
Feb 02 09:58:23 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[226779]: 02/02/2026 09:58:23 : epoch 69807533 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :gsh_dbusthread was started successfully
Feb 02 09:58:23 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[226779]: 02/02/2026 09:58:23 : epoch 69807533 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :admin thread was started successfully
Feb 02 09:58:23 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[226779]: 02/02/2026 09:58:23 : epoch 69807533 : compute-1 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :CRIT :DBUS not initialized, service thread exiting
Feb 02 09:58:23 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[226779]: 02/02/2026 09:58:23 : epoch 69807533 : compute-1 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :EVENT :shutdown
Feb 02 09:58:23 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[226779]: 02/02/2026 09:58:23 : epoch 69807533 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :reaper thread was started successfully
Feb 02 09:58:23 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[226779]: 02/02/2026 09:58:23 : epoch 69807533 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :General fridge was started successfully
Feb 02 09:58:23 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[226779]: 02/02/2026 09:58:23 : epoch 69807533 : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Feb 02 09:58:23 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[226779]: 02/02/2026 09:58:23 : epoch 69807533 : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :             NFS SERVER INITIALIZED
Feb 02 09:58:23 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[226779]: 02/02/2026 09:58:23 : epoch 69807533 : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Feb 02 09:58:23 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[226779]: 02/02/2026 09:58:23 : epoch 69807533 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fdfd0000df0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:58:23 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[226779]: 02/02/2026 09:58:23 : epoch 69807533 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fdfc8001c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:58:24 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:58:24 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:58:24 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:58:24.277 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:58:24 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:58:24 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:58:24 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:58:24.398 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:58:25 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[226779]: 02/02/2026 09:58:25 : epoch 69807533 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fdfb0000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:58:25 compute-1 ceph-mon[80115]: pgmap v562: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 2.3 KiB/s rd, 1023 B/s wr, 3 op/s
Feb 02 09:58:25 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-haproxy-nfs-cephfs-compute-1-sryqbx[86108]: [WARNING] 032/095825 (4) : Server backend/nfs.cephfs.0 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Feb 02 09:58:25 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[226779]: 02/02/2026 09:58:25 : epoch 69807533 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fdfb4000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:58:25 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[226779]: 02/02/2026 09:58:25 : epoch 69807533 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fdfb8000fa0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:58:26 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:58:26 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:58:26 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:58:26.280 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:58:26 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:58:26 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:58:26 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:58:26.400 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:58:27 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 09:58:27 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[226779]: 02/02/2026 09:58:27 : epoch 69807533 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fdfc8001c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:58:27 compute-1 ceph-mon[80115]: pgmap v563: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.7 KiB/s rd, 938 B/s wr, 2 op/s
Feb 02 09:58:27 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[226779]: 02/02/2026 09:58:27 : epoch 69807533 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fdfb00016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:58:27 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[226779]: 02/02/2026 09:58:27 : epoch 69807533 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fdfb40016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:58:28 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:58:28 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:58:28 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:58:28.283 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:58:28 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:58:28 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:58:28 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:58:28.403 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:58:28 compute-1 ceph-mon[80115]: pgmap v564: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.8 KiB/s rd, 938 B/s wr, 2 op/s
Feb 02 09:58:29 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[226779]: 02/02/2026 09:58:29 : epoch 69807533 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fdfb8001ac0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:58:29 compute-1 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Feb 02 09:58:29 compute-1 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/4244421351' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Feb 02 09:58:29 compute-1 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Feb 02 09:58:29 compute-1 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/4244421351' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Feb 02 09:58:29 compute-1 sudo[226917]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 02 09:58:29 compute-1 sudo[226917]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:58:29 compute-1 sudo[226917]: pam_unix(sudo:session): session closed for user root
Feb 02 09:58:29 compute-1 ceph-mon[80115]: from='client.? 192.168.122.10:0/1988233533' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Feb 02 09:58:29 compute-1 ceph-mon[80115]: from='client.? 192.168.122.10:0/1988233533' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Feb 02 09:58:29 compute-1 ceph-mon[80115]: from='client.? 192.168.122.10:0/4244421351' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Feb 02 09:58:29 compute-1 ceph-mon[80115]: from='client.? 192.168.122.10:0/4244421351' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Feb 02 09:58:29 compute-1 sudo[226942]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/d241d473-9fcb-5f74-b163-f1ca4454e7f1/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Feb 02 09:58:29 compute-1 sudo[226942]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:58:29 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[226779]: 02/02/2026 09:58:29 : epoch 69807533 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fdfc8001c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:58:29 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[226779]: 02/02/2026 09:58:29 : epoch 69807533 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fdfb00016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:58:29 compute-1 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Feb 02 09:58:29 compute-1 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/759187924' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Feb 02 09:58:29 compute-1 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Feb 02 09:58:29 compute-1 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/759187924' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Feb 02 09:58:30 compute-1 sudo[226942]: pam_unix(sudo:session): session closed for user root
Feb 02 09:58:30 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:58:30 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000025s ======
Feb 02 09:58:30 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:58:30.286 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Feb 02 09:58:30 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:58:30 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 09:58:30 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:58:30.406 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 09:58:30 compute-1 ceph-mon[80115]: from='client.? 192.168.122.10:0/759187924' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Feb 02 09:58:30 compute-1 ceph-mon[80115]: from='client.? 192.168.122.10:0/759187924' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Feb 02 09:58:30 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb 02 09:58:30 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb 02 09:58:30 compute-1 ceph-mon[80115]: pgmap v565: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 85 B/s wr, 0 op/s
Feb 02 09:58:31 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[226779]: 02/02/2026 09:58:31 : epoch 69807533 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fdfb40016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:58:31 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Feb 02 09:58:31 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Feb 02 09:58:31 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb 02 09:58:31 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb 02 09:58:31 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Feb 02 09:58:31 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Feb 02 09:58:31 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Feb 02 09:58:31 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[226779]: 02/02/2026 09:58:31 : epoch 69807533 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fdfb8001ac0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:58:31 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[226779]: 02/02/2026 09:58:31 : epoch 69807533 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fdfc8001c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:58:32 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:58:32 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:58:32 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:58:32.289 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:58:32 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 09:58:32 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:58:32 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:58:32 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:58:32.409 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:58:32 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Feb 02 09:58:32 compute-1 ceph-mon[80115]: pgmap v566: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 85 B/s wr, 0 op/s
Feb 02 09:58:33 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[226779]: 02/02/2026 09:58:33 : epoch 69807533 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fdfb00016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:58:33 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[226779]: 02/02/2026 09:58:33 : epoch 69807533 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fdfb8001ac0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:58:33 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[226779]: 02/02/2026 09:58:33 : epoch 69807533 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fdfa8000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:58:34 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:58:34 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:58:34 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:58:34.292 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:58:34 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:58:34 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:58:34 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:58:34.412 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:58:35 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[226779]: 02/02/2026 09:58:35 : epoch 69807533 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fdfb4002720 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:58:35 compute-1 ceph-mon[80115]: pgmap v567: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 426 B/s rd, 85 B/s wr, 0 op/s
Feb 02 09:58:35 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[226779]: 02/02/2026 09:58:35 : epoch 69807533 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fdfb0002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:58:35 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[226779]: 02/02/2026 09:58:35 : epoch 69807533 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fdfb8001ac0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:58:36 compute-1 sudo[227001]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 02 09:58:36 compute-1 sudo[227001]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:58:36 compute-1 sudo[227001]: pam_unix(sudo:session): session closed for user root
Feb 02 09:58:36 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:58:36 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000025s ======
Feb 02 09:58:36 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:58:36.295 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Feb 02 09:58:36 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:58:36 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:58:36 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:58:36.415 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:58:36 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb 02 09:58:36 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb 02 09:58:36 compute-1 ceph-mon[80115]: pgmap v568: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 B/s wr, 0 op/s
Feb 02 09:58:37 compute-1 sudo[227027]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Feb 02 09:58:37 compute-1 sudo[227027]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:58:37 compute-1 sudo[227027]: pam_unix(sudo:session): session closed for user root
Feb 02 09:58:37 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 09:58:37 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[226779]: 02/02/2026 09:58:37 : epoch 69807533 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fdfa80016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:58:37 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[226779]: 02/02/2026 09:58:37 : epoch 69807533 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fdfb4002720 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:58:37 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[226779]: 02/02/2026 09:58:37 : epoch 69807533 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fdfb0002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:58:38 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:58:38 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:58:38 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:58:38.299 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:58:38 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:58:38 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 09:58:38 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:58:38.418 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 09:58:39 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[226779]: 02/02/2026 09:58:39 : epoch 69807533 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fdfb8003340 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:58:39 compute-1 ceph-mon[80115]: pgmap v569: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 341 B/s rd, 0 B/s wr, 0 op/s
Feb 02 09:58:39 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[226779]: 02/02/2026 09:58:39 : epoch 69807533 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fdfa80016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:58:39 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[226779]: 02/02/2026 09:58:39 : epoch 69807533 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fdfb4002720 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:58:40 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:58:40 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:58:40 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:58:40.302 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:58:40 compute-1 nova_compute[226294]: 2026-02-02 09:58:40.402 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 02 09:58:40 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:58:40 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:58:40 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:58:40.422 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:58:40 compute-1 nova_compute[226294]: 2026-02-02 09:58:40.604 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._cleanup_running_deleted_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 02 09:58:41 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[226779]: 02/02/2026 09:58:41 : epoch 69807533 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fdfb0002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:58:41 compute-1 ceph-mon[80115]: pgmap v570: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Feb 02 09:58:41 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[226779]: 02/02/2026 09:58:41 : epoch 69807533 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fdfb8003340 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:58:41 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[226779]: 02/02/2026 09:58:41 : epoch 69807533 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fdfa80016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:58:42 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 09:58:42 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:58:42 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000025s ======
Feb 02 09:58:42 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:58:42.305 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Feb 02 09:58:42 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:58:42 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000025s ======
Feb 02 09:58:42 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:58:42.425 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Feb 02 09:58:43 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[226779]: 02/02/2026 09:58:43 : epoch 69807533 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fdfb4003820 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:58:43 compute-1 ceph-mon[80115]: pgmap v571: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Feb 02 09:58:43 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[226779]: 02/02/2026 09:58:43 : epoch 69807533 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fdfb0003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:58:43 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[226779]: 02/02/2026 09:58:43 : epoch 69807533 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fdfb8004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:58:44 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:58:44 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:58:44 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:58:44.307 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:58:44 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:58:44 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:58:44 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:58:44.427 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:58:44 compute-1 podman[227055]: 2026-02-02 09:58:44.449226464 +0000 UTC m=+0.120461768 container health_status 1fb2696999ea15e9688144989093738543d3cef725f9986792d6dfd707aefbe2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:099d88ae13fa2b3409da5310cdcba7fa01d2c87a8bc98296299a57054b9a075e, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'db4758ee7523fe447444c4bd2b867b543b1eee4e3bbcf6676cd1b27bf6147d86-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:099d88ae13fa2b3409da5310cdcba7fa01d2c87a8bc98296299a57054b9a075e', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_controller, io.buildah.version=1.41.3)
Feb 02 09:58:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:58:44.898 143542 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 02 09:58:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:58:44.899 143542 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 02 09:58:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:58:44.899 143542 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 02 09:58:45 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[226779]: 02/02/2026 09:58:45 : epoch 69807533 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fdfa8002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:58:45 compute-1 ceph-mon[80115]: pgmap v572: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 426 B/s rd, 0 op/s
Feb 02 09:58:45 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[226779]: 02/02/2026 09:58:45 : epoch 69807533 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fdfb4003820 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:58:45 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[226779]: 02/02/2026 09:58:45 : epoch 69807533 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fdfb0003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:58:46 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:58:46 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:58:46 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:58:46.310 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:58:46 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:58:46 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 09:58:46 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:58:46.429 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 09:58:47 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 09:58:47 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[226779]: 02/02/2026 09:58:47 : epoch 69807533 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fdfb8004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:58:47 compute-1 ceph-mon[80115]: pgmap v573: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Feb 02 09:58:47 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Feb 02 09:58:47 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[226779]: 02/02/2026 09:58:47 : epoch 69807533 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fdfa8002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:58:47 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[226779]: 02/02/2026 09:58:47 : epoch 69807533 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fdfb4003820 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:58:48 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:58:48 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:58:48 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:58:48.313 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:58:48 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:58:48 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:58:48 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:58:48.433 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:58:49 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[226779]: 02/02/2026 09:58:49 : epoch 69807533 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fdfb0003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:58:49 compute-1 ceph-mon[80115]: pgmap v574: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 341 B/s rd, 0 op/s
Feb 02 09:58:49 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[226779]: 02/02/2026 09:58:49 : epoch 69807533 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fdfb8004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:58:49 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[226779]: 02/02/2026 09:58:49 : epoch 69807533 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fdfa8002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:58:50 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:58:50 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:58:50 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:58:50.316 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:58:50 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:58:50 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000025s ======
Feb 02 09:58:50 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:58:50.435 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Feb 02 09:58:51 compute-1 podman[227082]: 2026-02-02 09:58:51.376200825 +0000 UTC m=+0.056839232 container health_status 9cab238676dda9a572dafc047b624a8b78a8fbec063bf997856559897160605d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'db4758ee7523fe447444c4bd2b867b543b1eee4e3bbcf6676cd1b27bf6147d86-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20260127)
Feb 02 09:58:51 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[226779]: 02/02/2026 09:58:51 : epoch 69807533 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fdfb4003820 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:58:51 compute-1 ceph-mon[80115]: pgmap v575: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Feb 02 09:58:51 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[226779]: 02/02/2026 09:58:51 : epoch 69807533 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fdfb0003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:58:51 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[226779]: 02/02/2026 09:58:51 : epoch 69807533 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fdfb8004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:58:52 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 09:58:52 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:58:52 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:58:52 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:58:52.319 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:58:52 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:58:52 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:58:52 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:58:52.439 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:58:53 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[226779]: 02/02/2026 09:58:53 : epoch 69807533 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fdfa8003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:58:53 compute-1 ceph-mon[80115]: pgmap v576: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Feb 02 09:58:53 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[226779]: 02/02/2026 09:58:53 : epoch 69807533 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fdfa8003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:58:53 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[226779]: 02/02/2026 09:58:53 : epoch 69807533 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fdfb0003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:58:54 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:58:54 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:58:54 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:58:54.321 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:58:54 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:58:54 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 09:58:54 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:58:54.441 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 09:58:54 compute-1 ceph-mon[80115]: pgmap v577: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 426 B/s rd, 0 op/s
Feb 02 09:58:55 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[226779]: 02/02/2026 09:58:55 : epoch 69807533 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fdfb8004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:58:55 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[226779]: 02/02/2026 09:58:55 : epoch 69807533 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fdfa8003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:58:55 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[226779]: 02/02/2026 09:58:55 : epoch 69807533 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fdfd0002010 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:58:56 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:58:56 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:58:56 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:58:56.324 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:58:56 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:58:56 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 09:58:56 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:58:56.444 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 09:58:56 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-haproxy-nfs-cephfs-compute-1-sryqbx[86108]: [WARNING] 032/095856 (4) : Server backend/nfs.cephfs.1 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Feb 02 09:58:57 compute-1 sudo[227106]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Feb 02 09:58:57 compute-1 sudo[227106]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:58:57 compute-1 sudo[227106]: pam_unix(sudo:session): session closed for user root
Feb 02 09:58:57 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 09:58:57 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[226779]: 02/02/2026 09:58:57 : epoch 69807533 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fdfb4004140 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:58:57 compute-1 ceph-mon[80115]: pgmap v578: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Feb 02 09:58:57 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[226779]: 02/02/2026 09:58:57 : epoch 69807533 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fdfb8004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:58:57 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[226779]: 02/02/2026 09:58:57 : epoch 69807533 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fdfa8003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:58:58 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:58:58 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:58:58 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:58:58.327 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:58:58 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:58:58 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 09:58:58 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:58:58.448 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 09:58:59 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[226779]: 02/02/2026 09:58:59 : epoch 69807533 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fdfd0002010 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:58:59 compute-1 ceph-mon[80115]: pgmap v579: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 341 B/s rd, 0 op/s
Feb 02 09:58:59 compute-1 ceph-mon[80115]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #37. Immutable memtables: 0.
Feb 02 09:58:59 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-09:58:59.497387) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Feb 02 09:58:59 compute-1 ceph-mon[80115]: rocksdb: [db/flush_job.cc:856] [default] [JOB 19] Flushing memtable with next log file: 37
Feb 02 09:58:59 compute-1 ceph-mon[80115]: rocksdb: EVENT_LOG_v1 {"time_micros": 1770026339497435, "job": 19, "event": "flush_started", "num_memtables": 1, "num_entries": 1077, "num_deletes": 251, "total_data_size": 2553694, "memory_usage": 2588048, "flush_reason": "Manual Compaction"}
Feb 02 09:58:59 compute-1 ceph-mon[80115]: rocksdb: [db/flush_job.cc:885] [default] [JOB 19] Level-0 flush table #38: started
Feb 02 09:58:59 compute-1 ceph-mon[80115]: rocksdb: EVENT_LOG_v1 {"time_micros": 1770026339513326, "cf_name": "default", "job": 19, "event": "table_file_creation", "file_number": 38, "file_size": 1669056, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 19745, "largest_seqno": 20817, "table_properties": {"data_size": 1664205, "index_size": 2439, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1349, "raw_key_size": 10544, "raw_average_key_size": 19, "raw_value_size": 1654479, "raw_average_value_size": 3086, "num_data_blocks": 109, "num_entries": 536, "num_filter_entries": 536, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1770026253, "oldest_key_time": 1770026253, "file_creation_time": 1770026339, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "fac1d709-8a2a-487d-b05b-57255ec289c7", "db_session_id": "DE871D21TSCUFP8UED8E", "orig_file_number": 38, "seqno_to_time_mapping": "N/A"}}
Feb 02 09:58:59 compute-1 ceph-mon[80115]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 19] Flush lasted 15993 microseconds, and 4716 cpu microseconds.
Feb 02 09:58:59 compute-1 ceph-mon[80115]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Feb 02 09:58:59 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-09:58:59.513381) [db/flush_job.cc:967] [default] [JOB 19] Level-0 flush table #38: 1669056 bytes OK
Feb 02 09:58:59 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-09:58:59.513403) [db/memtable_list.cc:519] [default] Level-0 commit table #38 started
Feb 02 09:58:59 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-09:58:59.515485) [db/memtable_list.cc:722] [default] Level-0 commit table #38: memtable #1 done
Feb 02 09:58:59 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-09:58:59.515511) EVENT_LOG_v1 {"time_micros": 1770026339515504, "job": 19, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Feb 02 09:58:59 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-09:58:59.515532) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Feb 02 09:58:59 compute-1 ceph-mon[80115]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 19] Try to delete WAL files size 2548388, prev total WAL file size 2548388, number of live WAL files 2.
Feb 02 09:58:59 compute-1 ceph-mon[80115]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000034.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb 02 09:58:59 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-09:58:59.516661) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730031323535' seq:72057594037927935, type:22 .. '7061786F730031353037' seq:0, type:0; will stop at (end)
Feb 02 09:58:59 compute-1 ceph-mon[80115]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 20] Compacting 1@0 + 1@6 files to L6, score -1.00
Feb 02 09:58:59 compute-1 ceph-mon[80115]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 19 Base level 0, inputs: [38(1629KB)], [36(12MB)]
Feb 02 09:58:59 compute-1 ceph-mon[80115]: rocksdb: EVENT_LOG_v1 {"time_micros": 1770026339516740, "job": 20, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [38], "files_L6": [36], "score": -1, "input_data_size": 14903754, "oldest_snapshot_seqno": -1}
Feb 02 09:58:59 compute-1 ceph-mon[80115]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 20] Generated table #39: 5000 keys, 12722323 bytes, temperature: kUnknown
Feb 02 09:58:59 compute-1 ceph-mon[80115]: rocksdb: EVENT_LOG_v1 {"time_micros": 1770026339624291, "cf_name": "default", "job": 20, "event": "table_file_creation", "file_number": 39, "file_size": 12722323, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 12688118, "index_size": 20591, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 12549, "raw_key_size": 127606, "raw_average_key_size": 25, "raw_value_size": 12596473, "raw_average_value_size": 2519, "num_data_blocks": 844, "num_entries": 5000, "num_filter_entries": 5000, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1770025175, "oldest_key_time": 0, "file_creation_time": 1770026339, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "fac1d709-8a2a-487d-b05b-57255ec289c7", "db_session_id": "DE871D21TSCUFP8UED8E", "orig_file_number": 39, "seqno_to_time_mapping": "N/A"}}
Feb 02 09:58:59 compute-1 ceph-mon[80115]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Feb 02 09:58:59 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-09:58:59.624675) [db/compaction/compaction_job.cc:1663] [default] [JOB 20] Compacted 1@0 + 1@6 files to L6 => 12722323 bytes
Feb 02 09:58:59 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-09:58:59.627634) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 138.4 rd, 118.2 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.6, 12.6 +0.0 blob) out(12.1 +0.0 blob), read-write-amplify(16.6) write-amplify(7.6) OK, records in: 5516, records dropped: 516 output_compression: NoCompression
Feb 02 09:58:59 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-09:58:59.627662) EVENT_LOG_v1 {"time_micros": 1770026339627649, "job": 20, "event": "compaction_finished", "compaction_time_micros": 107657, "compaction_time_cpu_micros": 36449, "output_level": 6, "num_output_files": 1, "total_output_size": 12722323, "num_input_records": 5516, "num_output_records": 5000, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Feb 02 09:58:59 compute-1 ceph-mon[80115]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000038.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb 02 09:58:59 compute-1 ceph-mon[80115]: rocksdb: EVENT_LOG_v1 {"time_micros": 1770026339628036, "job": 20, "event": "table_file_deletion", "file_number": 38}
Feb 02 09:58:59 compute-1 ceph-mon[80115]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000036.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb 02 09:58:59 compute-1 ceph-mon[80115]: rocksdb: EVENT_LOG_v1 {"time_micros": 1770026339630302, "job": 20, "event": "table_file_deletion", "file_number": 36}
Feb 02 09:58:59 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-09:58:59.516420) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 02 09:58:59 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-09:58:59.630544) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 02 09:58:59 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-09:58:59.630554) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 02 09:58:59 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-09:58:59.630558) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 02 09:58:59 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-09:58:59.630561) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 02 09:58:59 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-09:58:59.630566) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 02 09:58:59 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[226779]: 02/02/2026 09:58:59 : epoch 69807533 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fdfb4004140 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:58:59 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[226779]: 02/02/2026 09:58:59 : epoch 69807533 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fdfb8004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:59:00 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:59:00 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:59:00 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:59:00.330 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:59:00 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:59:00 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:59:00 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:59:00.451 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:59:01 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[226779]: 02/02/2026 09:59:01 : epoch 69807533 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fdfa8003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:59:01 compute-1 ceph-mon[80115]: pgmap v580: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Feb 02 09:59:01 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[226779]: 02/02/2026 09:59:01 : epoch 69807533 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fdfd0002010 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:59:01 compute-1 nova_compute[226294]: 2026-02-02 09:59:01.652 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 02 09:59:01 compute-1 nova_compute[226294]: 2026-02-02 09:59:01.653 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 02 09:59:01 compute-1 nova_compute[226294]: 2026-02-02 09:59:01.653 226298 DEBUG nova.compute.manager [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 02 09:59:01 compute-1 nova_compute[226294]: 2026-02-02 09:59:01.653 226298 DEBUG nova.compute.manager [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 02 09:59:01 compute-1 nova_compute[226294]: 2026-02-02 09:59:01.679 226298 DEBUG nova.compute.manager [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Feb 02 09:59:01 compute-1 nova_compute[226294]: 2026-02-02 09:59:01.679 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 02 09:59:01 compute-1 nova_compute[226294]: 2026-02-02 09:59:01.680 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 02 09:59:01 compute-1 nova_compute[226294]: 2026-02-02 09:59:01.680 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 02 09:59:01 compute-1 nova_compute[226294]: 2026-02-02 09:59:01.681 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 02 09:59:01 compute-1 nova_compute[226294]: 2026-02-02 09:59:01.681 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 02 09:59:01 compute-1 nova_compute[226294]: 2026-02-02 09:59:01.681 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 02 09:59:01 compute-1 nova_compute[226294]: 2026-02-02 09:59:01.682 226298 DEBUG nova.compute.manager [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 02 09:59:01 compute-1 nova_compute[226294]: 2026-02-02 09:59:01.683 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 02 09:59:01 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[226779]: 02/02/2026 09:59:01 : epoch 69807533 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fdfb4004140 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:59:01 compute-1 nova_compute[226294]: 2026-02-02 09:59:01.732 226298 DEBUG oslo_concurrency.lockutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 02 09:59:01 compute-1 nova_compute[226294]: 2026-02-02 09:59:01.732 226298 DEBUG oslo_concurrency.lockutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 02 09:59:01 compute-1 nova_compute[226294]: 2026-02-02 09:59:01.732 226298 DEBUG oslo_concurrency.lockutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 02 09:59:01 compute-1 nova_compute[226294]: 2026-02-02 09:59:01.733 226298 DEBUG nova.compute.resource_tracker [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 02 09:59:01 compute-1 nova_compute[226294]: 2026-02-02 09:59:01.734 226298 DEBUG oslo_concurrency.processutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 02 09:59:02 compute-1 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Feb 02 09:59:02 compute-1 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3758363331' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Feb 02 09:59:02 compute-1 nova_compute[226294]: 2026-02-02 09:59:02.185 226298 DEBUG oslo_concurrency.processutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.451s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 02 09:59:02 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 09:59:02 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:59:02 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:59:02 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:59:02.332 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:59:02 compute-1 nova_compute[226294]: 2026-02-02 09:59:02.371 226298 WARNING nova.virt.libvirt.driver [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 02 09:59:02 compute-1 nova_compute[226294]: 2026-02-02 09:59:02.372 226298 DEBUG nova.compute.resource_tracker [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5255MB free_disk=59.98828125GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 02 09:59:02 compute-1 nova_compute[226294]: 2026-02-02 09:59:02.372 226298 DEBUG oslo_concurrency.lockutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 02 09:59:02 compute-1 nova_compute[226294]: 2026-02-02 09:59:02.372 226298 DEBUG oslo_concurrency.lockutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 02 09:59:02 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:59:02 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 09:59:02 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:59:02.453 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 09:59:02 compute-1 ceph-mon[80115]: from='client.? 192.168.122.100:0/1382096754' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Feb 02 09:59:02 compute-1 ceph-mon[80115]: from='client.? 192.168.122.101:0/3758363331' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Feb 02 09:59:02 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Feb 02 09:59:02 compute-1 ceph-mon[80115]: from='client.? 192.168.122.102:0/451321705' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Feb 02 09:59:02 compute-1 nova_compute[226294]: 2026-02-02 09:59:02.623 226298 DEBUG nova.compute.resource_tracker [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 02 09:59:02 compute-1 nova_compute[226294]: 2026-02-02 09:59:02.624 226298 DEBUG nova.compute.resource_tracker [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 02 09:59:02 compute-1 nova_compute[226294]: 2026-02-02 09:59:02.810 226298 DEBUG oslo_concurrency.processutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 02 09:59:03 compute-1 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Feb 02 09:59:03 compute-1 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/944610693' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Feb 02 09:59:03 compute-1 nova_compute[226294]: 2026-02-02 09:59:03.278 226298 DEBUG oslo_concurrency.processutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.468s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 02 09:59:03 compute-1 nova_compute[226294]: 2026-02-02 09:59:03.285 226298 DEBUG nova.compute.provider_tree [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Inventory has not changed in ProviderTree for provider: 8e32c057-ad28-4c19-8374-763e0c1c8622 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 02 09:59:03 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[226779]: 02/02/2026 09:59:03 : epoch 69807533 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fdfb8004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:59:03 compute-1 ceph-mon[80115]: pgmap v581: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Feb 02 09:59:03 compute-1 ceph-mon[80115]: from='client.? 192.168.122.102:0/4057521887' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Feb 02 09:59:03 compute-1 ceph-mon[80115]: from='client.? 192.168.122.101:0/944610693' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Feb 02 09:59:03 compute-1 ceph-mon[80115]: from='client.? 192.168.122.100:0/3932492605' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Feb 02 09:59:03 compute-1 nova_compute[226294]: 2026-02-02 09:59:03.575 226298 DEBUG nova.scheduler.client.report [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Inventory has not changed for provider 8e32c057-ad28-4c19-8374-763e0c1c8622 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 02 09:59:03 compute-1 nova_compute[226294]: 2026-02-02 09:59:03.578 226298 DEBUG nova.compute.resource_tracker [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 02 09:59:03 compute-1 nova_compute[226294]: 2026-02-02 09:59:03.578 226298 DEBUG oslo_concurrency.lockutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.206s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 02 09:59:03 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[226779]: 02/02/2026 09:59:03 : epoch 69807533 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fdfa8003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:59:03 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[226779]: 02/02/2026 09:59:03 : epoch 69807533 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fdfd0002010 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:59:04 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:59:04 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:59:04 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:59:04.335 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:59:04 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:59:04 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:59:04 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:59:04.457 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:59:05 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[226779]: 02/02/2026 09:59:05 : epoch 69807533 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fdfb4004140 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:59:05 compute-1 ceph-mon[80115]: pgmap v582: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 341 B/s rd, 0 op/s
Feb 02 09:59:05 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[226779]: 02/02/2026 09:59:05 : epoch 69807533 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fdfb8004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:59:05 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[226779]: 02/02/2026 09:59:05 : epoch 69807533 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fdfc8001080 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:59:06 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:59:06 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:59:06 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:59:06.338 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:59:06 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[226779]: 02/02/2026 09:59:06 : epoch 69807533 : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Feb 02 09:59:06 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:59:06 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:59:06 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:59:06.460 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:59:07 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 09:59:07 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[226779]: 02/02/2026 09:59:07 : epoch 69807533 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fdfd00091b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:59:07 compute-1 ceph-mon[80115]: pgmap v583: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s
Feb 02 09:59:07 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[226779]: 02/02/2026 09:59:07 : epoch 69807533 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fdfb4004140 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:59:07 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[226779]: 02/02/2026 09:59:07 : epoch 69807533 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fdfb8004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:59:08 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:59:08 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:59:08 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:59:08.341 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:59:08 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:59:08 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 09:59:08 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:59:08.462 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 09:59:08 compute-1 ceph-mon[80115]: pgmap v584: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.4 KiB/s rd, 597 B/s wr, 2 op/s
Feb 02 09:59:08 compute-1 ceph-osd[77691]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Feb 02 09:59:08 compute-1 ceph-osd[77691]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                           ** DB Stats **
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Cumulative writes: 8815 writes, 34K keys, 8815 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.02 MB/s
                                           Cumulative WAL: 8815 writes, 1876 syncs, 4.70 writes per sync, written: 0.02 GB, 0.02 MB/s
                                           Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                           Interval writes: 746 writes, 1209 keys, 746 commit groups, 1.0 writes per commit group, ingest: 0.41 MB, 0.00 MB/s
                                           Interval WAL: 746 writes, 348 syncs, 2.14 writes per sync, written: 0.00 GB, 0.00 MB/s
                                           Interval stall: 00:00:0.000 H:M:S, 0.0 percent
                                           
                                           ** Compaction Stats [default] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      2/0    2.63 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.2      0.01              0.00         1    0.005       0      0       0.0       0.0
                                            Sum      2/0    2.63 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.2      0.01              0.00         1    0.005       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [default] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.2      0.01              0.00         1    0.005       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x5616dea3d350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 8.3e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [default] **
                                           
                                           ** Compaction Stats [m-0] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [m-0] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x5616dea3d350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 8.3e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [m-0] **
                                           
                                           ** Compaction Stats [m-1] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [m-1] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x5616dea3d350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 8.3e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [m-1] **
                                           
                                           ** Compaction Stats [m-2] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [m-2] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x5616dea3d350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 8.3e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [m-2] **
                                           
                                           ** Compaction Stats [p-0] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      1/0    1.56 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.2      0.01              0.00         1    0.007       0      0       0.0       0.0
                                            Sum      1/0    1.56 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.2      0.01              0.00         1    0.007       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [p-0] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.2      0.01              0.00         1    0.007       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x5616dea3d350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 8.3e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [p-0] **
                                           
                                           ** Compaction Stats [p-1] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [p-1] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x5616dea3d350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 8.3e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [p-1] **
                                           
                                           ** Compaction Stats [p-2] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [p-2] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x5616dea3d350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 8.3e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [p-2] **
                                           
                                           ** Compaction Stats [O-0] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [O-0] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x5616dea3c9b0#2 capacity: 224.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 2 last_secs: 1.3e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,8.85555e-05%) FilterBlock(1,0.11 KB,4.76837e-05%) IndexBlock(1,0.14 KB,6.13076e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [O-0] **
                                           
                                           ** Compaction Stats [O-1] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [O-1] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x5616dea3c9b0#2 capacity: 224.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 2 last_secs: 1.3e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,8.85555e-05%) FilterBlock(1,0.11 KB,4.76837e-05%) IndexBlock(1,0.14 KB,6.13076e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [O-1] **
                                           
                                           ** Compaction Stats [O-2] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      1/0    1.25 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.4      0.00              0.00         1    0.003       0      0       0.0       0.0
                                            Sum      1/0    1.25 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.4      0.00              0.00         1    0.003       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [O-2] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.4      0.00              0.00         1    0.003       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x5616dea3c9b0#2 capacity: 224.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 2 last_secs: 1.3e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,8.85555e-05%) FilterBlock(1,0.11 KB,4.76837e-05%) IndexBlock(1,0.14 KB,6.13076e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [O-2] **
                                           
                                           ** Compaction Stats [L] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.001       0      0       0.0       0.0
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.001       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [L] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.001       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x5616dea3d350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 8.3e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [L] **
                                           
                                           ** Compaction Stats [P] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [P] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x5616dea3d350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 8.3e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [P] **
Feb 02 09:59:09 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[226779]: 02/02/2026 09:59:09 : epoch 69807533 : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Feb 02 09:59:09 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[226779]: 02/02/2026 09:59:09 : epoch 69807533 : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Feb 02 09:59:09 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[226779]: 02/02/2026 09:59:09 : epoch 69807533 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fdfc8001080 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:59:09 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[226779]: 02/02/2026 09:59:09 : epoch 69807533 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fdfd00091b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:59:09 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[226779]: 02/02/2026 09:59:09 : epoch 69807533 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fdfb4004140 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:59:10 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:59:10 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:59:10 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:59:10.344 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:59:10 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:59:10 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:59:10 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:59:10.466 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:59:10 compute-1 ceph-mon[80115]: pgmap v585: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.3 KiB/s rd, 597 B/s wr, 1 op/s
Feb 02 09:59:11 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[226779]: 02/02/2026 09:59:11 : epoch 69807533 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fdfb8004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:59:11 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[226779]: 02/02/2026 09:59:11 : epoch 69807533 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fdfc80022c0 fd 38 proxy ignored for local
Feb 02 09:59:11 compute-1 kernel: ganesha.nfsd[227179]: segfault at 50 ip 00007fe05a59332e sp 00007fe0057f9210 error 4 in libntirpc.so.5.8[7fe05a578000+2c000] likely on CPU 1 (core 0, socket 1)
Feb 02 09:59:11 compute-1 kernel: Code: 47 20 66 41 89 86 f2 00 00 00 41 bf 01 00 00 00 b9 40 00 00 00 e9 af fd ff ff 66 90 48 8b 85 f8 00 00 00 48 8b 40 08 4c 8b 28 <45> 8b 65 50 49 8b 75 68 41 8b be 28 02 00 00 b9 40 00 00 00 e8 29
Feb 02 09:59:11 compute-1 systemd[1]: Started Process Core Dump (PID 227183/UID 0).
Feb 02 09:59:12 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 09:59:12 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:59:12 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:59:12 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:59:12.347 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:59:12 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:59:12 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:59:12 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:59:12.468 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:59:12 compute-1 systemd-coredump[227184]: Process 226783 (ganesha.nfsd) of user 0 dumped core.
                                                    
                                                    Stack trace of thread 57:
                                                    #0  0x00007fe05a59332e n/a (/usr/lib64/libntirpc.so.5.8 + 0x2232e)
                                                    ELF object binary architecture: AMD x86-64
Feb 02 09:59:12 compute-1 systemd[1]: systemd-coredump@9-227183-0.service: Deactivated successfully.
Feb 02 09:59:12 compute-1 podman[227189]: 2026-02-02 09:59:12.60093131 +0000 UTC m=+0.036728874 container died 123b155399b379be1f4fa4925cb331f5cbd111e1b9d64915a39b1dd9646a2993 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, ceph=True, OSD_FLAVOR=default, io.buildah.version=1.40.1, org.label-schema.license=GPLv2)
Feb 02 09:59:12 compute-1 systemd[1]: var-lib-containers-storage-overlay-db349f9cb7ea7145c448aa73f1739c4f5f00f38f55f443501a2048cc19eefd46-merged.mount: Deactivated successfully.
Feb 02 09:59:12 compute-1 podman[227189]: 2026-02-02 09:59:12.676594865 +0000 UTC m=+0.112392359 container remove 123b155399b379be1f4fa4925cb331f5cbd111e1b9d64915a39b1dd9646a2993 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, OSD_FLAVOR=default, ceph=True)
Feb 02 09:59:12 compute-1 systemd[1]: ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1@nfs.cephfs.0.0.compute-1.mhzhsx.service: Main process exited, code=exited, status=139/n/a
Feb 02 09:59:12 compute-1 systemd[1]: ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1@nfs.cephfs.0.0.compute-1.mhzhsx.service: Failed with result 'exit-code'.
Feb 02 09:59:12 compute-1 systemd[1]: ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1@nfs.cephfs.0.0.compute-1.mhzhsx.service: Consumed 1.261s CPU time.
Feb 02 09:59:13 compute-1 ceph-mon[80115]: pgmap v586: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.3 KiB/s rd, 597 B/s wr, 1 op/s
Feb 02 09:59:13 compute-1 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "version", "format": "json"} v 0)
Feb 02 09:59:13 compute-1 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/4212740508' entity='client.openstack' cmd=[{"prefix": "version", "format": "json"}]: dispatch
Feb 02 09:59:13 compute-1 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "version", "format": "json"} v 0)
Feb 02 09:59:13 compute-1 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/3127826587' entity='client.openstack' cmd=[{"prefix": "version", "format": "json"}]: dispatch
Feb 02 09:59:14 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:59:14 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 09:59:14 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:59:14.350 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 09:59:14 compute-1 ceph-mon[80115]: from='client.? 192.168.122.10:0/4212740508' entity='client.openstack' cmd=[{"prefix": "version", "format": "json"}]: dispatch
Feb 02 09:59:14 compute-1 ceph-mon[80115]: from='client.? 192.168.122.10:0/3127826587' entity='client.openstack' cmd=[{"prefix": "version", "format": "json"}]: dispatch
Feb 02 09:59:14 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:59:14 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:59:14 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:59:14.472 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:59:15 compute-1 podman[227234]: 2026-02-02 09:59:15.433615718 +0000 UTC m=+0.109297418 container health_status 1fb2696999ea15e9688144989093738543d3cef725f9986792d6dfd707aefbe2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:099d88ae13fa2b3409da5310cdcba7fa01d2c87a8bc98296299a57054b9a075e, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_controller, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'db4758ee7523fe447444c4bd2b867b543b1eee4e3bbcf6676cd1b27bf6147d86-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:099d88ae13fa2b3409da5310cdcba7fa01d2c87a8bc98296299a57054b9a075e', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260127)
Feb 02 09:59:15 compute-1 ceph-mon[80115]: from='client.24791 -' entity='client.openstack' cmd=[{"prefix": "fs volume ls", "format": "json"}]: dispatch
Feb 02 09:59:15 compute-1 ceph-mon[80115]: from='client.24797 -' entity='client.openstack' cmd=[{"prefix": "fs volume ls", "format": "json"}]: dispatch
Feb 02 09:59:15 compute-1 ceph-mon[80115]: from='client.24797 -' entity='client.openstack' cmd=[{"prefix": "nfs cluster info", "cluster_id": "cephfs", "format": "json"}]: dispatch
Feb 02 09:59:15 compute-1 ceph-mon[80115]: pgmap v587: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 2.4 KiB/s rd, 1023 B/s wr, 3 op/s
Feb 02 09:59:16 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:59:16 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 09:59:16 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:59:16.354 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 09:59:16 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:59:16 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:59:16 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:59:16.476 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:59:17 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 09:59:17 compute-1 sudo[227261]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Feb 02 09:59:17 compute-1 sudo[227261]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:59:17 compute-1 sudo[227261]: pam_unix(sudo:session): session closed for user root
Feb 02 09:59:17 compute-1 ceph-mon[80115]: pgmap v588: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 2.3 KiB/s rd, 1023 B/s wr, 3 op/s
Feb 02 09:59:17 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Feb 02 09:59:17 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-haproxy-nfs-cephfs-compute-1-sryqbx[86108]: [WARNING] 032/095917 (4) : Server backend/nfs.cephfs.0 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 1 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Feb 02 09:59:18 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:59:18 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:59:18 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:59:18.357 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:59:18 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:59:18 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 09:59:18 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:59:18.478 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 09:59:18 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-haproxy-nfs-cephfs-compute-1-sryqbx[86108]: [WARNING] 032/095918 (4) : Server backend/nfs.cephfs.1 is UP, reason: Layer4 check passed, check duration: 0ms. 2 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Feb 02 09:59:19 compute-1 ceph-mon[80115]: pgmap v589: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 2.4 KiB/s rd, 1023 B/s wr, 3 op/s
Feb 02 09:59:20 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:59:20 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:59:20 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:59:20.360 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:59:20 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:59:20 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:59:20 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:59:20.482 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:59:20 compute-1 ceph-mon[80115]: pgmap v590: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.2 KiB/s rd, 426 B/s wr, 1 op/s
Feb 02 09:59:22 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 09:59:22 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:59:22 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:59:22 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:59:22.362 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:59:22 compute-1 podman[227288]: 2026-02-02 09:59:22.429112458 +0000 UTC m=+0.096841871 container health_status 9cab238676dda9a572dafc047b624a8b78a8fbec063bf997856559897160605d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'db4758ee7523fe447444c4bd2b867b543b1eee4e3bbcf6676cd1b27bf6147d86-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Feb 02 09:59:22 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:59:22 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:59:22 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:59:22.484 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:59:23 compute-1 systemd[1]: ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1@nfs.cephfs.0.0.compute-1.mhzhsx.service: Scheduled restart job, restart counter is at 10.
Feb 02 09:59:23 compute-1 systemd[1]: Stopped Ceph nfs.cephfs.0.0.compute-1.mhzhsx for d241d473-9fcb-5f74-b163-f1ca4454e7f1.
Feb 02 09:59:23 compute-1 systemd[1]: ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1@nfs.cephfs.0.0.compute-1.mhzhsx.service: Consumed 1.261s CPU time.
Feb 02 09:59:23 compute-1 systemd[1]: Starting Ceph nfs.cephfs.0.0.compute-1.mhzhsx for d241d473-9fcb-5f74-b163-f1ca4454e7f1...
Feb 02 09:59:23 compute-1 podman[227357]: 2026-02-02 09:59:23.310282004 +0000 UTC m=+0.059861041 container create 5028719f360da6fe64ee691f5a029f1fd7268518239c7de7973349ae5c53e866 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx, io.buildah.version=1.40.1, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Feb 02 09:59:23 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3200e5647ef6db81293f92e95b7a46d0b6c2b35334c914309ecd465e7001205a/merged/etc/ganesha supports timestamps until 2038 (0x7fffffff)
Feb 02 09:59:23 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3200e5647ef6db81293f92e95b7a46d0b6c2b35334c914309ecd465e7001205a/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Feb 02 09:59:23 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3200e5647ef6db81293f92e95b7a46d0b6c2b35334c914309ecd465e7001205a/merged/etc/ceph/keyring supports timestamps until 2038 (0x7fffffff)
Feb 02 09:59:23 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3200e5647ef6db81293f92e95b7a46d0b6c2b35334c914309ecd465e7001205a/merged/var/lib/ceph/radosgw/ceph-nfs.cephfs.0.0.compute-1.mhzhsx-rgw/keyring supports timestamps until 2038 (0x7fffffff)
Feb 02 09:59:23 compute-1 podman[227357]: 2026-02-02 09:59:23.37227777 +0000 UTC m=+0.121856867 container init 5028719f360da6fe64ee691f5a029f1fd7268518239c7de7973349ae5c53e866 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, io.buildah.version=1.40.1)
Feb 02 09:59:23 compute-1 podman[227357]: 2026-02-02 09:59:23.283705777 +0000 UTC m=+0.033284864 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Feb 02 09:59:23 compute-1 podman[227357]: 2026-02-02 09:59:23.378628547 +0000 UTC m=+0.128207574 container start 5028719f360da6fe64ee691f5a029f1fd7268518239c7de7973349ae5c53e866 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default)
Feb 02 09:59:23 compute-1 bash[227357]: 5028719f360da6fe64ee691f5a029f1fd7268518239c7de7973349ae5c53e866
Feb 02 09:59:23 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[227372]: 02/02/2026 09:59:23 : epoch 6980757b : compute-1 : ganesha.nfsd-2[main] init_logging :LOG :NULL :LOG: Setting log level for all components to NIV_EVENT
Feb 02 09:59:23 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[227372]: 02/02/2026 09:59:23 : epoch 6980757b : compute-1 : ganesha.nfsd-2[main] main :MAIN :EVENT :ganesha.nfsd Starting: Ganesha Version 5.9
Feb 02 09:59:23 compute-1 systemd[1]: Started Ceph nfs.cephfs.0.0.compute-1.mhzhsx for d241d473-9fcb-5f74-b163-f1ca4454e7f1.
Feb 02 09:59:23 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[227372]: 02/02/2026 09:59:23 : epoch 6980757b : compute-1 : ganesha.nfsd-2[main] nfs_set_param_from_conf :NFS STARTUP :EVENT :Configuration file successfully parsed
Feb 02 09:59:23 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[227372]: 02/02/2026 09:59:23 : epoch 6980757b : compute-1 : ganesha.nfsd-2[main] monitoring_init :NFS STARTUP :EVENT :Init monitoring at 0.0.0.0:9587
Feb 02 09:59:23 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[227372]: 02/02/2026 09:59:23 : epoch 6980757b : compute-1 : ganesha.nfsd-2[main] fsal_init_fds_limit :MDCACHE LRU :EVENT :Setting the system-imposed limit on FDs to 1048576.
Feb 02 09:59:23 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[227372]: 02/02/2026 09:59:23 : epoch 6980757b : compute-1 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :Initializing ID Mapper.
Feb 02 09:59:23 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[227372]: 02/02/2026 09:59:23 : epoch 6980757b : compute-1 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :ID Mapper successfully initialized.
Feb 02 09:59:23 compute-1 ceph-mon[80115]: pgmap v591: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.2 KiB/s rd, 426 B/s wr, 1 op/s
Feb 02 09:59:23 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[227372]: 02/02/2026 09:59:23 : epoch 6980757b : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Feb 02 09:59:24 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:59:24 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:59:24 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:59:24.365 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:59:24 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:59:24 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:59:24 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:59:24.487 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:59:25 compute-1 ceph-mon[80115]: pgmap v592: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.2 KiB/s rd, 426 B/s wr, 1 op/s
Feb 02 09:59:26 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:59:26 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:59:26 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:59:26.369 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:59:26 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:59:26 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:59:26 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:59:26.490 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:59:27 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 09:59:27 compute-1 ceph-mon[80115]: pgmap v593: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 B/s wr, 0 op/s
Feb 02 09:59:28 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:59:28 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:59:28 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:59:28.372 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:59:28 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:59:28 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:59:28 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:59:28.493 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:59:29 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[227372]: 02/02/2026 09:59:29 : epoch 6980757b : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Feb 02 09:59:29 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[227372]: 02/02/2026 09:59:29 : epoch 6980757b : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Feb 02 09:59:29 compute-1 ceph-mon[80115]: pgmap v594: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.2 KiB/s rd, 597 B/s wr, 1 op/s
Feb 02 09:59:30 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:59:30 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:59:30 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:59:30.375 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:59:30 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:59:30 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:59:30 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:59:30.496 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:59:31 compute-1 ceph-mon[80115]: pgmap v595: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.2 KiB/s rd, 597 B/s wr, 1 op/s
Feb 02 09:59:32 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 09:59:32 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:59:32 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:59:32 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:59:32.378 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:59:32 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:59:32 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:59:32 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:59:32.499 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:59:32 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Feb 02 09:59:33 compute-1 ceph-mon[80115]: pgmap v596: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.2 KiB/s rd, 597 B/s wr, 1 op/s
Feb 02 09:59:34 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:59:34 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 09:59:34 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:59:34.381 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 09:59:34 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:59:34 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:59:34 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:59:34.501 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:59:35 compute-1 ceph-mon[80115]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Feb 02 09:59:35 compute-1 ceph-mon[80115]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                           ** DB Stats **
                                           Uptime(secs): 1200.0 total, 600.0 interval
                                           Cumulative writes: 3826 writes, 21K keys, 3826 commit groups, 1.0 writes per commit group, ingest: 0.05 GB, 0.05 MB/s
                                           Cumulative WAL: 3826 writes, 3826 syncs, 1.00 writes per sync, written: 0.05 GB, 0.05 MB/s
                                           Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                           Interval writes: 1415 writes, 6875 keys, 1415 commit groups, 1.0 writes per commit group, ingest: 16.41 MB, 0.03 MB/s
                                           Interval WAL: 1415 writes, 1415 syncs, 1.00 writes per sync, written: 0.02 GB, 0.03 MB/s
                                           Interval stall: 00:00:0.000 H:M:S, 0.0 percent
                                           
                                           ** Compaction Stats [default] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0    114.9      0.29              0.07        10    0.029       0      0       0.0       0.0
                                             L6      1/0   12.13 MB   0.0      0.1     0.0      0.1       0.1      0.0       0.0   3.4    120.2    101.5      1.11              0.25         9    0.123     43K   4830       0.0       0.0
                                            Sum      1/0   12.13 MB   0.0      0.1     0.0      0.1       0.1      0.0       0.0   4.4     95.5    104.3      1.39              0.33        19    0.073     43K   4830       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.1     0.0      0.0       0.1      0.0       0.0   5.3    124.4    125.4      0.50              0.17         8    0.062     22K   2565       0.0       0.0
                                           
                                           ** Compaction Stats [default] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Low      0/0    0.00 KB   0.0      0.1     0.0      0.1       0.1      0.0       0.0   0.0    120.2    101.5      1.11              0.25         9    0.123     43K   4830       0.0       0.0
                                           High      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0    115.6      0.28              0.07         9    0.032       0      0       0.0       0.0
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.8      0.00              0.00         1    0.002       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.0 total, 600.0 interval
                                           Flush(GB): cumulative 0.032, interval 0.012
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.14 GB write, 0.12 MB/s write, 0.13 GB read, 0.11 MB/s read, 1.4 seconds
                                           Interval compaction: 0.06 GB write, 0.10 MB/s write, 0.06 GB read, 0.10 MB/s read, 0.5 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55a64debd350#2 capacity: 304.00 MB usage: 8.81 MB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 0 last_secs: 0.000112 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(474,8.44 MB,2.77615%) FilterBlock(19,131.30 KB,0.0421775%) IndexBlock(19,244.95 KB,0.0786882%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [default] **
Feb 02 09:59:35 compute-1 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "version", "format": "json"} v 0)
Feb 02 09:59:35 compute-1 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1254072482' entity='client.openstack' cmd=[{"prefix": "version", "format": "json"}]: dispatch
Feb 02 09:59:35 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[227372]: 02/02/2026 09:59:35 : epoch 6980757b : compute-1 : ganesha.nfsd-2[main] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Feb 02 09:59:35 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[227372]: 02/02/2026 09:59:35 : epoch 6980757b : compute-1 : ganesha.nfsd-2[main] main :NFS STARTUP :WARN :No export entries found in configuration file !!!
Feb 02 09:59:35 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[227372]: 02/02/2026 09:59:35 : epoch 6980757b : compute-1 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:24): Unknown block (RADOS_URLS)
Feb 02 09:59:35 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[227372]: 02/02/2026 09:59:35 : epoch 6980757b : compute-1 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:29): Unknown block (RGW)
Feb 02 09:59:35 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[227372]: 02/02/2026 09:59:35 : epoch 6980757b : compute-1 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :CAP_SYS_RESOURCE was successfully removed for proper quota management in FSAL
Feb 02 09:59:35 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[227372]: 02/02/2026 09:59:35 : epoch 6980757b : compute-1 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :currently set capabilities are: cap_chown,cap_dac_override,cap_fowner,cap_fsetid,cap_kill,cap_setgid,cap_setuid,cap_setpcap,cap_net_bind_service,cap_sys_chroot,cap_setfcap=ep
Feb 02 09:59:35 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[227372]: 02/02/2026 09:59:35 : epoch 6980757b : compute-1 : ganesha.nfsd-2[main] gsh_dbus_pkginit :DBUS :CRIT :dbus_bus_get failed (Failed to connect to socket /run/dbus/system_bus_socket: No such file or directory)
Feb 02 09:59:35 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[227372]: 02/02/2026 09:59:35 : epoch 6980757b : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Feb 02 09:59:35 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[227372]: 02/02/2026 09:59:35 : epoch 6980757b : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Feb 02 09:59:35 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[227372]: 02/02/2026 09:59:35 : epoch 6980757b : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Feb 02 09:59:35 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[227372]: 02/02/2026 09:59:35 : epoch 6980757b : compute-1 : ganesha.nfsd-2[main] nfs_Init_svc :DISP :CRIT :Cannot acquire credentials for principal nfs
Feb 02 09:59:35 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[227372]: 02/02/2026 09:59:35 : epoch 6980757b : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Feb 02 09:59:35 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[227372]: 02/02/2026 09:59:35 : epoch 6980757b : compute-1 : ganesha.nfsd-2[main] nfs_Init_admin_thread :NFS CB :EVENT :Admin thread initialized
Feb 02 09:59:35 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[227372]: 02/02/2026 09:59:35 : epoch 6980757b : compute-1 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :EVENT :Callback creds directory (/var/run/ganesha) already exists
Feb 02 09:59:35 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[227372]: 02/02/2026 09:59:35 : epoch 6980757b : compute-1 : ganesha.nfsd-2[main] find_keytab_entry :NFS CB :WARN :Configuration file does not specify default realm while getting default realm name
Feb 02 09:59:35 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[227372]: 02/02/2026 09:59:35 : epoch 6980757b : compute-1 : ganesha.nfsd-2[main] gssd_refresh_krb5_machine_credential :NFS CB :CRIT :ERROR: gssd_refresh_krb5_machine_credential: no usable keytab entry found in keytab /etc/krb5.keytab for connection with host localhost
Feb 02 09:59:35 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[227372]: 02/02/2026 09:59:35 : epoch 6980757b : compute-1 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :WARN :gssd_refresh_krb5_machine_credential failed (-1765328160:0)
Feb 02 09:59:35 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[227372]: 02/02/2026 09:59:35 : epoch 6980757b : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :Starting delayed executor.
Feb 02 09:59:35 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[227372]: 02/02/2026 09:59:35 : epoch 6980757b : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :gsh_dbusthread was started successfully
Feb 02 09:59:35 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[227372]: 02/02/2026 09:59:35 : epoch 6980757b : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :admin thread was started successfully
Feb 02 09:59:35 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[227372]: 02/02/2026 09:59:35 : epoch 6980757b : compute-1 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :CRIT :DBUS not initialized, service thread exiting
Feb 02 09:59:35 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[227372]: 02/02/2026 09:59:35 : epoch 6980757b : compute-1 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :EVENT :shutdown
Feb 02 09:59:35 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[227372]: 02/02/2026 09:59:35 : epoch 6980757b : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :reaper thread was started successfully
Feb 02 09:59:35 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[227372]: 02/02/2026 09:59:35 : epoch 6980757b : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :General fridge was started successfully
Feb 02 09:59:35 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[227372]: 02/02/2026 09:59:35 : epoch 6980757b : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Feb 02 09:59:35 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[227372]: 02/02/2026 09:59:35 : epoch 6980757b : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :             NFS SERVER INITIALIZED
Feb 02 09:59:35 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[227372]: 02/02/2026 09:59:35 : epoch 6980757b : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Feb 02 09:59:35 compute-1 ceph-mon[80115]: pgmap v597: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 2.2 KiB/s rd, 1023 B/s wr, 3 op/s
Feb 02 09:59:35 compute-1 ceph-mon[80115]: from='client.? 192.168.122.10:0/1254072482' entity='client.openstack' cmd=[{"prefix": "version", "format": "json"}]: dispatch
Feb 02 09:59:35 compute-1 ceph-mon[80115]: from='client.? 192.168.122.10:0/540781071' entity='client.openstack' cmd=[{"prefix": "version", "format": "json"}]: dispatch
Feb 02 09:59:35 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[227372]: 02/02/2026 09:59:35 : epoch 6980757b : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff7d0000df0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:59:35 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[227372]: 02/02/2026 09:59:35 : epoch 6980757b : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff7c8001c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:59:36 compute-1 sudo[227436]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 02 09:59:36 compute-1 sudo[227436]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:59:36 compute-1 sudo[227436]: pam_unix(sudo:session): session closed for user root
Feb 02 09:59:36 compute-1 sudo[227461]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/d241d473-9fcb-5f74-b163-f1ca4454e7f1/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 check-host
Feb 02 09:59:36 compute-1 sudo[227461]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:59:36 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:59:36 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 09:59:36 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:59:36.383 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 09:59:36 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:59:36 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 09:59:36 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:59:36.504 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 09:59:36 compute-1 ceph-mon[80115]: from='client.24818 -' entity='client.openstack' cmd=[{"prefix": "fs volume ls", "format": "json"}]: dispatch
Feb 02 09:59:36 compute-1 ceph-mon[80115]: from='client.24832 -' entity='client.openstack' cmd=[{"prefix": "fs volume ls", "format": "json"}]: dispatch
Feb 02 09:59:36 compute-1 ceph-mon[80115]: from='client.24832 -' entity='client.openstack' cmd=[{"prefix": "nfs cluster info", "cluster_id": "cephfs", "format": "json"}]: dispatch
Feb 02 09:59:36 compute-1 sudo[227461]: pam_unix(sudo:session): session closed for user root
Feb 02 09:59:36 compute-1 sudo[227507]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 02 09:59:36 compute-1 sudo[227507]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:59:36 compute-1 sudo[227507]: pam_unix(sudo:session): session closed for user root
Feb 02 09:59:36 compute-1 sudo[227533]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/d241d473-9fcb-5f74-b163-f1ca4454e7f1/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Feb 02 09:59:36 compute-1 sudo[227533]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:59:37 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 09:59:37 compute-1 sudo[227573]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Feb 02 09:59:37 compute-1 sudo[227573]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:59:37 compute-1 sudo[227573]: pam_unix(sudo:session): session closed for user root
Feb 02 09:59:37 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[227372]: 02/02/2026 09:59:37 : epoch 6980757b : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff7b4000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:59:37 compute-1 sudo[227533]: pam_unix(sudo:session): session closed for user root
Feb 02 09:59:37 compute-1 ceph-mon[80115]: pgmap v598: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 2.2 KiB/s rd, 1023 B/s wr, 3 op/s
Feb 02 09:59:37 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb 02 09:59:37 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb 02 09:59:37 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb 02 09:59:37 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb 02 09:59:37 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-haproxy-nfs-cephfs-compute-1-sryqbx[86108]: [WARNING] 032/095937 (4) : Server backend/nfs.cephfs.0 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Feb 02 09:59:37 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[227372]: 02/02/2026 09:59:37 : epoch 6980757b : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff7ac000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:59:37 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[227372]: 02/02/2026 09:59:37 : epoch 6980757b : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff7b8000fa0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:59:38 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:59:38 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 09:59:38 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:59:38.386 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 09:59:38 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:59:38 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:59:38 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:59:38.508 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:59:38 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Feb 02 09:59:38 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Feb 02 09:59:38 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb 02 09:59:38 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb 02 09:59:38 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Feb 02 09:59:38 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Feb 02 09:59:38 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Feb 02 09:59:39 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[227372]: 02/02/2026 09:59:39 : epoch 6980757b : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff7c8002520 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:59:39 compute-1 ceph-mon[80115]: pgmap v599: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 2.2 KiB/s rd, 1023 B/s wr, 3 op/s
Feb 02 09:59:39 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[227372]: 02/02/2026 09:59:39 : epoch 6980757b : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff7b40016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:59:39 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[227372]: 02/02/2026 09:59:39 : epoch 6980757b : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff7ac0016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:59:40 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:59:40 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 09:59:40 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:59:40.390 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 09:59:40 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:59:40 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:59:40 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:59:40.510 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:59:40 compute-1 ceph-mon[80115]: pgmap v600: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.2 KiB/s rd, 426 B/s wr, 1 op/s
Feb 02 09:59:41 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[227372]: 02/02/2026 09:59:41 : epoch 6980757b : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff7b8001ac0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:59:41 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[227372]: 02/02/2026 09:59:41 : epoch 6980757b : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff7c8002520 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:59:41 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[227372]: 02/02/2026 09:59:41 : epoch 6980757b : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff7b40016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:59:42 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 09:59:42 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:59:42 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:59:42 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:59:42.393 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:59:42 compute-1 sudo[227616]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 02 09:59:42 compute-1 sudo[227616]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:59:42 compute-1 sudo[227616]: pam_unix(sudo:session): session closed for user root
Feb 02 09:59:42 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:59:42 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:59:42 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:59:42.514 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:59:43 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb 02 09:59:43 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb 02 09:59:43 compute-1 ceph-mon[80115]: pgmap v601: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.2 KiB/s rd, 426 B/s wr, 1 op/s
Feb 02 09:59:43 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[227372]: 02/02/2026 09:59:43 : epoch 6980757b : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff7ac0016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:59:43 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[227372]: 02/02/2026 09:59:43 : epoch 6980757b : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff7b8001ac0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:59:43 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[227372]: 02/02/2026 09:59:43 : epoch 6980757b : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff7c8002520 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:59:44 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:59:44 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:59:44 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:59:44.396 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:59:44 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:59:44 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:59:44 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:59:44.517 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:59:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:59:44.899 143542 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 02 09:59:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:59:44.900 143542 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 02 09:59:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 09:59:44.900 143542 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 02 09:59:45 compute-1 ceph-mon[80115]: pgmap v602: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.3 KiB/s rd, 426 B/s wr, 1 op/s
Feb 02 09:59:45 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[227372]: 02/02/2026 09:59:45 : epoch 6980757b : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff7b40016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:59:45 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[227372]: 02/02/2026 09:59:45 : epoch 6980757b : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff7ac0016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:59:45 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[227372]: 02/02/2026 09:59:45 : epoch 6980757b : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff7b8001ac0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:59:46 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:59:46 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:59:46 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:59:46.399 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:59:46 compute-1 podman[227643]: 2026-02-02 09:59:46.461769163 +0000 UTC m=+0.126753926 container health_status 1fb2696999ea15e9688144989093738543d3cef725f9986792d6dfd707aefbe2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:099d88ae13fa2b3409da5310cdcba7fa01d2c87a8bc98296299a57054b9a075e, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'db4758ee7523fe447444c4bd2b867b543b1eee4e3bbcf6676cd1b27bf6147d86-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:099d88ae13fa2b3409da5310cdcba7fa01d2c87a8bc98296299a57054b9a075e', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2)
Feb 02 09:59:46 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:59:46 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 09:59:46 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:59:46.519 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 09:59:47 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 09:59:47 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[227372]: 02/02/2026 09:59:47 : epoch 6980757b : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff7c8002520 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:59:47 compute-1 ceph-mon[80115]: pgmap v603: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 B/s wr, 0 op/s
Feb 02 09:59:47 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Feb 02 09:59:47 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[227372]: 02/02/2026 09:59:47 : epoch 6980757b : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff7b4002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:59:47 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[227372]: 02/02/2026 09:59:47 : epoch 6980757b : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff7ac002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:59:48 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:59:48 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:59:48 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:59:48.402 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:59:48 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:59:48 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 09:59:48 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:59:48.522 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 09:59:49 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[227372]: 02/02/2026 09:59:49 : epoch 6980757b : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff7ac002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:59:49 compute-1 ceph-mon[80115]: pgmap v604: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 341 B/s rd, 0 B/s wr, 0 op/s
Feb 02 09:59:49 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[227372]: 02/02/2026 09:59:49 : epoch 6980757b : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff7c8002520 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:59:49 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[227372]: 02/02/2026 09:59:49 : epoch 6980757b : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff7b4002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:59:50 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:59:50 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:59:50 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:59:50.406 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:59:50 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:59:50 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:59:50 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:59:50.526 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:59:51 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[227372]: 02/02/2026 09:59:51 : epoch 6980757b : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff7b8002f50 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:59:51 compute-1 ceph-mon[80115]: pgmap v605: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Feb 02 09:59:51 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[227372]: 02/02/2026 09:59:51 : epoch 6980757b : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff7b8002f50 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:59:51 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[227372]: 02/02/2026 09:59:51 : epoch 6980757b : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff7ac002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:59:52 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 09:59:52 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:59:52 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:59:52 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:59:52.409 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:59:52 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:59:52 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:59:52 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:59:52.529 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:59:53 compute-1 podman[227673]: 2026-02-02 09:59:53.391277372 +0000 UTC m=+0.059905562 container health_status 9cab238676dda9a572dafc047b624a8b78a8fbec063bf997856559897160605d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_metadata_agent, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'db4758ee7523fe447444c4bd2b867b543b1eee4e3bbcf6676cd1b27bf6147d86-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Feb 02 09:59:53 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[227372]: 02/02/2026 09:59:53 : epoch 6980757b : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff7b4002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:59:53 compute-1 ceph-mon[80115]: pgmap v606: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Feb 02 09:59:53 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[227372]: 02/02/2026 09:59:53 : epoch 6980757b : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff7b8002f50 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:59:53 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[227372]: 02/02/2026 09:59:53 : epoch 6980757b : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff7c8002520 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:59:54 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:59:54 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 09:59:54 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:59:54.413 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 09:59:54 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:59:54 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 09:59:54 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:59:54.532 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 09:59:55 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[227372]: 02/02/2026 09:59:55 : epoch 6980757b : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff7ac003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:59:55 compute-1 ceph-mon[80115]: pgmap v607: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 426 B/s rd, 0 op/s
Feb 02 09:59:55 compute-1 ceph-mon[80115]: from='client.? 192.168.122.10:0/2023615819' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Feb 02 09:59:55 compute-1 ceph-mon[80115]: from='client.? 192.168.122.10:0/2023615819' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Feb 02 09:59:55 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[227372]: 02/02/2026 09:59:55 : epoch 6980757b : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff7b4003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:59:55 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[227372]: 02/02/2026 09:59:55 : epoch 6980757b : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff7b8004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:59:56 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:59:56 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:59:56 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:59:56.417 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:59:56 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:59:56 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 09:59:56 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:59:56.536 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 09:59:57 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 09:59:57 compute-1 sudo[227694]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Feb 02 09:59:57 compute-1 sudo[227694]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 09:59:57 compute-1 sudo[227694]: pam_unix(sudo:session): session closed for user root
Feb 02 09:59:57 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[227372]: 02/02/2026 09:59:57 : epoch 6980757b : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff7c8002520 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:59:57 compute-1 ceph-mon[80115]: pgmap v608: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Feb 02 09:59:57 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[227372]: 02/02/2026 09:59:57 : epoch 6980757b : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff7ac003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:59:57 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[227372]: 02/02/2026 09:59:57 : epoch 6980757b : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff7b4003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:59:58 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:59:58 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 09:59:58 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:09:59:58.420 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 09:59:58 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 09:59:58 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 09:59:58 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:09:59:58.538 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 09:59:59 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[227372]: 02/02/2026 09:59:59 : epoch 6980757b : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff7b8004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:59:59 compute-1 ceph-mon[80115]: pgmap v609: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 341 B/s rd, 0 op/s
Feb 02 09:59:59 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[227372]: 02/02/2026 09:59:59 : epoch 6980757b : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff7c8002520 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 09:59:59 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[227372]: 02/02/2026 09:59:59 : epoch 6980757b : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff7ac003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:00:00 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:00:00 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:00:00 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:00:00.422 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:00:00 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:00:00 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:00:00 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:00:00.541 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:00:00 compute-1 ceph-mon[80115]: overall HEALTH_OK
Feb 02 10:00:01 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[227372]: 02/02/2026 10:00:01 : epoch 6980757b : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff7b4003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:00:01 compute-1 ceph-mon[80115]: pgmap v610: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Feb 02 10:00:01 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[227372]: 02/02/2026 10:00:01 : epoch 6980757b : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff7b8004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:00:01 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[227372]: 02/02/2026 10:00:01 : epoch 6980757b : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff7c8002520 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:00:02 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 10:00:02 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:00:02 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:00:02 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:00:02.425 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:00:02 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:00:02 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:00:02 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:00:02.544 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:00:02 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Feb 02 10:00:02 compute-1 ceph-mon[80115]: pgmap v611: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Feb 02 10:00:03 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[227372]: 02/02/2026 10:00:03 : epoch 6980757b : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff7ac003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:00:03 compute-1 nova_compute[226294]: 2026-02-02 10:00:03.574 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 02 10:00:03 compute-1 nova_compute[226294]: 2026-02-02 10:00:03.633 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 02 10:00:03 compute-1 nova_compute[226294]: 2026-02-02 10:00:03.633 226298 DEBUG nova.compute.manager [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 02 10:00:03 compute-1 nova_compute[226294]: 2026-02-02 10:00:03.634 226298 DEBUG nova.compute.manager [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 02 10:00:03 compute-1 nova_compute[226294]: 2026-02-02 10:00:03.674 226298 DEBUG nova.compute.manager [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Feb 02 10:00:03 compute-1 nova_compute[226294]: 2026-02-02 10:00:03.674 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 02 10:00:03 compute-1 nova_compute[226294]: 2026-02-02 10:00:03.675 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 02 10:00:03 compute-1 nova_compute[226294]: 2026-02-02 10:00:03.675 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 02 10:00:03 compute-1 nova_compute[226294]: 2026-02-02 10:00:03.676 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 02 10:00:03 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[227372]: 02/02/2026 10:00:03 : epoch 6980757b : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff7b4003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:00:03 compute-1 nova_compute[226294]: 2026-02-02 10:00:03.677 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 02 10:00:03 compute-1 nova_compute[226294]: 2026-02-02 10:00:03.678 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 02 10:00:03 compute-1 nova_compute[226294]: 2026-02-02 10:00:03.678 226298 DEBUG nova.compute.manager [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 02 10:00:03 compute-1 nova_compute[226294]: 2026-02-02 10:00:03.679 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 02 10:00:03 compute-1 nova_compute[226294]: 2026-02-02 10:00:03.710 226298 DEBUG oslo_concurrency.lockutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 02 10:00:03 compute-1 nova_compute[226294]: 2026-02-02 10:00:03.711 226298 DEBUG oslo_concurrency.lockutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 02 10:00:03 compute-1 nova_compute[226294]: 2026-02-02 10:00:03.711 226298 DEBUG oslo_concurrency.lockutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 02 10:00:03 compute-1 nova_compute[226294]: 2026-02-02 10:00:03.711 226298 DEBUG nova.compute.resource_tracker [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 02 10:00:03 compute-1 nova_compute[226294]: 2026-02-02 10:00:03.712 226298 DEBUG oslo_concurrency.processutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 02 10:00:03 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[227372]: 02/02/2026 10:00:03 : epoch 6980757b : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff7b8004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:00:04 compute-1 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Feb 02 10:00:04 compute-1 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3410704436' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Feb 02 10:00:04 compute-1 nova_compute[226294]: 2026-02-02 10:00:04.156 226298 DEBUG oslo_concurrency.processutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.444s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 02 10:00:04 compute-1 ceph-mon[80115]: from='client.? 192.168.122.102:0/3224298065' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Feb 02 10:00:04 compute-1 ceph-mon[80115]: from='client.? 192.168.122.100:0/3254037209' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Feb 02 10:00:04 compute-1 ceph-mon[80115]: from='client.? 192.168.122.101:0/3410704436' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Feb 02 10:00:04 compute-1 nova_compute[226294]: 2026-02-02 10:00:04.319 226298 WARNING nova.virt.libvirt.driver [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 02 10:00:04 compute-1 nova_compute[226294]: 2026-02-02 10:00:04.320 226298 DEBUG nova.compute.resource_tracker [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5257MB free_disk=59.98828125GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 02 10:00:04 compute-1 nova_compute[226294]: 2026-02-02 10:00:04.321 226298 DEBUG oslo_concurrency.lockutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 02 10:00:04 compute-1 nova_compute[226294]: 2026-02-02 10:00:04.321 226298 DEBUG oslo_concurrency.lockutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 02 10:00:04 compute-1 nova_compute[226294]: 2026-02-02 10:00:04.413 226298 DEBUG nova.compute.resource_tracker [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 02 10:00:04 compute-1 nova_compute[226294]: 2026-02-02 10:00:04.413 226298 DEBUG nova.compute.resource_tracker [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 02 10:00:04 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:00:04 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 10:00:04 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:00:04.427 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 10:00:04 compute-1 nova_compute[226294]: 2026-02-02 10:00:04.437 226298 DEBUG oslo_concurrency.processutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 02 10:00:04 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:00:04 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:00:04 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:00:04.546 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:00:04 compute-1 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Feb 02 10:00:04 compute-1 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1496402642' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Feb 02 10:00:04 compute-1 nova_compute[226294]: 2026-02-02 10:00:04.843 226298 DEBUG oslo_concurrency.processutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.406s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 02 10:00:04 compute-1 nova_compute[226294]: 2026-02-02 10:00:04.849 226298 DEBUG nova.compute.provider_tree [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Inventory has not changed in ProviderTree for provider: 8e32c057-ad28-4c19-8374-763e0c1c8622 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 02 10:00:04 compute-1 nova_compute[226294]: 2026-02-02 10:00:04.867 226298 DEBUG nova.scheduler.client.report [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Inventory has not changed for provider 8e32c057-ad28-4c19-8374-763e0c1c8622 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 02 10:00:04 compute-1 nova_compute[226294]: 2026-02-02 10:00:04.869 226298 DEBUG nova.compute.resource_tracker [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 02 10:00:04 compute-1 nova_compute[226294]: 2026-02-02 10:00:04.869 226298 DEBUG oslo_concurrency.lockutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.548s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 02 10:00:04 compute-1 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Feb 02 10:00:04 compute-1 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2770751255' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Feb 02 10:00:04 compute-1 nova_compute[226294]: 2026-02-02 10:00:04.939 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 02 10:00:05 compute-1 ceph-mon[80115]: pgmap v612: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 426 B/s rd, 0 op/s
Feb 02 10:00:05 compute-1 ceph-mon[80115]: from='client.? 192.168.122.101:0/1496402642' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Feb 02 10:00:05 compute-1 ceph-mon[80115]: from='client.? 192.168.122.100:0/1105285779' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Feb 02 10:00:05 compute-1 ceph-mon[80115]: from='client.? 192.168.122.102:0/2770751255' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Feb 02 10:00:05 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[227372]: 02/02/2026 10:00:05 : epoch 6980757b : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff7b4003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:00:05 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[227372]: 02/02/2026 10:00:05 : epoch 6980757b : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff7b4003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:00:05 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[227372]: 02/02/2026 10:00:05 : epoch 6980757b : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff7b4003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:00:06 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:00:06 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:00:06 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:00:06.430 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:00:06 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:00:06 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:00:06 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:00:06.549 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:00:07 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 10:00:07 compute-1 ceph-mon[80115]: pgmap v613: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Feb 02 10:00:07 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[227372]: 02/02/2026 10:00:07 : epoch 6980757b : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff7b8004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:00:07 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[227372]: 02/02/2026 10:00:07 : epoch 6980757b : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff7c8002520 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:00:07 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[227372]: 02/02/2026 10:00:07 : epoch 6980757b : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff7d0002010 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:00:08 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:00:08 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 10:00:08 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:00:08.432 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 10:00:08 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:00:08 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:00:08 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:00:08.552 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:00:09 compute-1 ceph-mon[80115]: pgmap v614: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 341 B/s rd, 0 op/s
Feb 02 10:00:09 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[227372]: 02/02/2026 10:00:09 : epoch 6980757b : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff7b4003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:00:09 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[227372]: 02/02/2026 10:00:09 : epoch 6980757b : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff7b0000d90 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:00:09 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[227372]: 02/02/2026 10:00:09 : epoch 6980757b : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff7b4003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:00:10 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:00:10 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:00:10 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:00:10.436 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:00:10 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:00:10 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 10:00:10 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:00:10.554 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 10:00:11 compute-1 ceph-mon[80115]: pgmap v615: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Feb 02 10:00:11 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[227372]: 02/02/2026 10:00:11 : epoch 6980757b : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff7d0002010 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:00:11 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[227372]: 02/02/2026 10:00:11 : epoch 6980757b : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff7c8002520 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:00:11 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[227372]: 02/02/2026 10:00:11 : epoch 6980757b : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff7b00018b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:00:12 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 10:00:12 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:00:12 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 10:00:12 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:00:12.437 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 10:00:12 compute-1 ovn_metadata_agent[143537]: 2026-02-02 10:00:12.537 143542 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=2, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '66:4f:4d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '4a:a7:f3:61:65:15'}, ipsec=False) old=SB_Global(nb_cfg=1) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 02 10:00:12 compute-1 ovn_metadata_agent[143537]: 2026-02-02 10:00:12.538 143542 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Feb 02 10:00:12 compute-1 ovn_metadata_agent[143537]: 2026-02-02 10:00:12.539 143542 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=2f54a3b0-231a-4b96-9e3a-0a36e3e73216, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '2'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 02 10:00:12 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:00:12 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:00:12 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:00:12.557 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:00:13 compute-1 ceph-mon[80115]: pgmap v616: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Feb 02 10:00:13 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[227372]: 02/02/2026 10:00:13 : epoch 6980757b : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff7b4003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:00:13 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[227372]: 02/02/2026 10:00:13 : epoch 6980757b : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff7d0002010 fd 38 proxy ignored for local
Feb 02 10:00:13 compute-1 kernel: ganesha.nfsd[227769]: segfault at 50 ip 00007ff85bdba32e sp 00007ff7e1ffa210 error 4 in libntirpc.so.5.8[7ff85bd9f000+2c000] likely on CPU 3 (core 0, socket 3)
Feb 02 10:00:13 compute-1 kernel: Code: 47 20 66 41 89 86 f2 00 00 00 41 bf 01 00 00 00 b9 40 00 00 00 e9 af fd ff ff 66 90 48 8b 85 f8 00 00 00 48 8b 40 08 4c 8b 28 <45> 8b 65 50 49 8b 75 68 41 8b be 28 02 00 00 b9 40 00 00 00 e8 29
Feb 02 10:00:13 compute-1 systemd[1]: Started Process Core Dump (PID 227773/UID 0).
Feb 02 10:00:14 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:00:14 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 10:00:14 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:00:14.439 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 10:00:14 compute-1 systemd-coredump[227774]: Process 227376 (ganesha.nfsd) of user 0 dumped core.
                                                    
                                                    Stack trace of thread 57:
                                                    #0  0x00007ff85bdba32e n/a (/usr/lib64/libntirpc.so.5.8 + 0x2232e)
                                                    ELF object binary architecture: AMD x86-64
Feb 02 10:00:14 compute-1 systemd[1]: systemd-coredump@10-227773-0.service: Deactivated successfully.
Feb 02 10:00:14 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:00:14 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 10:00:14 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:00:14.559 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 10:00:14 compute-1 podman[227779]: 2026-02-02 10:00:14.561889277 +0000 UTC m=+0.034875065 container died 5028719f360da6fe64ee691f5a029f1fd7268518239c7de7973349ae5c53e866 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, CEPH_REF=squid, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, ceph=True, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Feb 02 10:00:14 compute-1 systemd[1]: var-lib-containers-storage-overlay-3200e5647ef6db81293f92e95b7a46d0b6c2b35334c914309ecd465e7001205a-merged.mount: Deactivated successfully.
Feb 02 10:00:14 compute-1 podman[227779]: 2026-02-02 10:00:14.601579879 +0000 UTC m=+0.074565667 container remove 5028719f360da6fe64ee691f5a029f1fd7268518239c7de7973349ae5c53e866 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, ceph=True)
Feb 02 10:00:14 compute-1 systemd[1]: ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1@nfs.cephfs.0.0.compute-1.mhzhsx.service: Main process exited, code=exited, status=139/n/a
Feb 02 10:00:14 compute-1 systemd[1]: ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1@nfs.cephfs.0.0.compute-1.mhzhsx.service: Failed with result 'exit-code'.
Feb 02 10:00:14 compute-1 systemd[1]: ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1@nfs.cephfs.0.0.compute-1.mhzhsx.service: Consumed 1.188s CPU time.
Feb 02 10:00:15 compute-1 ceph-mon[80115]: pgmap v617: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 426 B/s rd, 0 op/s
Feb 02 10:00:16 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:00:16 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 10:00:16 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:00:16.441 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 10:00:16 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:00:16 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:00:16 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:00:16.563 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:00:17 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 10:00:17 compute-1 podman[227827]: 2026-02-02 10:00:17.437301538 +0000 UTC m=+0.105840287 container health_status 1fb2696999ea15e9688144989093738543d3cef725f9986792d6dfd707aefbe2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:099d88ae13fa2b3409da5310cdcba7fa01d2c87a8bc98296299a57054b9a075e, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'db4758ee7523fe447444c4bd2b867b543b1eee4e3bbcf6676cd1b27bf6147d86-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:099d88ae13fa2b3409da5310cdcba7fa01d2c87a8bc98296299a57054b9a075e', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Feb 02 10:00:17 compute-1 ceph-mon[80115]: pgmap v618: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Feb 02 10:00:17 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Feb 02 10:00:17 compute-1 sudo[227853]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Feb 02 10:00:17 compute-1 sudo[227853]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 10:00:17 compute-1 sudo[227853]: pam_unix(sudo:session): session closed for user root
Feb 02 10:00:18 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:00:18 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:00:18 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:00:18.444 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:00:18 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:00:18 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 10:00:18 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:00:18.565 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 10:00:19 compute-1 ceph-mon[80115]: pgmap v619: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Feb 02 10:00:19 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-haproxy-nfs-cephfs-compute-1-sryqbx[86108]: [WARNING] 032/100019 (4) : Server backend/nfs.cephfs.0 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Feb 02 10:00:20 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:00:20 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:00:20 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:00:20.446 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:00:20 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:00:20 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 10:00:20 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:00:20.568 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 10:00:21 compute-1 ceph-mon[80115]: pgmap v620: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s
Feb 02 10:00:22 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 10:00:22 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:00:22 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:00:22 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:00:22.448 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:00:22 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:00:22 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 10:00:22 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:00:22.572 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 10:00:23 compute-1 ceph-mon[80115]: pgmap v621: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s
Feb 02 10:00:24 compute-1 podman[227882]: 2026-02-02 10:00:24.399803485 +0000 UTC m=+0.072912403 container health_status 9cab238676dda9a572dafc047b624a8b78a8fbec063bf997856559897160605d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'db4758ee7523fe447444c4bd2b867b543b1eee4e3bbcf6676cd1b27bf6147d86-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127)
Feb 02 10:00:24 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:00:24 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:00:24 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:00:24.451 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:00:24 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:00:24 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 10:00:24 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:00:24.575 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 10:00:24 compute-1 systemd[1]: ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1@nfs.cephfs.0.0.compute-1.mhzhsx.service: Scheduled restart job, restart counter is at 11.
Feb 02 10:00:24 compute-1 systemd[1]: Stopped Ceph nfs.cephfs.0.0.compute-1.mhzhsx for d241d473-9fcb-5f74-b163-f1ca4454e7f1.
Feb 02 10:00:24 compute-1 systemd[1]: ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1@nfs.cephfs.0.0.compute-1.mhzhsx.service: Consumed 1.188s CPU time.
Feb 02 10:00:24 compute-1 systemd[1]: Starting Ceph nfs.cephfs.0.0.compute-1.mhzhsx for d241d473-9fcb-5f74-b163-f1ca4454e7f1...
Feb 02 10:00:25 compute-1 podman[227949]: 2026-02-02 10:00:25.037537645 +0000 UTC m=+0.054943863 container create b8cd7e6a5e5ee96a5449f4186f673db41acd163f7cefcb47536d3de0a6b28483 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS, CEPH_REF=squid, ceph=True, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/)
Feb 02 10:00:25 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e9465b7c38f1b6c44bfc60dadfc1f392a36c5062ddcaf7b9a22d0f3992451af5/merged/etc/ganesha supports timestamps until 2038 (0x7fffffff)
Feb 02 10:00:25 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e9465b7c38f1b6c44bfc60dadfc1f392a36c5062ddcaf7b9a22d0f3992451af5/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Feb 02 10:00:25 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e9465b7c38f1b6c44bfc60dadfc1f392a36c5062ddcaf7b9a22d0f3992451af5/merged/etc/ceph/keyring supports timestamps until 2038 (0x7fffffff)
Feb 02 10:00:25 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e9465b7c38f1b6c44bfc60dadfc1f392a36c5062ddcaf7b9a22d0f3992451af5/merged/var/lib/ceph/radosgw/ceph-nfs.cephfs.0.0.compute-1.mhzhsx-rgw/keyring supports timestamps until 2038 (0x7fffffff)
Feb 02 10:00:25 compute-1 podman[227949]: 2026-02-02 10:00:25.01448701 +0000 UTC m=+0.031893208 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Feb 02 10:00:25 compute-1 podman[227949]: 2026-02-02 10:00:25.116047724 +0000 UTC m=+0.133454002 container init b8cd7e6a5e5ee96a5449f4186f673db41acd163f7cefcb47536d3de0a6b28483 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx, OSD_FLAVOR=default, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, CEPH_REF=squid, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1)
Feb 02 10:00:25 compute-1 podman[227949]: 2026-02-02 10:00:25.131055988 +0000 UTC m=+0.148462216 container start b8cd7e6a5e5ee96a5449f4186f673db41acd163f7cefcb47536d3de0a6b28483 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, ceph=True, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, CEPH_REF=squid)
Feb 02 10:00:25 compute-1 bash[227949]: b8cd7e6a5e5ee96a5449f4186f673db41acd163f7cefcb47536d3de0a6b28483
Feb 02 10:00:25 compute-1 systemd[1]: Started Ceph nfs.cephfs.0.0.compute-1.mhzhsx for d241d473-9fcb-5f74-b163-f1ca4454e7f1.
Feb 02 10:00:25 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[227964]: 02/02/2026 10:00:25 : epoch 698075b9 : compute-1 : ganesha.nfsd-2[main] init_logging :LOG :NULL :LOG: Setting log level for all components to NIV_EVENT
Feb 02 10:00:25 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[227964]: 02/02/2026 10:00:25 : epoch 698075b9 : compute-1 : ganesha.nfsd-2[main] main :MAIN :EVENT :ganesha.nfsd Starting: Ganesha Version 5.9
Feb 02 10:00:25 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[227964]: 02/02/2026 10:00:25 : epoch 698075b9 : compute-1 : ganesha.nfsd-2[main] nfs_set_param_from_conf :NFS STARTUP :EVENT :Configuration file successfully parsed
Feb 02 10:00:25 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[227964]: 02/02/2026 10:00:25 : epoch 698075b9 : compute-1 : ganesha.nfsd-2[main] monitoring_init :NFS STARTUP :EVENT :Init monitoring at 0.0.0.0:9587
Feb 02 10:00:25 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[227964]: 02/02/2026 10:00:25 : epoch 698075b9 : compute-1 : ganesha.nfsd-2[main] fsal_init_fds_limit :MDCACHE LRU :EVENT :Setting the system-imposed limit on FDs to 1048576.
Feb 02 10:00:25 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[227964]: 02/02/2026 10:00:25 : epoch 698075b9 : compute-1 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :Initializing ID Mapper.
Feb 02 10:00:25 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[227964]: 02/02/2026 10:00:25 : epoch 698075b9 : compute-1 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :ID Mapper successfully initialized.
Feb 02 10:00:25 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[227964]: 02/02/2026 10:00:25 : epoch 698075b9 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Feb 02 10:00:25 compute-1 ceph-mon[80115]: pgmap v622: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 341 B/s rd, 0 op/s
Feb 02 10:00:26 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:00:26 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 10:00:26 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:00:26.453 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 10:00:26 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:00:26 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:00:26 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:00:26.578 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:00:27 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 10:00:27 compute-1 ceph-mon[80115]: pgmap v623: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s
Feb 02 10:00:28 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:00:28 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:00:28 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:00:28.456 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:00:28 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:00:28 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 10:00:28 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:00:28.581 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 10:00:28 compute-1 ceph-mon[80115]: pgmap v624: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 101 KiB/s rd, 596 B/s wr, 167 op/s
Feb 02 10:00:30 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:00:30 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:00:30 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:00:30.458 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:00:30 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:00:30 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 10:00:30 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:00:30.584 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 10:00:31 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[227964]: 02/02/2026 10:00:31 : epoch 698075b9 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Feb 02 10:00:31 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[227964]: 02/02/2026 10:00:31 : epoch 698075b9 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Feb 02 10:00:31 compute-1 ceph-mon[80115]: pgmap v625: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 101 KiB/s rd, 596 B/s wr, 167 op/s
Feb 02 10:00:32 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 10:00:32 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:00:32 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 10:00:32 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:00:32.461 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 10:00:32 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Feb 02 10:00:32 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:00:32 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:00:32 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:00:32.588 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:00:32 compute-1 sshd-session[227880]: error: kex_exchange_identification: read: Connection timed out
Feb 02 10:00:32 compute-1 sshd-session[227880]: banner exchange: Connection from 223.71.254.78 port 55632: Connection timed out
Feb 02 10:00:33 compute-1 ceph-mon[80115]: pgmap v626: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 101 KiB/s rd, 596 B/s wr, 167 op/s
Feb 02 10:00:34 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:00:34 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 10:00:34 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:00:34.464 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 10:00:34 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:00:34 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 10:00:34 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:00:34.590 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 10:00:35 compute-1 ceph-mon[80115]: pgmap v627: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 102 KiB/s rd, 938 B/s wr, 168 op/s
Feb 02 10:00:36 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:00:36 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:00:36 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:00:36.467 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:00:36 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:00:36 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:00:36 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:00:36.593 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:00:37 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 10:00:37 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[227964]: 02/02/2026 10:00:37 : epoch 698075b9 : compute-1 : ganesha.nfsd-2[main] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Feb 02 10:00:37 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[227964]: 02/02/2026 10:00:37 : epoch 698075b9 : compute-1 : ganesha.nfsd-2[main] main :NFS STARTUP :WARN :No export entries found in configuration file !!!
Feb 02 10:00:37 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[227964]: 02/02/2026 10:00:37 : epoch 698075b9 : compute-1 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:24): Unknown block (RADOS_URLS)
Feb 02 10:00:37 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[227964]: 02/02/2026 10:00:37 : epoch 698075b9 : compute-1 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:29): Unknown block (RGW)
Feb 02 10:00:37 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[227964]: 02/02/2026 10:00:37 : epoch 698075b9 : compute-1 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :CAP_SYS_RESOURCE was successfully removed for proper quota management in FSAL
Feb 02 10:00:37 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[227964]: 02/02/2026 10:00:37 : epoch 698075b9 : compute-1 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :currently set capabilities are: cap_chown,cap_dac_override,cap_fowner,cap_fsetid,cap_kill,cap_setgid,cap_setuid,cap_setpcap,cap_net_bind_service,cap_sys_chroot,cap_setfcap=ep
Feb 02 10:00:37 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[227964]: 02/02/2026 10:00:37 : epoch 698075b9 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_pkginit :DBUS :CRIT :dbus_bus_get failed (Failed to connect to socket /run/dbus/system_bus_socket: No such file or directory)
Feb 02 10:00:37 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[227964]: 02/02/2026 10:00:37 : epoch 698075b9 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Feb 02 10:00:37 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[227964]: 02/02/2026 10:00:37 : epoch 698075b9 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Feb 02 10:00:37 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[227964]: 02/02/2026 10:00:37 : epoch 698075b9 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Feb 02 10:00:37 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[227964]: 02/02/2026 10:00:37 : epoch 698075b9 : compute-1 : ganesha.nfsd-2[main] nfs_Init_svc :DISP :CRIT :Cannot acquire credentials for principal nfs
Feb 02 10:00:37 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[227964]: 02/02/2026 10:00:37 : epoch 698075b9 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Feb 02 10:00:37 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[227964]: 02/02/2026 10:00:37 : epoch 698075b9 : compute-1 : ganesha.nfsd-2[main] nfs_Init_admin_thread :NFS CB :EVENT :Admin thread initialized
Feb 02 10:00:37 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[227964]: 02/02/2026 10:00:37 : epoch 698075b9 : compute-1 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :EVENT :Callback creds directory (/var/run/ganesha) already exists
Feb 02 10:00:37 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[227964]: 02/02/2026 10:00:37 : epoch 698075b9 : compute-1 : ganesha.nfsd-2[main] find_keytab_entry :NFS CB :WARN :Configuration file does not specify default realm while getting default realm name
Feb 02 10:00:37 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[227964]: 02/02/2026 10:00:37 : epoch 698075b9 : compute-1 : ganesha.nfsd-2[main] gssd_refresh_krb5_machine_credential :NFS CB :CRIT :ERROR: gssd_refresh_krb5_machine_credential: no usable keytab entry found in keytab /etc/krb5.keytab for connection with host localhost
Feb 02 10:00:37 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[227964]: 02/02/2026 10:00:37 : epoch 698075b9 : compute-1 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :WARN :gssd_refresh_krb5_machine_credential failed (-1765328160:0)
Feb 02 10:00:37 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[227964]: 02/02/2026 10:00:37 : epoch 698075b9 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :Starting delayed executor.
Feb 02 10:00:37 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[227964]: 02/02/2026 10:00:37 : epoch 698075b9 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :gsh_dbusthread was started successfully
Feb 02 10:00:37 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[227964]: 02/02/2026 10:00:37 : epoch 698075b9 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :admin thread was started successfully
Feb 02 10:00:37 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[227964]: 02/02/2026 10:00:37 : epoch 698075b9 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :reaper thread was started successfully
Feb 02 10:00:37 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[227964]: 02/02/2026 10:00:37 : epoch 698075b9 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :General fridge was started successfully
Feb 02 10:00:37 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[227964]: 02/02/2026 10:00:37 : epoch 698075b9 : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Feb 02 10:00:37 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[227964]: 02/02/2026 10:00:37 : epoch 698075b9 : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :             NFS SERVER INITIALIZED
Feb 02 10:00:37 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[227964]: 02/02/2026 10:00:37 : epoch 698075b9 : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Feb 02 10:00:37 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[227964]: 02/02/2026 10:00:37 : epoch 698075b9 : compute-1 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :CRIT :DBUS not initialized, service thread exiting
Feb 02 10:00:37 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[227964]: 02/02/2026 10:00:37 : epoch 698075b9 : compute-1 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :EVENT :shutdown
Feb 02 10:00:37 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[227964]: 02/02/2026 10:00:37 : epoch 698075b9 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1c54000df0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:00:37 compute-1 ceph-mon[80115]: pgmap v628: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 101 KiB/s rd, 938 B/s wr, 168 op/s
Feb 02 10:00:37 compute-1 ceph-osd[77691]: bluestore.MempoolThread fragmentation_score=0.000029 took=0.000037s
Feb 02 10:00:37 compute-1 sudo[228028]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Feb 02 10:00:37 compute-1 sudo[228028]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 10:00:37 compute-1 sudo[228028]: pam_unix(sudo:session): session closed for user root
Feb 02 10:00:37 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[227964]: 02/02/2026 10:00:37 : epoch 698075b9 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1c58001c40 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:00:37 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[227964]: 02/02/2026 10:00:37 : epoch 698075b9 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1c3c000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:00:38 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:00:38 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:00:38 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:00:38.469 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:00:38 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:00:38 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:00:38 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:00:38.596 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:00:39 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[227964]: 02/02/2026 10:00:39 : epoch 698075b9 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1c30000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:00:39 compute-1 ceph-mon[80115]: pgmap v629: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 102 KiB/s rd, 1023 B/s wr, 169 op/s
Feb 02 10:00:39 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-haproxy-nfs-cephfs-compute-1-sryqbx[86108]: [WARNING] 032/100039 (4) : Server backend/nfs.cephfs.0 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Feb 02 10:00:39 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[227964]: 02/02/2026 10:00:39 : epoch 698075b9 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1c38000fa0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:00:39 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[227964]: 02/02/2026 10:00:39 : epoch 698075b9 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1c58002740 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:00:40 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:00:40 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:00:40 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:00:40.471 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:00:40 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:00:40 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:00:40 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:00:40.599 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:00:40 compute-1 ceph-mon[80115]: pgmap v630: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.2 KiB/s rd, 426 B/s wr, 1 op/s
Feb 02 10:00:41 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[227964]: 02/02/2026 10:00:41 : epoch 698075b9 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1c3c0016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:00:41 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[227964]: 02/02/2026 10:00:41 : epoch 698075b9 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1c300016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:00:41 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[227964]: 02/02/2026 10:00:41 : epoch 698075b9 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1c38001ac0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:00:42 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 10:00:42 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:00:42 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:00:42 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:00:42.473 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:00:42 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:00:42 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:00:42 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:00:42.602 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:00:42 compute-1 sudo[228055]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 02 10:00:42 compute-1 sudo[228055]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 10:00:42 compute-1 sudo[228055]: pam_unix(sudo:session): session closed for user root
Feb 02 10:00:42 compute-1 sudo[228081]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/d241d473-9fcb-5f74-b163-f1ca4454e7f1/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Feb 02 10:00:42 compute-1 sudo[228081]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 10:00:43 compute-1 sudo[228081]: pam_unix(sudo:session): session closed for user root
Feb 02 10:00:43 compute-1 ceph-mon[80115]: pgmap v631: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.2 KiB/s rd, 426 B/s wr, 1 op/s
Feb 02 10:00:43 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Feb 02 10:00:43 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Feb 02 10:00:43 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb 02 10:00:43 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb 02 10:00:43 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Feb 02 10:00:43 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Feb 02 10:00:43 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Feb 02 10:00:43 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[227964]: 02/02/2026 10:00:43 : epoch 698075b9 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1c58002740 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:00:43 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[227964]: 02/02/2026 10:00:43 : epoch 698075b9 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1c3c0016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:00:43 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[227964]: 02/02/2026 10:00:43 : epoch 698075b9 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1c300016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:00:44 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:00:44 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:00:44 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:00:44.476 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:00:44 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:00:44 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:00:44 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:00:44.605 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:00:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 10:00:44.900 143542 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 02 10:00:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 10:00:44.901 143542 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 02 10:00:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 10:00:44.902 143542 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 02 10:00:45 compute-1 ceph-mon[80115]: pgmap v632: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.2 KiB/s rd, 426 B/s wr, 1 op/s
Feb 02 10:00:45 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[227964]: 02/02/2026 10:00:45 : epoch 698075b9 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1c38001ac0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:00:45 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[227964]: 02/02/2026 10:00:45 : epoch 698075b9 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1c58002740 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:00:45 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[227964]: 02/02/2026 10:00:45 : epoch 698075b9 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1c3c0016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:00:46 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:00:46 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:00:46 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:00:46.479 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:00:46 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:00:46 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:00:46 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:00:46.608 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:00:47 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 10:00:47 compute-1 ceph-mon[80115]: pgmap v633: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 85 B/s wr, 0 op/s
Feb 02 10:00:47 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Feb 02 10:00:47 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[227964]: 02/02/2026 10:00:47 : epoch 698075b9 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1c300016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:00:47 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[227964]: 02/02/2026 10:00:47 : epoch 698075b9 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1c38001ac0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:00:47 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[227964]: 02/02/2026 10:00:47 : epoch 698075b9 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1c58002740 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:00:48 compute-1 podman[228139]: 2026-02-02 10:00:48.431687297 +0000 UTC m=+0.101116844 container health_status 1fb2696999ea15e9688144989093738543d3cef725f9986792d6dfd707aefbe2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:099d88ae13fa2b3409da5310cdcba7fa01d2c87a8bc98296299a57054b9a075e, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'db4758ee7523fe447444c4bd2b867b543b1eee4e3bbcf6676cd1b27bf6147d86-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:099d88ae13fa2b3409da5310cdcba7fa01d2c87a8bc98296299a57054b9a075e', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller)
Feb 02 10:00:48 compute-1 sudo[228160]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 02 10:00:48 compute-1 sudo[228160]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 10:00:48 compute-1 sudo[228160]: pam_unix(sudo:session): session closed for user root
Feb 02 10:00:48 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:00:48 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 10:00:48 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:00:48.480 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 10:00:48 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:00:48 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:00:48 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:00:48.611 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:00:49 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb 02 10:00:49 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb 02 10:00:49 compute-1 ceph-mon[80115]: pgmap v634: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 85 B/s wr, 0 op/s
Feb 02 10:00:49 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[227964]: 02/02/2026 10:00:49 : epoch 698075b9 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1c3c002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:00:49 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[227964]: 02/02/2026 10:00:49 : epoch 698075b9 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1c30002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:00:49 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[227964]: 02/02/2026 10:00:49 : epoch 698075b9 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1c38002f50 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:00:50 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:00:50 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 10:00:50 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:00:50.483 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 10:00:50 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:00:50 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:00:50 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:00:50.614 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:00:51 compute-1 ceph-mon[80115]: pgmap v635: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Feb 02 10:00:51 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[227964]: 02/02/2026 10:00:51 : epoch 698075b9 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1c58002740 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:00:51 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[227964]: 02/02/2026 10:00:51 : epoch 698075b9 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1c3c002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:00:51 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[227964]: 02/02/2026 10:00:51 : epoch 698075b9 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1c30002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:00:52 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 10:00:52 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:00:52 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 10:00:52 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:00:52.485 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 10:00:52 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:00:52 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:00:52 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:00:52.617 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:00:53 compute-1 ceph-mon[80115]: pgmap v636: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Feb 02 10:00:53 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[227964]: 02/02/2026 10:00:53 : epoch 698075b9 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1c38002f50 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:00:53 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[227964]: 02/02/2026 10:00:53 : epoch 698075b9 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1c58002740 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:00:53 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[227964]: 02/02/2026 10:00:53 : epoch 698075b9 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1c3c002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:00:54 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:00:54 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:00:54 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:00:54.488 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:00:54 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:00:54 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 10:00:54 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:00:54.619 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 10:00:55 compute-1 podman[228197]: 2026-02-02 10:00:55.38072249 +0000 UTC m=+0.053462324 container health_status 9cab238676dda9a572dafc047b624a8b78a8fbec063bf997856559897160605d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'db4758ee7523fe447444c4bd2b867b543b1eee4e3bbcf6676cd1b27bf6147d86-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Feb 02 10:00:55 compute-1 ceph-mon[80115]: pgmap v637: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Feb 02 10:00:55 compute-1 ceph-mon[80115]: from='client.? 192.168.122.10:0/3282188228' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Feb 02 10:00:55 compute-1 ceph-mon[80115]: from='client.? 192.168.122.10:0/3282188228' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Feb 02 10:00:55 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[227964]: 02/02/2026 10:00:55 : epoch 698075b9 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1c3c002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:00:55 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[227964]: 02/02/2026 10:00:55 : epoch 698075b9 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1c38002f50 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:00:55 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[227964]: 02/02/2026 10:00:55 : epoch 698075b9 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1c38002f50 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:00:56 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:00:56 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 10:00:56 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:00:56.491 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 10:00:56 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:00:56 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:00:56 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:00:56.623 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:00:57 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 10:00:57 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[227964]: 02/02/2026 10:00:57 : epoch 698075b9 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1c38002f50 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:00:57 compute-1 ceph-mon[80115]: pgmap v638: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Feb 02 10:00:57 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[227964]: 02/02/2026 10:00:57 : epoch 698075b9 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1c34000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:00:57 compute-1 sudo[228217]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Feb 02 10:00:57 compute-1 sudo[228217]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 10:00:57 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[227964]: 02/02/2026 10:00:57 : epoch 698075b9 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1c3c002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:00:57 compute-1 sudo[228217]: pam_unix(sudo:session): session closed for user root
Feb 02 10:00:58 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:00:58 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:00:58 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:00:58.493 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:00:58 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:00:58 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:00:58 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:00:58.626 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:00:59 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[227964]: 02/02/2026 10:00:59 : epoch 698075b9 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1c58002740 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:00:59 compute-1 ceph-mon[80115]: pgmap v639: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Feb 02 10:00:59 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[227964]: 02/02/2026 10:00:59 : epoch 698075b9 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1c38004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:00:59 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[227964]: 02/02/2026 10:00:59 : epoch 698075b9 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1c340016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:01:00 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:01:00 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:01:00 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:01:00.496 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:01:00 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:01:00 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:01:00 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:01:00.629 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:01:01 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[227964]: 02/02/2026 10:01:01 : epoch 698075b9 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1c3c002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:01:01 compute-1 ceph-mon[80115]: pgmap v640: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Feb 02 10:01:01 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[227964]: 02/02/2026 10:01:01 : epoch 698075b9 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1c58002740 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:01:01 compute-1 CROND[228245]: (root) CMD (run-parts /etc/cron.hourly)
Feb 02 10:01:01 compute-1 run-parts[228248]: (/etc/cron.hourly) starting 0anacron
Feb 02 10:01:01 compute-1 run-parts[228254]: (/etc/cron.hourly) finished 0anacron
Feb 02 10:01:01 compute-1 CROND[228244]: (root) CMDEND (run-parts /etc/cron.hourly)
Feb 02 10:01:01 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[227964]: 02/02/2026 10:01:01 : epoch 698075b9 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1c58002740 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:01:02 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 10:01:02 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:01:02 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 10:01:02 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:01:02.498 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 10:01:02 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:01:02 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:01:02 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:01:02.631 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:01:02 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Feb 02 10:01:02 compute-1 ceph-mon[80115]: pgmap v641: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Feb 02 10:01:02 compute-1 nova_compute[226294]: 2026-02-02 10:01:02.648 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 02 10:01:02 compute-1 nova_compute[226294]: 2026-02-02 10:01:02.648 226298 DEBUG nova.compute.manager [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 02 10:01:02 compute-1 nova_compute[226294]: 2026-02-02 10:01:02.649 226298 DEBUG nova.compute.manager [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 02 10:01:02 compute-1 nova_compute[226294]: 2026-02-02 10:01:02.943 226298 DEBUG nova.compute.manager [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Feb 02 10:01:02 compute-1 nova_compute[226294]: 2026-02-02 10:01:02.944 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 02 10:01:02 compute-1 nova_compute[226294]: 2026-02-02 10:01:02.945 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 02 10:01:03 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[227964]: 02/02/2026 10:01:03 : epoch 698075b9 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1c58002740 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:01:03 compute-1 nova_compute[226294]: 2026-02-02 10:01:03.648 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 02 10:01:03 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[227964]: 02/02/2026 10:01:03 : epoch 698075b9 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1c3c003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:01:03 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[227964]: 02/02/2026 10:01:03 : epoch 698075b9 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1c38004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:01:04 compute-1 ceph-mon[80115]: from='client.? 192.168.122.102:0/2265931273' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Feb 02 10:01:04 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:01:04 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:01:04 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:01:04.501 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:01:04 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:01:04 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 10:01:04 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:01:04.633 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 10:01:04 compute-1 nova_compute[226294]: 2026-02-02 10:01:04.648 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 02 10:01:04 compute-1 nova_compute[226294]: 2026-02-02 10:01:04.648 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 02 10:01:04 compute-1 nova_compute[226294]: 2026-02-02 10:01:04.649 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 02 10:01:04 compute-1 nova_compute[226294]: 2026-02-02 10:01:04.649 226298 DEBUG nova.compute.manager [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 02 10:01:04 compute-1 nova_compute[226294]: 2026-02-02 10:01:04.650 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 02 10:01:04 compute-1 nova_compute[226294]: 2026-02-02 10:01:04.679 226298 DEBUG oslo_concurrency.lockutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 02 10:01:04 compute-1 nova_compute[226294]: 2026-02-02 10:01:04.680 226298 DEBUG oslo_concurrency.lockutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 02 10:01:04 compute-1 nova_compute[226294]: 2026-02-02 10:01:04.680 226298 DEBUG oslo_concurrency.lockutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 02 10:01:04 compute-1 nova_compute[226294]: 2026-02-02 10:01:04.681 226298 DEBUG nova.compute.resource_tracker [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 02 10:01:04 compute-1 nova_compute[226294]: 2026-02-02 10:01:04.682 226298 DEBUG oslo_concurrency.processutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 02 10:01:05 compute-1 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Feb 02 10:01:05 compute-1 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1987960518' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Feb 02 10:01:05 compute-1 nova_compute[226294]: 2026-02-02 10:01:05.132 226298 DEBUG oslo_concurrency.processutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.450s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 02 10:01:05 compute-1 ceph-mon[80115]: pgmap v642: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Feb 02 10:01:05 compute-1 ceph-mon[80115]: from='client.? 192.168.122.102:0/1167134700' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Feb 02 10:01:05 compute-1 ceph-mon[80115]: from='client.? 192.168.122.101:0/1987960518' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Feb 02 10:01:05 compute-1 nova_compute[226294]: 2026-02-02 10:01:05.317 226298 WARNING nova.virt.libvirt.driver [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 02 10:01:05 compute-1 nova_compute[226294]: 2026-02-02 10:01:05.319 226298 DEBUG nova.compute.resource_tracker [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5235MB free_disk=59.98828125GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 02 10:01:05 compute-1 nova_compute[226294]: 2026-02-02 10:01:05.319 226298 DEBUG oslo_concurrency.lockutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 02 10:01:05 compute-1 nova_compute[226294]: 2026-02-02 10:01:05.319 226298 DEBUG oslo_concurrency.lockutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 02 10:01:05 compute-1 nova_compute[226294]: 2026-02-02 10:01:05.391 226298 DEBUG nova.compute.resource_tracker [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 02 10:01:05 compute-1 nova_compute[226294]: 2026-02-02 10:01:05.391 226298 DEBUG nova.compute.resource_tracker [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 02 10:01:05 compute-1 nova_compute[226294]: 2026-02-02 10:01:05.409 226298 DEBUG oslo_concurrency.processutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 02 10:01:05 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[227964]: 02/02/2026 10:01:05 : epoch 698075b9 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1c34001fc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:01:05 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[227964]: 02/02/2026 10:01:05 : epoch 698075b9 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1c58002740 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:01:05 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[227964]: 02/02/2026 10:01:05 : epoch 698075b9 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1c38004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:01:05 compute-1 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Feb 02 10:01:05 compute-1 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/791708781' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Feb 02 10:01:05 compute-1 nova_compute[226294]: 2026-02-02 10:01:05.865 226298 DEBUG oslo_concurrency.processutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.456s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 02 10:01:05 compute-1 nova_compute[226294]: 2026-02-02 10:01:05.871 226298 DEBUG nova.compute.provider_tree [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Inventory has not changed in ProviderTree for provider: 8e32c057-ad28-4c19-8374-763e0c1c8622 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 02 10:01:05 compute-1 nova_compute[226294]: 2026-02-02 10:01:05.889 226298 DEBUG nova.scheduler.client.report [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Inventory has not changed for provider 8e32c057-ad28-4c19-8374-763e0c1c8622 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 02 10:01:05 compute-1 nova_compute[226294]: 2026-02-02 10:01:05.892 226298 DEBUG nova.compute.resource_tracker [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 02 10:01:05 compute-1 nova_compute[226294]: 2026-02-02 10:01:05.892 226298 DEBUG oslo_concurrency.lockutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.573s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 02 10:01:06 compute-1 ceph-mon[80115]: from='client.? 192.168.122.100:0/3936621474' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Feb 02 10:01:06 compute-1 ceph-mon[80115]: from='client.? 192.168.122.101:0/791708781' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Feb 02 10:01:06 compute-1 ceph-mon[80115]: from='client.? 192.168.122.100:0/667549826' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Feb 02 10:01:06 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:01:06 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:01:06 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:01:06.504 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:01:06 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:01:06 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:01:06 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:01:06.637 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:01:06 compute-1 nova_compute[226294]: 2026-02-02 10:01:06.888 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 02 10:01:07 compute-1 ceph-mon[80115]: pgmap v643: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Feb 02 10:01:07 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 10:01:07 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[227964]: 02/02/2026 10:01:07 : epoch 698075b9 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1c38004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:01:07 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[227964]: 02/02/2026 10:01:07 : epoch 698075b9 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1c38004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:01:07 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[227964]: 02/02/2026 10:01:07 : epoch 698075b9 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1c30003820 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:01:08 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:01:08 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 10:01:08 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:01:08.505 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 10:01:08 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:01:08 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 10:01:08 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:01:08.640 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 10:01:09 compute-1 ceph-mon[80115]: pgmap v644: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Feb 02 10:01:09 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[227964]: 02/02/2026 10:01:09 : epoch 698075b9 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1c3c003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:01:09 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[227964]: 02/02/2026 10:01:09 : epoch 698075b9 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1c38004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:01:09 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[227964]: 02/02/2026 10:01:09 : epoch 698075b9 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1c28000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:01:10 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:01:10 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 10:01:10 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:01:10.508 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 10:01:10 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:01:10 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:01:10 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:01:10.643 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:01:11 compute-1 ceph-mon[80115]: pgmap v645: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Feb 02 10:01:11 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[227964]: 02/02/2026 10:01:11 : epoch 698075b9 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1c30004140 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:01:11 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[227964]: 02/02/2026 10:01:11 : epoch 698075b9 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1c3c003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:01:11 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[227964]: 02/02/2026 10:01:11 : epoch 698075b9 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1c38004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:01:12 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 10:01:12 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:01:12 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:01:12 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:01:12.512 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:01:12 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:01:12 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:01:12 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:01:12.647 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:01:13 compute-1 ceph-mon[80115]: pgmap v646: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Feb 02 10:01:13 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[227964]: 02/02/2026 10:01:13 : epoch 698075b9 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1c280016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:01:13 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[227964]: 02/02/2026 10:01:13 : epoch 698075b9 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1c30004140 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:01:13 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[227964]: 02/02/2026 10:01:13 : epoch 698075b9 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1c3c003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:01:14 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:01:14 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:01:14 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:01:14.515 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:01:14 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:01:14 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 10:01:14 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:01:14.649 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 10:01:15 compute-1 ceph-mon[80115]: pgmap v647: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Feb 02 10:01:15 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[227964]: 02/02/2026 10:01:15 : epoch 698075b9 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1c38004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:01:15 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[227964]: 02/02/2026 10:01:15 : epoch 698075b9 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1c38004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:01:15 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[227964]: 02/02/2026 10:01:15 : epoch 698075b9 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1c30004140 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:01:16 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:01:16 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:01:16 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:01:16.517 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:01:16 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:01:16 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:01:16 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:01:16.651 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:01:17 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 10:01:17 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[227964]: 02/02/2026 10:01:17 : epoch 698075b9 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1c3c003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:01:17 compute-1 ceph-mon[80115]: pgmap v648: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Feb 02 10:01:17 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Feb 02 10:01:17 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[227964]: 02/02/2026 10:01:17 : epoch 698075b9 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1c28001fc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:01:17 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[227964]: 02/02/2026 10:01:17 : epoch 698075b9 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1c38004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:01:17 compute-1 sudo[228308]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Feb 02 10:01:17 compute-1 sudo[228308]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 10:01:17 compute-1 sudo[228308]: pam_unix(sudo:session): session closed for user root
Feb 02 10:01:18 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:01:18 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 10:01:18 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:01:18.519 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 10:01:18 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:01:18 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:01:18 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:01:18.654 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:01:18 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-haproxy-nfs-cephfs-compute-1-sryqbx[86108]: [WARNING] 032/100118 (4) : Server backend/nfs.cephfs.1 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Feb 02 10:01:19 compute-1 podman[228334]: 2026-02-02 10:01:19.416716074 +0000 UTC m=+0.087467897 container health_status 1fb2696999ea15e9688144989093738543d3cef725f9986792d6dfd707aefbe2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:099d88ae13fa2b3409da5310cdcba7fa01d2c87a8bc98296299a57054b9a075e, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'db4758ee7523fe447444c4bd2b867b543b1eee4e3bbcf6676cd1b27bf6147d86-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:099d88ae13fa2b3409da5310cdcba7fa01d2c87a8bc98296299a57054b9a075e', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, tcib_managed=true)
Feb 02 10:01:19 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[227964]: 02/02/2026 10:01:19 : epoch 698075b9 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1c3c003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:01:19 compute-1 ceph-mon[80115]: pgmap v649: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 426 B/s rd, 0 op/s
Feb 02 10:01:19 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[227964]: 02/02/2026 10:01:19 : epoch 698075b9 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1c3c003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:01:19 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[227964]: 02/02/2026 10:01:19 : epoch 698075b9 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1c28001fc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:01:20 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:01:20 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 10:01:20 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:01:20.521 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 10:01:20 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:01:20 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:01:20 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:01:20.657 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:01:20 compute-1 ceph-mon[80115]: pgmap v650: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s
Feb 02 10:01:21 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[227964]: 02/02/2026 10:01:21 : epoch 698075b9 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1c28001fc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:01:21 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[227964]: 02/02/2026 10:01:21 : epoch 698075b9 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1c3c003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:01:21 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[227964]: 02/02/2026 10:01:21 : epoch 698075b9 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1c3c003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:01:22 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 10:01:22 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:01:22 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 10:01:22 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:01:22.523 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 10:01:22 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:01:22 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:01:22 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:01:22.660 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:01:23 compute-1 ceph-mon[80115]: pgmap v651: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s
Feb 02 10:01:23 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[227964]: 02/02/2026 10:01:23 : epoch 698075b9 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1c38004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:01:23 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[227964]: 02/02/2026 10:01:23 : epoch 698075b9 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1c28001fc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:01:23 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[227964]: 02/02/2026 10:01:23 : epoch 698075b9 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1c3c003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:01:24 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:01:24 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 10:01:24 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:01:24.525 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 10:01:24 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:01:24 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:01:24 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:01:24.662 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:01:25 compute-1 ceph-mon[80115]: pgmap v652: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s
Feb 02 10:01:25 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[227964]: 02/02/2026 10:01:25 : epoch 698075b9 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1c30004140 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:01:25 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[227964]: 02/02/2026 10:01:25 : epoch 698075b9 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1c38004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:01:25 compute-1 sshd-session[228365]: Invalid user solv from 80.94.92.184 port 47216
Feb 02 10:01:25 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[227964]: 02/02/2026 10:01:25 : epoch 698075b9 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1c280036e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:01:25 compute-1 podman[228367]: 2026-02-02 10:01:25.85850406 +0000 UTC m=+0.094761419 container health_status 9cab238676dda9a572dafc047b624a8b78a8fbec063bf997856559897160605d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'db4758ee7523fe447444c4bd2b867b543b1eee4e3bbcf6676cd1b27bf6147d86-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2)
Feb 02 10:01:25 compute-1 sshd-session[228365]: Connection closed by invalid user solv 80.94.92.184 port 47216 [preauth]
Feb 02 10:01:26 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:01:26 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:01:26 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:01:26.528 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:01:26 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:01:26 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:01:26 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:01:26.666 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:01:26 compute-1 ceph-mon[80115]: pgmap v653: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s
Feb 02 10:01:27 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 10:01:27 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[227964]: 02/02/2026 10:01:27 : epoch 698075b9 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1c3c003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:01:27 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[227964]: 02/02/2026 10:01:27 : epoch 698075b9 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1c30004140 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:01:27 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[227964]: 02/02/2026 10:01:27 : epoch 698075b9 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1c38004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:01:28 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:01:28 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 10:01:28 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:01:28.530 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 10:01:28 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[227964]: 02/02/2026 10:01:28 : epoch 698075b9 : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Feb 02 10:01:28 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:01:28 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:01:28 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:01:28.669 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:01:29 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[227964]: 02/02/2026 10:01:29 : epoch 698075b9 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1c280036e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:01:29 compute-1 ceph-mon[80115]: pgmap v654: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 682 B/s rd, 85 B/s wr, 0 op/s
Feb 02 10:01:29 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[227964]: 02/02/2026 10:01:29 : epoch 698075b9 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1c3c003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:01:29 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[227964]: 02/02/2026 10:01:29 : epoch 698075b9 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1c30004140 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:01:30 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:01:30 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:01:30 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:01:30.534 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:01:30 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:01:30 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:01:30 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:01:30.671 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:01:31 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[227964]: 02/02/2026 10:01:31 : epoch 698075b9 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1c38004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:01:31 compute-1 ceph-mon[80115]: pgmap v655: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 85 B/s wr, 0 op/s
Feb 02 10:01:31 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[227964]: 02/02/2026 10:01:31 : epoch 698075b9 : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Feb 02 10:01:31 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[227964]: 02/02/2026 10:01:31 : epoch 698075b9 : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Feb 02 10:01:31 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[227964]: 02/02/2026 10:01:31 : epoch 698075b9 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1c28004000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:01:31 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[227964]: 02/02/2026 10:01:31 : epoch 698075b9 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1c3c003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:01:32 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 10:01:32 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:01:32 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:01:32 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:01:32.536 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:01:32 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Feb 02 10:01:32 compute-1 ceph-mon[80115]: pgmap v656: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 85 B/s wr, 0 op/s
Feb 02 10:01:32 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:01:32 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:01:32 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:01:32.675 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:01:33 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[227964]: 02/02/2026 10:01:33 : epoch 698075b9 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1c30004140 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:01:33 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[227964]: 02/02/2026 10:01:33 : epoch 698075b9 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1c38004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:01:33 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[227964]: 02/02/2026 10:01:33 : epoch 698075b9 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1c28004000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:01:34 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:01:34 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:01:34 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:01:34.539 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:01:34 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[227964]: 02/02/2026 10:01:34 : epoch 698075b9 : compute-1 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Feb 02 10:01:34 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:01:34 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:01:34 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:01:34.678 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:01:35 compute-1 ceph-mon[80115]: pgmap v657: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 2.1 KiB/s rd, 938 B/s wr, 3 op/s
Feb 02 10:01:35 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[227964]: 02/02/2026 10:01:35 : epoch 698075b9 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1c3c003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:01:35 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[227964]: 02/02/2026 10:01:35 : epoch 698075b9 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1c30004140 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:01:35 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[227964]: 02/02/2026 10:01:35 : epoch 698075b9 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1c38004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:01:36 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:01:36 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 10:01:36 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:01:36.541 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 10:01:36 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:01:36 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:01:36 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:01:36.681 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:01:37 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 10:01:37 compute-1 ceph-mon[80115]: pgmap v658: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 2.1 KiB/s rd, 938 B/s wr, 3 op/s
Feb 02 10:01:37 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[227964]: 02/02/2026 10:01:37 : epoch 698075b9 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1c28004000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:01:37 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[227964]: 02/02/2026 10:01:37 : epoch 698075b9 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1c3c003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:01:37 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[227964]: 02/02/2026 10:01:37 : epoch 698075b9 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1c30004140 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:01:37 compute-1 sudo[228392]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Feb 02 10:01:37 compute-1 sudo[228392]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 10:01:37 compute-1 sudo[228392]: pam_unix(sudo:session): session closed for user root
Feb 02 10:01:38 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:01:38 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:01:38 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:01:38.544 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:01:38 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:01:38 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:01:38 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:01:38.684 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:01:38 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:01:38 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 10:01:38 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - - [02/Feb/2026:10:01:38.846 +0000] "GET /swift/info HTTP/1.1" 200 539 - "python-urllib3/1.26.5" - latency=0.001000026s
Feb 02 10:01:39 compute-1 ceph-mon[80115]: pgmap v659: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 2.3 KiB/s rd, 1023 B/s wr, 3 op/s
Feb 02 10:01:39 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[227964]: 02/02/2026 10:01:39 : epoch 698075b9 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1c38004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:01:39 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[227964]: 02/02/2026 10:01:39 : epoch 698075b9 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1c28004000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:01:39 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[227964]: 02/02/2026 10:01:39 : epoch 698075b9 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1c28004000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:01:40 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:01:40 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:01:40 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:01:40.547 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:01:40 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:01:40 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:01:40 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:01:40.687 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:01:40 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-haproxy-nfs-cephfs-compute-1-sryqbx[86108]: [WARNING] 032/100140 (4) : Server backend/nfs.cephfs.1 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Feb 02 10:01:41 compute-1 ceph-mon[80115]: pgmap v660: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.8 KiB/s rd, 938 B/s wr, 2 op/s
Feb 02 10:01:41 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[227964]: 02/02/2026 10:01:41 : epoch 698075b9 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1c30004140 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:01:41 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[227964]: 02/02/2026 10:01:41 : epoch 698075b9 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1c38004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:01:41 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[227964]: 02/02/2026 10:01:41 : epoch 698075b9 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1c58001f40 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:01:42 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 10:01:42 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:01:42 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:01:42 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:01:42.550 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:01:42 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:01:42 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 10:01:42 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:01:42.689 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 10:01:43 compute-1 ceph-mon[80115]: pgmap v661: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.8 KiB/s rd, 937 B/s wr, 2 op/s
Feb 02 10:01:43 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[227964]: 02/02/2026 10:01:43 : epoch 698075b9 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1c34000f30 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:01:43 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e140 e140: 3 total, 3 up, 3 in
Feb 02 10:01:43 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[227964]: 02/02/2026 10:01:43 : epoch 698075b9 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1c34000f30 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:01:43 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[227964]: 02/02/2026 10:01:43 : epoch 698075b9 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1c38004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:01:44 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:01:44 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 10:01:44 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:01:44.553 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 10:01:44 compute-1 ceph-mon[80115]: osdmap e140: 3 total, 3 up, 3 in
Feb 02 10:01:44 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e141 e141: 3 total, 3 up, 3 in
Feb 02 10:01:44 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:01:44 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 10:01:44 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:01:44.691 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 10:01:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 10:01:44.901 143542 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 02 10:01:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 10:01:44.902 143542 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 02 10:01:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 10:01:44.902 143542 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 02 10:01:45 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[227964]: 02/02/2026 10:01:45 : epoch 698075b9 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1c58001f40 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:01:45 compute-1 ceph-mon[80115]: pgmap v663: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 306 B/s rd, 102 B/s wr, 0 op/s
Feb 02 10:01:45 compute-1 ceph-mon[80115]: osdmap e141: 3 total, 3 up, 3 in
Feb 02 10:01:45 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e142 e142: 3 total, 3 up, 3 in
Feb 02 10:01:45 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[227964]: 02/02/2026 10:01:45 : epoch 698075b9 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1c34000f30 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:01:45 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[227964]: 02/02/2026 10:01:45 : epoch 698075b9 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1c30004140 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:01:46 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:01:46 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 10:01:46 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:01:46.555 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 10:01:46 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:01:46 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:01:46 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:01:46.695 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:01:46 compute-1 ceph-mon[80115]: osdmap e142: 3 total, 3 up, 3 in
Feb 02 10:01:46 compute-1 ceph-mon[80115]: pgmap v666: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail
Feb 02 10:01:47 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e142 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 10:01:47 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[227964]: 02/02/2026 10:01:47 : epoch 698075b9 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1c38004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:01:47 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Feb 02 10:01:47 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[227964]: 02/02/2026 10:01:47 : epoch 698075b9 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1c58002860 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:01:47 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e143 e143: 3 total, 3 up, 3 in
Feb 02 10:01:47 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[227964]: 02/02/2026 10:01:47 : epoch 698075b9 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1c340025b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:01:48 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:01:48 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 10:01:48 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:01:48.557 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 10:01:48 compute-1 sudo[228424]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 02 10:01:48 compute-1 sudo[228424]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 10:01:48 compute-1 sudo[228424]: pam_unix(sudo:session): session closed for user root
Feb 02 10:01:48 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:01:48 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:01:48 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:01:48.698 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:01:48 compute-1 ceph-mon[80115]: osdmap e143: 3 total, 3 up, 3 in
Feb 02 10:01:48 compute-1 ceph-mon[80115]: pgmap v668: 353 pgs: 353 active+clean; 41 MiB data, 194 MiB used, 60 GiB / 60 GiB avail; 54 KiB/s rd, 8.4 MiB/s wr, 79 op/s
Feb 02 10:01:48 compute-1 sudo[228449]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/d241d473-9fcb-5f74-b163-f1ca4454e7f1/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Feb 02 10:01:48 compute-1 sudo[228449]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 10:01:49 compute-1 sudo[228449]: pam_unix(sudo:session): session closed for user root
Feb 02 10:01:49 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[227964]: 02/02/2026 10:01:49 : epoch 698075b9 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1c340025b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:01:49 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[227964]: 02/02/2026 10:01:49 : epoch 698075b9 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1c38004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:01:49 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Feb 02 10:01:49 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Feb 02 10:01:49 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb 02 10:01:49 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb 02 10:01:49 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Feb 02 10:01:49 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Feb 02 10:01:49 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Feb 02 10:01:49 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[227964]: 02/02/2026 10:01:49 : epoch 698075b9 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1c58002860 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:01:50 compute-1 podman[228507]: 2026-02-02 10:01:50.427759546 +0000 UTC m=+0.103560120 container health_status 1fb2696999ea15e9688144989093738543d3cef725f9986792d6dfd707aefbe2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:099d88ae13fa2b3409da5310cdcba7fa01d2c87a8bc98296299a57054b9a075e, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'db4758ee7523fe447444c4bd2b867b543b1eee4e3bbcf6676cd1b27bf6147d86-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:099d88ae13fa2b3409da5310cdcba7fa01d2c87a8bc98296299a57054b9a075e', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_controller)
Feb 02 10:01:50 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:01:50 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:01:50 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:01:50.561 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:01:50 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:01:50 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:01:50 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:01:50.702 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:01:50 compute-1 ceph-mon[80115]: pgmap v669: 353 pgs: 353 active+clean; 41 MiB data, 194 MiB used, 60 GiB / 60 GiB avail; 44 KiB/s rd, 6.8 MiB/s wr, 64 op/s
Feb 02 10:01:51 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[227964]: 02/02/2026 10:01:51 : epoch 698075b9 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1c30004160 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:01:51 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[227964]: 02/02/2026 10:01:51 : epoch 698075b9 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1c340025b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:01:51 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[227964]: 02/02/2026 10:01:51 : epoch 698075b9 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1c38004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:01:52 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e144 e144: 3 total, 3 up, 3 in
Feb 02 10:01:52 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 10:01:52 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:01:52 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:01:52 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:01:52.564 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:01:52 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:01:52 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:01:52 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:01:52.705 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:01:53 compute-1 ceph-mon[80115]: osdmap e144: 3 total, 3 up, 3 in
Feb 02 10:01:53 compute-1 ceph-mon[80115]: pgmap v671: 353 pgs: 353 active+clean; 41 MiB data, 194 MiB used, 60 GiB / 60 GiB avail; 39 KiB/s rd, 6.0 MiB/s wr, 56 op/s
Feb 02 10:01:53 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[227964]: 02/02/2026 10:01:53 : epoch 698075b9 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1c38004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:01:53 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[227964]: 02/02/2026 10:01:53 : epoch 698075b9 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1c30004180 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:01:53 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[227964]: 02/02/2026 10:01:53 : epoch 698075b9 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1c340036b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:01:54 compute-1 sudo[228536]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 02 10:01:54 compute-1 sudo[228536]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 10:01:54 compute-1 sudo[228536]: pam_unix(sudo:session): session closed for user root
Feb 02 10:01:54 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:01:54 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:01:54 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:01:54.566 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:01:54 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:01:54 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:01:54 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:01:54.707 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:01:55 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb 02 10:01:55 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb 02 10:01:55 compute-1 ceph-mon[80115]: pgmap v672: 353 pgs: 353 active+clean; 41 MiB data, 194 MiB used, 60 GiB / 60 GiB avail; 33 KiB/s rd, 5.1 MiB/s wr, 48 op/s
Feb 02 10:01:55 compute-1 ceph-mon[80115]: from='client.? 192.168.122.10:0/2407768571' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Feb 02 10:01:55 compute-1 ceph-mon[80115]: from='client.? 192.168.122.10:0/2407768571' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Feb 02 10:01:55 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[227964]: 02/02/2026 10:01:55 : epoch 698075b9 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1c58003570 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:01:55 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[227964]: 02/02/2026 10:01:55 : epoch 698075b9 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1c38004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:01:55 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[227964]: 02/02/2026 10:01:55 : epoch 698075b9 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1c300041a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:01:56 compute-1 podman[228562]: 2026-02-02 10:01:56.388792676 +0000 UTC m=+0.058666391 container health_status 9cab238676dda9a572dafc047b624a8b78a8fbec063bf997856559897160605d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'db4758ee7523fe447444c4bd2b867b543b1eee4e3bbcf6676cd1b27bf6147d86-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 02 10:01:56 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:01:56 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:01:56 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:01:56.568 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:01:56 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:01:56 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:01:56 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:01:56.711 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:01:57 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 10:01:57 compute-1 ceph-mon[80115]: pgmap v673: 353 pgs: 353 active+clean; 41 MiB data, 194 MiB used, 60 GiB / 60 GiB avail; 30 KiB/s rd, 4.7 MiB/s wr, 44 op/s
Feb 02 10:01:57 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[227964]: 02/02/2026 10:01:57 : epoch 698075b9 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1c340036b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:01:57 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[227964]: 02/02/2026 10:01:57 : epoch 698075b9 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1c58003570 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:01:57 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[227964]: 02/02/2026 10:01:57 : epoch 698075b9 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1c38004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:01:58 compute-1 sudo[228583]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Feb 02 10:01:58 compute-1 sudo[228583]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 10:01:58 compute-1 sudo[228583]: pam_unix(sudo:session): session closed for user root
Feb 02 10:01:58 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:01:58 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:01:58 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:01:58.570 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:01:58 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:01:58 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:01:58 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:01:58.713 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:01:59 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[227964]: 02/02/2026 10:01:59 : epoch 698075b9 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1c300041c0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:01:59 compute-1 ceph-mon[80115]: pgmap v674: 353 pgs: 353 active+clean; 41 MiB data, 194 MiB used, 60 GiB / 60 GiB avail; 307 B/s rd, 0 op/s
Feb 02 10:01:59 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[227964]: 02/02/2026 10:01:59 : epoch 698075b9 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1c340036b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:01:59 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[227964]: 02/02/2026 10:01:59 : epoch 698075b9 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1c58004280 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:02:00 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:02:00 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:02:00 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:02:00.572 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:02:00 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:02:00 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:02:00 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:02:00.716 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:02:01 compute-1 anacron[4318]: Job `cron.monthly' started
Feb 02 10:02:01 compute-1 anacron[4318]: Job `cron.monthly' terminated
Feb 02 10:02:01 compute-1 anacron[4318]: Normal exit (3 jobs run)
Feb 02 10:02:01 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[227964]: 02/02/2026 10:02:01 : epoch 698075b9 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1c38004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:02:01 compute-1 ceph-mon[80115]: pgmap v675: 353 pgs: 353 active+clean; 41 MiB data, 194 MiB used, 60 GiB / 60 GiB avail; 307 B/s rd, 0 op/s
Feb 02 10:02:01 compute-1 nova_compute[226294]: 2026-02-02 10:02:01.643 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 02 10:02:01 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[227964]: 02/02/2026 10:02:01 : epoch 698075b9 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1c300041e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:02:01 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[227964]: 02/02/2026 10:02:01 : epoch 698075b9 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1c340036b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:02:02 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 10:02:02 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:02:02 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:02:02 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:02:02.575 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:02:02 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Feb 02 10:02:02 compute-1 ceph-mon[80115]: pgmap v676: 353 pgs: 353 active+clean; 41 MiB data, 194 MiB used, 60 GiB / 60 GiB avail; 299 B/s rd, 0 op/s
Feb 02 10:02:02 compute-1 nova_compute[226294]: 2026-02-02 10:02:02.648 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 02 10:02:02 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:02:02 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 10:02:02 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:02:02.718 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 10:02:03 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[227964]: 02/02/2026 10:02:03 : epoch 698075b9 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1c58004280 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:02:03 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[227964]: 02/02/2026 10:02:03 : epoch 698075b9 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1c38004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:02:03 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[227964]: 02/02/2026 10:02:03 : epoch 698075b9 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1c30004200 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:02:04 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:02:04 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:02:04 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:02:04.577 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:02:04 compute-1 nova_compute[226294]: 2026-02-02 10:02:04.649 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 02 10:02:04 compute-1 nova_compute[226294]: 2026-02-02 10:02:04.649 226298 DEBUG nova.compute.manager [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 02 10:02:04 compute-1 nova_compute[226294]: 2026-02-02 10:02:04.650 226298 DEBUG nova.compute.manager [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 02 10:02:04 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:02:04 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:02:04 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:02:04.722 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:02:04 compute-1 nova_compute[226294]: 2026-02-02 10:02:04.725 226298 DEBUG nova.compute.manager [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Feb 02 10:02:04 compute-1 nova_compute[226294]: 2026-02-02 10:02:04.725 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 02 10:02:04 compute-1 nova_compute[226294]: 2026-02-02 10:02:04.726 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 02 10:02:04 compute-1 nova_compute[226294]: 2026-02-02 10:02:04.726 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 02 10:02:04 compute-1 nova_compute[226294]: 2026-02-02 10:02:04.727 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 02 10:02:04 compute-1 nova_compute[226294]: 2026-02-02 10:02:04.727 226298 DEBUG nova.compute.manager [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 02 10:02:05 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[227964]: 02/02/2026 10:02:05 : epoch 698075b9 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1c340036b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:02:05 compute-1 ceph-mon[80115]: pgmap v677: 353 pgs: 353 active+clean; 41 MiB data, 194 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Feb 02 10:02:05 compute-1 ceph-mon[80115]: from='client.? 192.168.122.100:0/13567556' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Feb 02 10:02:05 compute-1 ceph-mon[80115]: from='client.? 192.168.122.100:0/3976023196' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Feb 02 10:02:05 compute-1 nova_compute[226294]: 2026-02-02 10:02:05.722 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 02 10:02:05 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[227964]: 02/02/2026 10:02:05 : epoch 698075b9 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1c58004280 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:02:05 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[227964]: 02/02/2026 10:02:05 : epoch 698075b9 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1c38004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:02:06 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:02:06 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 10:02:06 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:02:06.579 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 10:02:06 compute-1 nova_compute[226294]: 2026-02-02 10:02:06.648 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 02 10:02:06 compute-1 nova_compute[226294]: 2026-02-02 10:02:06.648 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 02 10:02:06 compute-1 ceph-mon[80115]: from='client.? 192.168.122.102:0/2737948522' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Feb 02 10:02:06 compute-1 ceph-mon[80115]: pgmap v678: 353 pgs: 353 active+clean; 41 MiB data, 194 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Feb 02 10:02:06 compute-1 nova_compute[226294]: 2026-02-02 10:02:06.678 226298 DEBUG oslo_concurrency.lockutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 02 10:02:06 compute-1 nova_compute[226294]: 2026-02-02 10:02:06.679 226298 DEBUG oslo_concurrency.lockutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 02 10:02:06 compute-1 nova_compute[226294]: 2026-02-02 10:02:06.679 226298 DEBUG oslo_concurrency.lockutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 02 10:02:06 compute-1 nova_compute[226294]: 2026-02-02 10:02:06.679 226298 DEBUG nova.compute.resource_tracker [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 02 10:02:06 compute-1 nova_compute[226294]: 2026-02-02 10:02:06.679 226298 DEBUG oslo_concurrency.processutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 02 10:02:06 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:02:06 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:02:06 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:02:06.724 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:02:07 compute-1 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Feb 02 10:02:07 compute-1 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/4238708752' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Feb 02 10:02:07 compute-1 nova_compute[226294]: 2026-02-02 10:02:07.131 226298 DEBUG oslo_concurrency.processutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.452s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 02 10:02:07 compute-1 nova_compute[226294]: 2026-02-02 10:02:07.312 226298 WARNING nova.virt.libvirt.driver [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 02 10:02:07 compute-1 nova_compute[226294]: 2026-02-02 10:02:07.314 226298 DEBUG nova.compute.resource_tracker [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5232MB free_disk=59.98828125GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 02 10:02:07 compute-1 nova_compute[226294]: 2026-02-02 10:02:07.315 226298 DEBUG oslo_concurrency.lockutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 02 10:02:07 compute-1 nova_compute[226294]: 2026-02-02 10:02:07.315 226298 DEBUG oslo_concurrency.lockutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 02 10:02:07 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 10:02:07 compute-1 nova_compute[226294]: 2026-02-02 10:02:07.378 226298 DEBUG nova.compute.resource_tracker [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 02 10:02:07 compute-1 nova_compute[226294]: 2026-02-02 10:02:07.378 226298 DEBUG nova.compute.resource_tracker [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 02 10:02:07 compute-1 nova_compute[226294]: 2026-02-02 10:02:07.398 226298 DEBUG oslo_concurrency.processutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 02 10:02:07 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[227964]: 02/02/2026 10:02:07 : epoch 698075b9 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1c30004220 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:02:07 compute-1 ceph-mon[80115]: from='client.? 192.168.122.101:0/4238708752' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Feb 02 10:02:07 compute-1 ceph-mon[80115]: from='client.? 192.168.122.102:0/2031407524' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Feb 02 10:02:07 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[227964]: 02/02/2026 10:02:07 : epoch 698075b9 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1c340036b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:02:07 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[227964]: 02/02/2026 10:02:07 : epoch 698075b9 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1c58004280 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:02:07 compute-1 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Feb 02 10:02:07 compute-1 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2119970772' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Feb 02 10:02:07 compute-1 nova_compute[226294]: 2026-02-02 10:02:07.848 226298 DEBUG oslo_concurrency.processutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.449s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 02 10:02:07 compute-1 nova_compute[226294]: 2026-02-02 10:02:07.854 226298 DEBUG nova.compute.provider_tree [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Inventory has not changed in ProviderTree for provider: 8e32c057-ad28-4c19-8374-763e0c1c8622 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 02 10:02:07 compute-1 nova_compute[226294]: 2026-02-02 10:02:07.871 226298 DEBUG nova.scheduler.client.report [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Inventory has not changed for provider 8e32c057-ad28-4c19-8374-763e0c1c8622 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 02 10:02:07 compute-1 nova_compute[226294]: 2026-02-02 10:02:07.873 226298 DEBUG nova.compute.resource_tracker [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 02 10:02:07 compute-1 nova_compute[226294]: 2026-02-02 10:02:07.874 226298 DEBUG oslo_concurrency.lockutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.559s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 02 10:02:08 compute-1 ovn_metadata_agent[143537]: 2026-02-02 10:02:08.249 143542 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=3, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '66:4f:4d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '4a:a7:f3:61:65:15'}, ipsec=False) old=SB_Global(nb_cfg=2) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 02 10:02:08 compute-1 ovn_metadata_agent[143537]: 2026-02-02 10:02:08.251 143542 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Feb 02 10:02:08 compute-1 ovn_metadata_agent[143537]: 2026-02-02 10:02:08.252 143542 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=2f54a3b0-231a-4b96-9e3a-0a36e3e73216, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '3'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 02 10:02:08 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:02:08 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:02:08 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:02:08.583 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:02:08 compute-1 ceph-mon[80115]: from='client.? 192.168.122.101:0/2119970772' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Feb 02 10:02:08 compute-1 ceph-mon[80115]: pgmap v679: 353 pgs: 353 active+clean; 41 MiB data, 194 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Feb 02 10:02:08 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:02:08 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:02:08 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:02:08.727 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:02:09 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[227964]: 02/02/2026 10:02:09 : epoch 698075b9 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1c38004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:02:09 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[227964]: 02/02/2026 10:02:09 : epoch 698075b9 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1c30004240 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:02:09 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[227964]: 02/02/2026 10:02:09 : epoch 698075b9 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1c340036b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:02:10 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:02:10 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:02:10 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:02:10.586 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:02:10 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:02:10 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:02:10 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:02:10.731 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:02:11 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[227964]: 02/02/2026 10:02:11 : epoch 698075b9 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1c58004280 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:02:11 compute-1 ceph-mon[80115]: pgmap v680: 353 pgs: 353 active+clean; 41 MiB data, 194 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Feb 02 10:02:11 compute-1 kernel: ganesha.nfsd[228017]: segfault at 50 ip 00007f1cde4a632e sp 00007f1c53ffe210 error 4 in libntirpc.so.5.8[7f1cde48b000+2c000] likely on CPU 4 (core 0, socket 4)
Feb 02 10:02:11 compute-1 kernel: Code: 47 20 66 41 89 86 f2 00 00 00 41 bf 01 00 00 00 b9 40 00 00 00 e9 af fd ff ff 66 90 48 8b 85 f8 00 00 00 48 8b 40 08 4c 8b 28 <45> 8b 65 50 49 8b 75 68 41 8b be 28 02 00 00 b9 40 00 00 00 e8 29
Feb 02 10:02:11 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[227964]: 02/02/2026 10:02:11 : epoch 698075b9 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1c38004050 fd 38 proxy ignored for local
Feb 02 10:02:11 compute-1 systemd[1]: Started Process Core Dump (PID 228662/UID 0).
Feb 02 10:02:12 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 10:02:12 compute-1 systemd-coredump[228663]: Process 227968 (ganesha.nfsd) of user 0 dumped core.
                                                    
                                                    Stack trace of thread 45:
                                                    #0  0x00007f1cde4a632e n/a (/usr/lib64/libntirpc.so.5.8 + 0x2232e)
                                                    #1  0x0000000000000000 n/a (n/a + 0x0)
                                                    #2  0x00007f1cde4b0900 n/a (/usr/lib64/libntirpc.so.5.8 + 0x2c900)
                                                    ELF object binary architecture: AMD x86-64
Feb 02 10:02:12 compute-1 systemd[1]: systemd-coredump@11-228662-0.service: Deactivated successfully.
Feb 02 10:02:12 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:02:12 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:02:12 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:02:12.588 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:02:12 compute-1 podman[228668]: 2026-02-02 10:02:12.599746331 +0000 UTC m=+0.030179944 container died b8cd7e6a5e5ee96a5449f4186f673db41acd163f7cefcb47536d3de0a6b28483 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.build-date=20250325, CEPH_REF=squid, io.buildah.version=1.40.1)
Feb 02 10:02:12 compute-1 systemd[1]: var-lib-containers-storage-overlay-e9465b7c38f1b6c44bfc60dadfc1f392a36c5062ddcaf7b9a22d0f3992451af5-merged.mount: Deactivated successfully.
Feb 02 10:02:12 compute-1 podman[228668]: 2026-02-02 10:02:12.650696008 +0000 UTC m=+0.081129571 container remove b8cd7e6a5e5ee96a5449f4186f673db41acd163f7cefcb47536d3de0a6b28483 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx, OSD_FLAVOR=default, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, io.buildah.version=1.40.1, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, CEPH_REF=squid)
Feb 02 10:02:12 compute-1 systemd[1]: ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1@nfs.cephfs.0.0.compute-1.mhzhsx.service: Main process exited, code=exited, status=139/n/a
Feb 02 10:02:12 compute-1 ceph-mon[80115]: pgmap v681: 353 pgs: 353 active+clean; 41 MiB data, 194 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Feb 02 10:02:12 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:02:12 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 10:02:12 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:02:12.733 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 10:02:12 compute-1 systemd[1]: ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1@nfs.cephfs.0.0.compute-1.mhzhsx.service: Failed with result 'exit-code'.
Feb 02 10:02:12 compute-1 systemd[1]: ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1@nfs.cephfs.0.0.compute-1.mhzhsx.service: Consumed 1.186s CPU time.
Feb 02 10:02:14 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:02:14 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:02:14 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:02:14.591 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:02:14 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:02:14 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:02:14 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:02:14.737 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:02:15 compute-1 ceph-mon[80115]: pgmap v682: 353 pgs: 353 active+clean; 41 MiB data, 194 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Feb 02 10:02:16 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:02:16 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:02:16 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:02:16.594 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:02:16 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:02:16 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:02:16 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:02:16.739 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:02:17 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 10:02:17 compute-1 ceph-mon[80115]: pgmap v683: 353 pgs: 353 active+clean; 41 MiB data, 194 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Feb 02 10:02:17 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Feb 02 10:02:17 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-haproxy-nfs-cephfs-compute-1-sryqbx[86108]: [WARNING] 032/100217 (4) : Server backend/nfs.cephfs.0 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Feb 02 10:02:18 compute-1 sudo[228713]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Feb 02 10:02:18 compute-1 sudo[228713]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 10:02:18 compute-1 sudo[228713]: pam_unix(sudo:session): session closed for user root
Feb 02 10:02:18 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:02:18 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:02:18 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:02:18.596 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:02:18 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:02:18 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 10:02:18 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:02:18.741 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 10:02:19 compute-1 ceph-mon[80115]: pgmap v684: 353 pgs: 353 active+clean; 41 MiB data, 194 MiB used, 60 GiB / 60 GiB avail; 426 B/s rd, 0 op/s
Feb 02 10:02:20 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:02:20 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:02:20 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:02:20.599 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:02:20 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:02:20 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:02:20 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:02:20.745 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:02:21 compute-1 podman[228740]: 2026-02-02 10:02:21.413994197 +0000 UTC m=+0.089567673 container health_status 1fb2696999ea15e9688144989093738543d3cef725f9986792d6dfd707aefbe2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:099d88ae13fa2b3409da5310cdcba7fa01d2c87a8bc98296299a57054b9a075e, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'db4758ee7523fe447444c4bd2b867b543b1eee4e3bbcf6676cd1b27bf6147d86-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:099d88ae13fa2b3409da5310cdcba7fa01d2c87a8bc98296299a57054b9a075e', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127)
Feb 02 10:02:21 compute-1 ceph-mon[80115]: pgmap v685: 353 pgs: 353 active+clean; 41 MiB data, 194 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s
Feb 02 10:02:22 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 10:02:22 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:02:22 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:02:22 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:02:22.602 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:02:22 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:02:22 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 10:02:22 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:02:22.747 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 10:02:23 compute-1 systemd[1]: ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1@nfs.cephfs.0.0.compute-1.mhzhsx.service: Scheduled restart job, restart counter is at 12.
Feb 02 10:02:23 compute-1 systemd[1]: Stopped Ceph nfs.cephfs.0.0.compute-1.mhzhsx for d241d473-9fcb-5f74-b163-f1ca4454e7f1.
Feb 02 10:02:23 compute-1 systemd[1]: ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1@nfs.cephfs.0.0.compute-1.mhzhsx.service: Consumed 1.186s CPU time.
Feb 02 10:02:23 compute-1 systemd[1]: Starting Ceph nfs.cephfs.0.0.compute-1.mhzhsx for d241d473-9fcb-5f74-b163-f1ca4454e7f1...
Feb 02 10:02:23 compute-1 podman[228815]: 2026-02-02 10:02:23.319742264 +0000 UTC m=+0.056008771 container create 85ab4d728a71079bd2a743190419dfcd03951eb9e7ca1f94d0aafff8d6f59769 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS)
Feb 02 10:02:23 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/28beda5ed4c246f2d3f1ed0902033b626bd7ff20b987eeaeecb35981e838cb7c/merged/etc/ganesha supports timestamps until 2038 (0x7fffffff)
Feb 02 10:02:23 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/28beda5ed4c246f2d3f1ed0902033b626bd7ff20b987eeaeecb35981e838cb7c/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Feb 02 10:02:23 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/28beda5ed4c246f2d3f1ed0902033b626bd7ff20b987eeaeecb35981e838cb7c/merged/etc/ceph/keyring supports timestamps until 2038 (0x7fffffff)
Feb 02 10:02:23 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/28beda5ed4c246f2d3f1ed0902033b626bd7ff20b987eeaeecb35981e838cb7c/merged/var/lib/ceph/radosgw/ceph-nfs.cephfs.0.0.compute-1.mhzhsx-rgw/keyring supports timestamps until 2038 (0x7fffffff)
Feb 02 10:02:23 compute-1 podman[228815]: 2026-02-02 10:02:23.377297375 +0000 UTC m=+0.113563922 container init 85ab4d728a71079bd2a743190419dfcd03951eb9e7ca1f94d0aafff8d6f59769 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default, ceph=True, org.label-schema.license=GPLv2, io.buildah.version=1.40.1, org.label-schema.build-date=20250325, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Feb 02 10:02:23 compute-1 podman[228815]: 2026-02-02 10:02:23.392455903 +0000 UTC m=+0.128722400 container start 85ab4d728a71079bd2a743190419dfcd03951eb9e7ca1f94d0aafff8d6f59769 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, org.label-schema.build-date=20250325)
Feb 02 10:02:23 compute-1 podman[228815]: 2026-02-02 10:02:23.298807405 +0000 UTC m=+0.035073922 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Feb 02 10:02:23 compute-1 bash[228815]: 85ab4d728a71079bd2a743190419dfcd03951eb9e7ca1f94d0aafff8d6f59769
Feb 02 10:02:23 compute-1 systemd[1]: Started Ceph nfs.cephfs.0.0.compute-1.mhzhsx for d241d473-9fcb-5f74-b163-f1ca4454e7f1.
Feb 02 10:02:23 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[228831]: 02/02/2026 10:02:23 : epoch 6980762f : compute-1 : ganesha.nfsd-2[main] init_logging :LOG :NULL :LOG: Setting log level for all components to NIV_EVENT
Feb 02 10:02:23 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[228831]: 02/02/2026 10:02:23 : epoch 6980762f : compute-1 : ganesha.nfsd-2[main] main :MAIN :EVENT :ganesha.nfsd Starting: Ganesha Version 5.9
Feb 02 10:02:23 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[228831]: 02/02/2026 10:02:23 : epoch 6980762f : compute-1 : ganesha.nfsd-2[main] nfs_set_param_from_conf :NFS STARTUP :EVENT :Configuration file successfully parsed
Feb 02 10:02:23 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[228831]: 02/02/2026 10:02:23 : epoch 6980762f : compute-1 : ganesha.nfsd-2[main] monitoring_init :NFS STARTUP :EVENT :Init monitoring at 0.0.0.0:9587
Feb 02 10:02:23 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[228831]: 02/02/2026 10:02:23 : epoch 6980762f : compute-1 : ganesha.nfsd-2[main] fsal_init_fds_limit :MDCACHE LRU :EVENT :Setting the system-imposed limit on FDs to 1048576.
Feb 02 10:02:23 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[228831]: 02/02/2026 10:02:23 : epoch 6980762f : compute-1 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :Initializing ID Mapper.
Feb 02 10:02:23 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[228831]: 02/02/2026 10:02:23 : epoch 6980762f : compute-1 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :ID Mapper successfully initialized.
Feb 02 10:02:23 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[228831]: 02/02/2026 10:02:23 : epoch 6980762f : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Feb 02 10:02:23 compute-1 ceph-mon[80115]: pgmap v686: 353 pgs: 353 active+clean; 41 MiB data, 194 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s
Feb 02 10:02:24 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:02:24 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:02:24 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:02:24.605 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:02:24 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:02:24 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:02:24 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:02:24.751 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:02:25 compute-1 ceph-mon[80115]: pgmap v687: 353 pgs: 353 active+clean; 41 MiB data, 194 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s
Feb 02 10:02:26 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:02:26 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:02:26 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:02:26.608 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:02:26 compute-1 ceph-mon[80115]: pgmap v688: 353 pgs: 353 active+clean; 41 MiB data, 194 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s
Feb 02 10:02:26 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:02:26 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:02:26 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:02:26.754 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:02:27 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 10:02:27 compute-1 podman[228875]: 2026-02-02 10:02:27.397615072 +0000 UTC m=+0.072524395 container health_status 9cab238676dda9a572dafc047b624a8b78a8fbec063bf997856559897160605d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'db4758ee7523fe447444c4bd2b867b543b1eee4e3bbcf6676cd1b27bf6147d86-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 02 10:02:28 compute-1 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Feb 02 10:02:28 compute-1 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/3560888185' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Feb 02 10:02:28 compute-1 ceph-mon[80115]: from='client.? 192.168.122.100:0/3560888185' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Feb 02 10:02:28 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:02:28 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:02:28 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:02:28.612 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:02:28 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:02:28 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:02:28 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:02:28.758 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:02:29 compute-1 ceph-mon[80115]: pgmap v689: 353 pgs: 353 active+clean; 41 MiB data, 194 MiB used, 60 GiB / 60 GiB avail; 1.2 KiB/s rd, 597 B/s wr, 1 op/s
Feb 02 10:02:29 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[228831]: 02/02/2026 10:02:29 : epoch 6980762f : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Feb 02 10:02:29 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[228831]: 02/02/2026 10:02:29 : epoch 6980762f : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Feb 02 10:02:30 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:02:30 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:02:30 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:02:30.614 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:02:30 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:02:30 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:02:30 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:02:30.761 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:02:31 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e145 e145: 3 total, 3 up, 3 in
Feb 02 10:02:31 compute-1 ceph-mon[80115]: pgmap v690: 353 pgs: 353 active+clean; 41 MiB data, 194 MiB used, 60 GiB / 60 GiB avail; 1.1 KiB/s rd, 597 B/s wr, 1 op/s
Feb 02 10:02:32 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e145 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 10:02:32 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e146 e146: 3 total, 3 up, 3 in
Feb 02 10:02:32 compute-1 ceph-mon[80115]: osdmap e145: 3 total, 3 up, 3 in
Feb 02 10:02:32 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Feb 02 10:02:32 compute-1 ceph-mon[80115]: osdmap e146: 3 total, 3 up, 3 in
Feb 02 10:02:32 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:02:32 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:02:32 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:02:32.618 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:02:32 compute-1 ceph-mon[80115]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #40. Immutable memtables: 0.
Feb 02 10:02:32 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:02:32.633613) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Feb 02 10:02:32 compute-1 ceph-mon[80115]: rocksdb: [db/flush_job.cc:856] [default] [JOB 21] Flushing memtable with next log file: 40
Feb 02 10:02:32 compute-1 ceph-mon[80115]: rocksdb: EVENT_LOG_v1 {"time_micros": 1770026552633650, "job": 21, "event": "flush_started", "num_memtables": 1, "num_entries": 2392, "num_deletes": 251, "total_data_size": 6360090, "memory_usage": 6461312, "flush_reason": "Manual Compaction"}
Feb 02 10:02:32 compute-1 ceph-mon[80115]: rocksdb: [db/flush_job.cc:885] [default] [JOB 21] Level-0 flush table #41: started
Feb 02 10:02:32 compute-1 ceph-mon[80115]: rocksdb: EVENT_LOG_v1 {"time_micros": 1770026552667380, "cf_name": "default", "job": 21, "event": "table_file_creation", "file_number": 41, "file_size": 4164023, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 20822, "largest_seqno": 23209, "table_properties": {"data_size": 4154205, "index_size": 6248, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2501, "raw_key_size": 20135, "raw_average_key_size": 20, "raw_value_size": 4134552, "raw_average_value_size": 4180, "num_data_blocks": 273, "num_entries": 989, "num_filter_entries": 989, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1770026340, "oldest_key_time": 1770026340, "file_creation_time": 1770026552, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "fac1d709-8a2a-487d-b05b-57255ec289c7", "db_session_id": "DE871D21TSCUFP8UED8E", "orig_file_number": 41, "seqno_to_time_mapping": "N/A"}}
Feb 02 10:02:32 compute-1 ceph-mon[80115]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 21] Flush lasted 33863 microseconds, and 10942 cpu microseconds.
Feb 02 10:02:32 compute-1 ceph-mon[80115]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Feb 02 10:02:32 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:02:32.667471) [db/flush_job.cc:967] [default] [JOB 21] Level-0 flush table #41: 4164023 bytes OK
Feb 02 10:02:32 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:02:32.667496) [db/memtable_list.cc:519] [default] Level-0 commit table #41 started
Feb 02 10:02:32 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:02:32.670121) [db/memtable_list.cc:722] [default] Level-0 commit table #41: memtable #1 done
Feb 02 10:02:32 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:02:32.670170) EVENT_LOG_v1 {"time_micros": 1770026552670162, "job": 21, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Feb 02 10:02:32 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:02:32.670194) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Feb 02 10:02:32 compute-1 ceph-mon[80115]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 21] Try to delete WAL files size 6349455, prev total WAL file size 6349455, number of live WAL files 2.
Feb 02 10:02:32 compute-1 ceph-mon[80115]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000037.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb 02 10:02:32 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:02:32.671564) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730031353036' seq:72057594037927935, type:22 .. '7061786F730031373538' seq:0, type:0; will stop at (end)
Feb 02 10:02:32 compute-1 ceph-mon[80115]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 22] Compacting 1@0 + 1@6 files to L6, score -1.00
Feb 02 10:02:32 compute-1 ceph-mon[80115]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 21 Base level 0, inputs: [41(4066KB)], [39(12MB)]
Feb 02 10:02:32 compute-1 ceph-mon[80115]: rocksdb: EVENT_LOG_v1 {"time_micros": 1770026552671642, "job": 22, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [41], "files_L6": [39], "score": -1, "input_data_size": 16886346, "oldest_snapshot_seqno": -1}
Feb 02 10:02:32 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:02:32 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:02:32 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:02:32.764 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:02:32 compute-1 ceph-mon[80115]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 22] Generated table #42: 5465 keys, 14711233 bytes, temperature: kUnknown
Feb 02 10:02:32 compute-1 ceph-mon[80115]: rocksdb: EVENT_LOG_v1 {"time_micros": 1770026552799245, "cf_name": "default", "job": 22, "event": "table_file_creation", "file_number": 42, "file_size": 14711233, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 14672361, "index_size": 24106, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 13701, "raw_key_size": 137905, "raw_average_key_size": 25, "raw_value_size": 14571119, "raw_average_value_size": 2666, "num_data_blocks": 994, "num_entries": 5465, "num_filter_entries": 5465, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1770025175, "oldest_key_time": 0, "file_creation_time": 1770026552, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "fac1d709-8a2a-487d-b05b-57255ec289c7", "db_session_id": "DE871D21TSCUFP8UED8E", "orig_file_number": 42, "seqno_to_time_mapping": "N/A"}}
Feb 02 10:02:32 compute-1 ceph-mon[80115]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Feb 02 10:02:32 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:02:32.799589) [db/compaction/compaction_job.cc:1663] [default] [JOB 22] Compacted 1@0 + 1@6 files to L6 => 14711233 bytes
Feb 02 10:02:32 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:02:32.800957) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 132.2 rd, 115.2 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(4.0, 12.1 +0.0 blob) out(14.0 +0.0 blob), read-write-amplify(7.6) write-amplify(3.5) OK, records in: 5989, records dropped: 524 output_compression: NoCompression
Feb 02 10:02:32 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:02:32.800992) EVENT_LOG_v1 {"time_micros": 1770026552800976, "job": 22, "event": "compaction_finished", "compaction_time_micros": 127703, "compaction_time_cpu_micros": 29104, "output_level": 6, "num_output_files": 1, "total_output_size": 14711233, "num_input_records": 5989, "num_output_records": 5465, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Feb 02 10:02:32 compute-1 ceph-mon[80115]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000041.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb 02 10:02:32 compute-1 ceph-mon[80115]: rocksdb: EVENT_LOG_v1 {"time_micros": 1770026552802120, "job": 22, "event": "table_file_deletion", "file_number": 41}
Feb 02 10:02:32 compute-1 ceph-mon[80115]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000039.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb 02 10:02:32 compute-1 ceph-mon[80115]: rocksdb: EVENT_LOG_v1 {"time_micros": 1770026552804920, "job": 22, "event": "table_file_deletion", "file_number": 39}
Feb 02 10:02:32 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:02:32.671478) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 02 10:02:32 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:02:32.804985) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 02 10:02:32 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:02:32.804993) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 02 10:02:32 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:02:32.804996) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 02 10:02:32 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:02:32.804997) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 02 10:02:32 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:02:32.804999) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 02 10:02:33 compute-1 ceph-mon[80115]: pgmap v692: 353 pgs: 353 active+clean; 41 MiB data, 194 MiB used, 60 GiB / 60 GiB avail; 1.3 KiB/s rd, 716 B/s wr, 2 op/s
Feb 02 10:02:34 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:02:34 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 10:02:34 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:02:34.619 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 10:02:34 compute-1 ceph-mon[80115]: pgmap v694: 353 pgs: 353 active+clean; 88 MiB data, 195 MiB used, 60 GiB / 60 GiB avail; 2.6 MiB/s rd, 2.7 MiB/s wr, 56 op/s
Feb 02 10:02:34 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:02:34 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:02:34 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:02:34.766 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:02:35 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[228831]: 02/02/2026 10:02:35 : epoch 6980762f : compute-1 : ganesha.nfsd-2[main] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Feb 02 10:02:35 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[228831]: 02/02/2026 10:02:35 : epoch 6980762f : compute-1 : ganesha.nfsd-2[main] main :NFS STARTUP :WARN :No export entries found in configuration file !!!
Feb 02 10:02:35 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[228831]: 02/02/2026 10:02:35 : epoch 6980762f : compute-1 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:24): Unknown block (RADOS_URLS)
Feb 02 10:02:35 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[228831]: 02/02/2026 10:02:35 : epoch 6980762f : compute-1 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:29): Unknown block (RGW)
Feb 02 10:02:35 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[228831]: 02/02/2026 10:02:35 : epoch 6980762f : compute-1 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :CAP_SYS_RESOURCE was successfully removed for proper quota management in FSAL
Feb 02 10:02:35 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[228831]: 02/02/2026 10:02:35 : epoch 6980762f : compute-1 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :currently set capabilities are: cap_chown,cap_dac_override,cap_fowner,cap_fsetid,cap_kill,cap_setgid,cap_setuid,cap_setpcap,cap_net_bind_service,cap_sys_chroot,cap_setfcap=ep
Feb 02 10:02:35 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[228831]: 02/02/2026 10:02:35 : epoch 6980762f : compute-1 : ganesha.nfsd-2[main] gsh_dbus_pkginit :DBUS :CRIT :dbus_bus_get failed (Failed to connect to socket /run/dbus/system_bus_socket: No such file or directory)
Feb 02 10:02:35 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[228831]: 02/02/2026 10:02:35 : epoch 6980762f : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Feb 02 10:02:35 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[228831]: 02/02/2026 10:02:35 : epoch 6980762f : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Feb 02 10:02:35 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[228831]: 02/02/2026 10:02:35 : epoch 6980762f : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Feb 02 10:02:35 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[228831]: 02/02/2026 10:02:35 : epoch 6980762f : compute-1 : ganesha.nfsd-2[main] nfs_Init_svc :DISP :CRIT :Cannot acquire credentials for principal nfs
Feb 02 10:02:35 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[228831]: 02/02/2026 10:02:35 : epoch 6980762f : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Feb 02 10:02:35 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[228831]: 02/02/2026 10:02:35 : epoch 6980762f : compute-1 : ganesha.nfsd-2[main] nfs_Init_admin_thread :NFS CB :EVENT :Admin thread initialized
Feb 02 10:02:35 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[228831]: 02/02/2026 10:02:35 : epoch 6980762f : compute-1 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :EVENT :Callback creds directory (/var/run/ganesha) already exists
Feb 02 10:02:35 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[228831]: 02/02/2026 10:02:35 : epoch 6980762f : compute-1 : ganesha.nfsd-2[main] find_keytab_entry :NFS CB :WARN :Configuration file does not specify default realm while getting default realm name
Feb 02 10:02:35 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[228831]: 02/02/2026 10:02:35 : epoch 6980762f : compute-1 : ganesha.nfsd-2[main] gssd_refresh_krb5_machine_credential :NFS CB :CRIT :ERROR: gssd_refresh_krb5_machine_credential: no usable keytab entry found in keytab /etc/krb5.keytab for connection with host localhost
Feb 02 10:02:35 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[228831]: 02/02/2026 10:02:35 : epoch 6980762f : compute-1 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :WARN :gssd_refresh_krb5_machine_credential failed (-1765328160:0)
Feb 02 10:02:35 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[228831]: 02/02/2026 10:02:35 : epoch 6980762f : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :Starting delayed executor.
Feb 02 10:02:35 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[228831]: 02/02/2026 10:02:35 : epoch 6980762f : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :gsh_dbusthread was started successfully
Feb 02 10:02:35 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[228831]: 02/02/2026 10:02:35 : epoch 6980762f : compute-1 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :CRIT :DBUS not initialized, service thread exiting
Feb 02 10:02:35 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[228831]: 02/02/2026 10:02:35 : epoch 6980762f : compute-1 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :EVENT :shutdown
Feb 02 10:02:35 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[228831]: 02/02/2026 10:02:35 : epoch 6980762f : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :admin thread was started successfully
Feb 02 10:02:35 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[228831]: 02/02/2026 10:02:35 : epoch 6980762f : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :reaper thread was started successfully
Feb 02 10:02:35 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[228831]: 02/02/2026 10:02:35 : epoch 6980762f : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :General fridge was started successfully
Feb 02 10:02:35 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[228831]: 02/02/2026 10:02:35 : epoch 6980762f : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Feb 02 10:02:35 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[228831]: 02/02/2026 10:02:35 : epoch 6980762f : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :             NFS SERVER INITIALIZED
Feb 02 10:02:35 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[228831]: 02/02/2026 10:02:35 : epoch 6980762f : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Feb 02 10:02:35 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[228831]: 02/02/2026 10:02:35 : epoch 6980762f : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7c64000df0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:02:35 compute-1 ceph-mon[80115]: from='client.? 192.168.122.100:0/748742709' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Feb 02 10:02:35 compute-1 ceph-mon[80115]: from='client.? 192.168.122.100:0/3948756213' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Feb 02 10:02:35 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[228831]: 02/02/2026 10:02:35 : epoch 6980762f : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7c58000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:02:35 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[228831]: 02/02/2026 10:02:35 : epoch 6980762f : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7c48000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:02:36 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:02:36 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:02:36 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:02:36.622 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:02:36 compute-1 ceph-mon[80115]: pgmap v695: 353 pgs: 353 active+clean; 88 MiB data, 195 MiB used, 60 GiB / 60 GiB avail; 2.6 MiB/s rd, 2.7 MiB/s wr, 53 op/s
Feb 02 10:02:36 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:02:36 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 10:02:36 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:02:36.768 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 10:02:37 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 e147: 3 total, 3 up, 3 in
Feb 02 10:02:37 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 10:02:37 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[228831]: 02/02/2026 10:02:37 : epoch 6980762f : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7c3c000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:02:37 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-haproxy-nfs-cephfs-compute-1-sryqbx[86108]: [WARNING] 032/100237 (4) : Server backend/nfs.cephfs.0 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Feb 02 10:02:37 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[228831]: 02/02/2026 10:02:37 : epoch 6980762f : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7c44000fa0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:02:37 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[228831]: 02/02/2026 10:02:37 : epoch 6980762f : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7c44000fa0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:02:38 compute-1 sudo[228916]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Feb 02 10:02:38 compute-1 sudo[228916]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 10:02:38 compute-1 sudo[228916]: pam_unix(sudo:session): session closed for user root
Feb 02 10:02:38 compute-1 ceph-mon[80115]: osdmap e147: 3 total, 3 up, 3 in
Feb 02 10:02:38 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:02:38 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:02:38 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:02:38.625 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:02:38 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:02:38 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 10:02:38 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:02:38.771 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 10:02:39 compute-1 ceph-mon[80115]: pgmap v697: 353 pgs: 353 active+clean; 88 MiB data, 216 MiB used, 60 GiB / 60 GiB avail; 3.0 MiB/s rd, 3.1 MiB/s wr, 70 op/s
Feb 02 10:02:39 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[228831]: 02/02/2026 10:02:39 : epoch 6980762f : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7c480016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:02:39 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[228831]: 02/02/2026 10:02:39 : epoch 6980762f : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7c3c0016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:02:39 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[228831]: 02/02/2026 10:02:39 : epoch 6980762f : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7c3c0016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:02:40 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:02:40 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:02:40 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:02:40.628 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:02:40 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:02:40 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:02:40 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:02:40.774 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:02:41 compute-1 ceph-mon[80115]: pgmap v698: 353 pgs: 353 active+clean; 88 MiB data, 216 MiB used, 60 GiB / 60 GiB avail; 2.6 MiB/s rd, 2.7 MiB/s wr, 61 op/s
Feb 02 10:02:41 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[228831]: 02/02/2026 10:02:41 : epoch 6980762f : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7c3c0016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:02:41 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[228831]: 02/02/2026 10:02:41 : epoch 6980762f : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7c40000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:02:41 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[228831]: 02/02/2026 10:02:41 : epoch 6980762f : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7c480016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:02:42 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 10:02:42 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:02:42 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:02:42 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:02:42.630 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:02:42 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:02:42 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:02:42 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:02:42.777 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:02:43 compute-1 ceph-mon[80115]: pgmap v699: 353 pgs: 353 active+clean; 88 MiB data, 216 MiB used, 60 GiB / 60 GiB avail; 2.1 MiB/s rd, 2.2 MiB/s wr, 49 op/s
Feb 02 10:02:43 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[228831]: 02/02/2026 10:02:43 : epoch 6980762f : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7c44001f60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:02:43 compute-1 kernel: ganesha.nfsd[228905]: segfault at 50 ip 00007f7ceda1032e sp 00007f7c627fb210 error 4 in libntirpc.so.5.8[7f7ced9f5000+2c000] likely on CPU 2 (core 0, socket 2)
Feb 02 10:02:43 compute-1 kernel: Code: 47 20 66 41 89 86 f2 00 00 00 41 bf 01 00 00 00 b9 40 00 00 00 e9 af fd ff ff 66 90 48 8b 85 f8 00 00 00 48 8b 40 08 4c 8b 28 <45> 8b 65 50 49 8b 75 68 41 8b be 28 02 00 00 b9 40 00 00 00 e8 29
Feb 02 10:02:43 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[228831]: 02/02/2026 10:02:43 : epoch 6980762f : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7c3c0016a0 fd 38 proxy ignored for local
Feb 02 10:02:43 compute-1 systemd[1]: Started Process Core Dump (PID 228944/UID 0).
Feb 02 10:02:44 compute-1 systemd-coredump[228945]: Process 228835 (ganesha.nfsd) of user 0 dumped core.
                                                    
                                                    Stack trace of thread 46:
                                                    #0  0x00007f7ceda1032e n/a (/usr/lib64/libntirpc.so.5.8 + 0x2232e)
                                                    ELF object binary architecture: AMD x86-64
Feb 02 10:02:44 compute-1 systemd[1]: systemd-coredump@12-228944-0.service: Deactivated successfully.
Feb 02 10:02:44 compute-1 podman[228950]: 2026-02-02 10:02:44.486787204 +0000 UTC m=+0.027200515 container died 85ab4d728a71079bd2a743190419dfcd03951eb9e7ca1f94d0aafff8d6f59769 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx, CEPH_REF=squid, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, ceph=True, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default)
Feb 02 10:02:44 compute-1 systemd[1]: var-lib-containers-storage-overlay-28beda5ed4c246f2d3f1ed0902033b626bd7ff20b987eeaeecb35981e838cb7c-merged.mount: Deactivated successfully.
Feb 02 10:02:44 compute-1 podman[228950]: 2026-02-02 10:02:44.526778574 +0000 UTC m=+0.067191895 container remove 85ab4d728a71079bd2a743190419dfcd03951eb9e7ca1f94d0aafff8d6f59769 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx, ceph=True, CEPH_REF=squid, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Feb 02 10:02:44 compute-1 systemd[1]: ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1@nfs.cephfs.0.0.compute-1.mhzhsx.service: Main process exited, code=exited, status=139/n/a
Feb 02 10:02:44 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:02:44 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:02:44 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:02:44.632 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:02:44 compute-1 systemd[1]: ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1@nfs.cephfs.0.0.compute-1.mhzhsx.service: Failed with result 'exit-code'.
Feb 02 10:02:44 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:02:44 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:02:44 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:02:44.780 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:02:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 10:02:44.902 143542 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 02 10:02:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 10:02:44.903 143542 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 02 10:02:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 10:02:44.904 143542 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 02 10:02:45 compute-1 ceph-mon[80115]: pgmap v700: 353 pgs: 353 active+clean; 88 MiB data, 216 MiB used, 60 GiB / 60 GiB avail; 2.3 MiB/s rd, 15 KiB/s wr, 88 op/s
Feb 02 10:02:46 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:02:46 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 10:02:46 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:02:46.634 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 10:02:46 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:02:46 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:02:46 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:02:46.783 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:02:47 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 10:02:47 compute-1 ceph-mon[80115]: pgmap v701: 353 pgs: 353 active+clean; 88 MiB data, 216 MiB used, 60 GiB / 60 GiB avail; 2.3 MiB/s rd, 15 KiB/s wr, 88 op/s
Feb 02 10:02:47 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Feb 02 10:02:48 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:02:48 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:02:48 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:02:48.637 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:02:48 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:02:48 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 10:02:48 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:02:48.785 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 10:02:49 compute-1 ceph-mon[80115]: pgmap v702: 353 pgs: 353 active+clean; 88 MiB data, 216 MiB used, 60 GiB / 60 GiB avail; 2.0 MiB/s rd, 13 KiB/s wr, 78 op/s
Feb 02 10:02:49 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-haproxy-nfs-cephfs-compute-1-sryqbx[86108]: [WARNING] 032/100249 (4) : Server backend/nfs.cephfs.0 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Feb 02 10:02:50 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:02:50 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:02:50 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:02:50.639 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:02:50 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:02:50 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 10:02:50 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:02:50.788 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 10:02:51 compute-1 ceph-mon[80115]: pgmap v703: 353 pgs: 353 active+clean; 88 MiB data, 216 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 68 op/s
Feb 02 10:02:52 compute-1 ceph-osd[77691]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #43. Immutable memtables: 0.
Feb 02 10:02:52 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 10:02:52 compute-1 podman[228997]: 2026-02-02 10:02:52.405358295 +0000 UTC m=+0.079951620 container health_status 1fb2696999ea15e9688144989093738543d3cef725f9986792d6dfd707aefbe2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:099d88ae13fa2b3409da5310cdcba7fa01d2c87a8bc98296299a57054b9a075e, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'db4758ee7523fe447444c4bd2b867b543b1eee4e3bbcf6676cd1b27bf6147d86-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:099d88ae13fa2b3409da5310cdcba7fa01d2c87a8bc98296299a57054b9a075e', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Feb 02 10:02:52 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:02:52 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 10:02:52 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:02:52.641 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 10:02:52 compute-1 ceph-mon[80115]: pgmap v704: 353 pgs: 353 active+clean; 88 MiB data, 216 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 68 op/s
Feb 02 10:02:52 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:02:52 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 10:02:52 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:02:52.791 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 10:02:54 compute-1 sudo[229026]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 02 10:02:54 compute-1 sudo[229026]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 10:02:54 compute-1 sudo[229026]: pam_unix(sudo:session): session closed for user root
Feb 02 10:02:54 compute-1 sudo[229051]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/d241d473-9fcb-5f74-b163-f1ca4454e7f1/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Feb 02 10:02:54 compute-1 sudo[229051]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 10:02:54 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:02:54 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:02:54 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:02:54.644 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:02:54 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:02:54 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:02:54 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:02:54.794 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:02:54 compute-1 systemd[1]: ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1@nfs.cephfs.0.0.compute-1.mhzhsx.service: Scheduled restart job, restart counter is at 13.
Feb 02 10:02:54 compute-1 systemd[1]: Stopped Ceph nfs.cephfs.0.0.compute-1.mhzhsx for d241d473-9fcb-5f74-b163-f1ca4454e7f1.
Feb 02 10:02:54 compute-1 systemd[1]: Starting Ceph nfs.cephfs.0.0.compute-1.mhzhsx for d241d473-9fcb-5f74-b163-f1ca4454e7f1...
Feb 02 10:02:55 compute-1 podman[229143]: 2026-02-02 10:02:55.025947251 +0000 UTC m=+0.059890673 container create 4a082278f968d851715d3ff83b10198d099fc94dfdc956a8f353cfb211d0aa31 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.build-date=20250325, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2)
Feb 02 10:02:55 compute-1 sudo[229051]: pam_unix(sudo:session): session closed for user root
Feb 02 10:02:55 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/aab1a8eaea1398d15f9f0e4bf76ebc7ab73640ad6ddb093b4227eedbb09799dc/merged/etc/ganesha supports timestamps until 2038 (0x7fffffff)
Feb 02 10:02:55 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/aab1a8eaea1398d15f9f0e4bf76ebc7ab73640ad6ddb093b4227eedbb09799dc/merged/etc/ceph/keyring supports timestamps until 2038 (0x7fffffff)
Feb 02 10:02:55 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/aab1a8eaea1398d15f9f0e4bf76ebc7ab73640ad6ddb093b4227eedbb09799dc/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Feb 02 10:02:55 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/aab1a8eaea1398d15f9f0e4bf76ebc7ab73640ad6ddb093b4227eedbb09799dc/merged/var/lib/ceph/radosgw/ceph-nfs.cephfs.0.0.compute-1.mhzhsx-rgw/keyring supports timestamps until 2038 (0x7fffffff)
Feb 02 10:02:55 compute-1 podman[229143]: 2026-02-02 10:02:55.081052418 +0000 UTC m=+0.114995840 container init 4a082278f968d851715d3ff83b10198d099fc94dfdc956a8f353cfb211d0aa31 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.40.1, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, org.label-schema.license=GPLv2)
Feb 02 10:02:55 compute-1 podman[229143]: 2026-02-02 10:02:55.087085567 +0000 UTC m=+0.121028989 container start 4a082278f968d851715d3ff83b10198d099fc94dfdc956a8f353cfb211d0aa31 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.build-date=20250325, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Feb 02 10:02:55 compute-1 bash[229143]: 4a082278f968d851715d3ff83b10198d099fc94dfdc956a8f353cfb211d0aa31
Feb 02 10:02:55 compute-1 podman[229143]: 2026-02-02 10:02:55.001159201 +0000 UTC m=+0.035102643 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Feb 02 10:02:55 compute-1 systemd[1]: Started Ceph nfs.cephfs.0.0.compute-1.mhzhsx for d241d473-9fcb-5f74-b163-f1ca4454e7f1.
Feb 02 10:02:55 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:02:55 : epoch 6980764f : compute-1 : ganesha.nfsd-2[main] init_logging :LOG :NULL :LOG: Setting log level for all components to NIV_EVENT
Feb 02 10:02:55 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:02:55 : epoch 6980764f : compute-1 : ganesha.nfsd-2[main] main :MAIN :EVENT :ganesha.nfsd Starting: Ganesha Version 5.9
Feb 02 10:02:55 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:02:55 : epoch 6980764f : compute-1 : ganesha.nfsd-2[main] nfs_set_param_from_conf :NFS STARTUP :EVENT :Configuration file successfully parsed
Feb 02 10:02:55 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:02:55 : epoch 6980764f : compute-1 : ganesha.nfsd-2[main] monitoring_init :NFS STARTUP :EVENT :Init monitoring at 0.0.0.0:9587
Feb 02 10:02:55 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:02:55 : epoch 6980764f : compute-1 : ganesha.nfsd-2[main] fsal_init_fds_limit :MDCACHE LRU :EVENT :Setting the system-imposed limit on FDs to 1048576.
Feb 02 10:02:55 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:02:55 : epoch 6980764f : compute-1 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :Initializing ID Mapper.
Feb 02 10:02:55 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:02:55 : epoch 6980764f : compute-1 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :ID Mapper successfully initialized.
Feb 02 10:02:55 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:02:55 : epoch 6980764f : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Feb 02 10:02:55 compute-1 ceph-mon[80115]: pgmap v705: 353 pgs: 353 active+clean; 121 MiB data, 280 MiB used, 60 GiB / 60 GiB avail; 2.2 MiB/s rd, 2.1 MiB/s wr, 133 op/s
Feb 02 10:02:55 compute-1 ceph-mon[80115]: from='client.? 192.168.122.10:0/2723065453' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Feb 02 10:02:55 compute-1 ceph-mon[80115]: from='client.? 192.168.122.10:0/2723065453' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Feb 02 10:02:55 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "config rm", "who": "osd/host:compute-0", "name": "osd_memory_target"}]: dispatch
Feb 02 10:02:55 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Feb 02 10:02:55 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Feb 02 10:02:55 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb 02 10:02:55 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb 02 10:02:55 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Feb 02 10:02:55 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Feb 02 10:02:55 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Feb 02 10:02:56 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:02:56 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 10:02:56 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:02:56.648 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 10:02:56 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:02:56 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:02:56 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:02:56.798 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:02:57 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 10:02:57 compute-1 ceph-mon[80115]: pgmap v706: 353 pgs: 353 active+clean; 121 MiB data, 280 MiB used, 60 GiB / 60 GiB avail; 327 KiB/s rd, 2.1 MiB/s wr, 64 op/s
Feb 02 10:02:58 compute-1 sudo[229212]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Feb 02 10:02:58 compute-1 sudo[229212]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 10:02:58 compute-1 sudo[229212]: pam_unix(sudo:session): session closed for user root
Feb 02 10:02:58 compute-1 podman[229236]: 2026-02-02 10:02:58.36763813 +0000 UTC m=+0.072582677 container health_status 9cab238676dda9a572dafc047b624a8b78a8fbec063bf997856559897160605d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'db4758ee7523fe447444c4bd2b867b543b1eee4e3bbcf6676cd1b27bf6147d86-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team)
Feb 02 10:02:58 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:02:58 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:02:58 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:02:58.652 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:02:58 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:02:58 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:02:58 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:02:58.801 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:02:59 compute-1 ceph-mon[80115]: pgmap v707: 353 pgs: 353 active+clean; 121 MiB data, 280 MiB used, 60 GiB / 60 GiB avail; 329 KiB/s rd, 2.1 MiB/s wr, 66 op/s
Feb 02 10:03:00 compute-1 sudo[229257]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 02 10:03:00 compute-1 sudo[229257]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 10:03:00 compute-1 sudo[229257]: pam_unix(sudo:session): session closed for user root
Feb 02 10:03:00 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:03:00 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:03:00 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:03:00.655 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:03:00 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:03:00 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 10:03:00 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:03:00.803 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 10:03:01 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb 02 10:03:01 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb 02 10:03:01 compute-1 ceph-mon[80115]: pgmap v708: 353 pgs: 353 active+clean; 121 MiB data, 280 MiB used, 60 GiB / 60 GiB avail; 329 KiB/s rd, 2.1 MiB/s wr, 66 op/s
Feb 02 10:03:01 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:03:01 : epoch 6980764f : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Feb 02 10:03:01 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:03:01 : epoch 6980764f : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Feb 02 10:03:01 compute-1 nova_compute[226294]: 2026-02-02 10:03:01.648 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 02 10:03:01 compute-1 nova_compute[226294]: 2026-02-02 10:03:01.648 226298 DEBUG nova.compute.manager [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Feb 02 10:03:01 compute-1 nova_compute[226294]: 2026-02-02 10:03:01.667 226298 DEBUG nova.compute.manager [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Feb 02 10:03:01 compute-1 nova_compute[226294]: 2026-02-02 10:03:01.667 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 02 10:03:01 compute-1 nova_compute[226294]: 2026-02-02 10:03:01.667 226298 DEBUG nova.compute.manager [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Feb 02 10:03:01 compute-1 nova_compute[226294]: 2026-02-02 10:03:01.678 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 02 10:03:02 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Feb 02 10:03:02 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 10:03:02 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:03:02 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 10:03:02 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:03:02.657 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 10:03:02 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:03:02 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 10:03:02 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:03:02.806 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 10:03:03 compute-1 ceph-mon[80115]: pgmap v709: 353 pgs: 353 active+clean; 121 MiB data, 280 MiB used, 60 GiB / 60 GiB avail; 329 KiB/s rd, 2.1 MiB/s wr, 66 op/s
Feb 02 10:03:03 compute-1 nova_compute[226294]: 2026-02-02 10:03:03.686 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 02 10:03:04 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:03:04 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:03:04 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:03:04.660 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:03:04 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:03:04 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:03:04 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:03:04.810 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:03:05 compute-1 ceph-mon[80115]: pgmap v710: 353 pgs: 353 active+clean; 121 MiB data, 280 MiB used, 60 GiB / 60 GiB avail; 335 KiB/s rd, 2.1 MiB/s wr, 68 op/s
Feb 02 10:03:05 compute-1 ceph-mon[80115]: from='client.? 192.168.122.100:0/1855667387' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Feb 02 10:03:05 compute-1 nova_compute[226294]: 2026-02-02 10:03:05.649 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 02 10:03:06 compute-1 ceph-mon[80115]: from='client.? 192.168.122.100:0/2552743148' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Feb 02 10:03:06 compute-1 nova_compute[226294]: 2026-02-02 10:03:06.644 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 02 10:03:06 compute-1 nova_compute[226294]: 2026-02-02 10:03:06.648 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 02 10:03:06 compute-1 nova_compute[226294]: 2026-02-02 10:03:06.648 226298 DEBUG nova.compute.manager [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 02 10:03:06 compute-1 nova_compute[226294]: 2026-02-02 10:03:06.648 226298 DEBUG nova.compute.manager [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 02 10:03:06 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:03:06 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:03:06 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:03:06.664 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:03:06 compute-1 nova_compute[226294]: 2026-02-02 10:03:06.679 226298 DEBUG nova.compute.manager [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Feb 02 10:03:06 compute-1 nova_compute[226294]: 2026-02-02 10:03:06.679 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 02 10:03:06 compute-1 nova_compute[226294]: 2026-02-02 10:03:06.680 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 02 10:03:06 compute-1 nova_compute[226294]: 2026-02-02 10:03:06.680 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 02 10:03:06 compute-1 nova_compute[226294]: 2026-02-02 10:03:06.680 226298 DEBUG nova.compute.manager [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 02 10:03:06 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:03:06 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:03:06 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:03:06.814 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:03:07 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:03:07 : epoch 6980764f : compute-1 : ganesha.nfsd-2[main] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Feb 02 10:03:07 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:03:07 : epoch 6980764f : compute-1 : ganesha.nfsd-2[main] main :NFS STARTUP :WARN :No export entries found in configuration file !!!
Feb 02 10:03:07 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:03:07 : epoch 6980764f : compute-1 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:24): Unknown block (RADOS_URLS)
Feb 02 10:03:07 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:03:07 : epoch 6980764f : compute-1 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:29): Unknown block (RGW)
Feb 02 10:03:07 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:03:07 : epoch 6980764f : compute-1 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :CAP_SYS_RESOURCE was successfully removed for proper quota management in FSAL
Feb 02 10:03:07 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:03:07 : epoch 6980764f : compute-1 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :currently set capabilities are: cap_chown,cap_dac_override,cap_fowner,cap_fsetid,cap_kill,cap_setgid,cap_setuid,cap_setpcap,cap_net_bind_service,cap_sys_chroot,cap_setfcap=ep
Feb 02 10:03:07 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:03:07 : epoch 6980764f : compute-1 : ganesha.nfsd-2[main] gsh_dbus_pkginit :DBUS :CRIT :dbus_bus_get failed (Failed to connect to socket /run/dbus/system_bus_socket: No such file or directory)
Feb 02 10:03:07 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:03:07 : epoch 6980764f : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Feb 02 10:03:07 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:03:07 : epoch 6980764f : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Feb 02 10:03:07 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:03:07 : epoch 6980764f : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Feb 02 10:03:07 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:03:07 : epoch 6980764f : compute-1 : ganesha.nfsd-2[main] nfs_Init_svc :DISP :CRIT :Cannot acquire credentials for principal nfs
Feb 02 10:03:07 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:03:07 : epoch 6980764f : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Feb 02 10:03:07 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:03:07 : epoch 6980764f : compute-1 : ganesha.nfsd-2[main] nfs_Init_admin_thread :NFS CB :EVENT :Admin thread initialized
Feb 02 10:03:07 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:03:07 : epoch 6980764f : compute-1 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :EVENT :Callback creds directory (/var/run/ganesha) already exists
Feb 02 10:03:07 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:03:07 : epoch 6980764f : compute-1 : ganesha.nfsd-2[main] find_keytab_entry :NFS CB :WARN :Configuration file does not specify default realm while getting default realm name
Feb 02 10:03:07 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:03:07 : epoch 6980764f : compute-1 : ganesha.nfsd-2[main] gssd_refresh_krb5_machine_credential :NFS CB :CRIT :ERROR: gssd_refresh_krb5_machine_credential: no usable keytab entry found in keytab /etc/krb5.keytab for connection with host localhost
Feb 02 10:03:07 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:03:07 : epoch 6980764f : compute-1 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :WARN :gssd_refresh_krb5_machine_credential failed (-1765328160:0)
Feb 02 10:03:07 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:03:07 : epoch 6980764f : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :Starting delayed executor.
Feb 02 10:03:07 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:03:07 : epoch 6980764f : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :gsh_dbusthread was started successfully
Feb 02 10:03:07 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:03:07 : epoch 6980764f : compute-1 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :CRIT :DBUS not initialized, service thread exiting
Feb 02 10:03:07 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:03:07 : epoch 6980764f : compute-1 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :EVENT :shutdown
Feb 02 10:03:07 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:03:07 : epoch 6980764f : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :admin thread was started successfully
Feb 02 10:03:07 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:03:07 : epoch 6980764f : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :reaper thread was started successfully
Feb 02 10:03:07 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:03:07 : epoch 6980764f : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :General fridge was started successfully
Feb 02 10:03:07 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:03:07 : epoch 6980764f : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Feb 02 10:03:07 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:03:07 : epoch 6980764f : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :             NFS SERVER INITIALIZED
Feb 02 10:03:07 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:03:07 : epoch 6980764f : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Feb 02 10:03:07 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 10:03:07 compute-1 ceph-mon[80115]: pgmap v711: 353 pgs: 353 active+clean; 121 MiB data, 280 MiB used, 60 GiB / 60 GiB avail; 8.0 KiB/s rd, 13 KiB/s wr, 3 op/s
Feb 02 10:03:07 compute-1 nova_compute[226294]: 2026-02-02 10:03:07.648 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 02 10:03:07 compute-1 nova_compute[226294]: 2026-02-02 10:03:07.648 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 02 10:03:07 compute-1 nova_compute[226294]: 2026-02-02 10:03:07.676 226298 DEBUG oslo_concurrency.lockutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 02 10:03:07 compute-1 nova_compute[226294]: 2026-02-02 10:03:07.676 226298 DEBUG oslo_concurrency.lockutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 02 10:03:07 compute-1 nova_compute[226294]: 2026-02-02 10:03:07.677 226298 DEBUG oslo_concurrency.lockutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 02 10:03:07 compute-1 nova_compute[226294]: 2026-02-02 10:03:07.677 226298 DEBUG nova.compute.resource_tracker [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 02 10:03:07 compute-1 nova_compute[226294]: 2026-02-02 10:03:07.677 226298 DEBUG oslo_concurrency.processutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 02 10:03:07 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:03:07 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b8000df0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:03:07 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:03:07 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b40016e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:03:07 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:03:07 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f849c000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:03:08 compute-1 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Feb 02 10:03:08 compute-1 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1066850920' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Feb 02 10:03:08 compute-1 nova_compute[226294]: 2026-02-02 10:03:08.173 226298 DEBUG oslo_concurrency.processutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.496s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 02 10:03:08 compute-1 nova_compute[226294]: 2026-02-02 10:03:08.377 226298 WARNING nova.virt.libvirt.driver [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 02 10:03:08 compute-1 nova_compute[226294]: 2026-02-02 10:03:08.379 226298 DEBUG nova.compute.resource_tracker [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5188MB free_disk=59.9427490234375GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 02 10:03:08 compute-1 nova_compute[226294]: 2026-02-02 10:03:08.379 226298 DEBUG oslo_concurrency.lockutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 02 10:03:08 compute-1 nova_compute[226294]: 2026-02-02 10:03:08.379 226298 DEBUG oslo_concurrency.lockutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 02 10:03:08 compute-1 nova_compute[226294]: 2026-02-02 10:03:08.613 226298 DEBUG nova.compute.resource_tracker [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 02 10:03:08 compute-1 nova_compute[226294]: 2026-02-02 10:03:08.613 226298 DEBUG nova.compute.resource_tracker [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 02 10:03:08 compute-1 ceph-mon[80115]: from='client.? 192.168.122.102:0/3086339622' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Feb 02 10:03:08 compute-1 ceph-mon[80115]: from='client.? 192.168.122.101:0/1066850920' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Feb 02 10:03:08 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:03:08 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 10:03:08 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:03:08.666 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 10:03:08 compute-1 nova_compute[226294]: 2026-02-02 10:03:08.687 226298 DEBUG nova.scheduler.client.report [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Refreshing inventories for resource provider 8e32c057-ad28-4c19-8374-763e0c1c8622 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Feb 02 10:03:08 compute-1 nova_compute[226294]: 2026-02-02 10:03:08.757 226298 DEBUG nova.scheduler.client.report [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Updating ProviderTree inventory for provider 8e32c057-ad28-4c19-8374-763e0c1c8622 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Feb 02 10:03:08 compute-1 nova_compute[226294]: 2026-02-02 10:03:08.758 226298 DEBUG nova.compute.provider_tree [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Updating inventory in ProviderTree for provider 8e32c057-ad28-4c19-8374-763e0c1c8622 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Feb 02 10:03:08 compute-1 nova_compute[226294]: 2026-02-02 10:03:08.770 226298 DEBUG nova.scheduler.client.report [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Refreshing aggregate associations for resource provider 8e32c057-ad28-4c19-8374-763e0c1c8622, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Feb 02 10:03:08 compute-1 nova_compute[226294]: 2026-02-02 10:03:08.789 226298 DEBUG nova.scheduler.client.report [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Refreshing trait associations for resource provider 8e32c057-ad28-4c19-8374-763e0c1c8622, traits: HW_CPU_X86_AESNI,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_SECURITY_TPM_1_2,HW_CPU_X86_SSE2,HW_CPU_X86_MMX,COMPUTE_STORAGE_BUS_USB,COMPUTE_STORAGE_BUS_IDE,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_SVM,HW_CPU_X86_F16C,HW_CPU_X86_AMD_SVM,HW_CPU_X86_SSE,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_GRAPHICS_MODEL_VIRTIO,HW_CPU_X86_BMI2,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_DEVICE_TAGGING,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_GRAPHICS_MODEL_VGA,HW_CPU_X86_AVX,COMPUTE_VOLUME_EXTEND,COMPUTE_SECURITY_UEFI_SECURE_BOOT,HW_CPU_X86_SHA,COMPUTE_SECURITY_TPM_2_0,HW_CPU_X86_ABM,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_CLMUL,HW_CPU_X86_FMA3,COMPUTE_ACCELERATORS,COMPUTE_TRUSTED_CERTS,HW_CPU_X86_BMI,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_SSE41,COMPUTE_VIOMMU_MODEL_AUTO,HW_CPU_X86_AVX2,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_RESCUE_BFV,HW_CPU_X86_SSE42,COMPUTE_NODE,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_SSSE3,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_NET_VIF_MODEL_PCNET,HW_CPU_X86_SSE4A,COMPUTE_IMAGE_TYPE_QCOW2 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Feb 02 10:03:08 compute-1 nova_compute[226294]: 2026-02-02 10:03:08.802 226298 DEBUG oslo_concurrency.processutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 02 10:03:08 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:03:08 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:03:08 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:03:08.817 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:03:09 compute-1 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Feb 02 10:03:09 compute-1 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1847353346' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Feb 02 10:03:09 compute-1 nova_compute[226294]: 2026-02-02 10:03:09.274 226298 DEBUG oslo_concurrency.processutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.472s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 02 10:03:09 compute-1 nova_compute[226294]: 2026-02-02 10:03:09.278 226298 DEBUG nova.compute.provider_tree [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Inventory has not changed in ProviderTree for provider: 8e32c057-ad28-4c19-8374-763e0c1c8622 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 02 10:03:09 compute-1 nova_compute[226294]: 2026-02-02 10:03:09.289 226298 DEBUG nova.scheduler.client.report [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Inventory has not changed for provider 8e32c057-ad28-4c19-8374-763e0c1c8622 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 02 10:03:09 compute-1 nova_compute[226294]: 2026-02-02 10:03:09.290 226298 DEBUG nova.compute.resource_tracker [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 02 10:03:09 compute-1 nova_compute[226294]: 2026-02-02 10:03:09.290 226298 DEBUG oslo_concurrency.lockutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.911s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 02 10:03:09 compute-1 ceph-mon[80115]: pgmap v712: 353 pgs: 353 active+clean; 121 MiB data, 280 MiB used, 60 GiB / 60 GiB avail; 8.2 KiB/s rd, 17 KiB/s wr, 4 op/s
Feb 02 10:03:09 compute-1 ceph-mon[80115]: from='client.? 192.168.122.102:0/1336790910' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Feb 02 10:03:09 compute-1 ceph-mon[80115]: from='client.? 192.168.122.101:0/1847353346' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Feb 02 10:03:09 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:03:09 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8494000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:03:09 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-haproxy-nfs-cephfs-compute-1-sryqbx[86108]: [WARNING] 032/100309 (4) : Server backend/nfs.cephfs.0 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Feb 02 10:03:09 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:03:09 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84bc001ac0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:03:09 compute-1 nova_compute[226294]: 2026-02-02 10:03:09.787 226298 DEBUG oslo_concurrency.lockutils [None req-15192d88-df5d-4298-a2d7-50664196de1b 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Acquiring lock "7440c4af-7e45-4796-ac03-ddd1eb035702" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 02 10:03:09 compute-1 nova_compute[226294]: 2026-02-02 10:03:09.787 226298 DEBUG oslo_concurrency.lockutils [None req-15192d88-df5d-4298-a2d7-50664196de1b 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Lock "7440c4af-7e45-4796-ac03-ddd1eb035702" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 02 10:03:09 compute-1 nova_compute[226294]: 2026-02-02 10:03:09.806 226298 DEBUG nova.compute.manager [None req-15192d88-df5d-4298-a2d7-50664196de1b 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] [instance: 7440c4af-7e45-4796-ac03-ddd1eb035702] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Feb 02 10:03:09 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:03:09 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b40016e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:03:09 compute-1 nova_compute[226294]: 2026-02-02 10:03:09.880 226298 DEBUG oslo_concurrency.lockutils [None req-15192d88-df5d-4298-a2d7-50664196de1b 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 02 10:03:09 compute-1 nova_compute[226294]: 2026-02-02 10:03:09.881 226298 DEBUG oslo_concurrency.lockutils [None req-15192d88-df5d-4298-a2d7-50664196de1b 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 02 10:03:09 compute-1 nova_compute[226294]: 2026-02-02 10:03:09.888 226298 DEBUG nova.virt.hardware [None req-15192d88-df5d-4298-a2d7-50664196de1b 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Feb 02 10:03:09 compute-1 nova_compute[226294]: 2026-02-02 10:03:09.888 226298 INFO nova.compute.claims [None req-15192d88-df5d-4298-a2d7-50664196de1b 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] [instance: 7440c4af-7e45-4796-ac03-ddd1eb035702] Claim successful on node compute-1.ctlplane.example.com
Feb 02 10:03:09 compute-1 nova_compute[226294]: 2026-02-02 10:03:09.998 226298 DEBUG oslo_concurrency.processutils [None req-15192d88-df5d-4298-a2d7-50664196de1b 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 02 10:03:10 compute-1 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Feb 02 10:03:10 compute-1 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1779420540' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Feb 02 10:03:10 compute-1 nova_compute[226294]: 2026-02-02 10:03:10.473 226298 DEBUG oslo_concurrency.processutils [None req-15192d88-df5d-4298-a2d7-50664196de1b 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.475s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 02 10:03:10 compute-1 nova_compute[226294]: 2026-02-02 10:03:10.480 226298 DEBUG nova.compute.provider_tree [None req-15192d88-df5d-4298-a2d7-50664196de1b 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Inventory has not changed in ProviderTree for provider: 8e32c057-ad28-4c19-8374-763e0c1c8622 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 02 10:03:10 compute-1 nova_compute[226294]: 2026-02-02 10:03:10.498 226298 DEBUG nova.scheduler.client.report [None req-15192d88-df5d-4298-a2d7-50664196de1b 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Inventory has not changed for provider 8e32c057-ad28-4c19-8374-763e0c1c8622 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 02 10:03:10 compute-1 nova_compute[226294]: 2026-02-02 10:03:10.526 226298 DEBUG oslo_concurrency.lockutils [None req-15192d88-df5d-4298-a2d7-50664196de1b 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.645s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 02 10:03:10 compute-1 nova_compute[226294]: 2026-02-02 10:03:10.527 226298 DEBUG nova.compute.manager [None req-15192d88-df5d-4298-a2d7-50664196de1b 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] [instance: 7440c4af-7e45-4796-ac03-ddd1eb035702] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Feb 02 10:03:10 compute-1 nova_compute[226294]: 2026-02-02 10:03:10.590 226298 DEBUG nova.compute.manager [None req-15192d88-df5d-4298-a2d7-50664196de1b 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] [instance: 7440c4af-7e45-4796-ac03-ddd1eb035702] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Feb 02 10:03:10 compute-1 nova_compute[226294]: 2026-02-02 10:03:10.591 226298 DEBUG nova.network.neutron [None req-15192d88-df5d-4298-a2d7-50664196de1b 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] [instance: 7440c4af-7e45-4796-ac03-ddd1eb035702] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Feb 02 10:03:10 compute-1 nova_compute[226294]: 2026-02-02 10:03:10.616 226298 INFO nova.virt.libvirt.driver [None req-15192d88-df5d-4298-a2d7-50664196de1b 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] [instance: 7440c4af-7e45-4796-ac03-ddd1eb035702] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Feb 02 10:03:10 compute-1 nova_compute[226294]: 2026-02-02 10:03:10.643 226298 DEBUG nova.compute.manager [None req-15192d88-df5d-4298-a2d7-50664196de1b 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] [instance: 7440c4af-7e45-4796-ac03-ddd1eb035702] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Feb 02 10:03:10 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:03:10 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:03:10 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:03:10.668 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:03:10 compute-1 nova_compute[226294]: 2026-02-02 10:03:10.790 226298 DEBUG nova.compute.manager [None req-15192d88-df5d-4298-a2d7-50664196de1b 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] [instance: 7440c4af-7e45-4796-ac03-ddd1eb035702] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Feb 02 10:03:10 compute-1 nova_compute[226294]: 2026-02-02 10:03:10.792 226298 DEBUG nova.virt.libvirt.driver [None req-15192d88-df5d-4298-a2d7-50664196de1b 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] [instance: 7440c4af-7e45-4796-ac03-ddd1eb035702] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Feb 02 10:03:10 compute-1 nova_compute[226294]: 2026-02-02 10:03:10.793 226298 INFO nova.virt.libvirt.driver [None req-15192d88-df5d-4298-a2d7-50664196de1b 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] [instance: 7440c4af-7e45-4796-ac03-ddd1eb035702] Creating image(s)
Feb 02 10:03:10 compute-1 ceph-mon[80115]: from='client.? 192.168.122.101:0/1779420540' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Feb 02 10:03:10 compute-1 ceph-mon[80115]: pgmap v713: 353 pgs: 353 active+clean; 121 MiB data, 280 MiB used, 60 GiB / 60 GiB avail; 7.2 KiB/s rd, 3.7 KiB/s wr, 2 op/s
Feb 02 10:03:10 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:03:10 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:03:10 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:03:10.818 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:03:10 compute-1 nova_compute[226294]: 2026-02-02 10:03:10.831 226298 DEBUG nova.storage.rbd_utils [None req-15192d88-df5d-4298-a2d7-50664196de1b 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] rbd image 7440c4af-7e45-4796-ac03-ddd1eb035702_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80
Feb 02 10:03:10 compute-1 nova_compute[226294]: 2026-02-02 10:03:10.861 226298 DEBUG nova.storage.rbd_utils [None req-15192d88-df5d-4298-a2d7-50664196de1b 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] rbd image 7440c4af-7e45-4796-ac03-ddd1eb035702_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80
Feb 02 10:03:10 compute-1 nova_compute[226294]: 2026-02-02 10:03:10.889 226298 DEBUG nova.storage.rbd_utils [None req-15192d88-df5d-4298-a2d7-50664196de1b 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] rbd image 7440c4af-7e45-4796-ac03-ddd1eb035702_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80
Feb 02 10:03:10 compute-1 nova_compute[226294]: 2026-02-02 10:03:10.893 226298 DEBUG oslo_concurrency.lockutils [None req-15192d88-df5d-4298-a2d7-50664196de1b 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Acquiring lock "b48fe8b86a7168723be684d0fce89ef3f0abcc61" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 02 10:03:10 compute-1 nova_compute[226294]: 2026-02-02 10:03:10.894 226298 DEBUG oslo_concurrency.lockutils [None req-15192d88-df5d-4298-a2d7-50664196de1b 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Lock "b48fe8b86a7168723be684d0fce89ef3f0abcc61" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 02 10:03:11 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:03:11 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f849c0016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:03:11 compute-1 nova_compute[226294]: 2026-02-02 10:03:11.729 226298 DEBUG nova.virt.libvirt.imagebackend [None req-15192d88-df5d-4298-a2d7-50664196de1b 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Image locations are: [{'url': 'rbd://d241d473-9fcb-5f74-b163-f1ca4454e7f1/images/d5e062d7-95ef-409c-9ad0-60f7cf6f44ce/snap', 'metadata': {'store': 'default_backend'}}, {'url': 'rbd://d241d473-9fcb-5f74-b163-f1ca4454e7f1/images/d5e062d7-95ef-409c-9ad0-60f7cf6f44ce/snap', 'metadata': {}}] clone /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py:1085
Feb 02 10:03:11 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:03:11 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84940016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:03:11 compute-1 nova_compute[226294]: 2026-02-02 10:03:11.813 226298 WARNING oslo_policy.policy [None req-15192d88-df5d-4298-a2d7-50664196de1b 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] JSON formatted policy_file support is deprecated since Victoria release. You need to use YAML format which will be default in future. You can use ``oslopolicy-convert-json-to-yaml`` tool to convert existing JSON-formatted policy file to YAML-formatted in backward compatible way: https://docs.openstack.org/oslo.policy/latest/cli/oslopolicy-convert-json-to-yaml.html.
Feb 02 10:03:11 compute-1 nova_compute[226294]: 2026-02-02 10:03:11.813 226298 WARNING oslo_policy.policy [None req-15192d88-df5d-4298-a2d7-50664196de1b 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] JSON formatted policy_file support is deprecated since Victoria release. You need to use YAML format which will be default in future. You can use ``oslopolicy-convert-json-to-yaml`` tool to convert existing JSON-formatted policy file to YAML-formatted in backward compatible way: https://docs.openstack.org/oslo.policy/latest/cli/oslopolicy-convert-json-to-yaml.html.
Feb 02 10:03:11 compute-1 nova_compute[226294]: 2026-02-02 10:03:11.816 226298 DEBUG nova.policy [None req-15192d88-df5d-4298-a2d7-50664196de1b 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '1b1695a2a70d4aa0aa350ba17d8f6d5e', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'efbfe697ca674d72b47da5adf3e42c0c', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Feb 02 10:03:11 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:03:11 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84bc0025c0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:03:12 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 10:03:12 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:03:12 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:03:12 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:03:12.671 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:03:12 compute-1 ceph-mon[80115]: pgmap v714: 353 pgs: 353 active+clean; 121 MiB data, 280 MiB used, 60 GiB / 60 GiB avail; 7.2 KiB/s rd, 3.7 KiB/s wr, 2 op/s
Feb 02 10:03:12 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:03:12 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:03:12 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:03:12.821 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:03:13 compute-1 ovn_metadata_agent[143537]: 2026-02-02 10:03:13.386 143542 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=4, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '66:4f:4d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '4a:a7:f3:61:65:15'}, ipsec=False) old=SB_Global(nb_cfg=3) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 02 10:03:13 compute-1 ovn_metadata_agent[143537]: 2026-02-02 10:03:13.388 143542 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Feb 02 10:03:13 compute-1 nova_compute[226294]: 2026-02-02 10:03:13.593 226298 DEBUG oslo_concurrency.processutils [None req-15192d88-df5d-4298-a2d7-50664196de1b 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b48fe8b86a7168723be684d0fce89ef3f0abcc61.part --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 02 10:03:13 compute-1 nova_compute[226294]: 2026-02-02 10:03:13.678 226298 DEBUG oslo_concurrency.processutils [None req-15192d88-df5d-4298-a2d7-50664196de1b 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b48fe8b86a7168723be684d0fce89ef3f0abcc61.part --force-share --output=json" returned: 0 in 0.085s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 02 10:03:13 compute-1 nova_compute[226294]: 2026-02-02 10:03:13.679 226298 DEBUG nova.virt.images [None req-15192d88-df5d-4298-a2d7-50664196de1b 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] d5e062d7-95ef-409c-9ad0-60f7cf6f44ce was qcow2, converting to raw fetch_to_raw /usr/lib/python3.9/site-packages/nova/virt/images.py:242
Feb 02 10:03:13 compute-1 nova_compute[226294]: 2026-02-02 10:03:13.681 226298 DEBUG nova.privsep.utils [None req-15192d88-df5d-4298-a2d7-50664196de1b 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63
Feb 02 10:03:13 compute-1 nova_compute[226294]: 2026-02-02 10:03:13.681 226298 DEBUG oslo_concurrency.processutils [None req-15192d88-df5d-4298-a2d7-50664196de1b 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Running cmd (subprocess): qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/b48fe8b86a7168723be684d0fce89ef3f0abcc61.part /var/lib/nova/instances/_base/b48fe8b86a7168723be684d0fce89ef3f0abcc61.converted execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 02 10:03:13 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:03:13 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b40016e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:03:13 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:03:13 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f849c0016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:03:13 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:03:13 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84940016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:03:13 compute-1 nova_compute[226294]: 2026-02-02 10:03:13.885 226298 DEBUG oslo_concurrency.processutils [None req-15192d88-df5d-4298-a2d7-50664196de1b 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] CMD "qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/b48fe8b86a7168723be684d0fce89ef3f0abcc61.part /var/lib/nova/instances/_base/b48fe8b86a7168723be684d0fce89ef3f0abcc61.converted" returned: 0 in 0.204s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 02 10:03:13 compute-1 nova_compute[226294]: 2026-02-02 10:03:13.889 226298 DEBUG oslo_concurrency.processutils [None req-15192d88-df5d-4298-a2d7-50664196de1b 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b48fe8b86a7168723be684d0fce89ef3f0abcc61.converted --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 02 10:03:13 compute-1 nova_compute[226294]: 2026-02-02 10:03:13.927 226298 DEBUG nova.network.neutron [None req-15192d88-df5d-4298-a2d7-50664196de1b 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] [instance: 7440c4af-7e45-4796-ac03-ddd1eb035702] Successfully created port: 6cd44411-fc3c-47f5-9a7e-daa3acfc46b5 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Feb 02 10:03:13 compute-1 nova_compute[226294]: 2026-02-02 10:03:13.957 226298 DEBUG oslo_concurrency.processutils [None req-15192d88-df5d-4298-a2d7-50664196de1b 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b48fe8b86a7168723be684d0fce89ef3f0abcc61.converted --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 02 10:03:13 compute-1 nova_compute[226294]: 2026-02-02 10:03:13.958 226298 DEBUG oslo_concurrency.lockutils [None req-15192d88-df5d-4298-a2d7-50664196de1b 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Lock "b48fe8b86a7168723be684d0fce89ef3f0abcc61" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 3.064s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 02 10:03:13 compute-1 nova_compute[226294]: 2026-02-02 10:03:13.986 226298 DEBUG nova.storage.rbd_utils [None req-15192d88-df5d-4298-a2d7-50664196de1b 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] rbd image 7440c4af-7e45-4796-ac03-ddd1eb035702_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80
Feb 02 10:03:13 compute-1 nova_compute[226294]: 2026-02-02 10:03:13.990 226298 DEBUG oslo_concurrency.processutils [None req-15192d88-df5d-4298-a2d7-50664196de1b 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/b48fe8b86a7168723be684d0fce89ef3f0abcc61 7440c4af-7e45-4796-ac03-ddd1eb035702_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 02 10:03:14 compute-1 nova_compute[226294]: 2026-02-02 10:03:14.382 226298 DEBUG oslo_concurrency.processutils [None req-15192d88-df5d-4298-a2d7-50664196de1b 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/b48fe8b86a7168723be684d0fce89ef3f0abcc61 7440c4af-7e45-4796-ac03-ddd1eb035702_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.392s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 02 10:03:14 compute-1 nova_compute[226294]: 2026-02-02 10:03:14.498 226298 DEBUG nova.storage.rbd_utils [None req-15192d88-df5d-4298-a2d7-50664196de1b 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] resizing rbd image 7440c4af-7e45-4796-ac03-ddd1eb035702_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288
Feb 02 10:03:14 compute-1 nova_compute[226294]: 2026-02-02 10:03:14.671 226298 DEBUG nova.objects.instance [None req-15192d88-df5d-4298-a2d7-50664196de1b 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Lazy-loading 'migration_context' on Instance uuid 7440c4af-7e45-4796-ac03-ddd1eb035702 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 02 10:03:14 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:03:14 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 10:03:14 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:03:14.672 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 10:03:14 compute-1 nova_compute[226294]: 2026-02-02 10:03:14.695 226298 DEBUG nova.virt.libvirt.driver [None req-15192d88-df5d-4298-a2d7-50664196de1b 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] [instance: 7440c4af-7e45-4796-ac03-ddd1eb035702] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Feb 02 10:03:14 compute-1 nova_compute[226294]: 2026-02-02 10:03:14.696 226298 DEBUG nova.virt.libvirt.driver [None req-15192d88-df5d-4298-a2d7-50664196de1b 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] [instance: 7440c4af-7e45-4796-ac03-ddd1eb035702] Ensure instance console log exists: /var/lib/nova/instances/7440c4af-7e45-4796-ac03-ddd1eb035702/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Feb 02 10:03:14 compute-1 nova_compute[226294]: 2026-02-02 10:03:14.697 226298 DEBUG oslo_concurrency.lockutils [None req-15192d88-df5d-4298-a2d7-50664196de1b 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 02 10:03:14 compute-1 nova_compute[226294]: 2026-02-02 10:03:14.697 226298 DEBUG oslo_concurrency.lockutils [None req-15192d88-df5d-4298-a2d7-50664196de1b 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 02 10:03:14 compute-1 nova_compute[226294]: 2026-02-02 10:03:14.698 226298 DEBUG oslo_concurrency.lockutils [None req-15192d88-df5d-4298-a2d7-50664196de1b 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 02 10:03:14 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:03:14 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:03:14 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:03:14.823 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:03:15 compute-1 nova_compute[226294]: 2026-02-02 10:03:15.340 226298 DEBUG nova.network.neutron [None req-15192d88-df5d-4298-a2d7-50664196de1b 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] [instance: 7440c4af-7e45-4796-ac03-ddd1eb035702] Successfully updated port: 6cd44411-fc3c-47f5-9a7e-daa3acfc46b5 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Feb 02 10:03:15 compute-1 nova_compute[226294]: 2026-02-02 10:03:15.511 226298 DEBUG nova.compute.manager [req-a504d106-883c-4fb4-9d39-29922ab59ebe req-4eb6ae37-1033-4dc5-9fa5-58c09f0a7b5f b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] [instance: 7440c4af-7e45-4796-ac03-ddd1eb035702] Received event network-changed-6cd44411-fc3c-47f5-9a7e-daa3acfc46b5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 02 10:03:15 compute-1 nova_compute[226294]: 2026-02-02 10:03:15.512 226298 DEBUG nova.compute.manager [req-a504d106-883c-4fb4-9d39-29922ab59ebe req-4eb6ae37-1033-4dc5-9fa5-58c09f0a7b5f b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] [instance: 7440c4af-7e45-4796-ac03-ddd1eb035702] Refreshing instance network info cache due to event network-changed-6cd44411-fc3c-47f5-9a7e-daa3acfc46b5. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Feb 02 10:03:15 compute-1 nova_compute[226294]: 2026-02-02 10:03:15.512 226298 DEBUG oslo_concurrency.lockutils [req-a504d106-883c-4fb4-9d39-29922ab59ebe req-4eb6ae37-1033-4dc5-9fa5-58c09f0a7b5f b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] Acquiring lock "refresh_cache-7440c4af-7e45-4796-ac03-ddd1eb035702" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 02 10:03:15 compute-1 nova_compute[226294]: 2026-02-02 10:03:15.513 226298 DEBUG oslo_concurrency.lockutils [req-a504d106-883c-4fb4-9d39-29922ab59ebe req-4eb6ae37-1033-4dc5-9fa5-58c09f0a7b5f b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] Acquired lock "refresh_cache-7440c4af-7e45-4796-ac03-ddd1eb035702" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 02 10:03:15 compute-1 nova_compute[226294]: 2026-02-02 10:03:15.513 226298 DEBUG nova.network.neutron [req-a504d106-883c-4fb4-9d39-29922ab59ebe req-4eb6ae37-1033-4dc5-9fa5-58c09f0a7b5f b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] [instance: 7440c4af-7e45-4796-ac03-ddd1eb035702] Refreshing network info cache for port 6cd44411-fc3c-47f5-9a7e-daa3acfc46b5 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Feb 02 10:03:15 compute-1 nova_compute[226294]: 2026-02-02 10:03:15.516 226298 DEBUG oslo_concurrency.lockutils [None req-15192d88-df5d-4298-a2d7-50664196de1b 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Acquiring lock "refresh_cache-7440c4af-7e45-4796-ac03-ddd1eb035702" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 02 10:03:15 compute-1 ceph-mon[80115]: pgmap v715: 353 pgs: 353 active+clean; 121 MiB data, 280 MiB used, 60 GiB / 60 GiB avail; 1.7 MiB/s rd, 4.8 KiB/s wr, 9 op/s
Feb 02 10:03:15 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:03:15 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84bc0025c0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:03:15 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:03:15 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b40016e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:03:15 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:03:15 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f849c0016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:03:16 compute-1 nova_compute[226294]: 2026-02-02 10:03:16.500 226298 DEBUG nova.network.neutron [req-a504d106-883c-4fb4-9d39-29922ab59ebe req-4eb6ae37-1033-4dc5-9fa5-58c09f0a7b5f b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] [instance: 7440c4af-7e45-4796-ac03-ddd1eb035702] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Feb 02 10:03:16 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:03:16 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:03:16 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:03:16.674 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:03:16 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:03:16 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:03:16 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:03:16.824 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:03:17 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 10:03:17 compute-1 nova_compute[226294]: 2026-02-02 10:03:17.540 226298 DEBUG nova.network.neutron [req-a504d106-883c-4fb4-9d39-29922ab59ebe req-4eb6ae37-1033-4dc5-9fa5-58c09f0a7b5f b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] [instance: 7440c4af-7e45-4796-ac03-ddd1eb035702] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 02 10:03:17 compute-1 nova_compute[226294]: 2026-02-02 10:03:17.574 226298 DEBUG oslo_concurrency.lockutils [req-a504d106-883c-4fb4-9d39-29922ab59ebe req-4eb6ae37-1033-4dc5-9fa5-58c09f0a7b5f b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] Releasing lock "refresh_cache-7440c4af-7e45-4796-ac03-ddd1eb035702" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 02 10:03:17 compute-1 nova_compute[226294]: 2026-02-02 10:03:17.576 226298 DEBUG oslo_concurrency.lockutils [None req-15192d88-df5d-4298-a2d7-50664196de1b 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Acquired lock "refresh_cache-7440c4af-7e45-4796-ac03-ddd1eb035702" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 02 10:03:17 compute-1 nova_compute[226294]: 2026-02-02 10:03:17.576 226298 DEBUG nova.network.neutron [None req-15192d88-df5d-4298-a2d7-50664196de1b 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] [instance: 7440c4af-7e45-4796-ac03-ddd1eb035702] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Feb 02 10:03:17 compute-1 ceph-mon[80115]: pgmap v716: 353 pgs: 353 active+clean; 121 MiB data, 280 MiB used, 60 GiB / 60 GiB avail; 1.7 MiB/s rd, 4.5 KiB/s wr, 7 op/s
Feb 02 10:03:17 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Feb 02 10:03:17 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:03:17 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84940016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:03:17 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:03:17 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84bc0032d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:03:17 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:03:17 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b40016e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:03:17 compute-1 nova_compute[226294]: 2026-02-02 10:03:17.979 226298 DEBUG nova.network.neutron [None req-15192d88-df5d-4298-a2d7-50664196de1b 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] [instance: 7440c4af-7e45-4796-ac03-ddd1eb035702] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Feb 02 10:03:18 compute-1 sudo[229552]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Feb 02 10:03:18 compute-1 sudo[229552]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 10:03:18 compute-1 sudo[229552]: pam_unix(sudo:session): session closed for user root
Feb 02 10:03:18 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:03:18 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:03:18 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:03:18.677 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:03:18 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:03:18 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 10:03:18 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:03:18.826 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 10:03:18 compute-1 nova_compute[226294]: 2026-02-02 10:03:18.922 226298 DEBUG nova.network.neutron [None req-15192d88-df5d-4298-a2d7-50664196de1b 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] [instance: 7440c4af-7e45-4796-ac03-ddd1eb035702] Updating instance_info_cache with network_info: [{"id": "6cd44411-fc3c-47f5-9a7e-daa3acfc46b5", "address": "fa:16:3e:7b:4d:0e", "network": {"id": "e0f239e2-848d-4af8-8655-52de33d6c78c", "bridge": "br-int", "label": "tempest-network-smoke--1699631923", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.25", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "efbfe697ca674d72b47da5adf3e42c0c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6cd44411-fc", "ovs_interfaceid": "6cd44411-fc3c-47f5-9a7e-daa3acfc46b5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 02 10:03:18 compute-1 nova_compute[226294]: 2026-02-02 10:03:18.941 226298 DEBUG oslo_concurrency.lockutils [None req-15192d88-df5d-4298-a2d7-50664196de1b 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Releasing lock "refresh_cache-7440c4af-7e45-4796-ac03-ddd1eb035702" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 02 10:03:18 compute-1 nova_compute[226294]: 2026-02-02 10:03:18.942 226298 DEBUG nova.compute.manager [None req-15192d88-df5d-4298-a2d7-50664196de1b 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] [instance: 7440c4af-7e45-4796-ac03-ddd1eb035702] Instance network_info: |[{"id": "6cd44411-fc3c-47f5-9a7e-daa3acfc46b5", "address": "fa:16:3e:7b:4d:0e", "network": {"id": "e0f239e2-848d-4af8-8655-52de33d6c78c", "bridge": "br-int", "label": "tempest-network-smoke--1699631923", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.25", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "efbfe697ca674d72b47da5adf3e42c0c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6cd44411-fc", "ovs_interfaceid": "6cd44411-fc3c-47f5-9a7e-daa3acfc46b5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Feb 02 10:03:18 compute-1 nova_compute[226294]: 2026-02-02 10:03:18.946 226298 DEBUG nova.virt.libvirt.driver [None req-15192d88-df5d-4298-a2d7-50664196de1b 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] [instance: 7440c4af-7e45-4796-ac03-ddd1eb035702] Start _get_guest_xml network_info=[{"id": "6cd44411-fc3c-47f5-9a7e-daa3acfc46b5", "address": "fa:16:3e:7b:4d:0e", "network": {"id": "e0f239e2-848d-4af8-8655-52de33d6c78c", "bridge": "br-int", "label": "tempest-network-smoke--1699631923", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.25", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "efbfe697ca674d72b47da5adf3e42c0c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6cd44411-fc", "ovs_interfaceid": "6cd44411-fc3c-47f5-9a7e-daa3acfc46b5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-02-02T10:01:42Z,direct_url=<?>,disk_format='qcow2',id=d5e062d7-95ef-409c-9ad0-60f7cf6f44ce,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='823d3e7e313a44e9a50531e3fef22a1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-02-02T10:01:45Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_format': None, 'disk_bus': 'virtio', 'encrypted': False, 'guest_format': None, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'size': 0, 'encryption_options': None, 'device_type': 'disk', 'boot_index': 0, 'image_id': 'd5e062d7-95ef-409c-9ad0-60f7cf6f44ce'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Feb 02 10:03:18 compute-1 nova_compute[226294]: 2026-02-02 10:03:18.951 226298 WARNING nova.virt.libvirt.driver [None req-15192d88-df5d-4298-a2d7-50664196de1b 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 02 10:03:18 compute-1 nova_compute[226294]: 2026-02-02 10:03:18.955 226298 DEBUG nova.virt.libvirt.host [None req-15192d88-df5d-4298-a2d7-50664196de1b 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Feb 02 10:03:18 compute-1 nova_compute[226294]: 2026-02-02 10:03:18.956 226298 DEBUG nova.virt.libvirt.host [None req-15192d88-df5d-4298-a2d7-50664196de1b 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Feb 02 10:03:18 compute-1 nova_compute[226294]: 2026-02-02 10:03:18.959 226298 DEBUG nova.virt.libvirt.host [None req-15192d88-df5d-4298-a2d7-50664196de1b 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Feb 02 10:03:18 compute-1 nova_compute[226294]: 2026-02-02 10:03:18.959 226298 DEBUG nova.virt.libvirt.host [None req-15192d88-df5d-4298-a2d7-50664196de1b 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Feb 02 10:03:18 compute-1 nova_compute[226294]: 2026-02-02 10:03:18.960 226298 DEBUG nova.virt.libvirt.driver [None req-15192d88-df5d-4298-a2d7-50664196de1b 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Feb 02 10:03:18 compute-1 nova_compute[226294]: 2026-02-02 10:03:18.960 226298 DEBUG nova.virt.hardware [None req-15192d88-df5d-4298-a2d7-50664196de1b 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-02-02T10:01:40Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='1194feb9-e285-414e-825a-1e77171d092f',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-02-02T10:01:42Z,direct_url=<?>,disk_format='qcow2',id=d5e062d7-95ef-409c-9ad0-60f7cf6f44ce,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='823d3e7e313a44e9a50531e3fef22a1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-02-02T10:01:45Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Feb 02 10:03:18 compute-1 nova_compute[226294]: 2026-02-02 10:03:18.961 226298 DEBUG nova.virt.hardware [None req-15192d88-df5d-4298-a2d7-50664196de1b 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Feb 02 10:03:18 compute-1 nova_compute[226294]: 2026-02-02 10:03:18.961 226298 DEBUG nova.virt.hardware [None req-15192d88-df5d-4298-a2d7-50664196de1b 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Feb 02 10:03:18 compute-1 nova_compute[226294]: 2026-02-02 10:03:18.962 226298 DEBUG nova.virt.hardware [None req-15192d88-df5d-4298-a2d7-50664196de1b 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Feb 02 10:03:18 compute-1 nova_compute[226294]: 2026-02-02 10:03:18.962 226298 DEBUG nova.virt.hardware [None req-15192d88-df5d-4298-a2d7-50664196de1b 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Feb 02 10:03:18 compute-1 nova_compute[226294]: 2026-02-02 10:03:18.963 226298 DEBUG nova.virt.hardware [None req-15192d88-df5d-4298-a2d7-50664196de1b 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Feb 02 10:03:18 compute-1 nova_compute[226294]: 2026-02-02 10:03:18.963 226298 DEBUG nova.virt.hardware [None req-15192d88-df5d-4298-a2d7-50664196de1b 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Feb 02 10:03:18 compute-1 nova_compute[226294]: 2026-02-02 10:03:18.963 226298 DEBUG nova.virt.hardware [None req-15192d88-df5d-4298-a2d7-50664196de1b 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Feb 02 10:03:18 compute-1 nova_compute[226294]: 2026-02-02 10:03:18.964 226298 DEBUG nova.virt.hardware [None req-15192d88-df5d-4298-a2d7-50664196de1b 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Feb 02 10:03:18 compute-1 nova_compute[226294]: 2026-02-02 10:03:18.964 226298 DEBUG nova.virt.hardware [None req-15192d88-df5d-4298-a2d7-50664196de1b 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Feb 02 10:03:18 compute-1 nova_compute[226294]: 2026-02-02 10:03:18.964 226298 DEBUG nova.virt.hardware [None req-15192d88-df5d-4298-a2d7-50664196de1b 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Feb 02 10:03:18 compute-1 nova_compute[226294]: 2026-02-02 10:03:18.970 226298 DEBUG nova.privsep.utils [None req-15192d88-df5d-4298-a2d7-50664196de1b 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63
Feb 02 10:03:18 compute-1 nova_compute[226294]: 2026-02-02 10:03:18.970 226298 DEBUG oslo_concurrency.processutils [None req-15192d88-df5d-4298-a2d7-50664196de1b 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 02 10:03:19 compute-1 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Feb 02 10:03:19 compute-1 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3937314563' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Feb 02 10:03:19 compute-1 nova_compute[226294]: 2026-02-02 10:03:19.477 226298 DEBUG oslo_concurrency.processutils [None req-15192d88-df5d-4298-a2d7-50664196de1b 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.507s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 02 10:03:19 compute-1 nova_compute[226294]: 2026-02-02 10:03:19.502 226298 DEBUG nova.storage.rbd_utils [None req-15192d88-df5d-4298-a2d7-50664196de1b 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] rbd image 7440c4af-7e45-4796-ac03-ddd1eb035702_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80
Feb 02 10:03:19 compute-1 nova_compute[226294]: 2026-02-02 10:03:19.506 226298 DEBUG oslo_concurrency.processutils [None req-15192d88-df5d-4298-a2d7-50664196de1b 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 02 10:03:19 compute-1 ceph-mon[80115]: pgmap v717: 353 pgs: 353 active+clean; 167 MiB data, 302 MiB used, 60 GiB / 60 GiB avail; 1.7 MiB/s rd, 1.8 MiB/s wr, 35 op/s
Feb 02 10:03:19 compute-1 ceph-mon[80115]: from='client.? 192.168.122.101:0/3937314563' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Feb 02 10:03:19 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:03:19 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f849c002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:03:19 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:03:19 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8494002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:03:19 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:03:19 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84bc0032d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:03:19 compute-1 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Feb 02 10:03:19 compute-1 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2403354180' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Feb 02 10:03:19 compute-1 nova_compute[226294]: 2026-02-02 10:03:19.995 226298 DEBUG oslo_concurrency.processutils [None req-15192d88-df5d-4298-a2d7-50664196de1b 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.489s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 02 10:03:19 compute-1 nova_compute[226294]: 2026-02-02 10:03:19.998 226298 DEBUG nova.virt.libvirt.vif [None req-15192d88-df5d-4298-a2d7-50664196de1b 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-02-02T10:03:08Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-2085006119',display_name='tempest-TestNetworkBasicOps-server-2085006119',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-2085006119',id=2,image_ref='d5e062d7-95ef-409c-9ad0-60f7cf6f44ce',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBPTNHym9kIQXfz4suqCv0Mc6jF6e6B0Wqnxt3FW6kpn3GvVkI5IjdsdGcg/l/4S1jaKaNrV8XIFTm81yb+PAyRJlVvfR+xXnnQd/vZvQFfCYnwki1MFmz37JvJeahfBCnw==',key_name='tempest-TestNetworkBasicOps-1317968913',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='efbfe697ca674d72b47da5adf3e42c0c',ramdisk_id='',reservation_id='r-qht7ug2q',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='d5e062d7-95ef-409c-9ad0-60f7cf6f44ce',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-793549693',owner_user_name='tempest-TestNetworkBasicOps-793549693-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-02-02T10:03:10Z,user_data=None,user_id='1b1695a2a70d4aa0aa350ba17d8f6d5e',uuid=7440c4af-7e45-4796-ac03-ddd1eb035702,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "6cd44411-fc3c-47f5-9a7e-daa3acfc46b5", "address": "fa:16:3e:7b:4d:0e", "network": {"id": "e0f239e2-848d-4af8-8655-52de33d6c78c", "bridge": "br-int", "label": "tempest-network-smoke--1699631923", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.25", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "efbfe697ca674d72b47da5adf3e42c0c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6cd44411-fc", "ovs_interfaceid": "6cd44411-fc3c-47f5-9a7e-daa3acfc46b5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Feb 02 10:03:19 compute-1 nova_compute[226294]: 2026-02-02 10:03:19.998 226298 DEBUG nova.network.os_vif_util [None req-15192d88-df5d-4298-a2d7-50664196de1b 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Converting VIF {"id": "6cd44411-fc3c-47f5-9a7e-daa3acfc46b5", "address": "fa:16:3e:7b:4d:0e", "network": {"id": "e0f239e2-848d-4af8-8655-52de33d6c78c", "bridge": "br-int", "label": "tempest-network-smoke--1699631923", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.25", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "efbfe697ca674d72b47da5adf3e42c0c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6cd44411-fc", "ovs_interfaceid": "6cd44411-fc3c-47f5-9a7e-daa3acfc46b5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 02 10:03:20 compute-1 nova_compute[226294]: 2026-02-02 10:03:20.000 226298 DEBUG nova.network.os_vif_util [None req-15192d88-df5d-4298-a2d7-50664196de1b 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:7b:4d:0e,bridge_name='br-int',has_traffic_filtering=True,id=6cd44411-fc3c-47f5-9a7e-daa3acfc46b5,network=Network(e0f239e2-848d-4af8-8655-52de33d6c78c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6cd44411-fc') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 02 10:03:20 compute-1 nova_compute[226294]: 2026-02-02 10:03:20.003 226298 DEBUG nova.objects.instance [None req-15192d88-df5d-4298-a2d7-50664196de1b 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Lazy-loading 'pci_devices' on Instance uuid 7440c4af-7e45-4796-ac03-ddd1eb035702 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 02 10:03:20 compute-1 nova_compute[226294]: 2026-02-02 10:03:20.022 226298 DEBUG nova.virt.libvirt.driver [None req-15192d88-df5d-4298-a2d7-50664196de1b 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] [instance: 7440c4af-7e45-4796-ac03-ddd1eb035702] End _get_guest_xml xml=<domain type="kvm">
Feb 02 10:03:20 compute-1 nova_compute[226294]:   <uuid>7440c4af-7e45-4796-ac03-ddd1eb035702</uuid>
Feb 02 10:03:20 compute-1 nova_compute[226294]:   <name>instance-00000002</name>
Feb 02 10:03:20 compute-1 nova_compute[226294]:   <memory>131072</memory>
Feb 02 10:03:20 compute-1 nova_compute[226294]:   <vcpu>1</vcpu>
Feb 02 10:03:20 compute-1 nova_compute[226294]:   <metadata>
Feb 02 10:03:20 compute-1 nova_compute[226294]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Feb 02 10:03:20 compute-1 nova_compute[226294]:       <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Feb 02 10:03:20 compute-1 nova_compute[226294]:       <nova:name>tempest-TestNetworkBasicOps-server-2085006119</nova:name>
Feb 02 10:03:20 compute-1 nova_compute[226294]:       <nova:creationTime>2026-02-02 10:03:18</nova:creationTime>
Feb 02 10:03:20 compute-1 nova_compute[226294]:       <nova:flavor name="m1.nano">
Feb 02 10:03:20 compute-1 nova_compute[226294]:         <nova:memory>128</nova:memory>
Feb 02 10:03:20 compute-1 nova_compute[226294]:         <nova:disk>1</nova:disk>
Feb 02 10:03:20 compute-1 nova_compute[226294]:         <nova:swap>0</nova:swap>
Feb 02 10:03:20 compute-1 nova_compute[226294]:         <nova:ephemeral>0</nova:ephemeral>
Feb 02 10:03:20 compute-1 nova_compute[226294]:         <nova:vcpus>1</nova:vcpus>
Feb 02 10:03:20 compute-1 nova_compute[226294]:       </nova:flavor>
Feb 02 10:03:20 compute-1 nova_compute[226294]:       <nova:owner>
Feb 02 10:03:20 compute-1 nova_compute[226294]:         <nova:user uuid="1b1695a2a70d4aa0aa350ba17d8f6d5e">tempest-TestNetworkBasicOps-793549693-project-member</nova:user>
Feb 02 10:03:20 compute-1 nova_compute[226294]:         <nova:project uuid="efbfe697ca674d72b47da5adf3e42c0c">tempest-TestNetworkBasicOps-793549693</nova:project>
Feb 02 10:03:20 compute-1 nova_compute[226294]:       </nova:owner>
Feb 02 10:03:20 compute-1 nova_compute[226294]:       <nova:root type="image" uuid="d5e062d7-95ef-409c-9ad0-60f7cf6f44ce"/>
Feb 02 10:03:20 compute-1 nova_compute[226294]:       <nova:ports>
Feb 02 10:03:20 compute-1 nova_compute[226294]:         <nova:port uuid="6cd44411-fc3c-47f5-9a7e-daa3acfc46b5">
Feb 02 10:03:20 compute-1 nova_compute[226294]:           <nova:ip type="fixed" address="10.100.0.25" ipVersion="4"/>
Feb 02 10:03:20 compute-1 nova_compute[226294]:         </nova:port>
Feb 02 10:03:20 compute-1 nova_compute[226294]:       </nova:ports>
Feb 02 10:03:20 compute-1 nova_compute[226294]:     </nova:instance>
Feb 02 10:03:20 compute-1 nova_compute[226294]:   </metadata>
Feb 02 10:03:20 compute-1 nova_compute[226294]:   <sysinfo type="smbios">
Feb 02 10:03:20 compute-1 nova_compute[226294]:     <system>
Feb 02 10:03:20 compute-1 nova_compute[226294]:       <entry name="manufacturer">RDO</entry>
Feb 02 10:03:20 compute-1 nova_compute[226294]:       <entry name="product">OpenStack Compute</entry>
Feb 02 10:03:20 compute-1 nova_compute[226294]:       <entry name="version">27.5.2-0.20260127144738.eaa65f0.el9</entry>
Feb 02 10:03:20 compute-1 nova_compute[226294]:       <entry name="serial">7440c4af-7e45-4796-ac03-ddd1eb035702</entry>
Feb 02 10:03:20 compute-1 nova_compute[226294]:       <entry name="uuid">7440c4af-7e45-4796-ac03-ddd1eb035702</entry>
Feb 02 10:03:20 compute-1 nova_compute[226294]:       <entry name="family">Virtual Machine</entry>
Feb 02 10:03:20 compute-1 nova_compute[226294]:     </system>
Feb 02 10:03:20 compute-1 nova_compute[226294]:   </sysinfo>
Feb 02 10:03:20 compute-1 nova_compute[226294]:   <os>
Feb 02 10:03:20 compute-1 nova_compute[226294]:     <type arch="x86_64" machine="q35">hvm</type>
Feb 02 10:03:20 compute-1 nova_compute[226294]:     <boot dev="hd"/>
Feb 02 10:03:20 compute-1 nova_compute[226294]:     <smbios mode="sysinfo"/>
Feb 02 10:03:20 compute-1 nova_compute[226294]:   </os>
Feb 02 10:03:20 compute-1 nova_compute[226294]:   <features>
Feb 02 10:03:20 compute-1 nova_compute[226294]:     <acpi/>
Feb 02 10:03:20 compute-1 nova_compute[226294]:     <apic/>
Feb 02 10:03:20 compute-1 nova_compute[226294]:     <vmcoreinfo/>
Feb 02 10:03:20 compute-1 nova_compute[226294]:   </features>
Feb 02 10:03:20 compute-1 nova_compute[226294]:   <clock offset="utc">
Feb 02 10:03:20 compute-1 nova_compute[226294]:     <timer name="pit" tickpolicy="delay"/>
Feb 02 10:03:20 compute-1 nova_compute[226294]:     <timer name="rtc" tickpolicy="catchup"/>
Feb 02 10:03:20 compute-1 nova_compute[226294]:     <timer name="hpet" present="no"/>
Feb 02 10:03:20 compute-1 nova_compute[226294]:   </clock>
Feb 02 10:03:20 compute-1 nova_compute[226294]:   <cpu mode="host-model" match="exact">
Feb 02 10:03:20 compute-1 nova_compute[226294]:     <topology sockets="1" cores="1" threads="1"/>
Feb 02 10:03:20 compute-1 nova_compute[226294]:   </cpu>
Feb 02 10:03:20 compute-1 nova_compute[226294]:   <devices>
Feb 02 10:03:20 compute-1 nova_compute[226294]:     <disk type="network" device="disk">
Feb 02 10:03:20 compute-1 nova_compute[226294]:       <driver type="raw" cache="none"/>
Feb 02 10:03:20 compute-1 nova_compute[226294]:       <source protocol="rbd" name="vms/7440c4af-7e45-4796-ac03-ddd1eb035702_disk">
Feb 02 10:03:20 compute-1 nova_compute[226294]:         <host name="192.168.122.100" port="6789"/>
Feb 02 10:03:20 compute-1 nova_compute[226294]:         <host name="192.168.122.102" port="6789"/>
Feb 02 10:03:20 compute-1 nova_compute[226294]:         <host name="192.168.122.101" port="6789"/>
Feb 02 10:03:20 compute-1 nova_compute[226294]:       </source>
Feb 02 10:03:20 compute-1 nova_compute[226294]:       <auth username="openstack">
Feb 02 10:03:20 compute-1 nova_compute[226294]:         <secret type="ceph" uuid="d241d473-9fcb-5f74-b163-f1ca4454e7f1"/>
Feb 02 10:03:20 compute-1 nova_compute[226294]:       </auth>
Feb 02 10:03:20 compute-1 nova_compute[226294]:       <target dev="vda" bus="virtio"/>
Feb 02 10:03:20 compute-1 nova_compute[226294]:     </disk>
Feb 02 10:03:20 compute-1 nova_compute[226294]:     <disk type="network" device="cdrom">
Feb 02 10:03:20 compute-1 nova_compute[226294]:       <driver type="raw" cache="none"/>
Feb 02 10:03:20 compute-1 nova_compute[226294]:       <source protocol="rbd" name="vms/7440c4af-7e45-4796-ac03-ddd1eb035702_disk.config">
Feb 02 10:03:20 compute-1 nova_compute[226294]:         <host name="192.168.122.100" port="6789"/>
Feb 02 10:03:20 compute-1 nova_compute[226294]:         <host name="192.168.122.102" port="6789"/>
Feb 02 10:03:20 compute-1 nova_compute[226294]:         <host name="192.168.122.101" port="6789"/>
Feb 02 10:03:20 compute-1 nova_compute[226294]:       </source>
Feb 02 10:03:20 compute-1 nova_compute[226294]:       <auth username="openstack">
Feb 02 10:03:20 compute-1 nova_compute[226294]:         <secret type="ceph" uuid="d241d473-9fcb-5f74-b163-f1ca4454e7f1"/>
Feb 02 10:03:20 compute-1 nova_compute[226294]:       </auth>
Feb 02 10:03:20 compute-1 nova_compute[226294]:       <target dev="sda" bus="sata"/>
Feb 02 10:03:20 compute-1 nova_compute[226294]:     </disk>
Feb 02 10:03:20 compute-1 nova_compute[226294]:     <interface type="ethernet">
Feb 02 10:03:20 compute-1 nova_compute[226294]:       <mac address="fa:16:3e:7b:4d:0e"/>
Feb 02 10:03:20 compute-1 nova_compute[226294]:       <model type="virtio"/>
Feb 02 10:03:20 compute-1 nova_compute[226294]:       <driver name="vhost" rx_queue_size="512"/>
Feb 02 10:03:20 compute-1 nova_compute[226294]:       <mtu size="1442"/>
Feb 02 10:03:20 compute-1 nova_compute[226294]:       <target dev="tap6cd44411-fc"/>
Feb 02 10:03:20 compute-1 nova_compute[226294]:     </interface>
Feb 02 10:03:20 compute-1 nova_compute[226294]:     <serial type="pty">
Feb 02 10:03:20 compute-1 nova_compute[226294]:       <log file="/var/lib/nova/instances/7440c4af-7e45-4796-ac03-ddd1eb035702/console.log" append="off"/>
Feb 02 10:03:20 compute-1 nova_compute[226294]:     </serial>
Feb 02 10:03:20 compute-1 nova_compute[226294]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Feb 02 10:03:20 compute-1 nova_compute[226294]:     <video>
Feb 02 10:03:20 compute-1 nova_compute[226294]:       <model type="virtio"/>
Feb 02 10:03:20 compute-1 nova_compute[226294]:     </video>
Feb 02 10:03:20 compute-1 nova_compute[226294]:     <input type="tablet" bus="usb"/>
Feb 02 10:03:20 compute-1 nova_compute[226294]:     <rng model="virtio">
Feb 02 10:03:20 compute-1 nova_compute[226294]:       <backend model="random">/dev/urandom</backend>
Feb 02 10:03:20 compute-1 nova_compute[226294]:     </rng>
Feb 02 10:03:20 compute-1 nova_compute[226294]:     <controller type="pci" model="pcie-root"/>
Feb 02 10:03:20 compute-1 nova_compute[226294]:     <controller type="pci" model="pcie-root-port"/>
Feb 02 10:03:20 compute-1 nova_compute[226294]:     <controller type="pci" model="pcie-root-port"/>
Feb 02 10:03:20 compute-1 nova_compute[226294]:     <controller type="pci" model="pcie-root-port"/>
Feb 02 10:03:20 compute-1 nova_compute[226294]:     <controller type="pci" model="pcie-root-port"/>
Feb 02 10:03:20 compute-1 nova_compute[226294]:     <controller type="pci" model="pcie-root-port"/>
Feb 02 10:03:20 compute-1 nova_compute[226294]:     <controller type="pci" model="pcie-root-port"/>
Feb 02 10:03:20 compute-1 nova_compute[226294]:     <controller type="pci" model="pcie-root-port"/>
Feb 02 10:03:20 compute-1 nova_compute[226294]:     <controller type="pci" model="pcie-root-port"/>
Feb 02 10:03:20 compute-1 nova_compute[226294]:     <controller type="pci" model="pcie-root-port"/>
Feb 02 10:03:20 compute-1 nova_compute[226294]:     <controller type="pci" model="pcie-root-port"/>
Feb 02 10:03:20 compute-1 nova_compute[226294]:     <controller type="pci" model="pcie-root-port"/>
Feb 02 10:03:20 compute-1 nova_compute[226294]:     <controller type="pci" model="pcie-root-port"/>
Feb 02 10:03:20 compute-1 nova_compute[226294]:     <controller type="pci" model="pcie-root-port"/>
Feb 02 10:03:20 compute-1 nova_compute[226294]:     <controller type="pci" model="pcie-root-port"/>
Feb 02 10:03:20 compute-1 nova_compute[226294]:     <controller type="pci" model="pcie-root-port"/>
Feb 02 10:03:20 compute-1 nova_compute[226294]:     <controller type="pci" model="pcie-root-port"/>
Feb 02 10:03:20 compute-1 nova_compute[226294]:     <controller type="pci" model="pcie-root-port"/>
Feb 02 10:03:20 compute-1 nova_compute[226294]:     <controller type="pci" model="pcie-root-port"/>
Feb 02 10:03:20 compute-1 nova_compute[226294]:     <controller type="pci" model="pcie-root-port"/>
Feb 02 10:03:20 compute-1 nova_compute[226294]:     <controller type="pci" model="pcie-root-port"/>
Feb 02 10:03:20 compute-1 nova_compute[226294]:     <controller type="pci" model="pcie-root-port"/>
Feb 02 10:03:20 compute-1 nova_compute[226294]:     <controller type="pci" model="pcie-root-port"/>
Feb 02 10:03:20 compute-1 nova_compute[226294]:     <controller type="pci" model="pcie-root-port"/>
Feb 02 10:03:20 compute-1 nova_compute[226294]:     <controller type="pci" model="pcie-root-port"/>
Feb 02 10:03:20 compute-1 nova_compute[226294]:     <controller type="usb" index="0"/>
Feb 02 10:03:20 compute-1 nova_compute[226294]:     <memballoon model="virtio">
Feb 02 10:03:20 compute-1 nova_compute[226294]:       <stats period="10"/>
Feb 02 10:03:20 compute-1 nova_compute[226294]:     </memballoon>
Feb 02 10:03:20 compute-1 nova_compute[226294]:   </devices>
Feb 02 10:03:20 compute-1 nova_compute[226294]: </domain>
Feb 02 10:03:20 compute-1 nova_compute[226294]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Feb 02 10:03:20 compute-1 nova_compute[226294]: 2026-02-02 10:03:20.023 226298 DEBUG nova.compute.manager [None req-15192d88-df5d-4298-a2d7-50664196de1b 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] [instance: 7440c4af-7e45-4796-ac03-ddd1eb035702] Preparing to wait for external event network-vif-plugged-6cd44411-fc3c-47f5-9a7e-daa3acfc46b5 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Feb 02 10:03:20 compute-1 nova_compute[226294]: 2026-02-02 10:03:20.024 226298 DEBUG oslo_concurrency.lockutils [None req-15192d88-df5d-4298-a2d7-50664196de1b 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Acquiring lock "7440c4af-7e45-4796-ac03-ddd1eb035702-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 02 10:03:20 compute-1 nova_compute[226294]: 2026-02-02 10:03:20.024 226298 DEBUG oslo_concurrency.lockutils [None req-15192d88-df5d-4298-a2d7-50664196de1b 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Lock "7440c4af-7e45-4796-ac03-ddd1eb035702-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 02 10:03:20 compute-1 nova_compute[226294]: 2026-02-02 10:03:20.025 226298 DEBUG oslo_concurrency.lockutils [None req-15192d88-df5d-4298-a2d7-50664196de1b 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Lock "7440c4af-7e45-4796-ac03-ddd1eb035702-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 02 10:03:20 compute-1 nova_compute[226294]: 2026-02-02 10:03:20.026 226298 DEBUG nova.virt.libvirt.vif [None req-15192d88-df5d-4298-a2d7-50664196de1b 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-02-02T10:03:08Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-2085006119',display_name='tempest-TestNetworkBasicOps-server-2085006119',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-2085006119',id=2,image_ref='d5e062d7-95ef-409c-9ad0-60f7cf6f44ce',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBPTNHym9kIQXfz4suqCv0Mc6jF6e6B0Wqnxt3FW6kpn3GvVkI5IjdsdGcg/l/4S1jaKaNrV8XIFTm81yb+PAyRJlVvfR+xXnnQd/vZvQFfCYnwki1MFmz37JvJeahfBCnw==',key_name='tempest-TestNetworkBasicOps-1317968913',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='efbfe697ca674d72b47da5adf3e42c0c',ramdisk_id='',reservation_id='r-qht7ug2q',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='d5e062d7-95ef-409c-9ad0-60f7cf6f44ce',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-793549693',owner_user_name='tempest-TestNetworkBasicOps-793549693-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-02-02T10:03:10Z,user_data=None,user_id='1b1695a2a70d4aa0aa350ba17d8f6d5e',uuid=7440c4af-7e45-4796-ac03-ddd1eb035702,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "6cd44411-fc3c-47f5-9a7e-daa3acfc46b5", "address": "fa:16:3e:7b:4d:0e", "network": {"id": "e0f239e2-848d-4af8-8655-52de33d6c78c", "bridge": "br-int", "label": "tempest-network-smoke--1699631923", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.25", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "efbfe697ca674d72b47da5adf3e42c0c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6cd44411-fc", "ovs_interfaceid": "6cd44411-fc3c-47f5-9a7e-daa3acfc46b5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Feb 02 10:03:20 compute-1 nova_compute[226294]: 2026-02-02 10:03:20.026 226298 DEBUG nova.network.os_vif_util [None req-15192d88-df5d-4298-a2d7-50664196de1b 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Converting VIF {"id": "6cd44411-fc3c-47f5-9a7e-daa3acfc46b5", "address": "fa:16:3e:7b:4d:0e", "network": {"id": "e0f239e2-848d-4af8-8655-52de33d6c78c", "bridge": "br-int", "label": "tempest-network-smoke--1699631923", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.25", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "efbfe697ca674d72b47da5adf3e42c0c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6cd44411-fc", "ovs_interfaceid": "6cd44411-fc3c-47f5-9a7e-daa3acfc46b5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 02 10:03:20 compute-1 nova_compute[226294]: 2026-02-02 10:03:20.027 226298 DEBUG nova.network.os_vif_util [None req-15192d88-df5d-4298-a2d7-50664196de1b 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:7b:4d:0e,bridge_name='br-int',has_traffic_filtering=True,id=6cd44411-fc3c-47f5-9a7e-daa3acfc46b5,network=Network(e0f239e2-848d-4af8-8655-52de33d6c78c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6cd44411-fc') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 02 10:03:20 compute-1 nova_compute[226294]: 2026-02-02 10:03:20.028 226298 DEBUG os_vif [None req-15192d88-df5d-4298-a2d7-50664196de1b 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:7b:4d:0e,bridge_name='br-int',has_traffic_filtering=True,id=6cd44411-fc3c-47f5-9a7e-daa3acfc46b5,network=Network(e0f239e2-848d-4af8-8655-52de33d6c78c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6cd44411-fc') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Feb 02 10:03:20 compute-1 nova_compute[226294]: 2026-02-02 10:03:20.090 226298 DEBUG ovsdbapp.backend.ovs_idl [None req-15192d88-df5d-4298-a2d7-50664196de1b 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Created schema index Interface.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Feb 02 10:03:20 compute-1 nova_compute[226294]: 2026-02-02 10:03:20.090 226298 DEBUG ovsdbapp.backend.ovs_idl [None req-15192d88-df5d-4298-a2d7-50664196de1b 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Created schema index Port.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Feb 02 10:03:20 compute-1 nova_compute[226294]: 2026-02-02 10:03:20.091 226298 DEBUG ovsdbapp.backend.ovs_idl [None req-15192d88-df5d-4298-a2d7-50664196de1b 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Created schema index Bridge.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Feb 02 10:03:20 compute-1 nova_compute[226294]: 2026-02-02 10:03:20.092 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-15192d88-df5d-4298-a2d7-50664196de1b 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] tcp:127.0.0.1:6640: entering CONNECTING _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 02 10:03:20 compute-1 nova_compute[226294]: 2026-02-02 10:03:20.093 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-15192d88-df5d-4298-a2d7-50664196de1b 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] [POLLOUT] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:03:20 compute-1 nova_compute[226294]: 2026-02-02 10:03:20.094 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-15192d88-df5d-4298-a2d7-50664196de1b 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 02 10:03:20 compute-1 nova_compute[226294]: 2026-02-02 10:03:20.094 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-15192d88-df5d-4298-a2d7-50664196de1b 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:03:20 compute-1 nova_compute[226294]: 2026-02-02 10:03:20.096 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-15192d88-df5d-4298-a2d7-50664196de1b 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:03:20 compute-1 nova_compute[226294]: 2026-02-02 10:03:20.099 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-15192d88-df5d-4298-a2d7-50664196de1b 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:03:20 compute-1 nova_compute[226294]: 2026-02-02 10:03:20.113 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:03:20 compute-1 nova_compute[226294]: 2026-02-02 10:03:20.113 226298 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 02 10:03:20 compute-1 nova_compute[226294]: 2026-02-02 10:03:20.114 226298 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 02 10:03:20 compute-1 nova_compute[226294]: 2026-02-02 10:03:20.115 226298 INFO oslo.privsep.daemon [None req-15192d88-df5d-4298-a2d7-50664196de1b 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Running privsep helper: ['sudo', 'nova-rootwrap', '/etc/nova/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/nova/nova.conf', '--config-file', '/etc/nova/nova-compute.conf', '--config-dir', '/etc/nova/nova.conf.d', '--privsep_context', 'vif_plug_ovs.privsep.vif_plug', '--privsep_sock_path', '/tmp/tmpx_r79dby/privsep.sock']
Feb 02 10:03:20 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:03:20 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 10:03:20 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:03:20.679 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 10:03:20 compute-1 ceph-mon[80115]: from='client.? 192.168.122.101:0/2403354180' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Feb 02 10:03:20 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:03:20 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 10:03:20 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:03:20.829 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 10:03:21 compute-1 nova_compute[226294]: 2026-02-02 10:03:21.027 226298 INFO oslo.privsep.daemon [None req-15192d88-df5d-4298-a2d7-50664196de1b 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Spawned new privsep daemon via rootwrap
Feb 02 10:03:21 compute-1 nova_compute[226294]: 2026-02-02 10:03:20.918 229645 INFO oslo.privsep.daemon [-] privsep daemon starting
Feb 02 10:03:21 compute-1 nova_compute[226294]: 2026-02-02 10:03:20.921 229645 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Feb 02 10:03:21 compute-1 nova_compute[226294]: 2026-02-02 10:03:20.924 229645 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_DAC_OVERRIDE|CAP_NET_ADMIN/CAP_DAC_OVERRIDE|CAP_NET_ADMIN/none
Feb 02 10:03:21 compute-1 nova_compute[226294]: 2026-02-02 10:03:20.924 229645 INFO oslo.privsep.daemon [-] privsep daemon running as pid 229645
Feb 02 10:03:21 compute-1 nova_compute[226294]: 2026-02-02 10:03:21.316 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:03:21 compute-1 nova_compute[226294]: 2026-02-02 10:03:21.316 226298 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap6cd44411-fc, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 02 10:03:21 compute-1 nova_compute[226294]: 2026-02-02 10:03:21.318 226298 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap6cd44411-fc, col_values=(('external_ids', {'iface-id': '6cd44411-fc3c-47f5-9a7e-daa3acfc46b5', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:7b:4d:0e', 'vm-uuid': '7440c4af-7e45-4796-ac03-ddd1eb035702'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 02 10:03:21 compute-1 nova_compute[226294]: 2026-02-02 10:03:21.320 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:03:21 compute-1 NetworkManager[49055]: <info>  [1770026601.3213] manager: (tap6cd44411-fc): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/23)
Feb 02 10:03:21 compute-1 nova_compute[226294]: 2026-02-02 10:03:21.324 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 02 10:03:21 compute-1 nova_compute[226294]: 2026-02-02 10:03:21.327 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:03:21 compute-1 nova_compute[226294]: 2026-02-02 10:03:21.328 226298 INFO os_vif [None req-15192d88-df5d-4298-a2d7-50664196de1b 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:7b:4d:0e,bridge_name='br-int',has_traffic_filtering=True,id=6cd44411-fc3c-47f5-9a7e-daa3acfc46b5,network=Network(e0f239e2-848d-4af8-8655-52de33d6c78c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6cd44411-fc')
Feb 02 10:03:21 compute-1 nova_compute[226294]: 2026-02-02 10:03:21.381 226298 DEBUG nova.virt.libvirt.driver [None req-15192d88-df5d-4298-a2d7-50664196de1b 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Feb 02 10:03:21 compute-1 nova_compute[226294]: 2026-02-02 10:03:21.381 226298 DEBUG nova.virt.libvirt.driver [None req-15192d88-df5d-4298-a2d7-50664196de1b 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Feb 02 10:03:21 compute-1 nova_compute[226294]: 2026-02-02 10:03:21.382 226298 DEBUG nova.virt.libvirt.driver [None req-15192d88-df5d-4298-a2d7-50664196de1b 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] No VIF found with MAC fa:16:3e:7b:4d:0e, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Feb 02 10:03:21 compute-1 nova_compute[226294]: 2026-02-02 10:03:21.383 226298 INFO nova.virt.libvirt.driver [None req-15192d88-df5d-4298-a2d7-50664196de1b 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] [instance: 7440c4af-7e45-4796-ac03-ddd1eb035702] Using config drive
Feb 02 10:03:21 compute-1 nova_compute[226294]: 2026-02-02 10:03:21.413 226298 DEBUG nova.storage.rbd_utils [None req-15192d88-df5d-4298-a2d7-50664196de1b 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] rbd image 7440c4af-7e45-4796-ac03-ddd1eb035702_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80
Feb 02 10:03:21 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:03:21 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b40016e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:03:21 compute-1 ceph-mon[80115]: pgmap v718: 353 pgs: 353 active+clean; 167 MiB data, 302 MiB used, 60 GiB / 60 GiB avail; 1.7 MiB/s rd, 1.8 MiB/s wr, 34 op/s
Feb 02 10:03:21 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:03:21 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f849c002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:03:21 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:03:21 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8494002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:03:22 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 10:03:22 compute-1 ovn_metadata_agent[143537]: 2026-02-02 10:03:22.390 143542 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=2f54a3b0-231a-4b96-9e3a-0a36e3e73216, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '4'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 02 10:03:22 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:03:22 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:03:22 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:03:22.682 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:03:22 compute-1 ceph-mon[80115]: pgmap v719: 353 pgs: 353 active+clean; 167 MiB data, 302 MiB used, 60 GiB / 60 GiB avail; 1.7 MiB/s rd, 1.8 MiB/s wr, 34 op/s
Feb 02 10:03:22 compute-1 nova_compute[226294]: 2026-02-02 10:03:22.753 226298 INFO nova.virt.libvirt.driver [None req-15192d88-df5d-4298-a2d7-50664196de1b 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] [instance: 7440c4af-7e45-4796-ac03-ddd1eb035702] Creating config drive at /var/lib/nova/instances/7440c4af-7e45-4796-ac03-ddd1eb035702/disk.config
Feb 02 10:03:22 compute-1 nova_compute[226294]: 2026-02-02 10:03:22.760 226298 DEBUG oslo_concurrency.processutils [None req-15192d88-df5d-4298-a2d7-50664196de1b 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/7440c4af-7e45-4796-ac03-ddd1eb035702/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmp0isvfgr9 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 02 10:03:22 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:03:22 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 10:03:22 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:03:22.831 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 10:03:22 compute-1 nova_compute[226294]: 2026-02-02 10:03:22.890 226298 DEBUG oslo_concurrency.processutils [None req-15192d88-df5d-4298-a2d7-50664196de1b 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/7440c4af-7e45-4796-ac03-ddd1eb035702/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmp0isvfgr9" returned: 0 in 0.131s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 02 10:03:22 compute-1 nova_compute[226294]: 2026-02-02 10:03:22.924 226298 DEBUG nova.storage.rbd_utils [None req-15192d88-df5d-4298-a2d7-50664196de1b 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] rbd image 7440c4af-7e45-4796-ac03-ddd1eb035702_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80
Feb 02 10:03:22 compute-1 nova_compute[226294]: 2026-02-02 10:03:22.928 226298 DEBUG oslo_concurrency.processutils [None req-15192d88-df5d-4298-a2d7-50664196de1b 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/7440c4af-7e45-4796-ac03-ddd1eb035702/disk.config 7440c4af-7e45-4796-ac03-ddd1eb035702_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 02 10:03:23 compute-1 nova_compute[226294]: 2026-02-02 10:03:23.097 226298 DEBUG oslo_concurrency.processutils [None req-15192d88-df5d-4298-a2d7-50664196de1b 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/7440c4af-7e45-4796-ac03-ddd1eb035702/disk.config 7440c4af-7e45-4796-ac03-ddd1eb035702_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.168s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 02 10:03:23 compute-1 nova_compute[226294]: 2026-02-02 10:03:23.098 226298 INFO nova.virt.libvirt.driver [None req-15192d88-df5d-4298-a2d7-50664196de1b 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] [instance: 7440c4af-7e45-4796-ac03-ddd1eb035702] Deleting local config drive /var/lib/nova/instances/7440c4af-7e45-4796-ac03-ddd1eb035702/disk.config because it was imported into RBD.
Feb 02 10:03:23 compute-1 systemd[1]: Starting libvirt secret daemon...
Feb 02 10:03:23 compute-1 systemd[1]: Started libvirt secret daemon.
Feb 02 10:03:23 compute-1 kernel: tun: Universal TUN/TAP device driver, 1.6
Feb 02 10:03:23 compute-1 kernel: tap6cd44411-fc: entered promiscuous mode
Feb 02 10:03:23 compute-1 NetworkManager[49055]: <info>  [1770026603.2634] manager: (tap6cd44411-fc): new Tun device (/org/freedesktop/NetworkManager/Devices/24)
Feb 02 10:03:23 compute-1 ovn_controller[133666]: 2026-02-02T10:03:23Z|00027|binding|INFO|Claiming lport 6cd44411-fc3c-47f5-9a7e-daa3acfc46b5 for this chassis.
Feb 02 10:03:23 compute-1 nova_compute[226294]: 2026-02-02 10:03:23.263 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:03:23 compute-1 ovn_controller[133666]: 2026-02-02T10:03:23Z|00028|binding|INFO|6cd44411-fc3c-47f5-9a7e-daa3acfc46b5: Claiming fa:16:3e:7b:4d:0e 10.100.0.25
Feb 02 10:03:23 compute-1 nova_compute[226294]: 2026-02-02 10:03:23.267 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:03:23 compute-1 podman[229710]: 2026-02-02 10:03:23.2740067 +0000 UTC m=+0.136677870 container health_status 1fb2696999ea15e9688144989093738543d3cef725f9986792d6dfd707aefbe2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:099d88ae13fa2b3409da5310cdcba7fa01d2c87a8bc98296299a57054b9a075e, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'db4758ee7523fe447444c4bd2b867b543b1eee4e3bbcf6676cd1b27bf6147d86-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:099d88ae13fa2b3409da5310cdcba7fa01d2c87a8bc98296299a57054b9a075e', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 02 10:03:23 compute-1 systemd-udevd[229768]: Network interface NamePolicy= disabled on kernel command line.
Feb 02 10:03:23 compute-1 NetworkManager[49055]: <info>  [1770026603.2996] device (tap6cd44411-fc): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Feb 02 10:03:23 compute-1 NetworkManager[49055]: <info>  [1770026603.3004] device (tap6cd44411-fc): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Feb 02 10:03:23 compute-1 ovn_metadata_agent[143537]: 2026-02-02 10:03:23.302 143542 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:7b:4d:0e 10.100.0.25'], port_security=['fa:16:3e:7b:4d:0e 10.100.0.25'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.25/28', 'neutron:device_id': '7440c4af-7e45-4796-ac03-ddd1eb035702', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e0f239e2-848d-4af8-8655-52de33d6c78c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'efbfe697ca674d72b47da5adf3e42c0c', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'ef61564f-168a-4d39-98a6-1b124de25a8e', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=690cea54-b458-4b47-8c01-a695a89a554b, chassis=[<ovs.db.idl.Row object at 0x7f9094efd8b0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f9094efd8b0>], logical_port=6cd44411-fc3c-47f5-9a7e-daa3acfc46b5) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 02 10:03:23 compute-1 ovn_metadata_agent[143537]: 2026-02-02 10:03:23.303 143542 INFO neutron.agent.ovn.metadata.agent [-] Port 6cd44411-fc3c-47f5-9a7e-daa3acfc46b5 in datapath e0f239e2-848d-4af8-8655-52de33d6c78c bound to our chassis
Feb 02 10:03:23 compute-1 ovn_metadata_agent[143537]: 2026-02-02 10:03:23.305 143542 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network e0f239e2-848d-4af8-8655-52de33d6c78c
Feb 02 10:03:23 compute-1 ovn_metadata_agent[143537]: 2026-02-02 10:03:23.306 143542 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.default', '--privsep_sock_path', '/tmp/tmpfvjbxss0/privsep.sock']
Feb 02 10:03:23 compute-1 ovn_controller[133666]: 2026-02-02T10:03:23Z|00029|binding|INFO|Setting lport 6cd44411-fc3c-47f5-9a7e-daa3acfc46b5 ovn-installed in OVS
Feb 02 10:03:23 compute-1 ovn_controller[133666]: 2026-02-02T10:03:23Z|00030|binding|INFO|Setting lport 6cd44411-fc3c-47f5-9a7e-daa3acfc46b5 up in Southbound
Feb 02 10:03:23 compute-1 nova_compute[226294]: 2026-02-02 10:03:23.314 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:03:23 compute-1 systemd-machined[195072]: New machine qemu-1-instance-00000002.
Feb 02 10:03:23 compute-1 systemd[1]: Started Virtual Machine qemu-1-instance-00000002.
Feb 02 10:03:23 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:03:23 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84bc003fe0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:03:23 compute-1 nova_compute[226294]: 2026-02-02 10:03:23.722 226298 DEBUG nova.compute.manager [req-3148a682-6d07-479d-9b30-bfc01d6e4c93 req-e86a4b7e-1a0f-4faa-9f17-979af3b0cc2a b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] [instance: 7440c4af-7e45-4796-ac03-ddd1eb035702] Received event network-vif-plugged-6cd44411-fc3c-47f5-9a7e-daa3acfc46b5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 02 10:03:23 compute-1 nova_compute[226294]: 2026-02-02 10:03:23.722 226298 DEBUG oslo_concurrency.lockutils [req-3148a682-6d07-479d-9b30-bfc01d6e4c93 req-e86a4b7e-1a0f-4faa-9f17-979af3b0cc2a b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] Acquiring lock "7440c4af-7e45-4796-ac03-ddd1eb035702-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 02 10:03:23 compute-1 nova_compute[226294]: 2026-02-02 10:03:23.723 226298 DEBUG oslo_concurrency.lockutils [req-3148a682-6d07-479d-9b30-bfc01d6e4c93 req-e86a4b7e-1a0f-4faa-9f17-979af3b0cc2a b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] Lock "7440c4af-7e45-4796-ac03-ddd1eb035702-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 02 10:03:23 compute-1 nova_compute[226294]: 2026-02-02 10:03:23.723 226298 DEBUG oslo_concurrency.lockutils [req-3148a682-6d07-479d-9b30-bfc01d6e4c93 req-e86a4b7e-1a0f-4faa-9f17-979af3b0cc2a b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] Lock "7440c4af-7e45-4796-ac03-ddd1eb035702-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 02 10:03:23 compute-1 nova_compute[226294]: 2026-02-02 10:03:23.723 226298 DEBUG nova.compute.manager [req-3148a682-6d07-479d-9b30-bfc01d6e4c93 req-e86a4b7e-1a0f-4faa-9f17-979af3b0cc2a b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] [instance: 7440c4af-7e45-4796-ac03-ddd1eb035702] Processing event network-vif-plugged-6cd44411-fc3c-47f5-9a7e-daa3acfc46b5 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Feb 02 10:03:23 compute-1 nova_compute[226294]: 2026-02-02 10:03:23.755 226298 DEBUG nova.virt.driver [None req-9f87d029-9537-479a-a951-66a5b7f88d49 - - - - - -] Emitting event <LifecycleEvent: 1770026603.7544875, 7440c4af-7e45-4796-ac03-ddd1eb035702 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 02 10:03:23 compute-1 nova_compute[226294]: 2026-02-02 10:03:23.755 226298 INFO nova.compute.manager [None req-9f87d029-9537-479a-a951-66a5b7f88d49 - - - - - -] [instance: 7440c4af-7e45-4796-ac03-ddd1eb035702] VM Started (Lifecycle Event)
Feb 02 10:03:23 compute-1 nova_compute[226294]: 2026-02-02 10:03:23.757 226298 DEBUG nova.compute.manager [None req-15192d88-df5d-4298-a2d7-50664196de1b 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] [instance: 7440c4af-7e45-4796-ac03-ddd1eb035702] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Feb 02 10:03:23 compute-1 nova_compute[226294]: 2026-02-02 10:03:23.769 226298 DEBUG nova.virt.libvirt.driver [None req-15192d88-df5d-4298-a2d7-50664196de1b 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] [instance: 7440c4af-7e45-4796-ac03-ddd1eb035702] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Feb 02 10:03:23 compute-1 nova_compute[226294]: 2026-02-02 10:03:23.773 226298 INFO nova.virt.libvirt.driver [-] [instance: 7440c4af-7e45-4796-ac03-ddd1eb035702] Instance spawned successfully.
Feb 02 10:03:23 compute-1 nova_compute[226294]: 2026-02-02 10:03:23.773 226298 DEBUG nova.virt.libvirt.driver [None req-15192d88-df5d-4298-a2d7-50664196de1b 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] [instance: 7440c4af-7e45-4796-ac03-ddd1eb035702] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Feb 02 10:03:23 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:03:23 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84bc003fe0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:03:23 compute-1 nova_compute[226294]: 2026-02-02 10:03:23.780 226298 DEBUG nova.compute.manager [None req-9f87d029-9537-479a-a951-66a5b7f88d49 - - - - - -] [instance: 7440c4af-7e45-4796-ac03-ddd1eb035702] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 02 10:03:23 compute-1 nova_compute[226294]: 2026-02-02 10:03:23.784 226298 DEBUG nova.compute.manager [None req-9f87d029-9537-479a-a951-66a5b7f88d49 - - - - - -] [instance: 7440c4af-7e45-4796-ac03-ddd1eb035702] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Feb 02 10:03:23 compute-1 nova_compute[226294]: 2026-02-02 10:03:23.799 226298 DEBUG nova.virt.libvirt.driver [None req-15192d88-df5d-4298-a2d7-50664196de1b 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] [instance: 7440c4af-7e45-4796-ac03-ddd1eb035702] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 02 10:03:23 compute-1 nova_compute[226294]: 2026-02-02 10:03:23.800 226298 DEBUG nova.virt.libvirt.driver [None req-15192d88-df5d-4298-a2d7-50664196de1b 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] [instance: 7440c4af-7e45-4796-ac03-ddd1eb035702] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 02 10:03:23 compute-1 nova_compute[226294]: 2026-02-02 10:03:23.800 226298 DEBUG nova.virt.libvirt.driver [None req-15192d88-df5d-4298-a2d7-50664196de1b 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] [instance: 7440c4af-7e45-4796-ac03-ddd1eb035702] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 02 10:03:23 compute-1 nova_compute[226294]: 2026-02-02 10:03:23.801 226298 DEBUG nova.virt.libvirt.driver [None req-15192d88-df5d-4298-a2d7-50664196de1b 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] [instance: 7440c4af-7e45-4796-ac03-ddd1eb035702] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 02 10:03:23 compute-1 nova_compute[226294]: 2026-02-02 10:03:23.801 226298 DEBUG nova.virt.libvirt.driver [None req-15192d88-df5d-4298-a2d7-50664196de1b 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] [instance: 7440c4af-7e45-4796-ac03-ddd1eb035702] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 02 10:03:23 compute-1 nova_compute[226294]: 2026-02-02 10:03:23.802 226298 DEBUG nova.virt.libvirt.driver [None req-15192d88-df5d-4298-a2d7-50664196de1b 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] [instance: 7440c4af-7e45-4796-ac03-ddd1eb035702] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 02 10:03:23 compute-1 nova_compute[226294]: 2026-02-02 10:03:23.841 226298 INFO nova.compute.manager [None req-9f87d029-9537-479a-a951-66a5b7f88d49 - - - - - -] [instance: 7440c4af-7e45-4796-ac03-ddd1eb035702] During sync_power_state the instance has a pending task (spawning). Skip.
Feb 02 10:03:23 compute-1 nova_compute[226294]: 2026-02-02 10:03:23.842 226298 DEBUG nova.virt.driver [None req-9f87d029-9537-479a-a951-66a5b7f88d49 - - - - - -] Emitting event <LifecycleEvent: 1770026603.7546926, 7440c4af-7e45-4796-ac03-ddd1eb035702 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 02 10:03:23 compute-1 nova_compute[226294]: 2026-02-02 10:03:23.842 226298 INFO nova.compute.manager [None req-9f87d029-9537-479a-a951-66a5b7f88d49 - - - - - -] [instance: 7440c4af-7e45-4796-ac03-ddd1eb035702] VM Paused (Lifecycle Event)
Feb 02 10:03:23 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:03:23 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f849c002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:03:23 compute-1 nova_compute[226294]: 2026-02-02 10:03:23.875 226298 DEBUG nova.compute.manager [None req-9f87d029-9537-479a-a951-66a5b7f88d49 - - - - - -] [instance: 7440c4af-7e45-4796-ac03-ddd1eb035702] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 02 10:03:23 compute-1 nova_compute[226294]: 2026-02-02 10:03:23.878 226298 DEBUG nova.virt.driver [None req-9f87d029-9537-479a-a951-66a5b7f88d49 - - - - - -] Emitting event <LifecycleEvent: 1770026603.760506, 7440c4af-7e45-4796-ac03-ddd1eb035702 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 02 10:03:23 compute-1 nova_compute[226294]: 2026-02-02 10:03:23.878 226298 INFO nova.compute.manager [None req-9f87d029-9537-479a-a951-66a5b7f88d49 - - - - - -] [instance: 7440c4af-7e45-4796-ac03-ddd1eb035702] VM Resumed (Lifecycle Event)
Feb 02 10:03:23 compute-1 nova_compute[226294]: 2026-02-02 10:03:23.884 226298 INFO nova.compute.manager [None req-15192d88-df5d-4298-a2d7-50664196de1b 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] [instance: 7440c4af-7e45-4796-ac03-ddd1eb035702] Took 13.09 seconds to spawn the instance on the hypervisor.
Feb 02 10:03:23 compute-1 nova_compute[226294]: 2026-02-02 10:03:23.885 226298 DEBUG nova.compute.manager [None req-15192d88-df5d-4298-a2d7-50664196de1b 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] [instance: 7440c4af-7e45-4796-ac03-ddd1eb035702] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 02 10:03:23 compute-1 nova_compute[226294]: 2026-02-02 10:03:23.913 226298 DEBUG nova.compute.manager [None req-9f87d029-9537-479a-a951-66a5b7f88d49 - - - - - -] [instance: 7440c4af-7e45-4796-ac03-ddd1eb035702] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 02 10:03:23 compute-1 nova_compute[226294]: 2026-02-02 10:03:23.915 226298 DEBUG nova.compute.manager [None req-9f87d029-9537-479a-a951-66a5b7f88d49 - - - - - -] [instance: 7440c4af-7e45-4796-ac03-ddd1eb035702] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Feb 02 10:03:23 compute-1 nova_compute[226294]: 2026-02-02 10:03:23.973 226298 INFO nova.compute.manager [None req-9f87d029-9537-479a-a951-66a5b7f88d49 - - - - - -] [instance: 7440c4af-7e45-4796-ac03-ddd1eb035702] During sync_power_state the instance has a pending task (spawning). Skip.
Feb 02 10:03:23 compute-1 ovn_metadata_agent[143537]: 2026-02-02 10:03:23.989 143542 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap
Feb 02 10:03:23 compute-1 ovn_metadata_agent[143537]: 2026-02-02 10:03:23.990 143542 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmpfvjbxss0/privsep.sock __init__ /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:362
Feb 02 10:03:23 compute-1 ovn_metadata_agent[143537]: 2026-02-02 10:03:23.855 229827 INFO oslo.privsep.daemon [-] privsep daemon starting
Feb 02 10:03:23 compute-1 ovn_metadata_agent[143537]: 2026-02-02 10:03:23.860 229827 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Feb 02 10:03:23 compute-1 ovn_metadata_agent[143537]: 2026-02-02 10:03:23.863 229827 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_NET_ADMIN|CAP_SYS_ADMIN|CAP_SYS_PTRACE/CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_NET_ADMIN|CAP_SYS_ADMIN|CAP_SYS_PTRACE/none
Feb 02 10:03:23 compute-1 nova_compute[226294]: 2026-02-02 10:03:23.991 226298 INFO nova.compute.manager [None req-15192d88-df5d-4298-a2d7-50664196de1b 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] [instance: 7440c4af-7e45-4796-ac03-ddd1eb035702] Took 14.14 seconds to build instance.
Feb 02 10:03:23 compute-1 ovn_metadata_agent[143537]: 2026-02-02 10:03:23.864 229827 INFO oslo.privsep.daemon [-] privsep daemon running as pid 229827
Feb 02 10:03:23 compute-1 ovn_metadata_agent[143537]: 2026-02-02 10:03:23.994 229827 DEBUG oslo.privsep.daemon [-] privsep: reply[e6b7431e-6b7f-4e50-9fbb-d191892069fb]: (2,) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 02 10:03:24 compute-1 nova_compute[226294]: 2026-02-02 10:03:24.009 226298 DEBUG oslo_concurrency.lockutils [None req-15192d88-df5d-4298-a2d7-50664196de1b 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Lock "7440c4af-7e45-4796-ac03-ddd1eb035702" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 14.222s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 02 10:03:24 compute-1 nova_compute[226294]: 2026-02-02 10:03:24.513 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:03:24 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:03:24 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:03:24 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:03:24.685 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:03:24 compute-1 ovn_metadata_agent[143537]: 2026-02-02 10:03:24.718 229827 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 02 10:03:24 compute-1 ovn_metadata_agent[143537]: 2026-02-02 10:03:24.718 229827 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 02 10:03:24 compute-1 ovn_metadata_agent[143537]: 2026-02-02 10:03:24.718 229827 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 02 10:03:24 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:03:24 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:03:24 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:03:24.834 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:03:25 compute-1 ovn_metadata_agent[143537]: 2026-02-02 10:03:25.584 229827 DEBUG oslo.privsep.daemon [-] privsep: reply[905302c7-bda3-4d0c-badc-8564beaa3fdd]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 02 10:03:25 compute-1 ovn_metadata_agent[143537]: 2026-02-02 10:03:25.585 143542 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tape0f239e2-81 in ovnmeta-e0f239e2-848d-4af8-8655-52de33d6c78c namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Feb 02 10:03:25 compute-1 ovn_metadata_agent[143537]: 2026-02-02 10:03:25.587 229827 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tape0f239e2-80 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Feb 02 10:03:25 compute-1 ceph-mon[80115]: pgmap v720: 353 pgs: 353 active+clean; 167 MiB data, 302 MiB used, 60 GiB / 60 GiB avail; 1.7 MiB/s rd, 1.8 MiB/s wr, 39 op/s
Feb 02 10:03:25 compute-1 ovn_metadata_agent[143537]: 2026-02-02 10:03:25.587 229827 DEBUG oslo.privsep.daemon [-] privsep: reply[4d2a9d6b-d839-4cc1-a68d-5c562ba10371]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 02 10:03:25 compute-1 ovn_metadata_agent[143537]: 2026-02-02 10:03:25.592 229827 DEBUG oslo.privsep.daemon [-] privsep: reply[c114385b-000a-4906-95b3-87b32147b154]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 02 10:03:25 compute-1 ovn_metadata_agent[143537]: 2026-02-02 10:03:25.614 143813 DEBUG oslo.privsep.daemon [-] privsep: reply[61487f99-5681-4beb-9dc9-feb43100099b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 02 10:03:25 compute-1 ovn_metadata_agent[143537]: 2026-02-02 10:03:25.631 229827 DEBUG oslo.privsep.daemon [-] privsep: reply[bf1240bf-c861-4d1a-824e-d827f99dc926]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 02 10:03:25 compute-1 ovn_metadata_agent[143537]: 2026-02-02 10:03:25.633 143542 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.link_cmd', '--privsep_sock_path', '/tmp/tmphjysm6an/privsep.sock']
Feb 02 10:03:25 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:03:25 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8494002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:03:25 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:03:25 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b40016e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:03:25 compute-1 nova_compute[226294]: 2026-02-02 10:03:25.822 226298 DEBUG nova.compute.manager [req-17defded-655c-4d98-b13b-34aead98e62c req-4e6dfd82-274e-462f-88f7-0d9a75fa1d58 b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] [instance: 7440c4af-7e45-4796-ac03-ddd1eb035702] Received event network-vif-plugged-6cd44411-fc3c-47f5-9a7e-daa3acfc46b5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 02 10:03:25 compute-1 nova_compute[226294]: 2026-02-02 10:03:25.822 226298 DEBUG oslo_concurrency.lockutils [req-17defded-655c-4d98-b13b-34aead98e62c req-4e6dfd82-274e-462f-88f7-0d9a75fa1d58 b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] Acquiring lock "7440c4af-7e45-4796-ac03-ddd1eb035702-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 02 10:03:25 compute-1 nova_compute[226294]: 2026-02-02 10:03:25.823 226298 DEBUG oslo_concurrency.lockutils [req-17defded-655c-4d98-b13b-34aead98e62c req-4e6dfd82-274e-462f-88f7-0d9a75fa1d58 b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] Lock "7440c4af-7e45-4796-ac03-ddd1eb035702-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 02 10:03:25 compute-1 nova_compute[226294]: 2026-02-02 10:03:25.823 226298 DEBUG oslo_concurrency.lockutils [req-17defded-655c-4d98-b13b-34aead98e62c req-4e6dfd82-274e-462f-88f7-0d9a75fa1d58 b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] Lock "7440c4af-7e45-4796-ac03-ddd1eb035702-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 02 10:03:25 compute-1 nova_compute[226294]: 2026-02-02 10:03:25.824 226298 DEBUG nova.compute.manager [req-17defded-655c-4d98-b13b-34aead98e62c req-4e6dfd82-274e-462f-88f7-0d9a75fa1d58 b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] [instance: 7440c4af-7e45-4796-ac03-ddd1eb035702] No waiting events found dispatching network-vif-plugged-6cd44411-fc3c-47f5-9a7e-daa3acfc46b5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 02 10:03:25 compute-1 nova_compute[226294]: 2026-02-02 10:03:25.824 226298 WARNING nova.compute.manager [req-17defded-655c-4d98-b13b-34aead98e62c req-4e6dfd82-274e-462f-88f7-0d9a75fa1d58 b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] [instance: 7440c4af-7e45-4796-ac03-ddd1eb035702] Received unexpected event network-vif-plugged-6cd44411-fc3c-47f5-9a7e-daa3acfc46b5 for instance with vm_state active and task_state None.
Feb 02 10:03:25 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:03:25 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84bc003fe0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:03:26 compute-1 ovn_metadata_agent[143537]: 2026-02-02 10:03:26.274 143542 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap
Feb 02 10:03:26 compute-1 ovn_metadata_agent[143537]: 2026-02-02 10:03:26.275 143542 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmphjysm6an/privsep.sock __init__ /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:362
Feb 02 10:03:26 compute-1 ovn_metadata_agent[143537]: 2026-02-02 10:03:26.153 229842 INFO oslo.privsep.daemon [-] privsep daemon starting
Feb 02 10:03:26 compute-1 ovn_metadata_agent[143537]: 2026-02-02 10:03:26.158 229842 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Feb 02 10:03:26 compute-1 ovn_metadata_agent[143537]: 2026-02-02 10:03:26.161 229842 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_NET_ADMIN|CAP_SYS_ADMIN/CAP_NET_ADMIN|CAP_SYS_ADMIN/none
Feb 02 10:03:26 compute-1 ovn_metadata_agent[143537]: 2026-02-02 10:03:26.162 229842 INFO oslo.privsep.daemon [-] privsep daemon running as pid 229842
Feb 02 10:03:26 compute-1 ovn_metadata_agent[143537]: 2026-02-02 10:03:26.279 229842 DEBUG oslo.privsep.daemon [-] privsep: reply[04df8505-c2f4-4e88-a08a-07e5289bf8c3]: (2,) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 02 10:03:26 compute-1 nova_compute[226294]: 2026-02-02 10:03:26.320 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:03:26 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:03:26 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:03:26 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:03:26.688 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:03:26 compute-1 ovn_metadata_agent[143537]: 2026-02-02 10:03:26.767 229842 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 02 10:03:26 compute-1 ovn_metadata_agent[143537]: 2026-02-02 10:03:26.767 229842 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 02 10:03:26 compute-1 ovn_metadata_agent[143537]: 2026-02-02 10:03:26.767 229842 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 02 10:03:26 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:03:26 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 10:03:26 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:03:26.836 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 10:03:27 compute-1 ovn_metadata_agent[143537]: 2026-02-02 10:03:27.263 229842 DEBUG oslo.privsep.daemon [-] privsep: reply[8ae2552b-e498-453a-9899-04498d282dab]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 02 10:03:27 compute-1 ovn_metadata_agent[143537]: 2026-02-02 10:03:27.289 229827 DEBUG oslo.privsep.daemon [-] privsep: reply[7edeeb21-1753-4a9a-b0de-0e092304784d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 02 10:03:27 compute-1 NetworkManager[49055]: <info>  [1770026607.2910] manager: (tape0f239e2-80): new Veth device (/org/freedesktop/NetworkManager/Devices/25)
Feb 02 10:03:27 compute-1 systemd-udevd[229855]: Network interface NamePolicy= disabled on kernel command line.
Feb 02 10:03:27 compute-1 ovn_metadata_agent[143537]: 2026-02-02 10:03:27.312 229842 DEBUG oslo.privsep.daemon [-] privsep: reply[9d9a97a7-5f65-454f-ae38-2d1ad844db2f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 02 10:03:27 compute-1 ovn_metadata_agent[143537]: 2026-02-02 10:03:27.315 229842 DEBUG oslo.privsep.daemon [-] privsep: reply[bd6e8b88-c4a8-4bb2-839a-b650f791233b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 02 10:03:27 compute-1 NetworkManager[49055]: <info>  [1770026607.3326] device (tape0f239e2-80): carrier: link connected
Feb 02 10:03:27 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 10:03:27 compute-1 ovn_metadata_agent[143537]: 2026-02-02 10:03:27.336 229842 DEBUG oslo.privsep.daemon [-] privsep: reply[d85d9bb5-2604-4b4c-b602-e631d7c08467]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 02 10:03:27 compute-1 ovn_metadata_agent[143537]: 2026-02-02 10:03:27.353 229827 DEBUG oslo.privsep.daemon [-] privsep: reply[5f713209-73f4-44d0-a144-b2a8470c9fa7]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tape0f239e2-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d3:a9:4f'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 13], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 376964, 'reachable_time': 40010, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 229873, 'error': None, 'target': 'ovnmeta-e0f239e2-848d-4af8-8655-52de33d6c78c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 02 10:03:27 compute-1 ovn_metadata_agent[143537]: 2026-02-02 10:03:27.366 229827 DEBUG oslo.privsep.daemon [-] privsep: reply[24e6aa69-0070-4a15-abc4-fd9fb13d2882]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fed3:a94f'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 376964, 'tstamp': 376964}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 229874, 'error': None, 'target': 'ovnmeta-e0f239e2-848d-4af8-8655-52de33d6c78c', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 02 10:03:27 compute-1 ovn_metadata_agent[143537]: 2026-02-02 10:03:27.380 229827 DEBUG oslo.privsep.daemon [-] privsep: reply[b8045f45-ae54-4d7c-b010-150208917eea]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tape0f239e2-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d3:a9:4f'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 13], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 376964, 'reachable_time': 40010, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 229875, 'error': None, 'target': 'ovnmeta-e0f239e2-848d-4af8-8655-52de33d6c78c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 02 10:03:27 compute-1 ovn_metadata_agent[143537]: 2026-02-02 10:03:27.402 229827 DEBUG oslo.privsep.daemon [-] privsep: reply[5ebafd7e-b4b4-4c8f-8e0f-ce8daf74dd47]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 02 10:03:27 compute-1 ovn_metadata_agent[143537]: 2026-02-02 10:03:27.445 229827 DEBUG oslo.privsep.daemon [-] privsep: reply[4cc034ef-8b71-4a5e-966d-80893eb45fbe]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 02 10:03:27 compute-1 ovn_metadata_agent[143537]: 2026-02-02 10:03:27.447 143542 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape0f239e2-80, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 02 10:03:27 compute-1 ovn_metadata_agent[143537]: 2026-02-02 10:03:27.447 143542 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 02 10:03:27 compute-1 ovn_metadata_agent[143537]: 2026-02-02 10:03:27.448 143542 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tape0f239e2-80, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 02 10:03:27 compute-1 nova_compute[226294]: 2026-02-02 10:03:27.450 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:03:27 compute-1 NetworkManager[49055]: <info>  [1770026607.4510] manager: (tape0f239e2-80): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/26)
Feb 02 10:03:27 compute-1 kernel: tape0f239e2-80: entered promiscuous mode
Feb 02 10:03:27 compute-1 nova_compute[226294]: 2026-02-02 10:03:27.452 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:03:27 compute-1 ovn_metadata_agent[143537]: 2026-02-02 10:03:27.454 143542 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tape0f239e2-80, col_values=(('external_ids', {'iface-id': 'a2dfb49c-9120-4cbe-a32f-76266c8258fd'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 02 10:03:27 compute-1 ovn_controller[133666]: 2026-02-02T10:03:27Z|00031|binding|INFO|Releasing lport a2dfb49c-9120-4cbe-a32f-76266c8258fd from this chassis (sb_readonly=0)
Feb 02 10:03:27 compute-1 nova_compute[226294]: 2026-02-02 10:03:27.455 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:03:27 compute-1 nova_compute[226294]: 2026-02-02 10:03:27.459 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:03:27 compute-1 ovn_metadata_agent[143537]: 2026-02-02 10:03:27.460 143542 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/e0f239e2-848d-4af8-8655-52de33d6c78c.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/e0f239e2-848d-4af8-8655-52de33d6c78c.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Feb 02 10:03:27 compute-1 ovn_metadata_agent[143537]: 2026-02-02 10:03:27.461 229827 DEBUG oslo.privsep.daemon [-] privsep: reply[eabcafc5-d68b-478b-a721-38e4aac48181]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 02 10:03:27 compute-1 ovn_metadata_agent[143537]: 2026-02-02 10:03:27.462 143542 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Feb 02 10:03:27 compute-1 ovn_metadata_agent[143537]: global
Feb 02 10:03:27 compute-1 ovn_metadata_agent[143537]:     log         /dev/log local0 debug
Feb 02 10:03:27 compute-1 ovn_metadata_agent[143537]:     log-tag     haproxy-metadata-proxy-e0f239e2-848d-4af8-8655-52de33d6c78c
Feb 02 10:03:27 compute-1 ovn_metadata_agent[143537]:     user        root
Feb 02 10:03:27 compute-1 ovn_metadata_agent[143537]:     group       root
Feb 02 10:03:27 compute-1 ovn_metadata_agent[143537]:     maxconn     1024
Feb 02 10:03:27 compute-1 ovn_metadata_agent[143537]:     pidfile     /var/lib/neutron/external/pids/e0f239e2-848d-4af8-8655-52de33d6c78c.pid.haproxy
Feb 02 10:03:27 compute-1 ovn_metadata_agent[143537]:     daemon
Feb 02 10:03:27 compute-1 ovn_metadata_agent[143537]: 
Feb 02 10:03:27 compute-1 ovn_metadata_agent[143537]: defaults
Feb 02 10:03:27 compute-1 ovn_metadata_agent[143537]:     log global
Feb 02 10:03:27 compute-1 ovn_metadata_agent[143537]:     mode http
Feb 02 10:03:27 compute-1 ovn_metadata_agent[143537]:     option httplog
Feb 02 10:03:27 compute-1 ovn_metadata_agent[143537]:     option dontlognull
Feb 02 10:03:27 compute-1 ovn_metadata_agent[143537]:     option http-server-close
Feb 02 10:03:27 compute-1 ovn_metadata_agent[143537]:     option forwardfor
Feb 02 10:03:27 compute-1 ovn_metadata_agent[143537]:     retries                 3
Feb 02 10:03:27 compute-1 ovn_metadata_agent[143537]:     timeout http-request    30s
Feb 02 10:03:27 compute-1 ovn_metadata_agent[143537]:     timeout connect         30s
Feb 02 10:03:27 compute-1 ovn_metadata_agent[143537]:     timeout client          32s
Feb 02 10:03:27 compute-1 ovn_metadata_agent[143537]:     timeout server          32s
Feb 02 10:03:27 compute-1 ovn_metadata_agent[143537]:     timeout http-keep-alive 30s
Feb 02 10:03:27 compute-1 ovn_metadata_agent[143537]: 
Feb 02 10:03:27 compute-1 ovn_metadata_agent[143537]: 
Feb 02 10:03:27 compute-1 ovn_metadata_agent[143537]: listen listener
Feb 02 10:03:27 compute-1 ovn_metadata_agent[143537]:     bind 169.254.169.254:80
Feb 02 10:03:27 compute-1 ovn_metadata_agent[143537]:     server metadata /var/lib/neutron/metadata_proxy
Feb 02 10:03:27 compute-1 ovn_metadata_agent[143537]:     http-request add-header X-OVN-Network-ID e0f239e2-848d-4af8-8655-52de33d6c78c
Feb 02 10:03:27 compute-1 ovn_metadata_agent[143537]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Feb 02 10:03:27 compute-1 ovn_metadata_agent[143537]: 2026-02-02 10:03:27.463 143542 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-e0f239e2-848d-4af8-8655-52de33d6c78c', 'env', 'PROCESS_TAG=haproxy-e0f239e2-848d-4af8-8655-52de33d6c78c', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/e0f239e2-848d-4af8-8655-52de33d6c78c.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Feb 02 10:03:27 compute-1 ceph-mon[80115]: pgmap v721: 353 pgs: 353 active+clean; 167 MiB data, 302 MiB used, 60 GiB / 60 GiB avail; 20 KiB/s rd, 1.8 MiB/s wr, 32 op/s
Feb 02 10:03:27 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:03:27 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f849c003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:03:27 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:03:27 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8494003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:03:27 compute-1 podman[229908]: 2026-02-02 10:03:27.84577758 +0000 UTC m=+0.115184963 container create 32c8233d46a3aba28edf7adfdd5cad3f04854640a635cc39b7a9013ccfe3cc23 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc, name=neutron-haproxy-ovnmeta-e0f239e2-848d-4af8-8655-52de33d6c78c, org.label-schema.build-date=20260127, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 02 10:03:27 compute-1 podman[229908]: 2026-02-02 10:03:27.761590067 +0000 UTC m=+0.030997500 image pull 19964fda6b912d3d57e21b0bcc221725d936e513025030cb508474fe04b06af8 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc
Feb 02 10:03:27 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:03:27 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b40016e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:03:27 compute-1 systemd[1]: Started libpod-conmon-32c8233d46a3aba28edf7adfdd5cad3f04854640a635cc39b7a9013ccfe3cc23.scope.
Feb 02 10:03:27 compute-1 systemd[1]: Started libcrun container.
Feb 02 10:03:27 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0c2211d282dfacd7338273edfd4401280099477e08d1784ec40396df24bf6f8d/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Feb 02 10:03:27 compute-1 podman[229908]: 2026-02-02 10:03:27.92379819 +0000 UTC m=+0.193205573 container init 32c8233d46a3aba28edf7adfdd5cad3f04854640a635cc39b7a9013ccfe3cc23 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc, name=neutron-haproxy-ovnmeta-e0f239e2-848d-4af8-8655-52de33d6c78c, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2)
Feb 02 10:03:27 compute-1 podman[229908]: 2026-02-02 10:03:27.930552078 +0000 UTC m=+0.199959431 container start 32c8233d46a3aba28edf7adfdd5cad3f04854640a635cc39b7a9013ccfe3cc23 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc, name=neutron-haproxy-ovnmeta-e0f239e2-848d-4af8-8655-52de33d6c78c, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127)
Feb 02 10:03:27 compute-1 neutron-haproxy-ovnmeta-e0f239e2-848d-4af8-8655-52de33d6c78c[229923]: [NOTICE]   (229927) : New worker (229929) forked
Feb 02 10:03:27 compute-1 neutron-haproxy-ovnmeta-e0f239e2-848d-4af8-8655-52de33d6c78c[229923]: [NOTICE]   (229927) : Loading success.
Feb 02 10:03:28 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:03:28 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 10:03:28 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:03:28.690 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 10:03:28 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:03:28 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 10:03:28 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:03:28.839 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 10:03:29 compute-1 podman[229939]: 2026-02-02 10:03:29.391215104 +0000 UTC m=+0.066991910 container health_status 9cab238676dda9a572dafc047b624a8b78a8fbec063bf997856559897160605d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'db4758ee7523fe447444c4bd2b867b543b1eee4e3bbcf6676cd1b27bf6147d86-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Feb 02 10:03:29 compute-1 nova_compute[226294]: 2026-02-02 10:03:29.547 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:03:29 compute-1 ceph-mon[80115]: pgmap v722: 353 pgs: 353 active+clean; 167 MiB data, 302 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 1.8 MiB/s wr, 101 op/s
Feb 02 10:03:29 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:03:29 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84bc003fe0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:03:29 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:03:29 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f849c003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:03:29 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:03:29 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8494003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:03:30 compute-1 rsyslogd[1009]: imjournal: 4611 messages lost due to rate-limiting (20000 allowed within 600 seconds)
Feb 02 10:03:30 compute-1 ceph-mon[80115]: pgmap v723: 353 pgs: 353 active+clean; 167 MiB data, 302 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 15 KiB/s wr, 74 op/s
Feb 02 10:03:30 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:03:30 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:03:30 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:03:30.693 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:03:30 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:03:30 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 10:03:30 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:03:30.842 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 10:03:31 compute-1 nova_compute[226294]: 2026-02-02 10:03:31.322 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:03:31 compute-1 nova_compute[226294]: 2026-02-02 10:03:31.582 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:03:31 compute-1 NetworkManager[49055]: <info>  [1770026611.5835] manager: (patch-br-int-to-provnet-3738ab71-03c6-44c1-bc4f-10cf3e96782e): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/27)
Feb 02 10:03:31 compute-1 NetworkManager[49055]: <info>  [1770026611.5846] device (patch-br-int-to-provnet-3738ab71-03c6-44c1-bc4f-10cf3e96782e)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Feb 02 10:03:31 compute-1 NetworkManager[49055]: <warn>  [1770026611.5848] device (patch-br-int-to-provnet-3738ab71-03c6-44c1-bc4f-10cf3e96782e)[Open vSwitch Interface]: error setting IPv4 forwarding to '1': No such file or directory
Feb 02 10:03:31 compute-1 NetworkManager[49055]: <info>  [1770026611.5866] manager: (patch-provnet-3738ab71-03c6-44c1-bc4f-10cf3e96782e-to-br-int): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/28)
Feb 02 10:03:31 compute-1 NetworkManager[49055]: <info>  [1770026611.5874] device (patch-provnet-3738ab71-03c6-44c1-bc4f-10cf3e96782e-to-br-int)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Feb 02 10:03:31 compute-1 NetworkManager[49055]: <warn>  [1770026611.5875] device (patch-provnet-3738ab71-03c6-44c1-bc4f-10cf3e96782e-to-br-int)[Open vSwitch Interface]: error setting IPv4 forwarding to '1': No such file or directory
Feb 02 10:03:31 compute-1 NetworkManager[49055]: <info>  [1770026611.5891] manager: (patch-br-int-to-provnet-3738ab71-03c6-44c1-bc4f-10cf3e96782e): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/29)
Feb 02 10:03:31 compute-1 NetworkManager[49055]: <info>  [1770026611.5904] manager: (patch-provnet-3738ab71-03c6-44c1-bc4f-10cf3e96782e-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/30)
Feb 02 10:03:31 compute-1 NetworkManager[49055]: <info>  [1770026611.5913] device (patch-br-int-to-provnet-3738ab71-03c6-44c1-bc4f-10cf3e96782e)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'none', managed-type: 'full')
Feb 02 10:03:31 compute-1 NetworkManager[49055]: <info>  [1770026611.5920] device (patch-provnet-3738ab71-03c6-44c1-bc4f-10cf3e96782e-to-br-int)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'none', managed-type: 'full')
Feb 02 10:03:31 compute-1 ovn_controller[133666]: 2026-02-02T10:03:31Z|00032|binding|INFO|Releasing lport a2dfb49c-9120-4cbe-a32f-76266c8258fd from this chassis (sb_readonly=0)
Feb 02 10:03:31 compute-1 nova_compute[226294]: 2026-02-02 10:03:31.602 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:03:31 compute-1 nova_compute[226294]: 2026-02-02 10:03:31.615 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:03:31 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:03:31 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b40016e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:03:31 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:03:31 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84bc003fe0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:03:31 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:03:31 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f849c003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:03:32 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Feb 02 10:03:32 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 10:03:32 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:03:32 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:03:32 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:03:32.695 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:03:32 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:03:32 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 10:03:32 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:03:32.845 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 10:03:33 compute-1 ceph-mon[80115]: pgmap v724: 353 pgs: 353 active+clean; 167 MiB data, 302 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 15 KiB/s wr, 74 op/s
Feb 02 10:03:33 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:03:33 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8494003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:03:33 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:03:33 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8494003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:03:33 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:03:33 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84bc003fe0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:03:34 compute-1 nova_compute[226294]: 2026-02-02 10:03:34.590 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:03:34 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:03:34 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:03:34 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:03:34.698 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:03:34 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:03:34 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:03:34 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:03:34.849 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:03:35 compute-1 ceph-mon[80115]: pgmap v725: 353 pgs: 353 active+clean; 167 MiB data, 302 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 16 KiB/s wr, 74 op/s
Feb 02 10:03:35 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:03:35 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f849c003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:03:35 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:03:35 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b40037a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:03:35 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:03:35 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8494003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:03:36 compute-1 nova_compute[226294]: 2026-02-02 10:03:36.323 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:03:36 compute-1 ovn_controller[133666]: 2026-02-02T10:03:36Z|00004|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:7b:4d:0e 10.100.0.25
Feb 02 10:03:36 compute-1 ovn_controller[133666]: 2026-02-02T10:03:36Z|00005|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:7b:4d:0e 10.100.0.25
Feb 02 10:03:36 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:03:36 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:03:36 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:03:36.701 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:03:36 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:03:36 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:03:36 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:03:36.851 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:03:36 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-haproxy-nfs-cephfs-compute-1-sryqbx[86108]: [WARNING] 032/100336 (4) : Server backend/nfs.cephfs.1 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Feb 02 10:03:37 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 10:03:37 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:03:37 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84bc003fe0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:03:37 compute-1 ceph-mon[80115]: pgmap v726: 353 pgs: 353 active+clean; 167 MiB data, 302 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 1023 B/s wr, 69 op/s
Feb 02 10:03:37 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:03:37 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f849c003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:03:37 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:03:37 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b40037a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:03:38 compute-1 sudo[229966]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Feb 02 10:03:38 compute-1 sudo[229966]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 10:03:38 compute-1 sudo[229966]: pam_unix(sudo:session): session closed for user root
Feb 02 10:03:38 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:03:38 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:03:38 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:03:38.704 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:03:38 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:03:38 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:03:38 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:03:38.853 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:03:38 compute-1 ceph-mon[80115]: pgmap v727: 353 pgs: 353 active+clean; 200 MiB data, 327 MiB used, 60 GiB / 60 GiB avail; 2.2 MiB/s rd, 2.1 MiB/s wr, 134 op/s
Feb 02 10:03:39 compute-1 nova_compute[226294]: 2026-02-02 10:03:39.628 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:03:39 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:03:39 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8494003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:03:39 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:03:39 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84bc003fe0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:03:39 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:03:39 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8498000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:03:40 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:03:40 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:03:40 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:03:40.706 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:03:40 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:03:40 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 10:03:40 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:03:40.855 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 10:03:41 compute-1 nova_compute[226294]: 2026-02-02 10:03:41.325 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:03:41 compute-1 ceph-mon[80115]: pgmap v728: 353 pgs: 353 active+clean; 200 MiB data, 327 MiB used, 60 GiB / 60 GiB avail; 326 KiB/s rd, 2.1 MiB/s wr, 65 op/s
Feb 02 10:03:41 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:03:41 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b8002010 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:03:41 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:03:41 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8494003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:03:41 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:03:41 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84bc003fe0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:03:42 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 10:03:42 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:03:42 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 10:03:42 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:03:42.708 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 10:03:42 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:03:42 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 10:03:42 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:03:42.859 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 10:03:43 compute-1 nova_compute[226294]: 2026-02-02 10:03:43.612 226298 DEBUG oslo_concurrency.lockutils [None req-eca082bb-25ef-4c4d-8e2c-ee5a4151b8f6 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Acquiring lock "7440c4af-7e45-4796-ac03-ddd1eb035702" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 02 10:03:43 compute-1 ceph-mon[80115]: pgmap v729: 353 pgs: 353 active+clean; 200 MiB data, 327 MiB used, 60 GiB / 60 GiB avail; 326 KiB/s rd, 2.1 MiB/s wr, 65 op/s
Feb 02 10:03:43 compute-1 nova_compute[226294]: 2026-02-02 10:03:43.613 226298 DEBUG oslo_concurrency.lockutils [None req-eca082bb-25ef-4c4d-8e2c-ee5a4151b8f6 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Lock "7440c4af-7e45-4796-ac03-ddd1eb035702" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 02 10:03:43 compute-1 nova_compute[226294]: 2026-02-02 10:03:43.613 226298 DEBUG oslo_concurrency.lockutils [None req-eca082bb-25ef-4c4d-8e2c-ee5a4151b8f6 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Acquiring lock "7440c4af-7e45-4796-ac03-ddd1eb035702-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 02 10:03:43 compute-1 nova_compute[226294]: 2026-02-02 10:03:43.613 226298 DEBUG oslo_concurrency.lockutils [None req-eca082bb-25ef-4c4d-8e2c-ee5a4151b8f6 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Lock "7440c4af-7e45-4796-ac03-ddd1eb035702-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 02 10:03:43 compute-1 nova_compute[226294]: 2026-02-02 10:03:43.614 226298 DEBUG oslo_concurrency.lockutils [None req-eca082bb-25ef-4c4d-8e2c-ee5a4151b8f6 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Lock "7440c4af-7e45-4796-ac03-ddd1eb035702-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 02 10:03:43 compute-1 nova_compute[226294]: 2026-02-02 10:03:43.615 226298 INFO nova.compute.manager [None req-eca082bb-25ef-4c4d-8e2c-ee5a4151b8f6 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] [instance: 7440c4af-7e45-4796-ac03-ddd1eb035702] Terminating instance
Feb 02 10:03:43 compute-1 nova_compute[226294]: 2026-02-02 10:03:43.617 226298 DEBUG nova.compute.manager [None req-eca082bb-25ef-4c4d-8e2c-ee5a4151b8f6 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] [instance: 7440c4af-7e45-4796-ac03-ddd1eb035702] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Feb 02 10:03:43 compute-1 kernel: tap6cd44411-fc (unregistering): left promiscuous mode
Feb 02 10:03:43 compute-1 NetworkManager[49055]: <info>  [1770026623.6714] device (tap6cd44411-fc): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Feb 02 10:03:43 compute-1 ovn_controller[133666]: 2026-02-02T10:03:43Z|00033|binding|INFO|Releasing lport 6cd44411-fc3c-47f5-9a7e-daa3acfc46b5 from this chassis (sb_readonly=0)
Feb 02 10:03:43 compute-1 ovn_controller[133666]: 2026-02-02T10:03:43Z|00034|binding|INFO|Setting lport 6cd44411-fc3c-47f5-9a7e-daa3acfc46b5 down in Southbound
Feb 02 10:03:43 compute-1 ovn_controller[133666]: 2026-02-02T10:03:43Z|00035|binding|INFO|Removing iface tap6cd44411-fc ovn-installed in OVS
Feb 02 10:03:43 compute-1 nova_compute[226294]: 2026-02-02 10:03:43.677 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:03:43 compute-1 ovn_metadata_agent[143537]: 2026-02-02 10:03:43.684 143542 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:7b:4d:0e 10.100.0.25'], port_security=['fa:16:3e:7b:4d:0e 10.100.0.25'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.25/28', 'neutron:device_id': '7440c4af-7e45-4796-ac03-ddd1eb035702', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e0f239e2-848d-4af8-8655-52de33d6c78c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'efbfe697ca674d72b47da5adf3e42c0c', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'ef61564f-168a-4d39-98a6-1b124de25a8e', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=690cea54-b458-4b47-8c01-a695a89a554b, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f9094efd8b0>], logical_port=6cd44411-fc3c-47f5-9a7e-daa3acfc46b5) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f9094efd8b0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 02 10:03:43 compute-1 ovn_metadata_agent[143537]: 2026-02-02 10:03:43.685 143542 INFO neutron.agent.ovn.metadata.agent [-] Port 6cd44411-fc3c-47f5-9a7e-daa3acfc46b5 in datapath e0f239e2-848d-4af8-8655-52de33d6c78c unbound from our chassis
Feb 02 10:03:43 compute-1 ovn_metadata_agent[143537]: 2026-02-02 10:03:43.687 143542 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network e0f239e2-848d-4af8-8655-52de33d6c78c, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Feb 02 10:03:43 compute-1 ovn_metadata_agent[143537]: 2026-02-02 10:03:43.687 229827 DEBUG oslo.privsep.daemon [-] privsep: reply[0295001c-fabd-4952-abaf-59b240f5050f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 02 10:03:43 compute-1 ovn_metadata_agent[143537]: 2026-02-02 10:03:43.688 143542 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-e0f239e2-848d-4af8-8655-52de33d6c78c namespace which is not needed anymore
Feb 02 10:03:43 compute-1 nova_compute[226294]: 2026-02-02 10:03:43.695 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:03:43 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:03:43 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84980016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:03:43 compute-1 systemd[1]: machine-qemu\x2d1\x2dinstance\x2d00000002.scope: Deactivated successfully.
Feb 02 10:03:43 compute-1 systemd[1]: machine-qemu\x2d1\x2dinstance\x2d00000002.scope: Consumed 12.830s CPU time.
Feb 02 10:03:43 compute-1 systemd-machined[195072]: Machine qemu-1-instance-00000002 terminated.
Feb 02 10:03:43 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-haproxy-nfs-cephfs-compute-1-sryqbx[86108]: [WARNING] 032/100343 (4) : Server backend/nfs.cephfs.2 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 1 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Feb 02 10:03:43 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:03:43 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b8002010 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:03:43 compute-1 neutron-haproxy-ovnmeta-e0f239e2-848d-4af8-8655-52de33d6c78c[229923]: [NOTICE]   (229927) : haproxy version is 2.8.14-c23fe91
Feb 02 10:03:43 compute-1 neutron-haproxy-ovnmeta-e0f239e2-848d-4af8-8655-52de33d6c78c[229923]: [NOTICE]   (229927) : path to executable is /usr/sbin/haproxy
Feb 02 10:03:43 compute-1 neutron-haproxy-ovnmeta-e0f239e2-848d-4af8-8655-52de33d6c78c[229923]: [WARNING]  (229927) : Exiting Master process...
Feb 02 10:03:43 compute-1 neutron-haproxy-ovnmeta-e0f239e2-848d-4af8-8655-52de33d6c78c[229923]: [WARNING]  (229927) : Exiting Master process...
Feb 02 10:03:43 compute-1 neutron-haproxy-ovnmeta-e0f239e2-848d-4af8-8655-52de33d6c78c[229923]: [ALERT]    (229927) : Current worker (229929) exited with code 143 (Terminated)
Feb 02 10:03:43 compute-1 neutron-haproxy-ovnmeta-e0f239e2-848d-4af8-8655-52de33d6c78c[229923]: [WARNING]  (229927) : All workers exited. Exiting... (0)
Feb 02 10:03:43 compute-1 systemd[1]: libpod-32c8233d46a3aba28edf7adfdd5cad3f04854640a635cc39b7a9013ccfe3cc23.scope: Deactivated successfully.
Feb 02 10:03:43 compute-1 podman[230022]: 2026-02-02 10:03:43.82983318 +0000 UTC m=+0.046244922 container died 32c8233d46a3aba28edf7adfdd5cad3f04854640a635cc39b7a9013ccfe3cc23 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc, name=neutron-haproxy-ovnmeta-e0f239e2-848d-4af8-8655-52de33d6c78c, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Feb 02 10:03:43 compute-1 nova_compute[226294]: 2026-02-02 10:03:43.847 226298 INFO nova.virt.libvirt.driver [-] [instance: 7440c4af-7e45-4796-ac03-ddd1eb035702] Instance destroyed successfully.
Feb 02 10:03:43 compute-1 nova_compute[226294]: 2026-02-02 10:03:43.848 226298 DEBUG nova.objects.instance [None req-eca082bb-25ef-4c4d-8e2c-ee5a4151b8f6 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Lazy-loading 'resources' on Instance uuid 7440c4af-7e45-4796-ac03-ddd1eb035702 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 02 10:03:43 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-32c8233d46a3aba28edf7adfdd5cad3f04854640a635cc39b7a9013ccfe3cc23-userdata-shm.mount: Deactivated successfully.
Feb 02 10:03:43 compute-1 systemd[1]: var-lib-containers-storage-overlay-0c2211d282dfacd7338273edfd4401280099477e08d1784ec40396df24bf6f8d-merged.mount: Deactivated successfully.
Feb 02 10:03:43 compute-1 nova_compute[226294]: 2026-02-02 10:03:43.862 226298 DEBUG nova.virt.libvirt.vif [None req-eca082bb-25ef-4c4d-8e2c-ee5a4151b8f6 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-02-02T10:03:08Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-2085006119',display_name='tempest-TestNetworkBasicOps-server-2085006119',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-2085006119',id=2,image_ref='d5e062d7-95ef-409c-9ad0-60f7cf6f44ce',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBPTNHym9kIQXfz4suqCv0Mc6jF6e6B0Wqnxt3FW6kpn3GvVkI5IjdsdGcg/l/4S1jaKaNrV8XIFTm81yb+PAyRJlVvfR+xXnnQd/vZvQFfCYnwki1MFmz37JvJeahfBCnw==',key_name='tempest-TestNetworkBasicOps-1317968913',keypairs=<?>,launch_index=0,launched_at=2026-02-02T10:03:23Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='efbfe697ca674d72b47da5adf3e42c0c',ramdisk_id='',reservation_id='r-qht7ug2q',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='d5e062d7-95ef-409c-9ad0-60f7cf6f44ce',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-793549693',owner_user_name='tempest-TestNetworkBasicOps-793549693-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-02-02T10:03:23Z,user_data=None,user_id='1b1695a2a70d4aa0aa350ba17d8f6d5e',uuid=7440c4af-7e45-4796-ac03-ddd1eb035702,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "6cd44411-fc3c-47f5-9a7e-daa3acfc46b5", "address": "fa:16:3e:7b:4d:0e", "network": {"id": "e0f239e2-848d-4af8-8655-52de33d6c78c", "bridge": "br-int", "label": "tempest-network-smoke--1699631923", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.25", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "efbfe697ca674d72b47da5adf3e42c0c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6cd44411-fc", "ovs_interfaceid": "6cd44411-fc3c-47f5-9a7e-daa3acfc46b5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Feb 02 10:03:43 compute-1 nova_compute[226294]: 2026-02-02 10:03:43.862 226298 DEBUG nova.network.os_vif_util [None req-eca082bb-25ef-4c4d-8e2c-ee5a4151b8f6 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Converting VIF {"id": "6cd44411-fc3c-47f5-9a7e-daa3acfc46b5", "address": "fa:16:3e:7b:4d:0e", "network": {"id": "e0f239e2-848d-4af8-8655-52de33d6c78c", "bridge": "br-int", "label": "tempest-network-smoke--1699631923", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.25", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "efbfe697ca674d72b47da5adf3e42c0c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6cd44411-fc", "ovs_interfaceid": "6cd44411-fc3c-47f5-9a7e-daa3acfc46b5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 02 10:03:43 compute-1 nova_compute[226294]: 2026-02-02 10:03:43.863 226298 DEBUG nova.network.os_vif_util [None req-eca082bb-25ef-4c4d-8e2c-ee5a4151b8f6 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:7b:4d:0e,bridge_name='br-int',has_traffic_filtering=True,id=6cd44411-fc3c-47f5-9a7e-daa3acfc46b5,network=Network(e0f239e2-848d-4af8-8655-52de33d6c78c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6cd44411-fc') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 02 10:03:43 compute-1 nova_compute[226294]: 2026-02-02 10:03:43.863 226298 DEBUG os_vif [None req-eca082bb-25ef-4c4d-8e2c-ee5a4151b8f6 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:7b:4d:0e,bridge_name='br-int',has_traffic_filtering=True,id=6cd44411-fc3c-47f5-9a7e-daa3acfc46b5,network=Network(e0f239e2-848d-4af8-8655-52de33d6c78c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6cd44411-fc') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Feb 02 10:03:43 compute-1 nova_compute[226294]: 2026-02-02 10:03:43.865 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:03:43 compute-1 nova_compute[226294]: 2026-02-02 10:03:43.865 226298 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6cd44411-fc, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 02 10:03:43 compute-1 podman[230022]: 2026-02-02 10:03:43.866342904 +0000 UTC m=+0.082754666 container cleanup 32c8233d46a3aba28edf7adfdd5cad3f04854640a635cc39b7a9013ccfe3cc23 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc, name=neutron-haproxy-ovnmeta-e0f239e2-848d-4af8-8655-52de33d6c78c, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Feb 02 10:03:43 compute-1 nova_compute[226294]: 2026-02-02 10:03:43.867 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:03:43 compute-1 nova_compute[226294]: 2026-02-02 10:03:43.868 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:03:43 compute-1 nova_compute[226294]: 2026-02-02 10:03:43.871 226298 INFO os_vif [None req-eca082bb-25ef-4c4d-8e2c-ee5a4151b8f6 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:7b:4d:0e,bridge_name='br-int',has_traffic_filtering=True,id=6cd44411-fc3c-47f5-9a7e-daa3acfc46b5,network=Network(e0f239e2-848d-4af8-8655-52de33d6c78c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6cd44411-fc')
Feb 02 10:03:43 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:03:43 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8494003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:03:43 compute-1 systemd[1]: libpod-conmon-32c8233d46a3aba28edf7adfdd5cad3f04854640a635cc39b7a9013ccfe3cc23.scope: Deactivated successfully.
Feb 02 10:03:43 compute-1 podman[230061]: 2026-02-02 10:03:43.9263933 +0000 UTC m=+0.038902929 container remove 32c8233d46a3aba28edf7adfdd5cad3f04854640a635cc39b7a9013ccfe3cc23 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc, name=neutron-haproxy-ovnmeta-e0f239e2-848d-4af8-8655-52de33d6c78c, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Feb 02 10:03:43 compute-1 ovn_metadata_agent[143537]: 2026-02-02 10:03:43.929 229827 DEBUG oslo.privsep.daemon [-] privsep: reply[36af9218-dd7a-4acf-93f5-059240575fd2]: (4, ('Mon Feb  2 10:03:43 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-e0f239e2-848d-4af8-8655-52de33d6c78c (32c8233d46a3aba28edf7adfdd5cad3f04854640a635cc39b7a9013ccfe3cc23)\n32c8233d46a3aba28edf7adfdd5cad3f04854640a635cc39b7a9013ccfe3cc23\nMon Feb  2 10:03:43 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-e0f239e2-848d-4af8-8655-52de33d6c78c (32c8233d46a3aba28edf7adfdd5cad3f04854640a635cc39b7a9013ccfe3cc23)\n32c8233d46a3aba28edf7adfdd5cad3f04854640a635cc39b7a9013ccfe3cc23\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 02 10:03:43 compute-1 ovn_metadata_agent[143537]: 2026-02-02 10:03:43.931 229827 DEBUG oslo.privsep.daemon [-] privsep: reply[bf77539e-a83a-42c6-a3a1-6ba252f8a27b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 02 10:03:43 compute-1 ovn_metadata_agent[143537]: 2026-02-02 10:03:43.932 143542 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape0f239e2-80, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 02 10:03:43 compute-1 nova_compute[226294]: 2026-02-02 10:03:43.934 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:03:43 compute-1 kernel: tape0f239e2-80: left promiscuous mode
Feb 02 10:03:43 compute-1 nova_compute[226294]: 2026-02-02 10:03:43.940 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:03:43 compute-1 ovn_metadata_agent[143537]: 2026-02-02 10:03:43.942 229827 DEBUG oslo.privsep.daemon [-] privsep: reply[70994b38-efc0-4946-8411-5188e411bfd1]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 02 10:03:43 compute-1 ovn_metadata_agent[143537]: 2026-02-02 10:03:43.960 229827 DEBUG oslo.privsep.daemon [-] privsep: reply[54181e19-7a0b-42a5-a1c7-76771643ca0a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 02 10:03:43 compute-1 ovn_metadata_agent[143537]: 2026-02-02 10:03:43.961 229827 DEBUG oslo.privsep.daemon [-] privsep: reply[229e5397-f2d3-4e0e-99b7-6edcc397e206]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 02 10:03:43 compute-1 ovn_metadata_agent[143537]: 2026-02-02 10:03:43.972 229827 DEBUG oslo.privsep.daemon [-] privsep: reply[d559afe4-4b75-4ea4-b331-8fa8f4366a1d]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 376957, 'reachable_time': 43124, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 230097, 'error': None, 'target': 'ovnmeta-e0f239e2-848d-4af8-8655-52de33d6c78c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 02 10:03:43 compute-1 systemd[1]: run-netns-ovnmeta\x2de0f239e2\x2d848d\x2d4af8\x2d8655\x2d52de33d6c78c.mount: Deactivated successfully.
Feb 02 10:03:43 compute-1 ovn_metadata_agent[143537]: 2026-02-02 10:03:43.980 143813 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-e0f239e2-848d-4af8-8655-52de33d6c78c deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Feb 02 10:03:43 compute-1 ovn_metadata_agent[143537]: 2026-02-02 10:03:43.980 143813 DEBUG oslo.privsep.daemon [-] privsep: reply[fdad92a5-3c20-47f0-b9cb-714e1f4f9d51]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 02 10:03:44 compute-1 nova_compute[226294]: 2026-02-02 10:03:44.316 226298 INFO nova.virt.libvirt.driver [None req-eca082bb-25ef-4c4d-8e2c-ee5a4151b8f6 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] [instance: 7440c4af-7e45-4796-ac03-ddd1eb035702] Deleting instance files /var/lib/nova/instances/7440c4af-7e45-4796-ac03-ddd1eb035702_del
Feb 02 10:03:44 compute-1 nova_compute[226294]: 2026-02-02 10:03:44.317 226298 INFO nova.virt.libvirt.driver [None req-eca082bb-25ef-4c4d-8e2c-ee5a4151b8f6 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] [instance: 7440c4af-7e45-4796-ac03-ddd1eb035702] Deletion of /var/lib/nova/instances/7440c4af-7e45-4796-ac03-ddd1eb035702_del complete
Feb 02 10:03:44 compute-1 nova_compute[226294]: 2026-02-02 10:03:44.381 226298 DEBUG nova.virt.libvirt.host [None req-eca082bb-25ef-4c4d-8e2c-ee5a4151b8f6 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Checking UEFI support for host arch (x86_64) supports_uefi /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1754
Feb 02 10:03:44 compute-1 nova_compute[226294]: 2026-02-02 10:03:44.382 226298 INFO nova.virt.libvirt.host [None req-eca082bb-25ef-4c4d-8e2c-ee5a4151b8f6 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] UEFI support detected
Feb 02 10:03:44 compute-1 nova_compute[226294]: 2026-02-02 10:03:44.383 226298 INFO nova.compute.manager [None req-eca082bb-25ef-4c4d-8e2c-ee5a4151b8f6 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] [instance: 7440c4af-7e45-4796-ac03-ddd1eb035702] Took 0.77 seconds to destroy the instance on the hypervisor.
Feb 02 10:03:44 compute-1 nova_compute[226294]: 2026-02-02 10:03:44.384 226298 DEBUG oslo.service.loopingcall [None req-eca082bb-25ef-4c4d-8e2c-ee5a4151b8f6 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Feb 02 10:03:44 compute-1 nova_compute[226294]: 2026-02-02 10:03:44.384 226298 DEBUG nova.compute.manager [-] [instance: 7440c4af-7e45-4796-ac03-ddd1eb035702] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Feb 02 10:03:44 compute-1 nova_compute[226294]: 2026-02-02 10:03:44.384 226298 DEBUG nova.network.neutron [-] [instance: 7440c4af-7e45-4796-ac03-ddd1eb035702] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Feb 02 10:03:44 compute-1 nova_compute[226294]: 2026-02-02 10:03:44.629 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:03:44 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:03:44 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 10:03:44 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:03:44.711 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 10:03:44 compute-1 nova_compute[226294]: 2026-02-02 10:03:44.777 226298 DEBUG nova.compute.manager [req-927f2239-a691-4005-8cad-2901a3c4575e req-bbca04fe-62a1-43b3-b8b1-c73577697f1a b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] [instance: 7440c4af-7e45-4796-ac03-ddd1eb035702] Received event network-vif-unplugged-6cd44411-fc3c-47f5-9a7e-daa3acfc46b5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 02 10:03:44 compute-1 nova_compute[226294]: 2026-02-02 10:03:44.778 226298 DEBUG oslo_concurrency.lockutils [req-927f2239-a691-4005-8cad-2901a3c4575e req-bbca04fe-62a1-43b3-b8b1-c73577697f1a b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] Acquiring lock "7440c4af-7e45-4796-ac03-ddd1eb035702-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 02 10:03:44 compute-1 nova_compute[226294]: 2026-02-02 10:03:44.778 226298 DEBUG oslo_concurrency.lockutils [req-927f2239-a691-4005-8cad-2901a3c4575e req-bbca04fe-62a1-43b3-b8b1-c73577697f1a b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] Lock "7440c4af-7e45-4796-ac03-ddd1eb035702-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 02 10:03:44 compute-1 nova_compute[226294]: 2026-02-02 10:03:44.778 226298 DEBUG oslo_concurrency.lockutils [req-927f2239-a691-4005-8cad-2901a3c4575e req-bbca04fe-62a1-43b3-b8b1-c73577697f1a b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] Lock "7440c4af-7e45-4796-ac03-ddd1eb035702-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 02 10:03:44 compute-1 nova_compute[226294]: 2026-02-02 10:03:44.779 226298 DEBUG nova.compute.manager [req-927f2239-a691-4005-8cad-2901a3c4575e req-bbca04fe-62a1-43b3-b8b1-c73577697f1a b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] [instance: 7440c4af-7e45-4796-ac03-ddd1eb035702] No waiting events found dispatching network-vif-unplugged-6cd44411-fc3c-47f5-9a7e-daa3acfc46b5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 02 10:03:44 compute-1 nova_compute[226294]: 2026-02-02 10:03:44.779 226298 DEBUG nova.compute.manager [req-927f2239-a691-4005-8cad-2901a3c4575e req-bbca04fe-62a1-43b3-b8b1-c73577697f1a b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] [instance: 7440c4af-7e45-4796-ac03-ddd1eb035702] Received event network-vif-unplugged-6cd44411-fc3c-47f5-9a7e-daa3acfc46b5 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Feb 02 10:03:44 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:03:44 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 10:03:44 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:03:44.862 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 10:03:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 10:03:44.904 143542 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 02 10:03:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 10:03:44.904 143542 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 02 10:03:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 10:03:44.904 143542 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 02 10:03:45 compute-1 ceph-mon[80115]: pgmap v730: 353 pgs: 353 active+clean; 200 MiB data, 327 MiB used, 60 GiB / 60 GiB avail; 326 KiB/s rd, 2.1 MiB/s wr, 66 op/s
Feb 02 10:03:45 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:03:45 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84bc003fe0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:03:45 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:03:45 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84980016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:03:45 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:03:45 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b8002010 fd 38 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:03:46 compute-1 nova_compute[226294]: 2026-02-02 10:03:46.577 226298 DEBUG nova.network.neutron [-] [instance: 7440c4af-7e45-4796-ac03-ddd1eb035702] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 02 10:03:46 compute-1 nova_compute[226294]: 2026-02-02 10:03:46.602 226298 INFO nova.compute.manager [-] [instance: 7440c4af-7e45-4796-ac03-ddd1eb035702] Took 2.22 seconds to deallocate network for instance.
Feb 02 10:03:46 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:03:46 : epoch 6980764f : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Feb 02 10:03:46 compute-1 nova_compute[226294]: 2026-02-02 10:03:46.666 226298 DEBUG oslo_concurrency.lockutils [None req-eca082bb-25ef-4c4d-8e2c-ee5a4151b8f6 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 02 10:03:46 compute-1 nova_compute[226294]: 2026-02-02 10:03:46.667 226298 DEBUG oslo_concurrency.lockutils [None req-eca082bb-25ef-4c4d-8e2c-ee5a4151b8f6 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 02 10:03:46 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:03:46 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 10:03:46 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:03:46.714 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 10:03:46 compute-1 nova_compute[226294]: 2026-02-02 10:03:46.747 226298 DEBUG oslo_concurrency.processutils [None req-eca082bb-25ef-4c4d-8e2c-ee5a4151b8f6 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 02 10:03:46 compute-1 nova_compute[226294]: 2026-02-02 10:03:46.862 226298 DEBUG nova.compute.manager [req-2d8ae641-6d94-494c-b0e6-9b2a04fcba6e req-4d1a677b-5c4e-4e99-8066-7993dc9a7c38 b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] [instance: 7440c4af-7e45-4796-ac03-ddd1eb035702] Received event network-vif-plugged-6cd44411-fc3c-47f5-9a7e-daa3acfc46b5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 02 10:03:46 compute-1 nova_compute[226294]: 2026-02-02 10:03:46.863 226298 DEBUG oslo_concurrency.lockutils [req-2d8ae641-6d94-494c-b0e6-9b2a04fcba6e req-4d1a677b-5c4e-4e99-8066-7993dc9a7c38 b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] Acquiring lock "7440c4af-7e45-4796-ac03-ddd1eb035702-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 02 10:03:46 compute-1 nova_compute[226294]: 2026-02-02 10:03:46.863 226298 DEBUG oslo_concurrency.lockutils [req-2d8ae641-6d94-494c-b0e6-9b2a04fcba6e req-4d1a677b-5c4e-4e99-8066-7993dc9a7c38 b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] Lock "7440c4af-7e45-4796-ac03-ddd1eb035702-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 02 10:03:46 compute-1 nova_compute[226294]: 2026-02-02 10:03:46.864 226298 DEBUG oslo_concurrency.lockutils [req-2d8ae641-6d94-494c-b0e6-9b2a04fcba6e req-4d1a677b-5c4e-4e99-8066-7993dc9a7c38 b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] Lock "7440c4af-7e45-4796-ac03-ddd1eb035702-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 02 10:03:46 compute-1 nova_compute[226294]: 2026-02-02 10:03:46.864 226298 DEBUG nova.compute.manager [req-2d8ae641-6d94-494c-b0e6-9b2a04fcba6e req-4d1a677b-5c4e-4e99-8066-7993dc9a7c38 b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] [instance: 7440c4af-7e45-4796-ac03-ddd1eb035702] No waiting events found dispatching network-vif-plugged-6cd44411-fc3c-47f5-9a7e-daa3acfc46b5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 02 10:03:46 compute-1 nova_compute[226294]: 2026-02-02 10:03:46.865 226298 WARNING nova.compute.manager [req-2d8ae641-6d94-494c-b0e6-9b2a04fcba6e req-4d1a677b-5c4e-4e99-8066-7993dc9a7c38 b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] [instance: 7440c4af-7e45-4796-ac03-ddd1eb035702] Received unexpected event network-vif-plugged-6cd44411-fc3c-47f5-9a7e-daa3acfc46b5 for instance with vm_state deleted and task_state None.
Feb 02 10:03:46 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:03:46 compute-1 nova_compute[226294]: 2026-02-02 10:03:46.865 226298 DEBUG nova.compute.manager [req-2d8ae641-6d94-494c-b0e6-9b2a04fcba6e req-4d1a677b-5c4e-4e99-8066-7993dc9a7c38 b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] [instance: 7440c4af-7e45-4796-ac03-ddd1eb035702] Received event network-vif-deleted-6cd44411-fc3c-47f5-9a7e-daa3acfc46b5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 02 10:03:46 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:03:46 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:03:46.865 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:03:47 compute-1 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Feb 02 10:03:47 compute-1 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3299572418' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Feb 02 10:03:47 compute-1 nova_compute[226294]: 2026-02-02 10:03:47.229 226298 DEBUG oslo_concurrency.processutils [None req-eca082bb-25ef-4c4d-8e2c-ee5a4151b8f6 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.482s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 02 10:03:47 compute-1 nova_compute[226294]: 2026-02-02 10:03:47.236 226298 DEBUG nova.compute.provider_tree [None req-eca082bb-25ef-4c4d-8e2c-ee5a4151b8f6 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Updating inventory in ProviderTree for provider 8e32c057-ad28-4c19-8374-763e0c1c8622 with inventory: {'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 59, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Feb 02 10:03:47 compute-1 nova_compute[226294]: 2026-02-02 10:03:47.280 226298 ERROR nova.scheduler.client.report [None req-eca082bb-25ef-4c4d-8e2c-ee5a4151b8f6 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] [req-c8004082-069f-4a06-823b-f50ed91fc676] Failed to update inventory to [{'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 59, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}}] for resource provider with UUID 8e32c057-ad28-4c19-8374-763e0c1c8622.  Got 409: {"errors": [{"status": 409, "title": "Conflict", "detail": "There was a conflict when trying to complete your request.\n\n resource provider generation conflict  ", "code": "placement.concurrent_update", "request_id": "req-c8004082-069f-4a06-823b-f50ed91fc676"}]}
Feb 02 10:03:47 compute-1 nova_compute[226294]: 2026-02-02 10:03:47.298 226298 DEBUG nova.scheduler.client.report [None req-eca082bb-25ef-4c4d-8e2c-ee5a4151b8f6 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Refreshing inventories for resource provider 8e32c057-ad28-4c19-8374-763e0c1c8622 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Feb 02 10:03:47 compute-1 nova_compute[226294]: 2026-02-02 10:03:47.325 226298 DEBUG nova.scheduler.client.report [None req-eca082bb-25ef-4c4d-8e2c-ee5a4151b8f6 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Updating ProviderTree inventory for provider 8e32c057-ad28-4c19-8374-763e0c1c8622 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Feb 02 10:03:47 compute-1 nova_compute[226294]: 2026-02-02 10:03:47.326 226298 DEBUG nova.compute.provider_tree [None req-eca082bb-25ef-4c4d-8e2c-ee5a4151b8f6 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Updating inventory in ProviderTree for provider 8e32c057-ad28-4c19-8374-763e0c1c8622 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Feb 02 10:03:47 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 10:03:47 compute-1 nova_compute[226294]: 2026-02-02 10:03:47.351 226298 DEBUG nova.scheduler.client.report [None req-eca082bb-25ef-4c4d-8e2c-ee5a4151b8f6 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Refreshing aggregate associations for resource provider 8e32c057-ad28-4c19-8374-763e0c1c8622, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Feb 02 10:03:47 compute-1 nova_compute[226294]: 2026-02-02 10:03:47.381 226298 DEBUG nova.scheduler.client.report [None req-eca082bb-25ef-4c4d-8e2c-ee5a4151b8f6 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Refreshing trait associations for resource provider 8e32c057-ad28-4c19-8374-763e0c1c8622, traits: HW_CPU_X86_AESNI,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_SECURITY_TPM_1_2,HW_CPU_X86_SSE2,HW_CPU_X86_MMX,COMPUTE_STORAGE_BUS_USB,COMPUTE_STORAGE_BUS_IDE,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_SVM,HW_CPU_X86_F16C,HW_CPU_X86_AMD_SVM,HW_CPU_X86_SSE,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_GRAPHICS_MODEL_VIRTIO,HW_CPU_X86_BMI2,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_DEVICE_TAGGING,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_GRAPHICS_MODEL_VGA,HW_CPU_X86_AVX,COMPUTE_VOLUME_EXTEND,COMPUTE_SECURITY_UEFI_SECURE_BOOT,HW_CPU_X86_SHA,COMPUTE_SECURITY_TPM_2_0,HW_CPU_X86_ABM,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_CLMUL,HW_CPU_X86_FMA3,COMPUTE_ACCELERATORS,COMPUTE_TRUSTED_CERTS,HW_CPU_X86_BMI,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_SSE41,COMPUTE_VIOMMU_MODEL_AUTO,HW_CPU_X86_AVX2,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_RESCUE_BFV,HW_CPU_X86_SSE42,COMPUTE_NODE,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_SSSE3,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_NET_VIF_MODEL_PCNET,HW_CPU_X86_SSE4A,COMPUTE_IMAGE_TYPE_QCOW2 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Feb 02 10:03:47 compute-1 nova_compute[226294]: 2026-02-02 10:03:47.430 226298 DEBUG oslo_concurrency.processutils [None req-eca082bb-25ef-4c4d-8e2c-ee5a4151b8f6 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 02 10:03:47 compute-1 ceph-mon[80115]: pgmap v731: 353 pgs: 353 active+clean; 200 MiB data, 327 MiB used, 60 GiB / 60 GiB avail; 326 KiB/s rd, 2.1 MiB/s wr, 65 op/s
Feb 02 10:03:47 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Feb 02 10:03:47 compute-1 ceph-mon[80115]: from='client.? 192.168.122.101:0/3299572418' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Feb 02 10:03:47 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:03:47 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8494003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:03:47 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:03:47 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84bc003fe0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:03:47 compute-1 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Feb 02 10:03:47 compute-1 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1182949554' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Feb 02 10:03:47 compute-1 nova_compute[226294]: 2026-02-02 10:03:47.866 226298 DEBUG oslo_concurrency.processutils [None req-eca082bb-25ef-4c4d-8e2c-ee5a4151b8f6 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.436s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 02 10:03:47 compute-1 nova_compute[226294]: 2026-02-02 10:03:47.872 226298 DEBUG nova.compute.provider_tree [None req-eca082bb-25ef-4c4d-8e2c-ee5a4151b8f6 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Updating inventory in ProviderTree for provider 8e32c057-ad28-4c19-8374-763e0c1c8622 with inventory: {'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 59, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Feb 02 10:03:47 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:03:47 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84bc003fe0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:03:47 compute-1 nova_compute[226294]: 2026-02-02 10:03:47.925 226298 DEBUG nova.scheduler.client.report [None req-eca082bb-25ef-4c4d-8e2c-ee5a4151b8f6 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Updated inventory for provider 8e32c057-ad28-4c19-8374-763e0c1c8622 with generation 3 in Placement from set_inventory_for_provider using data: {'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 59, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:957
Feb 02 10:03:47 compute-1 nova_compute[226294]: 2026-02-02 10:03:47.925 226298 DEBUG nova.compute.provider_tree [None req-eca082bb-25ef-4c4d-8e2c-ee5a4151b8f6 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Updating resource provider 8e32c057-ad28-4c19-8374-763e0c1c8622 generation from 3 to 4 during operation: update_inventory _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164
Feb 02 10:03:47 compute-1 nova_compute[226294]: 2026-02-02 10:03:47.926 226298 DEBUG nova.compute.provider_tree [None req-eca082bb-25ef-4c4d-8e2c-ee5a4151b8f6 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Updating inventory in ProviderTree for provider 8e32c057-ad28-4c19-8374-763e0c1c8622 with inventory: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Feb 02 10:03:47 compute-1 nova_compute[226294]: 2026-02-02 10:03:47.965 226298 DEBUG oslo_concurrency.lockutils [None req-eca082bb-25ef-4c4d-8e2c-ee5a4151b8f6 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.298s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 02 10:03:48 compute-1 nova_compute[226294]: 2026-02-02 10:03:48.003 226298 INFO nova.scheduler.client.report [None req-eca082bb-25ef-4c4d-8e2c-ee5a4151b8f6 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Deleted allocations for instance 7440c4af-7e45-4796-ac03-ddd1eb035702
Feb 02 10:03:48 compute-1 nova_compute[226294]: 2026-02-02 10:03:48.164 226298 DEBUG oslo_concurrency.lockutils [None req-eca082bb-25ef-4c4d-8e2c-ee5a4151b8f6 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Lock "7440c4af-7e45-4796-ac03-ddd1eb035702" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 4.551s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 02 10:03:48 compute-1 ceph-mon[80115]: from='client.? 192.168.122.101:0/1182949554' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Feb 02 10:03:48 compute-1 ceph-mon[80115]: pgmap v732: 353 pgs: 353 active+clean; 121 MiB data, 280 MiB used, 60 GiB / 60 GiB avail; 346 KiB/s rd, 2.1 MiB/s wr, 95 op/s
Feb 02 10:03:48 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:03:48 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:03:48 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:03:48.715 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:03:48 compute-1 nova_compute[226294]: 2026-02-02 10:03:48.867 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:03:48 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:03:48 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 10:03:48 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:03:48.867 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 10:03:49 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:03:49 : epoch 6980764f : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Feb 02 10:03:49 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:03:49 : epoch 6980764f : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Feb 02 10:03:49 compute-1 nova_compute[226294]: 2026-02-02 10:03:49.631 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:03:49 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:03:49 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b8002010 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:03:49 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:03:49 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b8002010 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:03:49 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:03:49 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8498002720 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:03:50 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:03:50 : epoch 6980764f : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Feb 02 10:03:50 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:03:50 : epoch 6980764f : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Feb 02 10:03:50 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:03:50 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:03:50 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:03:50.719 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:03:50 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:03:50 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 10:03:50 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:03:50.868 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 10:03:50 compute-1 nova_compute[226294]: 2026-02-02 10:03:50.968 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:03:50 compute-1 nova_compute[226294]: 2026-02-02 10:03:50.981 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:03:51 compute-1 ceph-mon[80115]: pgmap v733: 353 pgs: 353 active+clean; 121 MiB data, 280 MiB used, 60 GiB / 60 GiB avail; 19 KiB/s rd, 16 KiB/s wr, 29 op/s
Feb 02 10:03:51 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:03:51 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84bc003fe0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:03:51 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:03:51 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84bc003fe0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:03:51 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:03:51 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b40037a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:03:52 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 10:03:52 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:03:52 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:03:52 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:03:52.722 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:03:52 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:03:52 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:03:52 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:03:52.871 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:03:53 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:03:53 : epoch 6980764f : compute-1 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Feb 02 10:03:53 compute-1 podman[230151]: 2026-02-02 10:03:53.456955558 +0000 UTC m=+0.126004008 container health_status 1fb2696999ea15e9688144989093738543d3cef725f9986792d6dfd707aefbe2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:099d88ae13fa2b3409da5310cdcba7fa01d2c87a8bc98296299a57054b9a075e, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'db4758ee7523fe447444c4bd2b867b543b1eee4e3bbcf6676cd1b27bf6147d86-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:099d88ae13fa2b3409da5310cdcba7fa01d2c87a8bc98296299a57054b9a075e', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team)
Feb 02 10:03:53 compute-1 ceph-mon[80115]: pgmap v734: 353 pgs: 353 active+clean; 121 MiB data, 280 MiB used, 60 GiB / 60 GiB avail; 19 KiB/s rd, 16 KiB/s wr, 29 op/s
Feb 02 10:03:53 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:03:53 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8498002720 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:03:53 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:03:53 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b80095a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:03:53 compute-1 nova_compute[226294]: 2026-02-02 10:03:53.869 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:03:53 compute-1 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Feb 02 10:03:53 compute-1 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/3528945408' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Feb 02 10:03:53 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:03:53 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84bc003fe0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:03:54 compute-1 nova_compute[226294]: 2026-02-02 10:03:54.633 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:03:54 compute-1 ceph-mon[80115]: from='client.? 192.168.122.100:0/3528945408' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Feb 02 10:03:54 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:03:54 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:03:54 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:03:54.725 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:03:54 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:03:54 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:03:54 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:03:54.874 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:03:55 compute-1 ceph-mon[80115]: pgmap v735: 353 pgs: 353 active+clean; 41 MiB data, 244 MiB used, 60 GiB / 60 GiB avail; 40 KiB/s rd, 18 KiB/s wr, 60 op/s
Feb 02 10:03:55 compute-1 ceph-mon[80115]: from='client.? 192.168.122.10:0/1229698750' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Feb 02 10:03:55 compute-1 ceph-mon[80115]: from='client.? 192.168.122.10:0/1229698750' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Feb 02 10:03:55 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:03:55 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84bc003fe0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:03:55 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:03:55 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84bc003fe0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:03:55 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:03:55 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84bc003fe0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:03:56 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:03:56 : epoch 6980764f : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Feb 02 10:03:56 compute-1 ceph-mon[80115]: pgmap v736: 353 pgs: 353 active+clean; 41 MiB data, 244 MiB used, 60 GiB / 60 GiB avail; 40 KiB/s rd, 4.3 KiB/s wr, 59 op/s
Feb 02 10:03:56 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:03:56 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:03:56 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:03:56.728 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:03:56 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:03:56 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:03:56 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:03:56.876 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:03:57 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 10:03:57 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:03:57 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b80095a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:03:57 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:03:57 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8498003430 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:03:57 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:03:57 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b40037a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:03:58 compute-1 sudo[230181]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Feb 02 10:03:58 compute-1 sudo[230181]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 10:03:58 compute-1 sudo[230181]: pam_unix(sudo:session): session closed for user root
Feb 02 10:03:58 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:03:58 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:03:58 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:03:58.731 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:03:58 compute-1 nova_compute[226294]: 2026-02-02 10:03:58.845 226298 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1770026623.8443913, 7440c4af-7e45-4796-ac03-ddd1eb035702 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 02 10:03:58 compute-1 nova_compute[226294]: 2026-02-02 10:03:58.846 226298 INFO nova.compute.manager [-] [instance: 7440c4af-7e45-4796-ac03-ddd1eb035702] VM Stopped (Lifecycle Event)
Feb 02 10:03:58 compute-1 nova_compute[226294]: 2026-02-02 10:03:58.864 226298 DEBUG nova.compute.manager [None req-b3a0dc63-054f-480f-8518-83b7344a459f - - - - - -] [instance: 7440c4af-7e45-4796-ac03-ddd1eb035702] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 02 10:03:58 compute-1 nova_compute[226294]: 2026-02-02 10:03:58.870 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:03:58 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:03:58 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 10:03:58 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:03:58.878 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 10:03:58 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-haproxy-nfs-cephfs-compute-1-sryqbx[86108]: [WARNING] 032/100358 (4) : Server backend/nfs.cephfs.1 is UP, reason: Layer4 check passed, check duration: 0ms. 2 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Feb 02 10:03:59 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:03:59 : epoch 6980764f : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Feb 02 10:03:59 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:03:59 : epoch 6980764f : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Feb 02 10:03:59 compute-1 ceph-mon[80115]: pgmap v737: 353 pgs: 353 active+clean; 41 MiB data, 238 MiB used, 60 GiB / 60 GiB avail; 41 KiB/s rd, 4.8 KiB/s wr, 61 op/s
Feb 02 10:03:59 compute-1 nova_compute[226294]: 2026-02-02 10:03:59.682 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:03:59 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:03:59 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84bc003fe0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:03:59 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:03:59 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b80095a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:03:59 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:03:59 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8498003430 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:04:00 compute-1 podman[230207]: 2026-02-02 10:04:00.394654726 +0000 UTC m=+0.066652131 container health_status 9cab238676dda9a572dafc047b624a8b78a8fbec063bf997856559897160605d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'db4758ee7523fe447444c4bd2b867b543b1eee4e3bbcf6676cd1b27bf6147d86-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Feb 02 10:04:00 compute-1 sudo[230222]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 02 10:04:00 compute-1 sudo[230222]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 10:04:00 compute-1 sudo[230222]: pam_unix(sudo:session): session closed for user root
Feb 02 10:04:00 compute-1 sudo[230251]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/d241d473-9fcb-5f74-b163-f1ca4454e7f1/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --image quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec --timeout 895 ls
Feb 02 10:04:00 compute-1 sudo[230251]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 10:04:00 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:04:00 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:04:00 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:04:00.733 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:04:00 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:04:00 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 10:04:00 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:04:00.881 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 10:04:01 compute-1 podman[230348]: 2026-02-02 10:04:01.07693985 +0000 UTC m=+0.068467319 container exec 01cf0f34952fee77036b3444f3bcffc4063a933b24861a159799fb77cf2e6e0e (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-crash-compute-1, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, ceph=True, io.buildah.version=1.40.1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Feb 02 10:04:01 compute-1 podman[230348]: 2026-02-02 10:04:01.155627707 +0000 UTC m=+0.147155186 container exec_died 01cf0f34952fee77036b3444f3bcffc4063a933b24861a159799fb77cf2e6e0e (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-crash-compute-1, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, io.buildah.version=1.40.1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Feb 02 10:04:01 compute-1 ceph-mon[80115]: pgmap v738: 353 pgs: 353 active+clean; 41 MiB data, 238 MiB used, 60 GiB / 60 GiB avail; 22 KiB/s rd, 2.2 KiB/s wr, 31 op/s
Feb 02 10:04:01 compute-1 podman[230481]: 2026-02-02 10:04:01.716804844 +0000 UTC m=+0.051834600 container exec 8d747add3fe1f1bfe0d198ee6025faae61dcb9cc9c5a16a41a7205162dac9d07 (image=quay.io/prometheus/node-exporter:v1.7.0, name=ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-node-exporter-compute-1, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Feb 02 10:04:01 compute-1 podman[230481]: 2026-02-02 10:04:01.725448302 +0000 UTC m=+0.060478038 container exec_died 8d747add3fe1f1bfe0d198ee6025faae61dcb9cc9c5a16a41a7205162dac9d07 (image=quay.io/prometheus/node-exporter:v1.7.0, name=ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-node-exporter-compute-1, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Feb 02 10:04:01 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:04:01 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b40037a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:04:01 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:04:01 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84bc003fe0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:04:01 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:04:01 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84bc003fe0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:04:01 compute-1 podman[230554]: 2026-02-02 10:04:01.972095234 +0000 UTC m=+0.060550789 container exec 4a082278f968d851715d3ff83b10198d099fc94dfdc956a8f353cfb211d0aa31 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, io.buildah.version=1.40.1, OSD_FLAVOR=default)
Feb 02 10:04:01 compute-1 podman[230554]: 2026-02-02 10:04:01.982367706 +0000 UTC m=+0.070823261 container exec_died 4a082278f968d851715d3ff83b10198d099fc94dfdc956a8f353cfb211d0aa31 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, org.label-schema.license=GPLv2, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True)
Feb 02 10:04:02 compute-1 podman[230620]: 2026-02-02 10:04:02.216362394 +0000 UTC m=+0.063077017 container exec 956874b4175e8a2fade475e5bb43454e3b46ed50db9c41f7163b928c5dfb05c2 (image=quay.io/ceph/haproxy:2.3, name=ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-haproxy-nfs-cephfs-compute-1-sryqbx)
Feb 02 10:04:02 compute-1 podman[230620]: 2026-02-02 10:04:02.25560625 +0000 UTC m=+0.102320843 container exec_died 956874b4175e8a2fade475e5bb43454e3b46ed50db9c41f7163b928c5dfb05c2 (image=quay.io/ceph/haproxy:2.3, name=ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-haproxy-nfs-cephfs-compute-1-sryqbx)
Feb 02 10:04:02 compute-1 ceph-mon[80115]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #43. Immutable memtables: 0.
Feb 02 10:04:02 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:04:02.283451) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Feb 02 10:04:02 compute-1 ceph-mon[80115]: rocksdb: [db/flush_job.cc:856] [default] [JOB 23] Flushing memtable with next log file: 43
Feb 02 10:04:02 compute-1 ceph-mon[80115]: rocksdb: EVENT_LOG_v1 {"time_micros": 1770026642283506, "job": 23, "event": "flush_started", "num_memtables": 1, "num_entries": 1114, "num_deletes": 250, "total_data_size": 2480949, "memory_usage": 2518520, "flush_reason": "Manual Compaction"}
Feb 02 10:04:02 compute-1 ceph-mon[80115]: rocksdb: [db/flush_job.cc:885] [default] [JOB 23] Level-0 flush table #44: started
Feb 02 10:04:02 compute-1 ceph-mon[80115]: rocksdb: EVENT_LOG_v1 {"time_micros": 1770026642297209, "cf_name": "default", "job": 23, "event": "table_file_creation", "file_number": 44, "file_size": 1022451, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 23214, "largest_seqno": 24323, "table_properties": {"data_size": 1018432, "index_size": 1607, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1349, "raw_key_size": 10580, "raw_average_key_size": 20, "raw_value_size": 1009718, "raw_average_value_size": 1949, "num_data_blocks": 71, "num_entries": 518, "num_filter_entries": 518, "num_deletions": 250, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1770026553, "oldest_key_time": 1770026553, "file_creation_time": 1770026642, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "fac1d709-8a2a-487d-b05b-57255ec289c7", "db_session_id": "DE871D21TSCUFP8UED8E", "orig_file_number": 44, "seqno_to_time_mapping": "N/A"}}
Feb 02 10:04:02 compute-1 ceph-mon[80115]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 23] Flush lasted 13823 microseconds, and 4723 cpu microseconds.
Feb 02 10:04:02 compute-1 ceph-mon[80115]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Feb 02 10:04:02 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:04:02.297275) [db/flush_job.cc:967] [default] [JOB 23] Level-0 flush table #44: 1022451 bytes OK
Feb 02 10:04:02 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:04:02.297298) [db/memtable_list.cc:519] [default] Level-0 commit table #44 started
Feb 02 10:04:02 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:04:02.298842) [db/memtable_list.cc:722] [default] Level-0 commit table #44: memtable #1 done
Feb 02 10:04:02 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:04:02.298865) EVENT_LOG_v1 {"time_micros": 1770026642298858, "job": 23, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Feb 02 10:04:02 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:04:02.298887) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Feb 02 10:04:02 compute-1 ceph-mon[80115]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 23] Try to delete WAL files size 2475493, prev total WAL file size 2475493, number of live WAL files 2.
Feb 02 10:04:02 compute-1 ceph-mon[80115]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000040.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb 02 10:04:02 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:04:02.299722) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6D67727374617400353030' seq:72057594037927935, type:22 .. '6D67727374617400373531' seq:0, type:0; will stop at (end)
Feb 02 10:04:02 compute-1 ceph-mon[80115]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 24] Compacting 1@0 + 1@6 files to L6, score -1.00
Feb 02 10:04:02 compute-1 ceph-mon[80115]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 23 Base level 0, inputs: [44(998KB)], [42(14MB)]
Feb 02 10:04:02 compute-1 ceph-mon[80115]: rocksdb: EVENT_LOG_v1 {"time_micros": 1770026642299766, "job": 24, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [44], "files_L6": [42], "score": -1, "input_data_size": 15733684, "oldest_snapshot_seqno": -1}
Feb 02 10:04:02 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:04:02 : epoch 6980764f : compute-1 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Feb 02 10:04:02 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 10:04:02 compute-1 ceph-mon[80115]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 24] Generated table #45: 5503 keys, 12269977 bytes, temperature: kUnknown
Feb 02 10:04:02 compute-1 ceph-mon[80115]: rocksdb: EVENT_LOG_v1 {"time_micros": 1770026642462914, "cf_name": "default", "job": 24, "event": "table_file_creation", "file_number": 45, "file_size": 12269977, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 12234320, "index_size": 20827, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 13765, "raw_key_size": 139121, "raw_average_key_size": 25, "raw_value_size": 12135889, "raw_average_value_size": 2205, "num_data_blocks": 850, "num_entries": 5503, "num_filter_entries": 5503, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1770025175, "oldest_key_time": 0, "file_creation_time": 1770026642, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "fac1d709-8a2a-487d-b05b-57255ec289c7", "db_session_id": "DE871D21TSCUFP8UED8E", "orig_file_number": 45, "seqno_to_time_mapping": "N/A"}}
Feb 02 10:04:02 compute-1 ceph-mon[80115]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Feb 02 10:04:02 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:04:02.463176) [db/compaction/compaction_job.cc:1663] [default] [JOB 24] Compacted 1@0 + 1@6 files to L6 => 12269977 bytes
Feb 02 10:04:02 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:04:02.465941) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 96.4 rd, 75.2 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.0, 14.0 +0.0 blob) out(11.7 +0.0 blob), read-write-amplify(27.4) write-amplify(12.0) OK, records in: 5983, records dropped: 480 output_compression: NoCompression
Feb 02 10:04:02 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:04:02.465961) EVENT_LOG_v1 {"time_micros": 1770026642465952, "job": 24, "event": "compaction_finished", "compaction_time_micros": 163215, "compaction_time_cpu_micros": 33954, "output_level": 6, "num_output_files": 1, "total_output_size": 12269977, "num_input_records": 5983, "num_output_records": 5503, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Feb 02 10:04:02 compute-1 ceph-mon[80115]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000044.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb 02 10:04:02 compute-1 ceph-mon[80115]: rocksdb: EVENT_LOG_v1 {"time_micros": 1770026642466169, "job": 24, "event": "table_file_deletion", "file_number": 44}
Feb 02 10:04:02 compute-1 ceph-mon[80115]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000042.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb 02 10:04:02 compute-1 ceph-mon[80115]: rocksdb: EVENT_LOG_v1 {"time_micros": 1770026642467840, "job": 24, "event": "table_file_deletion", "file_number": 42}
Feb 02 10:04:02 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:04:02.299604) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 02 10:04:02 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:04:02.467958) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 02 10:04:02 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:04:02.467966) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 02 10:04:02 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:04:02.467971) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 02 10:04:02 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:04:02.467975) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 02 10:04:02 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:04:02.467980) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 02 10:04:02 compute-1 podman[230686]: 2026-02-02 10:04:02.485174911 +0000 UTC m=+0.061228607 container exec 2d34301ed9240709c7b956110bf2f7a09db1491812be3ea91463444d19b431a2 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-keepalived-nfs-cephfs-compute-1-whrwoq, io.k8s.display-name=Keepalived on RHEL 9, version=2.2.4, name=keepalived, com.redhat.component=keepalived-container, io.openshift.tags=Ceph keepalived, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, vendor=Red Hat, Inc., architecture=x86_64, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, distribution-scope=public, vcs-type=git, description=keepalived for Ceph, release=1793, build-date=2023-02-22T09:23:20, io.buildah.version=1.28.2, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, summary=Provides keepalived on RHEL 9 for Ceph.)
Feb 02 10:04:02 compute-1 podman[230686]: 2026-02-02 10:04:02.52752578 +0000 UTC m=+0.103579446 container exec_died 2d34301ed9240709c7b956110bf2f7a09db1491812be3ea91463444d19b431a2 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-keepalived-nfs-cephfs-compute-1-whrwoq, release=1793, name=keepalived, com.redhat.component=keepalived-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Keepalived on RHEL 9, architecture=x86_64, build-date=2023-02-22T09:23:20, io.openshift.expose-services=, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, distribution-scope=public, vcs-type=git, io.buildah.version=1.28.2, io.openshift.tags=Ceph keepalived, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, vendor=Red Hat, Inc., version=2.2.4, summary=Provides keepalived on RHEL 9 for Ceph., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=keepalived for Ceph)
Feb 02 10:04:02 compute-1 sudo[230251]: pam_unix(sudo:session): session closed for user root
Feb 02 10:04:02 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Feb 02 10:04:02 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb 02 10:04:02 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb 02 10:04:02 compute-1 sudo[230717]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 02 10:04:02 compute-1 sudo[230717]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 10:04:02 compute-1 sudo[230717]: pam_unix(sudo:session): session closed for user root
Feb 02 10:04:02 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:04:02 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 10:04:02 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:04:02.735 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 10:04:02 compute-1 sudo[230742]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/d241d473-9fcb-5f74-b163-f1ca4454e7f1/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Feb 02 10:04:02 compute-1 sudo[230742]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 10:04:02 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:04:02 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:04:02 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:04:02.884 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:04:03 compute-1 sudo[230742]: pam_unix(sudo:session): session closed for user root
Feb 02 10:04:03 compute-1 ceph-mon[80115]: pgmap v739: 353 pgs: 353 active+clean; 41 MiB data, 238 MiB used, 60 GiB / 60 GiB avail; 22 KiB/s rd, 2.2 KiB/s wr, 31 op/s
Feb 02 10:04:03 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb 02 10:04:03 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb 02 10:04:03 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "config rm", "who": "osd/host:compute-1", "name": "osd_memory_target"}]: dispatch
Feb 02 10:04:03 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:04:03 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84bc003fe0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:04:03 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:04:03 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84bc003fe0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:04:03 compute-1 nova_compute[226294]: 2026-02-02 10:04:03.871 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:04:03 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:04:03 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f848c000b60 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:04:04 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "config rm", "who": "osd/host:compute-2", "name": "osd_memory_target"}]: dispatch
Feb 02 10:04:04 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Feb 02 10:04:04 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Feb 02 10:04:04 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb 02 10:04:04 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb 02 10:04:04 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Feb 02 10:04:04 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Feb 02 10:04:04 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Feb 02 10:04:04 compute-1 nova_compute[226294]: 2026-02-02 10:04:04.684 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:04:04 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:04:04 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:04:04 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:04:04.738 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:04:04 compute-1 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Feb 02 10:04:04 compute-1 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/3260463622' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Feb 02 10:04:04 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:04:04 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 10:04:04 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:04:04.886 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 10:04:05 compute-1 ceph-mon[80115]: pgmap v740: 353 pgs: 353 active+clean; 41 MiB data, 238 MiB used, 60 GiB / 60 GiB avail; 23 KiB/s rd, 2.5 KiB/s wr, 33 op/s
Feb 02 10:04:05 compute-1 ceph-mon[80115]: from='client.? 192.168.122.100:0/3260463622' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Feb 02 10:04:05 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:04:05 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b80095a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:04:05 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-haproxy-nfs-cephfs-compute-1-sryqbx[86108]: [WARNING] 032/100405 (4) : Server backend/nfs.cephfs.2 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Feb 02 10:04:05 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:04:05 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8498003430 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:04:05 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:04:05 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84bc003fe0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:04:06 compute-1 ceph-mon[80115]: from='client.? 192.168.122.100:0/4079846023' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Feb 02 10:04:06 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:04:06 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:04:06 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:04:06.741 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:04:06 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:04:06 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 10:04:06 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:04:06.889 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 10:04:07 compute-1 nova_compute[226294]: 2026-02-02 10:04:07.287 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 02 10:04:07 compute-1 nova_compute[226294]: 2026-02-02 10:04:07.301 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 02 10:04:07 compute-1 nova_compute[226294]: 2026-02-02 10:04:07.302 226298 DEBUG nova.compute.manager [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 02 10:04:07 compute-1 nova_compute[226294]: 2026-02-02 10:04:07.302 226298 DEBUG nova.compute.manager [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 02 10:04:07 compute-1 nova_compute[226294]: 2026-02-02 10:04:07.312 226298 DEBUG nova.compute.manager [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Feb 02 10:04:07 compute-1 nova_compute[226294]: 2026-02-02 10:04:07.312 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 02 10:04:07 compute-1 nova_compute[226294]: 2026-02-02 10:04:07.312 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 02 10:04:07 compute-1 nova_compute[226294]: 2026-02-02 10:04:07.312 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 02 10:04:07 compute-1 nova_compute[226294]: 2026-02-02 10:04:07.313 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 02 10:04:07 compute-1 nova_compute[226294]: 2026-02-02 10:04:07.313 226298 DEBUG nova.compute.manager [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 02 10:04:07 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 10:04:07 compute-1 nova_compute[226294]: 2026-02-02 10:04:07.649 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 02 10:04:07 compute-1 nova_compute[226294]: 2026-02-02 10:04:07.650 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 02 10:04:07 compute-1 ceph-mon[80115]: pgmap v741: 353 pgs: 353 active+clean; 41 MiB data, 238 MiB used, 60 GiB / 60 GiB avail; 2.1 KiB/s rd, 767 B/s wr, 3 op/s
Feb 02 10:04:07 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:04:07 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f848c0016a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:04:07 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:04:07 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b80095a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:04:07 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:04:07 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8498003430 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:04:08 compute-1 nova_compute[226294]: 2026-02-02 10:04:08.648 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 02 10:04:08 compute-1 nova_compute[226294]: 2026-02-02 10:04:08.649 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 02 10:04:08 compute-1 nova_compute[226294]: 2026-02-02 10:04:08.679 226298 DEBUG oslo_concurrency.lockutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 02 10:04:08 compute-1 nova_compute[226294]: 2026-02-02 10:04:08.680 226298 DEBUG oslo_concurrency.lockutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 02 10:04:08 compute-1 nova_compute[226294]: 2026-02-02 10:04:08.680 226298 DEBUG oslo_concurrency.lockutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 02 10:04:08 compute-1 nova_compute[226294]: 2026-02-02 10:04:08.681 226298 DEBUG nova.compute.resource_tracker [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 02 10:04:08 compute-1 nova_compute[226294]: 2026-02-02 10:04:08.681 226298 DEBUG oslo_concurrency.processutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 02 10:04:08 compute-1 ceph-mon[80115]: pgmap v742: 353 pgs: 353 active+clean; 41 MiB data, 238 MiB used, 60 GiB / 60 GiB avail; 2.1 KiB/s rd, 767 B/s wr, 3 op/s
Feb 02 10:04:08 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:04:08 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:04:08 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:04:08.744 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:04:08 compute-1 nova_compute[226294]: 2026-02-02 10:04:08.873 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:04:08 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:04:08 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:04:08 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:04:08.892 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:04:09 compute-1 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Feb 02 10:04:09 compute-1 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2465839043' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Feb 02 10:04:09 compute-1 sudo[230823]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 02 10:04:09 compute-1 sudo[230823]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 10:04:09 compute-1 sudo[230823]: pam_unix(sudo:session): session closed for user root
Feb 02 10:04:09 compute-1 nova_compute[226294]: 2026-02-02 10:04:09.137 226298 DEBUG oslo_concurrency.processutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.456s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 02 10:04:09 compute-1 nova_compute[226294]: 2026-02-02 10:04:09.330 226298 WARNING nova.virt.libvirt.driver [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 02 10:04:09 compute-1 nova_compute[226294]: 2026-02-02 10:04:09.331 226298 DEBUG nova.compute.resource_tracker [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4928MB free_disk=59.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 02 10:04:09 compute-1 nova_compute[226294]: 2026-02-02 10:04:09.332 226298 DEBUG oslo_concurrency.lockutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 02 10:04:09 compute-1 nova_compute[226294]: 2026-02-02 10:04:09.332 226298 DEBUG oslo_concurrency.lockutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 02 10:04:09 compute-1 nova_compute[226294]: 2026-02-02 10:04:09.406 226298 DEBUG nova.compute.resource_tracker [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 02 10:04:09 compute-1 nova_compute[226294]: 2026-02-02 10:04:09.406 226298 DEBUG nova.compute.resource_tracker [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 02 10:04:09 compute-1 nova_compute[226294]: 2026-02-02 10:04:09.420 226298 DEBUG oslo_concurrency.processutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 02 10:04:09 compute-1 nova_compute[226294]: 2026-02-02 10:04:09.725 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:04:09 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:04:09 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84bc003fe0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:04:09 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:04:09 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84bc003fe0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:04:09 compute-1 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Feb 02 10:04:09 compute-1 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3035330726' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Feb 02 10:04:09 compute-1 nova_compute[226294]: 2026-02-02 10:04:09.899 226298 DEBUG oslo_concurrency.processutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.479s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 02 10:04:09 compute-1 nova_compute[226294]: 2026-02-02 10:04:09.905 226298 DEBUG nova.compute.provider_tree [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Inventory has not changed in ProviderTree for provider: 8e32c057-ad28-4c19-8374-763e0c1c8622 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 02 10:04:09 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:04:09 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f849c002690 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:04:09 compute-1 nova_compute[226294]: 2026-02-02 10:04:09.922 226298 DEBUG nova.scheduler.client.report [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Inventory has not changed for provider 8e32c057-ad28-4c19-8374-763e0c1c8622 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 02 10:04:09 compute-1 nova_compute[226294]: 2026-02-02 10:04:09.946 226298 DEBUG nova.compute.resource_tracker [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 02 10:04:09 compute-1 nova_compute[226294]: 2026-02-02 10:04:09.947 226298 DEBUG oslo_concurrency.lockutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.615s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 02 10:04:10 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb 02 10:04:10 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb 02 10:04:10 compute-1 ceph-mon[80115]: from='client.? 192.168.122.101:0/2465839043' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Feb 02 10:04:10 compute-1 ceph-mon[80115]: from='client.? 192.168.122.102:0/2115916494' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Feb 02 10:04:10 compute-1 ceph-mon[80115]: from='client.? 192.168.122.101:0/3035330726' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Feb 02 10:04:10 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:04:10 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 10:04:10 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:04:10.795 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 10:04:10 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:04:10 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 10:04:10 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:04:10.894 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 10:04:11 compute-1 ceph-mon[80115]: from='client.? 192.168.122.102:0/2498217816' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Feb 02 10:04:11 compute-1 ceph-mon[80115]: pgmap v743: 353 pgs: 353 active+clean; 41 MiB data, 238 MiB used, 60 GiB / 60 GiB avail; 938 B/s rd, 255 B/s wr, 1 op/s
Feb 02 10:04:11 compute-1 ceph-mon[80115]: from='client.? 192.168.122.100:0/3023024456' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Feb 02 10:04:11 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:04:11 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f848c0016a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:04:11 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:04:11 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8498003430 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:04:11 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:04:11 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84bc003fe0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:04:12 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 10:04:12 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:04:12 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:04:12 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:04:12.798 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:04:12 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:04:12 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:04:12 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:04:12.897 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:04:13 compute-1 ceph-mon[80115]: pgmap v744: 353 pgs: 353 active+clean; 41 MiB data, 238 MiB used, 60 GiB / 60 GiB avail; 938 B/s rd, 255 B/s wr, 1 op/s
Feb 02 10:04:13 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:04:13 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f849c002690 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:04:13 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:04:13 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f848c0016a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:04:13 compute-1 nova_compute[226294]: 2026-02-02 10:04:13.877 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:04:13 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:04:13 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8498003430 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:04:14 compute-1 nova_compute[226294]: 2026-02-02 10:04:14.774 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:04:14 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:04:14 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:04:14 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:04:14.801 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:04:14 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:04:14 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:04:14 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:04:14.899 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:04:15 compute-1 nova_compute[226294]: 2026-02-02 10:04:15.454 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:04:15 compute-1 ovn_metadata_agent[143537]: 2026-02-02 10:04:15.455 143542 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=5, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '66:4f:4d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '4a:a7:f3:61:65:15'}, ipsec=False) old=SB_Global(nb_cfg=4) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 02 10:04:15 compute-1 ovn_metadata_agent[143537]: 2026-02-02 10:04:15.457 143542 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Feb 02 10:04:15 compute-1 ceph-mon[80115]: pgmap v745: 353 pgs: 353 active+clean; 88 MiB data, 259 MiB used, 60 GiB / 60 GiB avail; 18 KiB/s rd, 1.8 MiB/s wr, 28 op/s
Feb 02 10:04:15 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:04:15 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f849c002690 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:04:15 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:04:15 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84bc003fe0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:04:15 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:04:15 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f848c002b10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:04:16 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:04:16 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:04:16 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:04:16.804 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:04:16 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:04:16 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 10:04:16 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:04:16.901 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 10:04:17 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 10:04:17 compute-1 ceph-mon[80115]: pgmap v746: 353 pgs: 353 active+clean; 88 MiB data, 259 MiB used, 60 GiB / 60 GiB avail; 17 KiB/s rd, 1.8 MiB/s wr, 27 op/s
Feb 02 10:04:17 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Feb 02 10:04:17 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:04:17 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f848c002b10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:04:17 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:04:17 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8498004140 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:04:17 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:04:17 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84bc004cf0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:04:18 compute-1 sudo[230879]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Feb 02 10:04:18 compute-1 sudo[230879]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 10:04:18 compute-1 sudo[230879]: pam_unix(sudo:session): session closed for user root
Feb 02 10:04:18 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:04:18 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:04:18 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:04:18.807 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:04:18 compute-1 nova_compute[226294]: 2026-02-02 10:04:18.878 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:04:18 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:04:18 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:04:18 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:04:18.904 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:04:19 compute-1 ceph-mon[80115]: pgmap v747: 353 pgs: 353 active+clean; 88 MiB data, 259 MiB used, 60 GiB / 60 GiB avail; 17 KiB/s rd, 1.8 MiB/s wr, 27 op/s
Feb 02 10:04:19 compute-1 ceph-mon[80115]: from='client.? 192.168.122.100:0/1713753610' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Feb 02 10:04:19 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:04:19 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f848c002b10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:04:19 compute-1 nova_compute[226294]: 2026-02-02 10:04:19.775 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:04:19 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:04:19 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f849c002690 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:04:19 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:04:19 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8498004140 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:04:20 compute-1 ovn_metadata_agent[143537]: 2026-02-02 10:04:20.459 143542 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=2f54a3b0-231a-4b96-9e3a-0a36e3e73216, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '5'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 02 10:04:20 compute-1 ceph-mon[80115]: from='client.? 192.168.122.100:0/3934267532' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Feb 02 10:04:20 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:04:20 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:04:20 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:04:20.811 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:04:20 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:04:20 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:04:20 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:04:20.907 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:04:21 compute-1 ceph-mon[80115]: pgmap v748: 353 pgs: 353 active+clean; 88 MiB data, 259 MiB used, 60 GiB / 60 GiB avail; 17 KiB/s rd, 1.8 MiB/s wr, 27 op/s
Feb 02 10:04:21 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:04:21 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84bc004cf0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:04:21 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:04:21 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f848c002b10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:04:21 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:04:21 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f849c002690 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:04:22 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 10:04:22 compute-1 ceph-mon[80115]: pgmap v749: 353 pgs: 353 active+clean; 88 MiB data, 259 MiB used, 60 GiB / 60 GiB avail; 17 KiB/s rd, 1.8 MiB/s wr, 27 op/s
Feb 02 10:04:22 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:04:22 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:04:22 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:04:22.814 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:04:22 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:04:22 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:04:22 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:04:22.909 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:04:23 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:04:23 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8498004140 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:04:23 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:04:23 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84bc004cf0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:04:23 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-haproxy-nfs-cephfs-compute-1-sryqbx[86108]: [WARNING] 032/100423 (4) : Server backend/nfs.cephfs.2 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Feb 02 10:04:23 compute-1 nova_compute[226294]: 2026-02-02 10:04:23.881 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:04:23 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:04:23 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f848c003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:04:24 compute-1 podman[230907]: 2026-02-02 10:04:24.445536063 +0000 UTC m=+0.115014967 container health_status 1fb2696999ea15e9688144989093738543d3cef725f9986792d6dfd707aefbe2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:099d88ae13fa2b3409da5310cdcba7fa01d2c87a8bc98296299a57054b9a075e, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller, io.buildah.version=1.41.3, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'db4758ee7523fe447444c4bd2b867b543b1eee4e3bbcf6676cd1b27bf6147d86-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:099d88ae13fa2b3409da5310cdcba7fa01d2c87a8bc98296299a57054b9a075e', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Feb 02 10:04:24 compute-1 nova_compute[226294]: 2026-02-02 10:04:24.777 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:04:24 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:04:24 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 10:04:24 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:04:24.817 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 10:04:24 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:04:24 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:04:24 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:04:24.912 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:04:25 compute-1 ceph-mon[80115]: pgmap v750: 353 pgs: 353 active+clean; 88 MiB data, 259 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 1.8 MiB/s wr, 101 op/s
Feb 02 10:04:25 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:04:25 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f849c002690 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:04:25 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:04:25 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8498004140 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:04:25 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:04:25 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84bc004cf0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:04:26 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:04:26 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:04:26 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:04:26.819 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:04:26 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:04:26 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:04:26 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:04:26.914 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:04:27 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 10:04:27 compute-1 ceph-mon[80115]: pgmap v751: 353 pgs: 353 active+clean; 88 MiB data, 259 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 12 KiB/s wr, 73 op/s
Feb 02 10:04:27 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:04:27 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f848c003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:04:27 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:04:27 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f849c003a20 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:04:27 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:04:27 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8498004140 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:04:28 compute-1 ceph-mon[80115]: pgmap v752: 353 pgs: 353 active+clean; 88 MiB data, 259 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 12 KiB/s wr, 73 op/s
Feb 02 10:04:28 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:04:28 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:04:28 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:04:28.822 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:04:28 compute-1 nova_compute[226294]: 2026-02-02 10:04:28.882 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:04:28 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:04:28 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:04:28 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:04:28.916 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:04:29 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:04:29 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84bc004cf0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:04:29 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:04:29 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f848c003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:04:29 compute-1 nova_compute[226294]: 2026-02-02 10:04:29.823 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:04:29 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:04:29 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f849c003a20 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:04:30 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:04:30 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:04:30 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:04:30.825 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:04:30 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:04:30 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:04:30 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:04:30.919 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:04:31 compute-1 podman[230937]: 2026-02-02 10:04:31.394101128 +0000 UTC m=+0.070241805 container health_status 9cab238676dda9a572dafc047b624a8b78a8fbec063bf997856559897160605d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'db4758ee7523fe447444c4bd2b867b543b1eee4e3bbcf6676cd1b27bf6147d86-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 02 10:04:31 compute-1 ceph-mon[80115]: pgmap v753: 353 pgs: 353 active+clean; 88 MiB data, 259 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 12 KiB/s wr, 73 op/s
Feb 02 10:04:31 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:04:31 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8498004140 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:04:31 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:04:31 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84bc004cf0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:04:31 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:04:31 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f848c003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:04:32 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 10:04:32 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Feb 02 10:04:32 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:04:32 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 10:04:32 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:04:32.826 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 10:04:32 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:04:32 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:04:32 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:04:32.922 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:04:33 compute-1 ovn_controller[133666]: 2026-02-02T10:04:33Z|00036|memory_trim|INFO|Detected inactivity (last active 30000 ms ago): trimming memory
Feb 02 10:04:33 compute-1 ceph-mon[80115]: pgmap v754: 353 pgs: 353 active+clean; 88 MiB data, 259 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 12 KiB/s wr, 73 op/s
Feb 02 10:04:33 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:04:33 : epoch 6980764f : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Feb 02 10:04:33 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:04:33 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f849c003a20 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:04:33 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:04:33 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f849c003a20 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:04:33 compute-1 nova_compute[226294]: 2026-02-02 10:04:33.884 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:04:33 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:04:33 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84bc004cf0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:04:34 compute-1 ceph-mon[80115]: pgmap v755: 353 pgs: 353 active+clean; 109 MiB data, 283 MiB used, 60 GiB / 60 GiB avail; 2.1 MiB/s rd, 2.1 MiB/s wr, 115 op/s
Feb 02 10:04:34 compute-1 nova_compute[226294]: 2026-02-02 10:04:34.826 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:04:34 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:04:34 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 10:04:34 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:04:34.829 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 10:04:34 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:04:34 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:04:34 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:04:34.923 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:04:35 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:04:35 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f848c003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:04:35 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:04:35 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f849c003a20 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:04:35 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:04:35 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8498004140 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:04:36 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:04:36 : epoch 6980764f : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Feb 02 10:04:36 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:04:36 : epoch 6980764f : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Feb 02 10:04:36 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:04:36 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 10:04:36 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:04:36.833 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 10:04:36 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:04:36 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:04:36 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:04:36.925 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:04:37 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 10:04:37 compute-1 ceph-mon[80115]: pgmap v756: 353 pgs: 353 active+clean; 109 MiB data, 283 MiB used, 60 GiB / 60 GiB avail; 171 KiB/s rd, 2.0 MiB/s wr, 41 op/s
Feb 02 10:04:37 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:04:37 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84bc004cf0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:04:37 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:04:37 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f848c003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:04:37 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:04:37 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f849c003a20 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:04:38 compute-1 sudo[230960]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Feb 02 10:04:38 compute-1 sudo[230960]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 10:04:38 compute-1 sudo[230960]: pam_unix(sudo:session): session closed for user root
Feb 02 10:04:38 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:04:38 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:04:38 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:04:38.836 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:04:38 compute-1 nova_compute[226294]: 2026-02-02 10:04:38.886 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:04:38 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:04:38 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:04:38 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:04:38.928 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:04:39 compute-1 ceph-mon[80115]: pgmap v757: 353 pgs: 353 active+clean; 121 MiB data, 302 MiB used, 60 GiB / 60 GiB avail; 321 KiB/s rd, 2.1 MiB/s wr, 66 op/s
Feb 02 10:04:39 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:04:39 : epoch 6980764f : compute-1 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Feb 02 10:04:39 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:04:39 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8498004140 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:04:39 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:04:39 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84bc004cf0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:04:39 compute-1 nova_compute[226294]: 2026-02-02 10:04:39.870 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:04:39 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:04:39 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f848c003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:04:40 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:04:40 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 10:04:40 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:04:40.839 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 10:04:40 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:04:40 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 10:04:40 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:04:40.931 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 10:04:41 compute-1 ceph-mon[80115]: pgmap v758: 353 pgs: 353 active+clean; 121 MiB data, 302 MiB used, 60 GiB / 60 GiB avail; 321 KiB/s rd, 2.1 MiB/s wr, 66 op/s
Feb 02 10:04:41 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:04:41 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f849c003a20 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:04:41 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:04:41 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8498004140 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:04:41 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:04:41 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b4000b60 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:04:42 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 10:04:42 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:04:42 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:04:42 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:04:42.842 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:04:42 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:04:42 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 10:04:42 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:04:42.934 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 10:04:43 compute-1 ceph-mon[80115]: pgmap v759: 353 pgs: 353 active+clean; 121 MiB data, 302 MiB used, 60 GiB / 60 GiB avail; 321 KiB/s rd, 2.1 MiB/s wr, 66 op/s
Feb 02 10:04:43 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:04:43 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f848c003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:04:43 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:04:43 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b8000df0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:04:43 compute-1 nova_compute[226294]: 2026-02-02 10:04:43.888 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:04:43 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:04:43 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8498004140 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:04:44 compute-1 ceph-mon[80115]: pgmap v760: 353 pgs: 353 active+clean; 121 MiB data, 302 MiB used, 60 GiB / 60 GiB avail; 321 KiB/s rd, 2.1 MiB/s wr, 67 op/s
Feb 02 10:04:44 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:04:44 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:04:44 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:04:44.844 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:04:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 10:04:44.905 143542 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 02 10:04:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 10:04:44.906 143542 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 02 10:04:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 10:04:44.906 143542 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 02 10:04:44 compute-1 nova_compute[226294]: 2026-02-02 10:04:44.907 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:04:44 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:04:44 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:04:44 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:04:44.937 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:04:45 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:04:45 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8498004140 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:04:45 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:04:45 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f848c003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:04:45 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-haproxy-nfs-cephfs-compute-1-sryqbx[86108]: [WARNING] 032/100445 (4) : Server backend/nfs.cephfs.2 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Feb 02 10:04:45 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:04:45 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b8000f90 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:04:46 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:04:46 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:04:46 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:04:46.847 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:04:46 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:04:46 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 10:04:46 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:04:46.940 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 10:04:47 compute-1 ceph-mon[80115]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #46. Immutable memtables: 0.
Feb 02 10:04:47 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:04:47.303089) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Feb 02 10:04:47 compute-1 ceph-mon[80115]: rocksdb: [db/flush_job.cc:856] [default] [JOB 25] Flushing memtable with next log file: 46
Feb 02 10:04:47 compute-1 ceph-mon[80115]: rocksdb: EVENT_LOG_v1 {"time_micros": 1770026687303680, "job": 25, "event": "flush_started", "num_memtables": 1, "num_entries": 760, "num_deletes": 256, "total_data_size": 1528395, "memory_usage": 1554632, "flush_reason": "Manual Compaction"}
Feb 02 10:04:47 compute-1 ceph-mon[80115]: rocksdb: [db/flush_job.cc:885] [default] [JOB 25] Level-0 flush table #47: started
Feb 02 10:04:47 compute-1 ceph-mon[80115]: rocksdb: EVENT_LOG_v1 {"time_micros": 1770026687313442, "cf_name": "default", "job": 25, "event": "table_file_creation", "file_number": 47, "file_size": 1010544, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 24328, "largest_seqno": 25083, "table_properties": {"data_size": 1006812, "index_size": 1512, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1157, "raw_key_size": 8235, "raw_average_key_size": 18, "raw_value_size": 999314, "raw_average_value_size": 2266, "num_data_blocks": 65, "num_entries": 441, "num_filter_entries": 441, "num_deletions": 256, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1770026642, "oldest_key_time": 1770026642, "file_creation_time": 1770026687, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "fac1d709-8a2a-487d-b05b-57255ec289c7", "db_session_id": "DE871D21TSCUFP8UED8E", "orig_file_number": 47, "seqno_to_time_mapping": "N/A"}}
Feb 02 10:04:47 compute-1 ceph-mon[80115]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 25] Flush lasted 10403 microseconds, and 3354 cpu microseconds.
Feb 02 10:04:47 compute-1 ceph-mon[80115]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Feb 02 10:04:47 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:04:47.313495) [db/flush_job.cc:967] [default] [JOB 25] Level-0 flush table #47: 1010544 bytes OK
Feb 02 10:04:47 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:04:47.313523) [db/memtable_list.cc:519] [default] Level-0 commit table #47 started
Feb 02 10:04:47 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:04:47.315774) [db/memtable_list.cc:722] [default] Level-0 commit table #47: memtable #1 done
Feb 02 10:04:47 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:04:47.315793) EVENT_LOG_v1 {"time_micros": 1770026687315788, "job": 25, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Feb 02 10:04:47 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:04:47.315813) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Feb 02 10:04:47 compute-1 ceph-mon[80115]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 25] Try to delete WAL files size 1524319, prev total WAL file size 1524319, number of live WAL files 2.
Feb 02 10:04:47 compute-1 ceph-mon[80115]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000043.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb 02 10:04:47 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:04:47.316379) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6C6F676D00323530' seq:72057594037927935, type:22 .. '6C6F676D00353032' seq:0, type:0; will stop at (end)
Feb 02 10:04:47 compute-1 ceph-mon[80115]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 26] Compacting 1@0 + 1@6 files to L6, score -1.00
Feb 02 10:04:47 compute-1 ceph-mon[80115]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 25 Base level 0, inputs: [47(986KB)], [45(11MB)]
Feb 02 10:04:47 compute-1 ceph-mon[80115]: rocksdb: EVENT_LOG_v1 {"time_micros": 1770026687316431, "job": 26, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [47], "files_L6": [45], "score": -1, "input_data_size": 13280521, "oldest_snapshot_seqno": -1}
Feb 02 10:04:47 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 10:04:47 compute-1 ceph-mon[80115]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 26] Generated table #48: 5415 keys, 13121633 bytes, temperature: kUnknown
Feb 02 10:04:47 compute-1 ceph-mon[80115]: rocksdb: EVENT_LOG_v1 {"time_micros": 1770026687409292, "cf_name": "default", "job": 26, "event": "table_file_creation", "file_number": 48, "file_size": 13121633, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 13085284, "index_size": 21711, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 13573, "raw_key_size": 138447, "raw_average_key_size": 25, "raw_value_size": 12987133, "raw_average_value_size": 2398, "num_data_blocks": 886, "num_entries": 5415, "num_filter_entries": 5415, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1770025175, "oldest_key_time": 0, "file_creation_time": 1770026687, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "fac1d709-8a2a-487d-b05b-57255ec289c7", "db_session_id": "DE871D21TSCUFP8UED8E", "orig_file_number": 48, "seqno_to_time_mapping": "N/A"}}
Feb 02 10:04:47 compute-1 ceph-mon[80115]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Feb 02 10:04:47 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:04:47.409646) [db/compaction/compaction_job.cc:1663] [default] [JOB 26] Compacted 1@0 + 1@6 files to L6 => 13121633 bytes
Feb 02 10:04:47 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:04:47.417845) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 142.8 rd, 141.1 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.0, 11.7 +0.0 blob) out(12.5 +0.0 blob), read-write-amplify(26.1) write-amplify(13.0) OK, records in: 5944, records dropped: 529 output_compression: NoCompression
Feb 02 10:04:47 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:04:47.417876) EVENT_LOG_v1 {"time_micros": 1770026687417863, "job": 26, "event": "compaction_finished", "compaction_time_micros": 92986, "compaction_time_cpu_micros": 18182, "output_level": 6, "num_output_files": 1, "total_output_size": 13121633, "num_input_records": 5944, "num_output_records": 5415, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Feb 02 10:04:47 compute-1 ceph-mon[80115]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000047.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb 02 10:04:47 compute-1 ceph-mon[80115]: rocksdb: EVENT_LOG_v1 {"time_micros": 1770026687418179, "job": 26, "event": "table_file_deletion", "file_number": 47}
Feb 02 10:04:47 compute-1 ceph-mon[80115]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000045.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb 02 10:04:47 compute-1 ceph-mon[80115]: rocksdb: EVENT_LOG_v1 {"time_micros": 1770026687420041, "job": 26, "event": "table_file_deletion", "file_number": 45}
Feb 02 10:04:47 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:04:47.316262) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 02 10:04:47 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:04:47.420077) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 02 10:04:47 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:04:47.420082) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 02 10:04:47 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:04:47.420086) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 02 10:04:47 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:04:47.420089) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 02 10:04:47 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:04:47.420092) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 02 10:04:47 compute-1 ceph-mon[80115]: pgmap v761: 353 pgs: 353 active+clean; 121 MiB data, 302 MiB used, 60 GiB / 60 GiB avail; 150 KiB/s rd, 108 KiB/s wr, 25 op/s
Feb 02 10:04:47 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Feb 02 10:04:47 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:04:47 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b4001e20 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:04:47 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:04:47 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8498004140 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:04:47 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:04:47 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f848c003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:04:48 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:04:48 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:04:48 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:04:48.850 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:04:48 compute-1 nova_compute[226294]: 2026-02-02 10:04:48.890 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:04:48 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:04:48 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:04:48 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:04:48.943 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:04:49 compute-1 ceph-mon[80115]: pgmap v762: 353 pgs: 353 active+clean; 121 MiB data, 302 MiB used, 60 GiB / 60 GiB avail; 150 KiB/s rd, 111 KiB/s wr, 25 op/s
Feb 02 10:04:49 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:04:49 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b8000f90 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:04:49 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:04:49 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b4001e20 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:04:49 compute-1 nova_compute[226294]: 2026-02-02 10:04:49.909 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:04:49 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:04:49 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8498004140 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:04:50 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:04:50 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:04:50 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:04:50.853 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:04:50 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:04:50 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:04:50 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:04:50.946 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:04:51 compute-1 ceph-mon[80115]: pgmap v763: 353 pgs: 353 active+clean; 121 MiB data, 302 MiB used, 60 GiB / 60 GiB avail; 341 B/s rd, 15 KiB/s wr, 1 op/s
Feb 02 10:04:51 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:04:51 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f848c003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:04:51 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:04:51 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b80091b0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:04:51 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:04:51 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b4001e20 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:04:52 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 10:04:52 compute-1 ceph-mon[80115]: pgmap v764: 353 pgs: 353 active+clean; 121 MiB data, 302 MiB used, 60 GiB / 60 GiB avail; 341 B/s rd, 15 KiB/s wr, 1 op/s
Feb 02 10:04:52 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:04:52 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:04:52 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:04:52.856 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:04:52 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:04:52 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:04:52 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:04:52.948 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:04:53 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:04:53 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8498004140 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:04:53 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:04:53 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8498004140 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:04:53 compute-1 nova_compute[226294]: 2026-02-02 10:04:53.891 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:04:53 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:04:53 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b80091b0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:04:54 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:04:54 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:04:54 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:04:54.859 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:04:54 compute-1 nova_compute[226294]: 2026-02-02 10:04:54.910 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:04:54 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:04:54 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 10:04:54 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:04:54.951 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 10:04:55 compute-1 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Feb 02 10:04:55 compute-1 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/3753626506' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Feb 02 10:04:55 compute-1 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Feb 02 10:04:55 compute-1 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/3753626506' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Feb 02 10:04:55 compute-1 podman[230996]: 2026-02-02 10:04:55.443307178 +0000 UTC m=+0.107711515 container health_status 1fb2696999ea15e9688144989093738543d3cef725f9986792d6dfd707aefbe2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:099d88ae13fa2b3409da5310cdcba7fa01d2c87a8bc98296299a57054b9a075e, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'db4758ee7523fe447444c4bd2b867b543b1eee4e3bbcf6676cd1b27bf6147d86-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:099d88ae13fa2b3409da5310cdcba7fa01d2c87a8bc98296299a57054b9a075e', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Feb 02 10:04:55 compute-1 ceph-mon[80115]: pgmap v765: 353 pgs: 353 active+clean; 121 MiB data, 302 MiB used, 60 GiB / 60 GiB avail; 597 B/s rd, 16 KiB/s wr, 1 op/s
Feb 02 10:04:55 compute-1 ceph-mon[80115]: from='client.? 192.168.122.10:0/3753626506' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Feb 02 10:04:55 compute-1 ceph-mon[80115]: from='client.? 192.168.122.10:0/3753626506' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Feb 02 10:04:55 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:04:55 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b80091b0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:04:55 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:04:55 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f848c003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:04:55 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:04:55 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8498004140 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:04:56 compute-1 sshd-session[231007]: Invalid user solv from 80.94.92.184 port 50274
Feb 02 10:04:56 compute-1 sshd-session[231007]: Connection closed by invalid user solv 80.94.92.184 port 50274 [preauth]
Feb 02 10:04:56 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:04:56 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:04:56 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:04:56.862 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:04:56 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:04:56 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:04:56 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:04:56.954 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:04:57 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 10:04:57 compute-1 ceph-mon[80115]: pgmap v766: 353 pgs: 353 active+clean; 121 MiB data, 302 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 3.3 KiB/s wr, 0 op/s
Feb 02 10:04:57 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:04:57 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b4002f20 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:04:57 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:04:57 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b80091b0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:04:57 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:04:57 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f848c003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:04:58 compute-1 sudo[231026]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Feb 02 10:04:58 compute-1 sudo[231026]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 10:04:58 compute-1 sudo[231026]: pam_unix(sudo:session): session closed for user root
Feb 02 10:04:58 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:04:58 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:04:58 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:04:58.866 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:04:58 compute-1 nova_compute[226294]: 2026-02-02 10:04:58.892 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:04:58 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:04:58 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:04:58 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:04:58.957 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:04:59 compute-1 ceph-mon[80115]: pgmap v767: 353 pgs: 353 active+clean; 121 MiB data, 302 MiB used, 60 GiB / 60 GiB avail; 6.6 KiB/s rd, 3.3 KiB/s wr, 1 op/s
Feb 02 10:04:59 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:04:59 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8498004140 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:04:59 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:04:59 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b4003840 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:04:59 compute-1 nova_compute[226294]: 2026-02-02 10:04:59.911 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:04:59 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:04:59 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b800a910 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:05:00 compute-1 ceph-mon[80115]: pgmap v768: 353 pgs: 353 active+clean; 121 MiB data, 302 MiB used, 60 GiB / 60 GiB avail; 6.6 KiB/s rd, 1023 B/s wr, 0 op/s
Feb 02 10:05:00 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:05:00 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 10:05:00 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:05:00.868 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 10:05:00 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:05:00 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 10:05:00 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:05:00.959 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 10:05:01 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:05:01 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f848c003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:05:01 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:05:01 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8498004140 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:05:01 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:05:01 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b4003840 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:05:02 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Feb 02 10:05:02 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 10:05:02 compute-1 podman[231052]: 2026-02-02 10:05:02.374665678 +0000 UTC m=+0.053547255 container health_status 9cab238676dda9a572dafc047b624a8b78a8fbec063bf997856559897160605d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'db4758ee7523fe447444c4bd2b867b543b1eee4e3bbcf6676cd1b27bf6147d86-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible)
Feb 02 10:05:02 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:05:02 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:05:02 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:05:02.872 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:05:02 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:05:02 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:05:02 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:05:02.962 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:05:03 compute-1 ceph-mon[80115]: pgmap v769: 353 pgs: 353 active+clean; 121 MiB data, 302 MiB used, 60 GiB / 60 GiB avail; 6.6 KiB/s rd, 1023 B/s wr, 0 op/s
Feb 02 10:05:03 compute-1 ceph-mon[80115]: from='client.? 192.168.122.100:0/3041023965' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Feb 02 10:05:03 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:05:03 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b800a910 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:05:03 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:05:03 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b800a910 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:05:03 compute-1 nova_compute[226294]: 2026-02-02 10:05:03.893 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:05:03 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:05:03 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8498004140 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:05:04 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:05:04 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 10:05:04 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:05:04.873 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 10:05:04 compute-1 nova_compute[226294]: 2026-02-02 10:05:04.913 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:05:04 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:05:04 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:05:04 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:05:04.963 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:05:05 compute-1 ceph-mon[80115]: pgmap v770: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 26 KiB/s rd, 6.2 KiB/s wr, 29 op/s
Feb 02 10:05:05 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:05:05 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b4003840 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:05:05 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:05:05 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b800a910 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:05:05 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:05:05 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f848c003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:05:06 compute-1 ceph-mon[80115]: from='client.? 192.168.122.100:0/1277977808' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Feb 02 10:05:06 compute-1 ceph-mon[80115]: from='client.? 192.168.122.100:0/462922132' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Feb 02 10:05:06 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:05:06 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 10:05:06 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:05:06.876 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 10:05:06 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:05:06 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 10:05:06 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:05:06.966 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 10:05:07 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 10:05:07 compute-1 ceph-mon[80115]: pgmap v771: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 25 KiB/s rd, 5.2 KiB/s wr, 28 op/s
Feb 02 10:05:07 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:05:07 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8498004140 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:05:07 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:05:07 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b4003840 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:05:07 compute-1 nova_compute[226294]: 2026-02-02 10:05:07.946 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 02 10:05:07 compute-1 nova_compute[226294]: 2026-02-02 10:05:07.947 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 02 10:05:07 compute-1 nova_compute[226294]: 2026-02-02 10:05:07.947 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 02 10:05:07 compute-1 nova_compute[226294]: 2026-02-02 10:05:07.948 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 02 10:05:07 compute-1 nova_compute[226294]: 2026-02-02 10:05:07.948 226298 DEBUG nova.compute.manager [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 02 10:05:07 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:05:07 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b800a910 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:05:08 compute-1 nova_compute[226294]: 2026-02-02 10:05:08.646 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 02 10:05:08 compute-1 nova_compute[226294]: 2026-02-02 10:05:08.648 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 02 10:05:08 compute-1 nova_compute[226294]: 2026-02-02 10:05:08.648 226298 DEBUG nova.compute.manager [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 02 10:05:08 compute-1 nova_compute[226294]: 2026-02-02 10:05:08.648 226298 DEBUG nova.compute.manager [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 02 10:05:08 compute-1 nova_compute[226294]: 2026-02-02 10:05:08.664 226298 DEBUG nova.compute.manager [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Feb 02 10:05:08 compute-1 nova_compute[226294]: 2026-02-02 10:05:08.665 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 02 10:05:08 compute-1 nova_compute[226294]: 2026-02-02 10:05:08.665 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 02 10:05:08 compute-1 ceph-mon[80115]: pgmap v772: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 25 KiB/s rd, 5.2 KiB/s wr, 28 op/s
Feb 02 10:05:08 compute-1 nova_compute[226294]: 2026-02-02 10:05:08.697 226298 DEBUG oslo_concurrency.lockutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 02 10:05:08 compute-1 nova_compute[226294]: 2026-02-02 10:05:08.698 226298 DEBUG oslo_concurrency.lockutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 02 10:05:08 compute-1 nova_compute[226294]: 2026-02-02 10:05:08.698 226298 DEBUG oslo_concurrency.lockutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 02 10:05:08 compute-1 nova_compute[226294]: 2026-02-02 10:05:08.699 226298 DEBUG nova.compute.resource_tracker [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 02 10:05:08 compute-1 nova_compute[226294]: 2026-02-02 10:05:08.699 226298 DEBUG oslo_concurrency.processutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 02 10:05:08 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:05:08 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 10:05:08 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:05:08.879 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 10:05:08 compute-1 nova_compute[226294]: 2026-02-02 10:05:08.895 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:05:08 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:05:08 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:05:08 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:05:08.969 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:05:09 compute-1 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Feb 02 10:05:09 compute-1 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1611746226' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Feb 02 10:05:09 compute-1 nova_compute[226294]: 2026-02-02 10:05:09.156 226298 DEBUG oslo_concurrency.processutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.457s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 02 10:05:09 compute-1 nova_compute[226294]: 2026-02-02 10:05:09.315 226298 WARNING nova.virt.libvirt.driver [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 02 10:05:09 compute-1 nova_compute[226294]: 2026-02-02 10:05:09.316 226298 DEBUG nova.compute.resource_tracker [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4979MB free_disk=59.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 02 10:05:09 compute-1 nova_compute[226294]: 2026-02-02 10:05:09.316 226298 DEBUG oslo_concurrency.lockutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 02 10:05:09 compute-1 nova_compute[226294]: 2026-02-02 10:05:09.316 226298 DEBUG oslo_concurrency.lockutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 02 10:05:09 compute-1 sudo[231098]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 02 10:05:09 compute-1 sudo[231098]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 10:05:09 compute-1 sudo[231098]: pam_unix(sudo:session): session closed for user root
Feb 02 10:05:09 compute-1 sudo[231123]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/d241d473-9fcb-5f74-b163-f1ca4454e7f1/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Feb 02 10:05:09 compute-1 sudo[231123]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 10:05:09 compute-1 nova_compute[226294]: 2026-02-02 10:05:09.400 226298 DEBUG nova.compute.resource_tracker [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 02 10:05:09 compute-1 nova_compute[226294]: 2026-02-02 10:05:09.400 226298 DEBUG nova.compute.resource_tracker [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 02 10:05:09 compute-1 nova_compute[226294]: 2026-02-02 10:05:09.427 226298 DEBUG oslo_concurrency.processutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 02 10:05:09 compute-1 ceph-mon[80115]: from='client.? 192.168.122.101:0/1611746226' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Feb 02 10:05:09 compute-1 ceph-mon[80115]: from='client.? 192.168.122.102:0/3983497669' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Feb 02 10:05:09 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:05:09 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f848c003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:05:09 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:05:09 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8498004140 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:05:09 compute-1 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Feb 02 10:05:09 compute-1 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1662336788' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Feb 02 10:05:09 compute-1 sudo[231123]: pam_unix(sudo:session): session closed for user root
Feb 02 10:05:09 compute-1 nova_compute[226294]: 2026-02-02 10:05:09.892 226298 DEBUG oslo_concurrency.processutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.465s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 02 10:05:09 compute-1 nova_compute[226294]: 2026-02-02 10:05:09.897 226298 DEBUG nova.compute.provider_tree [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Inventory has not changed in ProviderTree for provider: 8e32c057-ad28-4c19-8374-763e0c1c8622 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 02 10:05:09 compute-1 nova_compute[226294]: 2026-02-02 10:05:09.911 226298 DEBUG nova.scheduler.client.report [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Inventory has not changed for provider 8e32c057-ad28-4c19-8374-763e0c1c8622 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 02 10:05:09 compute-1 nova_compute[226294]: 2026-02-02 10:05:09.912 226298 DEBUG nova.compute.resource_tracker [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 02 10:05:09 compute-1 nova_compute[226294]: 2026-02-02 10:05:09.912 226298 DEBUG oslo_concurrency.lockutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.596s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 02 10:05:09 compute-1 nova_compute[226294]: 2026-02-02 10:05:09.922 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:05:09 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:05:09 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8498004140 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:05:10 compute-1 ceph-mon[80115]: from='client.? 192.168.122.101:0/1662336788' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Feb 02 10:05:10 compute-1 ceph-mon[80115]: from='client.? 192.168.122.102:0/1806638358' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Feb 02 10:05:10 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Feb 02 10:05:10 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Feb 02 10:05:10 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb 02 10:05:10 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb 02 10:05:10 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Feb 02 10:05:10 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Feb 02 10:05:10 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Feb 02 10:05:10 compute-1 ceph-mon[80115]: pgmap v773: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 19 KiB/s rd, 5.2 KiB/s wr, 28 op/s
Feb 02 10:05:10 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:05:10 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 10:05:10 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:05:10.882 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 10:05:10 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:05:10 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:05:10 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:05:10.972 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:05:11 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:05:11 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84bc001ff0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:05:11 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:05:11 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b4003840 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:05:11 compute-1 nova_compute[226294]: 2026-02-02 10:05:11.896 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 02 10:05:12 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:05:12 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b800a910 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:05:12 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 10:05:12 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:05:12 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:05:12 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:05:12.886 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:05:12 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:05:12 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:05:12 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:05:12.974 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:05:13 compute-1 ceph-mon[80115]: pgmap v774: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 19 KiB/s rd, 5.2 KiB/s wr, 28 op/s
Feb 02 10:05:13 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:05:13 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8498004140 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:05:13 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:05:13 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84bc002ee0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:05:13 compute-1 nova_compute[226294]: 2026-02-02 10:05:13.897 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:05:14 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:05:14 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b4003840 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:05:14 compute-1 ceph-mon[80115]: pgmap v775: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 19 KiB/s rd, 5.2 KiB/s wr, 28 op/s
Feb 02 10:05:14 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:05:14 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:05:14 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:05:14.888 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:05:14 compute-1 nova_compute[226294]: 2026-02-02 10:05:14.946 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:05:14 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:05:14 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 10:05:14 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:05:14.976 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 10:05:15 compute-1 sudo[231206]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 02 10:05:15 compute-1 sudo[231206]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 10:05:15 compute-1 sudo[231206]: pam_unix(sudo:session): session closed for user root
Feb 02 10:05:15 compute-1 nova_compute[226294]: 2026-02-02 10:05:15.702 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:05:15 compute-1 ovn_metadata_agent[143537]: 2026-02-02 10:05:15.702 143542 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=6, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '66:4f:4d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '4a:a7:f3:61:65:15'}, ipsec=False) old=SB_Global(nb_cfg=5) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 02 10:05:15 compute-1 ovn_metadata_agent[143537]: 2026-02-02 10:05:15.705 143542 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Feb 02 10:05:15 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:05:15 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b4003840 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:05:15 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:05:15 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8498004140 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:05:16 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:05:16 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84bc002ee0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:05:16 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb 02 10:05:16 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb 02 10:05:16 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:05:16 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 10:05:16 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:05:16.891 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 10:05:16 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:05:16 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:05:16 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:05:16.980 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:05:17 compute-1 ceph-mon[80115]: pgmap v776: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Feb 02 10:05:17 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Feb 02 10:05:17 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 10:05:17 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:05:17 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b800a910 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:05:17 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:05:17 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b4003840 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:05:18 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:05:18 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8498004140 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:05:18 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:05:18 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:05:18 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:05:18.894 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:05:18 compute-1 nova_compute[226294]: 2026-02-02 10:05:18.899 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:05:18 compute-1 sudo[231233]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Feb 02 10:05:18 compute-1 sudo[231233]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 10:05:18 compute-1 sudo[231233]: pam_unix(sudo:session): session closed for user root
Feb 02 10:05:18 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:05:18 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:05:18 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:05:18.982 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:05:19 compute-1 ovn_metadata_agent[143537]: 2026-02-02 10:05:19.707 143542 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=2f54a3b0-231a-4b96-9e3a-0a36e3e73216, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '6'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 02 10:05:19 compute-1 ceph-mon[80115]: pgmap v777: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Feb 02 10:05:19 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:05:19 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84bc002ee0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:05:19 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:05:19 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b800a910 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:05:19 compute-1 nova_compute[226294]: 2026-02-02 10:05:19.948 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:05:20 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:05:20 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b4003840 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:05:20 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:05:20 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 10:05:20 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:05:20.896 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 10:05:20 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:05:20 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:05:20 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:05:20.984 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:05:21 compute-1 ceph-mon[80115]: pgmap v778: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Feb 02 10:05:21 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:05:21 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8498004140 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:05:21 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:05:21 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84bc002ee0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:05:22 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:05:22 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b800a910 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:05:22 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 10:05:22 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:05:22 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:05:22 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:05:22.900 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:05:22 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:05:22 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:05:22 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:05:22.986 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:05:23 compute-1 systemd[1]: virtsecretd.service: Deactivated successfully.
Feb 02 10:05:23 compute-1 ceph-mon[80115]: pgmap v779: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Feb 02 10:05:23 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:05:23 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b4003840 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:05:23 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:05:23 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8498004140 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:05:23 compute-1 nova_compute[226294]: 2026-02-02 10:05:23.899 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:05:24 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:05:24 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84bc002ee0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:05:24 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:05:24 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:05:24 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:05:24.902 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:05:24 compute-1 nova_compute[226294]: 2026-02-02 10:05:24.986 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:05:24 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:05:24 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 10:05:24 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:05:24.988 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 10:05:25 compute-1 ceph-mon[80115]: pgmap v780: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Feb 02 10:05:25 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:05:25 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b800a910 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:05:25 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:05:25 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b4003840 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:05:26 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:05:26 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8498004140 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:05:26 compute-1 podman[231262]: 2026-02-02 10:05:26.480579042 +0000 UTC m=+0.150230242 container health_status 1fb2696999ea15e9688144989093738543d3cef725f9986792d6dfd707aefbe2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:099d88ae13fa2b3409da5310cdcba7fa01d2c87a8bc98296299a57054b9a075e, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'db4758ee7523fe447444c4bd2b867b543b1eee4e3bbcf6676cd1b27bf6147d86-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:099d88ae13fa2b3409da5310cdcba7fa01d2c87a8bc98296299a57054b9a075e', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true)
Feb 02 10:05:26 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:05:26 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:05:26 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:05:26.904 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:05:26 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:05:26 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:05:26 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:05:26.991 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:05:27 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 10:05:27 compute-1 ceph-mon[80115]: pgmap v781: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Feb 02 10:05:27 compute-1 ceph-mon[80115]: from='client.? 192.168.122.102:0/297270713' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Feb 02 10:05:27 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:05:27 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84bc002ee0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:05:27 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:05:27 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b800a910 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:05:28 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:05:28 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b4003840 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:05:28 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:05:28 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:05:28 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:05:28.907 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:05:28 compute-1 nova_compute[226294]: 2026-02-02 10:05:28.931 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:05:28 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:05:28 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 10:05:28 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:05:28.993 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 10:05:29 compute-1 ceph-mon[80115]: pgmap v782: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Feb 02 10:05:29 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:05:29 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8498004140 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:05:29 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:05:29 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84bc004df0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:05:30 compute-1 nova_compute[226294]: 2026-02-02 10:05:30.027 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:05:30 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:05:30 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b800a910 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:05:30 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:05:30 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:05:30 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:05:30.910 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:05:30 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:05:30 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 10:05:30 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:05:30.996 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 10:05:31 compute-1 ceph-mon[80115]: pgmap v783: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Feb 02 10:05:31 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:05:31 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b4003840 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:05:31 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:05:31 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8498004140 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:05:32 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:05:32 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84bc004df0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:05:32 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 10:05:32 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Feb 02 10:05:32 compute-1 ceph-mon[80115]: pgmap v784: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Feb 02 10:05:32 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:05:32 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 10:05:32 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:05:32.912 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 10:05:33 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:05:33 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:05:33 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:05:32.999 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:05:33 compute-1 podman[231290]: 2026-02-02 10:05:33.40307638 +0000 UTC m=+0.079217256 container health_status 9cab238676dda9a572dafc047b624a8b78a8fbec063bf997856559897160605d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20260127, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'db4758ee7523fe447444c4bd2b867b543b1eee4e3bbcf6676cd1b27bf6147d86-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team)
Feb 02 10:05:33 compute-1 ceph-mon[80115]: from='client.? 192.168.122.102:0/789059564' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Feb 02 10:05:33 compute-1 ceph-mon[80115]: from='client.? 192.168.122.102:0/3962326735' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Feb 02 10:05:33 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:05:33 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b800a910 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:05:33 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:05:33 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b4003840 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:05:33 compute-1 nova_compute[226294]: 2026-02-02 10:05:33.977 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:05:34 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:05:34 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8498004140 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:05:34 compute-1 ceph-mon[80115]: pgmap v785: 353 pgs: 353 active+clean; 88 MiB data, 277 MiB used, 60 GiB / 60 GiB avail; 1.7 MiB/s rd, 1.8 MiB/s wr, 34 op/s
Feb 02 10:05:34 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:05:34 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:05:34 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:05:34.915 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:05:35 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:05:35 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:05:35 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:05:35.001 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:05:35 compute-1 nova_compute[226294]: 2026-02-02 10:05:35.060 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:05:35 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:05:35 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84bc004df0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:05:35 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:05:35 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b800a910 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:05:36 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:05:36 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b4003840 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:05:36 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:05:36 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 10:05:36 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:05:36.919 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 10:05:37 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:05:37 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:05:37 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:05:37.004 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:05:37 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 10:05:37 compute-1 ceph-mon[80115]: pgmap v786: 353 pgs: 353 active+clean; 88 MiB data, 277 MiB used, 60 GiB / 60 GiB avail; 1.7 MiB/s rd, 1.8 MiB/s wr, 34 op/s
Feb 02 10:05:37 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:05:37 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8498004140 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:05:37 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:05:37 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84bc004df0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:05:38 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:05:38 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b800a910 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:05:38 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:05:38 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 10:05:38 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:05:38.922 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 10:05:39 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-haproxy-nfs-cephfs-compute-1-sryqbx[86108]: [WARNING] 032/100539 (4) : Server backend/nfs.cephfs.1 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Feb 02 10:05:39 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:05:39 compute-1 nova_compute[226294]: 2026-02-02 10:05:39.030 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:05:39 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:05:39 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:05:39.030 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:05:39 compute-1 sudo[231312]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Feb 02 10:05:39 compute-1 sudo[231312]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 10:05:39 compute-1 sudo[231312]: pam_unix(sudo:session): session closed for user root
Feb 02 10:05:39 compute-1 ceph-mon[80115]: pgmap v787: 353 pgs: 353 active+clean; 88 MiB data, 277 MiB used, 60 GiB / 60 GiB avail; 2.6 MiB/s rd, 1.8 MiB/s wr, 74 op/s
Feb 02 10:05:39 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:05:39 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b4003840 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:05:39 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:05:39 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8498004140 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:05:40 compute-1 nova_compute[226294]: 2026-02-02 10:05:40.063 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:05:40 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:05:40 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84bc004df0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:05:40 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:05:40 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:05:40 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:05:40.926 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:05:41 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:05:41 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:05:41 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:05:41.032 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:05:41 compute-1 ceph-mon[80115]: pgmap v788: 353 pgs: 353 active+clean; 88 MiB data, 277 MiB used, 60 GiB / 60 GiB avail; 2.6 MiB/s rd, 1.8 MiB/s wr, 74 op/s
Feb 02 10:05:41 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:05:41 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b800a910 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:05:41 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:05:41 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b800a910 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:05:42 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:05:42 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f848c001090 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:05:42 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 10:05:42 compute-1 ceph-mon[80115]: pgmap v789: 353 pgs: 353 active+clean; 88 MiB data, 277 MiB used, 60 GiB / 60 GiB avail; 2.6 MiB/s rd, 1.8 MiB/s wr, 74 op/s
Feb 02 10:05:42 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:05:42 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:05:42 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:05:42.928 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:05:43 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:05:43 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:05:43 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:05:43.033 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:05:43 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:05:43 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84bc004df0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:05:43 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:05:43 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b4003840 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:05:44 compute-1 nova_compute[226294]: 2026-02-02 10:05:44.063 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:05:44 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:05:44 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b4003840 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:05:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 10:05:44.905 143542 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 02 10:05:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 10:05:44.906 143542 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 02 10:05:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 10:05:44.906 143542 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 02 10:05:44 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:05:44 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:05:44 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:05:44.931 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:05:45 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:05:45 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:05:45 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:05:45.035 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:05:45 compute-1 nova_compute[226294]: 2026-02-02 10:05:45.065 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:05:45 compute-1 ceph-mon[80115]: pgmap v790: 353 pgs: 353 active+clean; 88 MiB data, 277 MiB used, 60 GiB / 60 GiB avail; 3.6 MiB/s rd, 1.8 MiB/s wr, 108 op/s
Feb 02 10:05:45 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:05:45 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f848c001090 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:05:45 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:05:45 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84bc004df0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:05:46 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:05:46 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b4003840 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:05:46 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:05:46 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 10:05:46 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:05:46.934 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 10:05:47 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:05:47 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 10:05:47 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:05:47.037 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 10:05:47 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 10:05:47 compute-1 ceph-mon[80115]: pgmap v791: 353 pgs: 353 active+clean; 88 MiB data, 277 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 12 KiB/s wr, 73 op/s
Feb 02 10:05:47 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Feb 02 10:05:47 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:05:47 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b4003840 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:05:47 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:05:47 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f848c001090 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:05:48 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:05:48 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84bc004df0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:05:48 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:05:48 : epoch 6980764f : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Feb 02 10:05:48 compute-1 ceph-mon[80115]: pgmap v792: 353 pgs: 353 active+clean; 109 MiB data, 283 MiB used, 60 GiB / 60 GiB avail; 2.0 MiB/s rd, 2.0 MiB/s wr, 101 op/s
Feb 02 10:05:48 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:05:48 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:05:48 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:05:48.938 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:05:49 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:05:49 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:05:49 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:05:49.041 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:05:49 compute-1 nova_compute[226294]: 2026-02-02 10:05:49.068 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:05:49 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:05:49 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b4003840 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:05:49 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:05:49 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b4003840 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:05:50 compute-1 nova_compute[226294]: 2026-02-02 10:05:50.067 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:05:50 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:05:50 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f848c001090 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:05:50 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:05:50 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:05:50 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:05:50.942 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:05:51 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:05:51 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:05:51 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:05:51.043 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:05:51 compute-1 ceph-mon[80115]: pgmap v793: 353 pgs: 353 active+clean; 109 MiB data, 283 MiB used, 60 GiB / 60 GiB avail; 1.1 MiB/s rd, 2.0 MiB/s wr, 62 op/s
Feb 02 10:05:51 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:05:51 : epoch 6980764f : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Feb 02 10:05:51 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:05:51 : epoch 6980764f : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Feb 02 10:05:51 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:05:51 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84bc004df0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:05:51 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:05:51 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b4003840 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:05:52 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:05:52 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b800a910 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:05:52 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 10:05:52 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:05:52 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:05:52 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:05:52.945 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:05:53 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:05:53 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 10:05:53 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:05:53.046 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 10:05:53 compute-1 ceph-mon[80115]: pgmap v794: 353 pgs: 353 active+clean; 109 MiB data, 283 MiB used, 60 GiB / 60 GiB avail; 1.1 MiB/s rd, 2.0 MiB/s wr, 62 op/s
Feb 02 10:05:53 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:05:53 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f848c001090 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:05:53 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:05:53 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f848c001090 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:05:54 compute-1 nova_compute[226294]: 2026-02-02 10:05:54.079 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:05:54 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:05:54 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b4003840 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:05:54 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:05:54 : epoch 6980764f : compute-1 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Feb 02 10:05:54 compute-1 ceph-mon[80115]: pgmap v795: 353 pgs: 353 active+clean; 121 MiB data, 302 MiB used, 60 GiB / 60 GiB avail; 1.4 MiB/s rd, 2.1 MiB/s wr, 101 op/s
Feb 02 10:05:54 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:05:54 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:05:54 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:05:54.948 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:05:55 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:05:55 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 10:05:55 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:05:55.050 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 10:05:55 compute-1 nova_compute[226294]: 2026-02-02 10:05:55.070 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:05:55 compute-1 ceph-mon[80115]: from='client.? 192.168.122.10:0/745370475' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Feb 02 10:05:55 compute-1 ceph-mon[80115]: from='client.? 192.168.122.10:0/745370475' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Feb 02 10:05:55 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:05:55 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b800a910 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:05:55 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:05:55 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f848c001090 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:05:56 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:05:56 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84bc004df0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:05:56 compute-1 ceph-mon[80115]: pgmap v796: 353 pgs: 353 active+clean; 121 MiB data, 302 MiB used, 60 GiB / 60 GiB avail; 329 KiB/s rd, 2.1 MiB/s wr, 67 op/s
Feb 02 10:05:56 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:05:56 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 10:05:56 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:05:56.951 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 10:05:57 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:05:57 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:05:57 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:05:57.058 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:05:57 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 10:05:57 compute-1 podman[231349]: 2026-02-02 10:05:57.457443534 +0000 UTC m=+0.125536246 container health_status 1fb2696999ea15e9688144989093738543d3cef725f9986792d6dfd707aefbe2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:099d88ae13fa2b3409da5310cdcba7fa01d2c87a8bc98296299a57054b9a075e, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'db4758ee7523fe447444c4bd2b867b543b1eee4e3bbcf6676cd1b27bf6147d86-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:099d88ae13fa2b3409da5310cdcba7fa01d2c87a8bc98296299a57054b9a075e', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Feb 02 10:05:57 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:05:57 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b4003840 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:05:57 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:05:57 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b800a910 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:05:58 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:05:58 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b800a910 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:05:58 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:05:58 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:05:58 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:05:58.954 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:05:59 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:05:59 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 10:05:59 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:05:59.060 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 10:05:59 compute-1 nova_compute[226294]: 2026-02-02 10:05:59.112 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:05:59 compute-1 sudo[231376]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Feb 02 10:05:59 compute-1 sudo[231376]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 10:05:59 compute-1 sudo[231376]: pam_unix(sudo:session): session closed for user root
Feb 02 10:05:59 compute-1 ceph-mon[80115]: pgmap v797: 353 pgs: 353 active+clean; 121 MiB data, 302 MiB used, 60 GiB / 60 GiB avail; 335 KiB/s rd, 2.1 MiB/s wr, 69 op/s
Feb 02 10:05:59 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:05:59 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84bc004df0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:05:59 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:05:59 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b4004550 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:06:00 compute-1 nova_compute[226294]: 2026-02-02 10:06:00.071 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:06:00 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:06:00 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b800a910 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:06:00 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:06:00 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:06:00 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:06:00.957 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:06:01 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-haproxy-nfs-cephfs-compute-1-sryqbx[86108]: [WARNING] 032/100601 (4) : Server backend/nfs.cephfs.1 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Feb 02 10:06:01 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:06:01 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:06:01 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:06:01.063 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:06:01 compute-1 ceph-mon[80115]: pgmap v798: 353 pgs: 353 active+clean; 121 MiB data, 302 MiB used, 60 GiB / 60 GiB avail; 231 KiB/s rd, 109 KiB/s wr, 41 op/s
Feb 02 10:06:01 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:06:01 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b800a910 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:06:01 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:06:01 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84bc004df0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:06:02 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:06:02 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b4004550 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:06:02 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 10:06:02 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Feb 02 10:06:02 compute-1 ceph-mon[80115]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #49. Immutable memtables: 0.
Feb 02 10:06:02 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:06:02.716554) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Feb 02 10:06:02 compute-1 ceph-mon[80115]: rocksdb: [db/flush_job.cc:856] [default] [JOB 27] Flushing memtable with next log file: 49
Feb 02 10:06:02 compute-1 ceph-mon[80115]: rocksdb: EVENT_LOG_v1 {"time_micros": 1770026762716595, "job": 27, "event": "flush_started", "num_memtables": 1, "num_entries": 984, "num_deletes": 251, "total_data_size": 2136220, "memory_usage": 2174248, "flush_reason": "Manual Compaction"}
Feb 02 10:06:02 compute-1 ceph-mon[80115]: rocksdb: [db/flush_job.cc:885] [default] [JOB 27] Level-0 flush table #50: started
Feb 02 10:06:02 compute-1 ceph-mon[80115]: rocksdb: EVENT_LOG_v1 {"time_micros": 1770026762734395, "cf_name": "default", "job": 27, "event": "table_file_creation", "file_number": 50, "file_size": 1411299, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 25088, "largest_seqno": 26067, "table_properties": {"data_size": 1406868, "index_size": 2083, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1285, "raw_key_size": 10028, "raw_average_key_size": 19, "raw_value_size": 1397892, "raw_average_value_size": 2751, "num_data_blocks": 93, "num_entries": 508, "num_filter_entries": 508, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1770026687, "oldest_key_time": 1770026687, "file_creation_time": 1770026762, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "fac1d709-8a2a-487d-b05b-57255ec289c7", "db_session_id": "DE871D21TSCUFP8UED8E", "orig_file_number": 50, "seqno_to_time_mapping": "N/A"}}
Feb 02 10:06:02 compute-1 ceph-mon[80115]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 27] Flush lasted 17929 microseconds, and 4816 cpu microseconds.
Feb 02 10:06:02 compute-1 ceph-mon[80115]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Feb 02 10:06:02 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:06:02.734478) [db/flush_job.cc:967] [default] [JOB 27] Level-0 flush table #50: 1411299 bytes OK
Feb 02 10:06:02 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:06:02.734502) [db/memtable_list.cc:519] [default] Level-0 commit table #50 started
Feb 02 10:06:02 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:06:02.736925) [db/memtable_list.cc:722] [default] Level-0 commit table #50: memtable #1 done
Feb 02 10:06:02 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:06:02.736952) EVENT_LOG_v1 {"time_micros": 1770026762736944, "job": 27, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Feb 02 10:06:02 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:06:02.736974) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Feb 02 10:06:02 compute-1 ceph-mon[80115]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 27] Try to delete WAL files size 2131277, prev total WAL file size 2131277, number of live WAL files 2.
Feb 02 10:06:02 compute-1 ceph-mon[80115]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000046.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb 02 10:06:02 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:06:02.737599) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730031373537' seq:72057594037927935, type:22 .. '7061786F730032303039' seq:0, type:0; will stop at (end)
Feb 02 10:06:02 compute-1 ceph-mon[80115]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 28] Compacting 1@0 + 1@6 files to L6, score -1.00
Feb 02 10:06:02 compute-1 ceph-mon[80115]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 27 Base level 0, inputs: [50(1378KB)], [48(12MB)]
Feb 02 10:06:02 compute-1 ceph-mon[80115]: rocksdb: EVENT_LOG_v1 {"time_micros": 1770026762737640, "job": 28, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [50], "files_L6": [48], "score": -1, "input_data_size": 14532932, "oldest_snapshot_seqno": -1}
Feb 02 10:06:02 compute-1 ceph-mon[80115]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 28] Generated table #51: 5407 keys, 12364573 bytes, temperature: kUnknown
Feb 02 10:06:02 compute-1 ceph-mon[80115]: rocksdb: EVENT_LOG_v1 {"time_micros": 1770026762827031, "cf_name": "default", "job": 28, "event": "table_file_creation", "file_number": 51, "file_size": 12364573, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 12328990, "index_size": 20945, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 13573, "raw_key_size": 138963, "raw_average_key_size": 25, "raw_value_size": 12231647, "raw_average_value_size": 2262, "num_data_blocks": 851, "num_entries": 5407, "num_filter_entries": 5407, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1770025175, "oldest_key_time": 0, "file_creation_time": 1770026762, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "fac1d709-8a2a-487d-b05b-57255ec289c7", "db_session_id": "DE871D21TSCUFP8UED8E", "orig_file_number": 51, "seqno_to_time_mapping": "N/A"}}
Feb 02 10:06:02 compute-1 ceph-mon[80115]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Feb 02 10:06:02 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:06:02.827361) [db/compaction/compaction_job.cc:1663] [default] [JOB 28] Compacted 1@0 + 1@6 files to L6 => 12364573 bytes
Feb 02 10:06:02 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:06:02.831443) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 162.4 rd, 138.2 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.3, 12.5 +0.0 blob) out(11.8 +0.0 blob), read-write-amplify(19.1) write-amplify(8.8) OK, records in: 5923, records dropped: 516 output_compression: NoCompression
Feb 02 10:06:02 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:06:02.831471) EVENT_LOG_v1 {"time_micros": 1770026762831458, "job": 28, "event": "compaction_finished", "compaction_time_micros": 89471, "compaction_time_cpu_micros": 32110, "output_level": 6, "num_output_files": 1, "total_output_size": 12364573, "num_input_records": 5923, "num_output_records": 5407, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Feb 02 10:06:02 compute-1 ceph-mon[80115]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000050.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb 02 10:06:02 compute-1 ceph-mon[80115]: rocksdb: EVENT_LOG_v1 {"time_micros": 1770026762831844, "job": 28, "event": "table_file_deletion", "file_number": 50}
Feb 02 10:06:02 compute-1 ceph-mon[80115]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000048.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb 02 10:06:02 compute-1 ceph-mon[80115]: rocksdb: EVENT_LOG_v1 {"time_micros": 1770026762833744, "job": 28, "event": "table_file_deletion", "file_number": 48}
Feb 02 10:06:02 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:06:02.737534) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 02 10:06:02 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:06:02.833864) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 02 10:06:02 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:06:02.833872) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 02 10:06:02 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:06:02.833875) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 02 10:06:02 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:06:02.833878) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 02 10:06:02 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:06:02.833881) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 02 10:06:02 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:06:02 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:06:02 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:06:02.960 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:06:03 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:06:03 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 10:06:03 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:06:03.066 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 10:06:03 compute-1 ceph-mon[80115]: pgmap v799: 353 pgs: 353 active+clean; 121 MiB data, 302 MiB used, 60 GiB / 60 GiB avail; 231 KiB/s rd, 109 KiB/s wr, 41 op/s
Feb 02 10:06:03 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:06:03 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b800a910 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:06:03 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:06:03 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f848c004100 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:06:04 compute-1 nova_compute[226294]: 2026-02-02 10:06:04.116 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:06:04 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:06:04 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84bc004df0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:06:04 compute-1 podman[231403]: 2026-02-02 10:06:04.399404109 +0000 UTC m=+0.067840044 container health_status 9cab238676dda9a572dafc047b624a8b78a8fbec063bf997856559897160605d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20260127, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'db4758ee7523fe447444c4bd2b867b543b1eee4e3bbcf6676cd1b27bf6147d86-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Feb 02 10:06:04 compute-1 ceph-mon[80115]: pgmap v800: 353 pgs: 353 active+clean; 121 MiB data, 302 MiB used, 60 GiB / 60 GiB avail; 231 KiB/s rd, 112 KiB/s wr, 41 op/s
Feb 02 10:06:04 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:06:04 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 10:06:04 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:06:04.962 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 10:06:05 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:06:05 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 10:06:05 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:06:05.069 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 10:06:05 compute-1 nova_compute[226294]: 2026-02-02 10:06:05.074 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:06:05 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:06:05 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b4004550 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:06:05 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:06:05 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b800a910 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:06:06 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:06:06 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f848c004100 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:06:06 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:06:06 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:06:06 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:06:06.966 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:06:07 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:06:07 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:06:07 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:06:07.072 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:06:07 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 10:06:07 compute-1 nova_compute[226294]: 2026-02-02 10:06:07.650 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 02 10:06:07 compute-1 ceph-mon[80115]: pgmap v801: 353 pgs: 353 active+clean; 121 MiB data, 302 MiB used, 60 GiB / 60 GiB avail; 6.2 KiB/s rd, 16 KiB/s wr, 1 op/s
Feb 02 10:06:07 compute-1 ceph-mon[80115]: from='client.? 192.168.122.100:0/515216054' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Feb 02 10:06:07 compute-1 ceph-mon[80115]: from='client.? 192.168.122.100:0/3091021254' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Feb 02 10:06:07 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:06:07 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84bc004df0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:06:07 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:06:07 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b4004550 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:06:08 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:06:08 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b800a910 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:06:08 compute-1 nova_compute[226294]: 2026-02-02 10:06:08.649 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 02 10:06:08 compute-1 nova_compute[226294]: 2026-02-02 10:06:08.649 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 02 10:06:08 compute-1 nova_compute[226294]: 2026-02-02 10:06:08.650 226298 DEBUG nova.compute.manager [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 02 10:06:08 compute-1 ceph-mon[80115]: from='client.? 192.168.122.100:0/1238960955' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Feb 02 10:06:08 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:06:08 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:06:08 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:06:08.969 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:06:09 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:06:09 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:06:09 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:06:09.075 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:06:09 compute-1 nova_compute[226294]: 2026-02-02 10:06:09.122 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:06:09 compute-1 nova_compute[226294]: 2026-02-02 10:06:09.644 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 02 10:06:09 compute-1 nova_compute[226294]: 2026-02-02 10:06:09.665 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 02 10:06:09 compute-1 nova_compute[226294]: 2026-02-02 10:06:09.666 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 02 10:06:09 compute-1 ceph-mon[80115]: pgmap v802: 353 pgs: 353 active+clean; 167 MiB data, 318 MiB used, 60 GiB / 60 GiB avail; 24 KiB/s rd, 1.8 MiB/s wr, 29 op/s
Feb 02 10:06:09 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:06:09 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b800a910 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:06:09 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:06:09 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84bc004df0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:06:10 compute-1 nova_compute[226294]: 2026-02-02 10:06:10.083 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:06:10 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:06:10 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b4004550 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:06:10 compute-1 nova_compute[226294]: 2026-02-02 10:06:10.648 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 02 10:06:10 compute-1 nova_compute[226294]: 2026-02-02 10:06:10.649 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 02 10:06:10 compute-1 nova_compute[226294]: 2026-02-02 10:06:10.649 226298 DEBUG nova.compute.manager [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 02 10:06:10 compute-1 nova_compute[226294]: 2026-02-02 10:06:10.649 226298 DEBUG nova.compute.manager [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 02 10:06:10 compute-1 nova_compute[226294]: 2026-02-02 10:06:10.721 226298 DEBUG nova.compute.manager [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Feb 02 10:06:10 compute-1 nova_compute[226294]: 2026-02-02 10:06:10.722 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 02 10:06:10 compute-1 nova_compute[226294]: 2026-02-02 10:06:10.783 226298 DEBUG oslo_concurrency.lockutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 02 10:06:10 compute-1 nova_compute[226294]: 2026-02-02 10:06:10.784 226298 DEBUG oslo_concurrency.lockutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 02 10:06:10 compute-1 nova_compute[226294]: 2026-02-02 10:06:10.784 226298 DEBUG oslo_concurrency.lockutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 02 10:06:10 compute-1 nova_compute[226294]: 2026-02-02 10:06:10.785 226298 DEBUG nova.compute.resource_tracker [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 02 10:06:10 compute-1 nova_compute[226294]: 2026-02-02 10:06:10.785 226298 DEBUG oslo_concurrency.processutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 02 10:06:10 compute-1 ceph-mon[80115]: from='client.? 192.168.122.102:0/2415136399' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Feb 02 10:06:10 compute-1 ceph-mon[80115]: pgmap v803: 353 pgs: 353 active+clean; 167 MiB data, 318 MiB used, 60 GiB / 60 GiB avail; 17 KiB/s rd, 1.8 MiB/s wr, 27 op/s
Feb 02 10:06:10 compute-1 ceph-mon[80115]: from='client.? 192.168.122.102:0/2147545547' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Feb 02 10:06:10 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:06:10 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 10:06:10 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:06:10.971 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 10:06:11 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:06:11 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:06:11 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:06:11.077 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:06:11 compute-1 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Feb 02 10:06:11 compute-1 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/771773473' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Feb 02 10:06:11 compute-1 nova_compute[226294]: 2026-02-02 10:06:11.302 226298 DEBUG oslo_concurrency.processutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.517s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 02 10:06:11 compute-1 nova_compute[226294]: 2026-02-02 10:06:11.484 226298 WARNING nova.virt.libvirt.driver [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 02 10:06:11 compute-1 nova_compute[226294]: 2026-02-02 10:06:11.486 226298 DEBUG nova.compute.resource_tracker [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4972MB free_disk=59.92196273803711GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 02 10:06:11 compute-1 nova_compute[226294]: 2026-02-02 10:06:11.487 226298 DEBUG oslo_concurrency.lockutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 02 10:06:11 compute-1 nova_compute[226294]: 2026-02-02 10:06:11.487 226298 DEBUG oslo_concurrency.lockutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 02 10:06:11 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:06:11 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b800a910 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:06:11 compute-1 ceph-mon[80115]: from='client.? 192.168.122.100:0/4008810149' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Feb 02 10:06:11 compute-1 ceph-mon[80115]: from='client.? 192.168.122.101:0/771773473' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Feb 02 10:06:11 compute-1 ceph-mon[80115]: from='client.? 192.168.122.102:0/3764222666' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Feb 02 10:06:11 compute-1 ceph-mon[80115]: from='client.? 192.168.122.100:0/1097599874' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Feb 02 10:06:11 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:06:11 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b800a910 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:06:12 compute-1 nova_compute[226294]: 2026-02-02 10:06:12.113 226298 DEBUG nova.compute.resource_tracker [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 02 10:06:12 compute-1 nova_compute[226294]: 2026-02-02 10:06:12.114 226298 DEBUG nova.compute.resource_tracker [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 02 10:06:12 compute-1 nova_compute[226294]: 2026-02-02 10:06:12.134 226298 DEBUG oslo_concurrency.processutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 02 10:06:12 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:06:12 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8498001090 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:06:12 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 10:06:12 compute-1 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Feb 02 10:06:12 compute-1 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3165935581' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Feb 02 10:06:12 compute-1 nova_compute[226294]: 2026-02-02 10:06:12.573 226298 DEBUG oslo_concurrency.processutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.439s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 02 10:06:12 compute-1 nova_compute[226294]: 2026-02-02 10:06:12.578 226298 DEBUG nova.compute.provider_tree [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Inventory has not changed in ProviderTree for provider: 8e32c057-ad28-4c19-8374-763e0c1c8622 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 02 10:06:12 compute-1 nova_compute[226294]: 2026-02-02 10:06:12.612 226298 DEBUG nova.scheduler.client.report [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Inventory has not changed for provider 8e32c057-ad28-4c19-8374-763e0c1c8622 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 02 10:06:12 compute-1 nova_compute[226294]: 2026-02-02 10:06:12.613 226298 DEBUG nova.compute.resource_tracker [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 02 10:06:12 compute-1 nova_compute[226294]: 2026-02-02 10:06:12.613 226298 DEBUG oslo_concurrency.lockutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.126s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 02 10:06:12 compute-1 ceph-mon[80115]: from='client.? 192.168.122.101:0/3165935581' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Feb 02 10:06:12 compute-1 ceph-mon[80115]: pgmap v804: 353 pgs: 353 active+clean; 167 MiB data, 318 MiB used, 60 GiB / 60 GiB avail; 17 KiB/s rd, 1.8 MiB/s wr, 27 op/s
Feb 02 10:06:12 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:06:12 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 10:06:12 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:06:12.974 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 10:06:13 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:06:13 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:06:13 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:06:13.079 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:06:13 compute-1 nova_compute[226294]: 2026-02-02 10:06:13.541 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 02 10:06:13 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:06:13 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b4004550 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:06:13 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:06:13 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b4004550 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:06:14 compute-1 nova_compute[226294]: 2026-02-02 10:06:14.124 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:06:14 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:06:14 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f849c001090 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:06:14 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:06:14 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 10:06:14 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:06:14.978 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 10:06:15 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:06:15 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 10:06:15 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:06:15.083 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 10:06:15 compute-1 nova_compute[226294]: 2026-02-02 10:06:15.085 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:06:15 compute-1 sudo[231474]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 02 10:06:15 compute-1 sudo[231474]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 10:06:15 compute-1 sudo[231474]: pam_unix(sudo:session): session closed for user root
Feb 02 10:06:15 compute-1 ceph-mon[80115]: pgmap v805: 353 pgs: 353 active+clean; 167 MiB data, 323 MiB used, 60 GiB / 60 GiB avail; 17 KiB/s rd, 1.8 MiB/s wr, 27 op/s
Feb 02 10:06:15 compute-1 sudo[231499]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/d241d473-9fcb-5f74-b163-f1ca4454e7f1/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Feb 02 10:06:15 compute-1 sudo[231499]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 10:06:15 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:06:15 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_23] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8498001090 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:06:15 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:06:15 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b800a910 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:06:16 compute-1 sudo[231499]: pam_unix(sudo:session): session closed for user root
Feb 02 10:06:16 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:06:16 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b4004550 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:06:16 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:06:16 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:06:16 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:06:16.981 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:06:17 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:06:17 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:06:17 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:06:17.085 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:06:17 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 10:06:17 compute-1 ceph-mon[80115]: pgmap v806: 353 pgs: 353 active+clean; 167 MiB data, 323 MiB used, 60 GiB / 60 GiB avail; 17 KiB/s rd, 1.8 MiB/s wr, 27 op/s
Feb 02 10:06:17 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Feb 02 10:06:17 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:06:17 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f849c001090 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:06:17 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:06:17 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f849c001090 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:06:18 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:06:18 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b800a910 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:06:18 compute-1 ceph-mon[80115]: pgmap v807: 353 pgs: 353 active+clean; 167 MiB data, 323 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 1.8 MiB/s wr, 101 op/s
Feb 02 10:06:18 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:06:18 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:06:18 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:06:18.983 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:06:19 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:06:19 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:06:19 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:06:19.087 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:06:19 compute-1 nova_compute[226294]: 2026-02-02 10:06:19.127 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:06:19 compute-1 sudo[231556]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Feb 02 10:06:19 compute-1 sudo[231556]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 10:06:19 compute-1 sudo[231556]: pam_unix(sudo:session): session closed for user root
Feb 02 10:06:19 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:06:19 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b4004550 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:06:19 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:06:19 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_23] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b4004550 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:06:20 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb 02 10:06:20 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb 02 10:06:20 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Feb 02 10:06:20 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Feb 02 10:06:20 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb 02 10:06:20 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb 02 10:06:20 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Feb 02 10:06:20 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Feb 02 10:06:20 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Feb 02 10:06:20 compute-1 nova_compute[226294]: 2026-02-02 10:06:20.086 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:06:20 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:06:20 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_23] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b4004550 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:06:20 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:06:20 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:06:20 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:06:20.985 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:06:21 compute-1 ceph-mon[80115]: pgmap v808: 353 pgs: 353 active+clean; 167 MiB data, 323 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 12 KiB/s wr, 73 op/s
Feb 02 10:06:21 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:06:21 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:06:21 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:06:21.088 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:06:21 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:06:21 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_23] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b4004550 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:06:21 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:06:21 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_23] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8494002690 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:06:22 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:06:22 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8490000b60 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:06:22 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 10:06:22 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:06:22 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:06:22 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:06:22.987 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:06:23 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:06:23 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:06:23 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:06:23.090 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:06:23 compute-1 ceph-mon[80115]: pgmap v809: 353 pgs: 353 active+clean; 167 MiB data, 323 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 12 KiB/s wr, 73 op/s
Feb 02 10:06:23 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:06:23 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b4004550 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:06:23 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:06:23 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_23] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b4004550 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:06:24 compute-1 nova_compute[226294]: 2026-02-02 10:06:24.130 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:06:24 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:06:24 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_23] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8494002690 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:06:24 compute-1 ceph-mon[80115]: pgmap v810: 353 pgs: 353 active+clean; 167 MiB data, 323 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 15 KiB/s wr, 74 op/s
Feb 02 10:06:24 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:06:24 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 10:06:24 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:06:24.989 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 10:06:25 compute-1 sudo[231586]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 02 10:06:25 compute-1 sudo[231586]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 10:06:25 compute-1 sudo[231586]: pam_unix(sudo:session): session closed for user root
Feb 02 10:06:25 compute-1 nova_compute[226294]: 2026-02-02 10:06:25.088 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:06:25 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:06:25 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:06:25 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:06:25.092 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:06:25 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:06:25 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84900016a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:06:25 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb 02 10:06:25 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb 02 10:06:25 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:06:25 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b4004550 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:06:26 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:06:26 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_23] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b4004550 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:06:26 compute-1 ceph-mon[80115]: pgmap v811: 353 pgs: 353 active+clean; 167 MiB data, 323 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 15 KiB/s wr, 74 op/s
Feb 02 10:06:26 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:06:26 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:06:26 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:06:26.993 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:06:27 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:06:27 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.002000053s ======
Feb 02 10:06:27 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:06:27.094 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000053s
Feb 02 10:06:27 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 10:06:27 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:06:27 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_23] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8494002690 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:06:27 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:06:27 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84900016a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:06:28 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:06:28 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b4004550 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:06:28 compute-1 podman[231612]: 2026-02-02 10:06:28.434072 +0000 UTC m=+0.095354294 container health_status 1fb2696999ea15e9688144989093738543d3cef725f9986792d6dfd707aefbe2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:099d88ae13fa2b3409da5310cdcba7fa01d2c87a8bc98296299a57054b9a075e, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'db4758ee7523fe447444c4bd2b867b543b1eee4e3bbcf6676cd1b27bf6147d86-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:099d88ae13fa2b3409da5310cdcba7fa01d2c87a8bc98296299a57054b9a075e', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3)
Feb 02 10:06:28 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:06:28 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 10:06:28 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:06:28.995 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 10:06:29 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:06:29 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 10:06:29 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:06:29.097 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 10:06:29 compute-1 nova_compute[226294]: 2026-02-02 10:06:29.132 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:06:29 compute-1 ceph-mon[80115]: pgmap v812: 353 pgs: 353 active+clean; 200 MiB data, 348 MiB used, 60 GiB / 60 GiB avail; 2.3 MiB/s rd, 2.1 MiB/s wr, 138 op/s
Feb 02 10:06:29 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:06:29 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f849c002e50 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:06:29 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:06:29 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_23] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8494002690 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:06:30 compute-1 nova_compute[226294]: 2026-02-02 10:06:30.090 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:06:30 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:06:30 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84900016a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:06:30 compute-1 ceph-mon[80115]: pgmap v813: 353 pgs: 353 active+clean; 200 MiB data, 348 MiB used, 60 GiB / 60 GiB avail; 377 KiB/s rd, 2.1 MiB/s wr, 64 op/s
Feb 02 10:06:31 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:06:31 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:06:31 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:06:30.999 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:06:31 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:06:31 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:06:31 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:06:31.100 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:06:31 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:06:31 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b4004550 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:06:31 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:06:31 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f849c002e50 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:06:32 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Feb 02 10:06:32 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 10:06:32 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:06:32 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_23] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8494002690 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:06:33 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:06:33 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 10:06:33 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:06:33.001 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 10:06:33 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:06:33 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:06:33 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:06:33.102 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:06:33 compute-1 ceph-mon[80115]: pgmap v814: 353 pgs: 353 active+clean; 200 MiB data, 348 MiB used, 60 GiB / 60 GiB avail; 377 KiB/s rd, 2.1 MiB/s wr, 64 op/s
Feb 02 10:06:33 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:06:33 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8490002b10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:06:33 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:06:33 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b4004550 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:06:34 compute-1 nova_compute[226294]: 2026-02-02 10:06:34.135 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:06:34 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:06:34 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f849c002e50 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:06:35 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:06:35 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:06:35 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:06:35.005 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:06:35 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:06:35 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:06:35 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:06:35.104 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:06:35 compute-1 nova_compute[226294]: 2026-02-02 10:06:35.119 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:06:35 compute-1 podman[231642]: 2026-02-02 10:06:35.397088433 +0000 UTC m=+0.076801252 container health_status 9cab238676dda9a572dafc047b624a8b78a8fbec063bf997856559897160605d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'db4758ee7523fe447444c4bd2b867b543b1eee4e3bbcf6676cd1b27bf6147d86-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent)
Feb 02 10:06:35 compute-1 ceph-mon[80115]: pgmap v815: 353 pgs: 353 active+clean; 121 MiB data, 348 MiB used, 60 GiB / 60 GiB avail; 397 KiB/s rd, 2.1 MiB/s wr, 91 op/s
Feb 02 10:06:35 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:06:35 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_23] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8494003690 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:06:35 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:06:35 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8490002b10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:06:36 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:06:36 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b4004550 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:06:37 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:06:37 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:06:37 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:06:37.009 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:06:37 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:06:37 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:06:37 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:06:37.106 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:06:37 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 10:06:37 compute-1 ceph-mon[80115]: pgmap v816: 353 pgs: 353 active+clean; 121 MiB data, 348 MiB used, 60 GiB / 60 GiB avail; 397 KiB/s rd, 2.1 MiB/s wr, 91 op/s
Feb 02 10:06:37 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:06:37 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f849c002e50 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:06:37 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:06:37 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_23] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8494003690 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:06:38 compute-1 nova_compute[226294]: 2026-02-02 10:06:38.060 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:06:38 compute-1 ovn_metadata_agent[143537]: 2026-02-02 10:06:38.060 143542 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=7, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '66:4f:4d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '4a:a7:f3:61:65:15'}, ipsec=False) old=SB_Global(nb_cfg=6) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 02 10:06:38 compute-1 ovn_metadata_agent[143537]: 2026-02-02 10:06:38.062 143542 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Feb 02 10:06:38 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:06:38 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8490002b10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:06:38 compute-1 ceph-mon[80115]: pgmap v817: 353 pgs: 353 active+clean; 121 MiB data, 304 MiB used, 60 GiB / 60 GiB avail; 398 KiB/s rd, 2.1 MiB/s wr, 92 op/s
Feb 02 10:06:39 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:06:39 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:06:39 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:06:39.012 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:06:39 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:06:39 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 10:06:39 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:06:39.108 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 10:06:39 compute-1 nova_compute[226294]: 2026-02-02 10:06:39.137 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:06:39 compute-1 sudo[231665]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Feb 02 10:06:39 compute-1 sudo[231665]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 10:06:39 compute-1 sudo[231665]: pam_unix(sudo:session): session closed for user root
Feb 02 10:06:39 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:06:39 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b4004550 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:06:39 compute-1 ceph-mon[80115]: from='client.? 192.168.122.100:0/320512063' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Feb 02 10:06:39 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:06:39 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f849c002e50 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:06:40 compute-1 nova_compute[226294]: 2026-02-02 10:06:40.121 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:06:40 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:06:40 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_23] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8494003690 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:06:40 compute-1 ceph-mon[80115]: pgmap v818: 353 pgs: 353 active+clean; 121 MiB data, 304 MiB used, 60 GiB / 60 GiB avail; 21 KiB/s rd, 1.2 KiB/s wr, 28 op/s
Feb 02 10:06:41 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:06:41 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 10:06:41 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:06:41.014 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 10:06:41 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:06:41 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:06:41 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:06:41.111 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:06:41 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:06:41 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8490003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:06:41 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:06:41 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b4004550 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:06:42 compute-1 ovn_metadata_agent[143537]: 2026-02-02 10:06:42.064 143542 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=2f54a3b0-231a-4b96-9e3a-0a36e3e73216, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '7'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 02 10:06:42 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 10:06:42 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:06:42 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f849c004340 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:06:43 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:06:43 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 10:06:43 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:06:43.017 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 10:06:43 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:06:43 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:06:43 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:06:43.113 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:06:43 compute-1 ceph-mon[80115]: pgmap v819: 353 pgs: 353 active+clean; 121 MiB data, 304 MiB used, 60 GiB / 60 GiB avail; 21 KiB/s rd, 1.2 KiB/s wr, 28 op/s
Feb 02 10:06:43 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:06:43 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_23] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8494003690 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:06:43 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:06:43 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8490003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:06:44 compute-1 nova_compute[226294]: 2026-02-02 10:06:44.140 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:06:44 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:06:44 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b4004550 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:06:44 compute-1 ceph-mon[80115]: from='client.? 192.168.122.102:0/2273464306' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Feb 02 10:06:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 10:06:44.907 143542 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 02 10:06:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 10:06:44.907 143542 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 02 10:06:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 10:06:44.907 143542 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 02 10:06:45 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:06:45 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 10:06:45 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:06:45.020 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 10:06:45 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:06:45 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:06:45 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:06:45.115 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:06:45 compute-1 nova_compute[226294]: 2026-02-02 10:06:45.173 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:06:45 compute-1 ceph-mon[80115]: pgmap v820: 353 pgs: 353 active+clean; 41 MiB data, 287 MiB used, 60 GiB / 60 GiB avail; 40 KiB/s rd, 2.3 KiB/s wr, 56 op/s
Feb 02 10:06:45 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:06:45 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f849c004340 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:06:45 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:06:45 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_23] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8494003690 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:06:46 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:06:46 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8490003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:06:46 compute-1 ceph-mon[80115]: pgmap v821: 353 pgs: 353 active+clean; 41 MiB data, 287 MiB used, 60 GiB / 60 GiB avail; 20 KiB/s rd, 1.3 KiB/s wr, 29 op/s
Feb 02 10:06:47 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:06:47 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 10:06:47 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:06:47.023 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 10:06:47 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:06:47 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:06:47 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:06:47.117 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:06:47 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 10:06:47 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Feb 02 10:06:47 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:06:47 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8494003690 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:06:47 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:06:47 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f849c004340 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:06:48 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:06:48 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_23] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b4004550 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:06:48 compute-1 ceph-mon[80115]: pgmap v822: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 20 KiB/s rd, 1.3 KiB/s wr, 29 op/s
Feb 02 10:06:49 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:06:49 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 10:06:49 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:06:49.026 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 10:06:49 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:06:49 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:06:49 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:06:49.119 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:06:49 compute-1 nova_compute[226294]: 2026-02-02 10:06:49.143 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:06:49 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:06:49 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8490003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:06:49 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:06:49 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8494003690 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:06:50 compute-1 nova_compute[226294]: 2026-02-02 10:06:50.228 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:06:50 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:06:50 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f849c004340 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:06:51 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:06:51 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:06:51 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:06:51.030 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:06:51 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:06:51 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:06:51 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:06:51.121 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:06:51 compute-1 ceph-mon[80115]: pgmap v823: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 19 KiB/s rd, 1.2 KiB/s wr, 27 op/s
Feb 02 10:06:51 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:06:51 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_23] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b4004550 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:06:51 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:06:51 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8490003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:06:52 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 10:06:52 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:06:52 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f849c004340 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:06:52 compute-1 ceph-mon[80115]: pgmap v824: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 19 KiB/s rd, 1.2 KiB/s wr, 27 op/s
Feb 02 10:06:53 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:06:53 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 10:06:53 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:06:53.033 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 10:06:53 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:06:53 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 10:06:53 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:06:53.124 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 10:06:53 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:06:53 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_23] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b80089d0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:06:53 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:06:53 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b4004550 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:06:54 compute-1 nova_compute[226294]: 2026-02-02 10:06:54.146 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:06:54 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:06:54 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b4004550 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:06:54 compute-1 ceph-mon[80115]: pgmap v825: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 19 KiB/s rd, 1.2 KiB/s wr, 27 op/s
Feb 02 10:06:55 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:06:55 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 10:06:55 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:06:55.036 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 10:06:55 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:06:55 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:06:55 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:06:55.127 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:06:55 compute-1 nova_compute[226294]: 2026-02-02 10:06:55.254 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:06:55 compute-1 ceph-mon[80115]: from='client.? 192.168.122.10:0/2065013609' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Feb 02 10:06:55 compute-1 ceph-mon[80115]: from='client.? 192.168.122.10:0/2065013609' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Feb 02 10:06:55 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:06:55 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f849c004340 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:06:55 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:06:55 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_23] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b80089d0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:06:56 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:06:56 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8490003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:06:56 compute-1 ceph-mon[80115]: pgmap v826: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Feb 02 10:06:57 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:06:57 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:06:57 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:06:57.038 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:06:57 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:06:57 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:06:57 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:06:57.129 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:06:57 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 10:06:57 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:06:57 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b4004550 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:06:57 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:06:57 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f849c004340 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:06:58 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:06:58 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_23] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b80089d0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:06:59 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:06:59 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:06:59 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:06:59.042 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:06:59 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:06:59 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:06:59 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:06:59.131 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:06:59 compute-1 nova_compute[226294]: 2026-02-02 10:06:59.149 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:06:59 compute-1 sudo[231709]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Feb 02 10:06:59 compute-1 sudo[231709]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 10:06:59 compute-1 sudo[231709]: pam_unix(sudo:session): session closed for user root
Feb 02 10:06:59 compute-1 podman[231703]: 2026-02-02 10:06:59.410669524 +0000 UTC m=+0.075175519 container health_status 1fb2696999ea15e9688144989093738543d3cef725f9986792d6dfd707aefbe2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:099d88ae13fa2b3409da5310cdcba7fa01d2c87a8bc98296299a57054b9a075e, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'db4758ee7523fe447444c4bd2b867b543b1eee4e3bbcf6676cd1b27bf6147d86-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:099d88ae13fa2b3409da5310cdcba7fa01d2c87a8bc98296299a57054b9a075e', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Feb 02 10:06:59 compute-1 ceph-mon[80115]: pgmap v827: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Feb 02 10:06:59 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:06:59 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8490003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:06:59 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:06:59 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b4004550 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:07:00 compute-1 nova_compute[226294]: 2026-02-02 10:07:00.283 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:07:00 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:07:00 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f849c004340 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:07:00 compute-1 ceph-mon[80115]: pgmap v828: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Feb 02 10:07:01 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:07:01 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:07:01 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:07:01.044 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:07:01 compute-1 nova_compute[226294]: 2026-02-02 10:07:01.102 226298 DEBUG oslo_concurrency.lockutils [None req-0873adaa-b51f-4db6-ae30-5e6fd5483f88 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Acquiring lock "15b2e821-5e8b-4d8a-9a48-7c6a30bd3220" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 02 10:07:01 compute-1 nova_compute[226294]: 2026-02-02 10:07:01.102 226298 DEBUG oslo_concurrency.lockutils [None req-0873adaa-b51f-4db6-ae30-5e6fd5483f88 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Lock "15b2e821-5e8b-4d8a-9a48-7c6a30bd3220" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 02 10:07:01 compute-1 nova_compute[226294]: 2026-02-02 10:07:01.125 226298 DEBUG nova.compute.manager [None req-0873adaa-b51f-4db6-ae30-5e6fd5483f88 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] [instance: 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Feb 02 10:07:01 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:07:01 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:07:01 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:07:01.133 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:07:01 compute-1 nova_compute[226294]: 2026-02-02 10:07:01.190 226298 DEBUG oslo_concurrency.lockutils [None req-0873adaa-b51f-4db6-ae30-5e6fd5483f88 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 02 10:07:01 compute-1 nova_compute[226294]: 2026-02-02 10:07:01.190 226298 DEBUG oslo_concurrency.lockutils [None req-0873adaa-b51f-4db6-ae30-5e6fd5483f88 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 02 10:07:01 compute-1 nova_compute[226294]: 2026-02-02 10:07:01.196 226298 DEBUG nova.virt.hardware [None req-0873adaa-b51f-4db6-ae30-5e6fd5483f88 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Feb 02 10:07:01 compute-1 nova_compute[226294]: 2026-02-02 10:07:01.196 226298 INFO nova.compute.claims [None req-0873adaa-b51f-4db6-ae30-5e6fd5483f88 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] [instance: 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220] Claim successful on node compute-1.ctlplane.example.com
Feb 02 10:07:01 compute-1 nova_compute[226294]: 2026-02-02 10:07:01.289 226298 DEBUG oslo_concurrency.processutils [None req-0873adaa-b51f-4db6-ae30-5e6fd5483f88 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 02 10:07:01 compute-1 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Feb 02 10:07:01 compute-1 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/198600574' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Feb 02 10:07:01 compute-1 nova_compute[226294]: 2026-02-02 10:07:01.760 226298 DEBUG oslo_concurrency.processutils [None req-0873adaa-b51f-4db6-ae30-5e6fd5483f88 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.471s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 02 10:07:01 compute-1 nova_compute[226294]: 2026-02-02 10:07:01.765 226298 DEBUG nova.compute.provider_tree [None req-0873adaa-b51f-4db6-ae30-5e6fd5483f88 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Inventory has not changed in ProviderTree for provider: 8e32c057-ad28-4c19-8374-763e0c1c8622 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 02 10:07:01 compute-1 ceph-mon[80115]: from='client.? 192.168.122.101:0/198600574' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Feb 02 10:07:01 compute-1 nova_compute[226294]: 2026-02-02 10:07:01.787 226298 DEBUG nova.scheduler.client.report [None req-0873adaa-b51f-4db6-ae30-5e6fd5483f88 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Inventory has not changed for provider 8e32c057-ad28-4c19-8374-763e0c1c8622 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 02 10:07:01 compute-1 nova_compute[226294]: 2026-02-02 10:07:01.811 226298 DEBUG oslo_concurrency.lockutils [None req-0873adaa-b51f-4db6-ae30-5e6fd5483f88 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.621s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 02 10:07:01 compute-1 nova_compute[226294]: 2026-02-02 10:07:01.812 226298 DEBUG nova.compute.manager [None req-0873adaa-b51f-4db6-ae30-5e6fd5483f88 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] [instance: 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Feb 02 10:07:01 compute-1 nova_compute[226294]: 2026-02-02 10:07:01.869 226298 DEBUG nova.compute.manager [None req-0873adaa-b51f-4db6-ae30-5e6fd5483f88 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] [instance: 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Feb 02 10:07:01 compute-1 nova_compute[226294]: 2026-02-02 10:07:01.870 226298 DEBUG nova.network.neutron [None req-0873adaa-b51f-4db6-ae30-5e6fd5483f88 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] [instance: 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Feb 02 10:07:01 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:07:01 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_23] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b8009980 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:07:01 compute-1 nova_compute[226294]: 2026-02-02 10:07:01.890 226298 INFO nova.virt.libvirt.driver [None req-0873adaa-b51f-4db6-ae30-5e6fd5483f88 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] [instance: 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Feb 02 10:07:01 compute-1 nova_compute[226294]: 2026-02-02 10:07:01.909 226298 DEBUG nova.compute.manager [None req-0873adaa-b51f-4db6-ae30-5e6fd5483f88 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] [instance: 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Feb 02 10:07:01 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:07:01 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8490003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:07:02 compute-1 nova_compute[226294]: 2026-02-02 10:07:02.007 226298 DEBUG nova.compute.manager [None req-0873adaa-b51f-4db6-ae30-5e6fd5483f88 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] [instance: 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Feb 02 10:07:02 compute-1 nova_compute[226294]: 2026-02-02 10:07:02.008 226298 DEBUG nova.virt.libvirt.driver [None req-0873adaa-b51f-4db6-ae30-5e6fd5483f88 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] [instance: 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Feb 02 10:07:02 compute-1 nova_compute[226294]: 2026-02-02 10:07:02.008 226298 INFO nova.virt.libvirt.driver [None req-0873adaa-b51f-4db6-ae30-5e6fd5483f88 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] [instance: 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220] Creating image(s)
Feb 02 10:07:02 compute-1 nova_compute[226294]: 2026-02-02 10:07:02.035 226298 DEBUG nova.storage.rbd_utils [None req-0873adaa-b51f-4db6-ae30-5e6fd5483f88 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] rbd image 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80
Feb 02 10:07:02 compute-1 nova_compute[226294]: 2026-02-02 10:07:02.067 226298 DEBUG nova.storage.rbd_utils [None req-0873adaa-b51f-4db6-ae30-5e6fd5483f88 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] rbd image 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80
Feb 02 10:07:02 compute-1 nova_compute[226294]: 2026-02-02 10:07:02.094 226298 DEBUG nova.storage.rbd_utils [None req-0873adaa-b51f-4db6-ae30-5e6fd5483f88 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] rbd image 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80
Feb 02 10:07:02 compute-1 nova_compute[226294]: 2026-02-02 10:07:02.098 226298 DEBUG oslo_concurrency.processutils [None req-0873adaa-b51f-4db6-ae30-5e6fd5483f88 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b48fe8b86a7168723be684d0fce89ef3f0abcc61 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 02 10:07:02 compute-1 nova_compute[226294]: 2026-02-02 10:07:02.156 226298 DEBUG oslo_concurrency.processutils [None req-0873adaa-b51f-4db6-ae30-5e6fd5483f88 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b48fe8b86a7168723be684d0fce89ef3f0abcc61 --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 02 10:07:02 compute-1 nova_compute[226294]: 2026-02-02 10:07:02.157 226298 DEBUG oslo_concurrency.lockutils [None req-0873adaa-b51f-4db6-ae30-5e6fd5483f88 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Acquiring lock "b48fe8b86a7168723be684d0fce89ef3f0abcc61" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 02 10:07:02 compute-1 nova_compute[226294]: 2026-02-02 10:07:02.159 226298 DEBUG oslo_concurrency.lockutils [None req-0873adaa-b51f-4db6-ae30-5e6fd5483f88 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Lock "b48fe8b86a7168723be684d0fce89ef3f0abcc61" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 02 10:07:02 compute-1 nova_compute[226294]: 2026-02-02 10:07:02.159 226298 DEBUG oslo_concurrency.lockutils [None req-0873adaa-b51f-4db6-ae30-5e6fd5483f88 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Lock "b48fe8b86a7168723be684d0fce89ef3f0abcc61" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 02 10:07:02 compute-1 nova_compute[226294]: 2026-02-02 10:07:02.198 226298 DEBUG nova.storage.rbd_utils [None req-0873adaa-b51f-4db6-ae30-5e6fd5483f88 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] rbd image 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80
Feb 02 10:07:02 compute-1 nova_compute[226294]: 2026-02-02 10:07:02.202 226298 DEBUG oslo_concurrency.processutils [None req-0873adaa-b51f-4db6-ae30-5e6fd5483f88 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/b48fe8b86a7168723be684d0fce89ef3f0abcc61 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 02 10:07:02 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 10:07:02 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:07:02 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b4004550 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:07:02 compute-1 nova_compute[226294]: 2026-02-02 10:07:02.572 226298 DEBUG oslo_concurrency.processutils [None req-0873adaa-b51f-4db6-ae30-5e6fd5483f88 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/b48fe8b86a7168723be684d0fce89ef3f0abcc61 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.369s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 02 10:07:02 compute-1 nova_compute[226294]: 2026-02-02 10:07:02.663 226298 DEBUG nova.policy [None req-0873adaa-b51f-4db6-ae30-5e6fd5483f88 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '1b1695a2a70d4aa0aa350ba17d8f6d5e', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'efbfe697ca674d72b47da5adf3e42c0c', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Feb 02 10:07:02 compute-1 nova_compute[226294]: 2026-02-02 10:07:02.674 226298 DEBUG nova.storage.rbd_utils [None req-0873adaa-b51f-4db6-ae30-5e6fd5483f88 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] resizing rbd image 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288
Feb 02 10:07:02 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Feb 02 10:07:02 compute-1 ceph-mon[80115]: pgmap v829: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Feb 02 10:07:02 compute-1 nova_compute[226294]: 2026-02-02 10:07:02.827 226298 DEBUG nova.objects.instance [None req-0873adaa-b51f-4db6-ae30-5e6fd5483f88 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Lazy-loading 'migration_context' on Instance uuid 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 02 10:07:02 compute-1 nova_compute[226294]: 2026-02-02 10:07:02.843 226298 DEBUG nova.virt.libvirt.driver [None req-0873adaa-b51f-4db6-ae30-5e6fd5483f88 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] [instance: 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Feb 02 10:07:02 compute-1 nova_compute[226294]: 2026-02-02 10:07:02.843 226298 DEBUG nova.virt.libvirt.driver [None req-0873adaa-b51f-4db6-ae30-5e6fd5483f88 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] [instance: 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220] Ensure instance console log exists: /var/lib/nova/instances/15b2e821-5e8b-4d8a-9a48-7c6a30bd3220/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Feb 02 10:07:02 compute-1 nova_compute[226294]: 2026-02-02 10:07:02.844 226298 DEBUG oslo_concurrency.lockutils [None req-0873adaa-b51f-4db6-ae30-5e6fd5483f88 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 02 10:07:02 compute-1 nova_compute[226294]: 2026-02-02 10:07:02.844 226298 DEBUG oslo_concurrency.lockutils [None req-0873adaa-b51f-4db6-ae30-5e6fd5483f88 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 02 10:07:02 compute-1 nova_compute[226294]: 2026-02-02 10:07:02.844 226298 DEBUG oslo_concurrency.lockutils [None req-0873adaa-b51f-4db6-ae30-5e6fd5483f88 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 02 10:07:03 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:07:03 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:07:03 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:07:03.047 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:07:03 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:07:03 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:07:03 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:07:03.135 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:07:03 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:07:03 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f849c004340 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:07:03 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:07:03 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_23] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b8009980 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:07:04 compute-1 nova_compute[226294]: 2026-02-02 10:07:04.152 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:07:04 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:07:04 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b8009980 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:07:04 compute-1 nova_compute[226294]: 2026-02-02 10:07:04.672 226298 DEBUG nova.network.neutron [None req-0873adaa-b51f-4db6-ae30-5e6fd5483f88 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] [instance: 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220] Successfully created port: 09a00258-4f60-42dd-a769-b2ea3b870187 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Feb 02 10:07:05 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:07:05 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:07:05 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:07:05.050 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:07:05 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:07:05 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:07:05 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:07:05.137 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:07:05 compute-1 nova_compute[226294]: 2026-02-02 10:07:05.329 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:07:05 compute-1 nova_compute[226294]: 2026-02-02 10:07:05.422 226298 DEBUG nova.network.neutron [None req-0873adaa-b51f-4db6-ae30-5e6fd5483f88 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] [instance: 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220] Successfully updated port: 09a00258-4f60-42dd-a769-b2ea3b870187 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Feb 02 10:07:05 compute-1 nova_compute[226294]: 2026-02-02 10:07:05.435 226298 DEBUG oslo_concurrency.lockutils [None req-0873adaa-b51f-4db6-ae30-5e6fd5483f88 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Acquiring lock "refresh_cache-15b2e821-5e8b-4d8a-9a48-7c6a30bd3220" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 02 10:07:05 compute-1 nova_compute[226294]: 2026-02-02 10:07:05.435 226298 DEBUG oslo_concurrency.lockutils [None req-0873adaa-b51f-4db6-ae30-5e6fd5483f88 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Acquired lock "refresh_cache-15b2e821-5e8b-4d8a-9a48-7c6a30bd3220" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 02 10:07:05 compute-1 nova_compute[226294]: 2026-02-02 10:07:05.435 226298 DEBUG nova.network.neutron [None req-0873adaa-b51f-4db6-ae30-5e6fd5483f88 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] [instance: 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Feb 02 10:07:05 compute-1 nova_compute[226294]: 2026-02-02 10:07:05.540 226298 DEBUG nova.compute.manager [req-39baf1d6-6267-4073-a518-beb0581ec750 req-d37672d2-036a-418d-9fc8-b9620ca668c1 b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] [instance: 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220] Received event network-changed-09a00258-4f60-42dd-a769-b2ea3b870187 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 02 10:07:05 compute-1 nova_compute[226294]: 2026-02-02 10:07:05.541 226298 DEBUG nova.compute.manager [req-39baf1d6-6267-4073-a518-beb0581ec750 req-d37672d2-036a-418d-9fc8-b9620ca668c1 b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] [instance: 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220] Refreshing instance network info cache due to event network-changed-09a00258-4f60-42dd-a769-b2ea3b870187. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Feb 02 10:07:05 compute-1 nova_compute[226294]: 2026-02-02 10:07:05.541 226298 DEBUG oslo_concurrency.lockutils [req-39baf1d6-6267-4073-a518-beb0581ec750 req-d37672d2-036a-418d-9fc8-b9620ca668c1 b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] Acquiring lock "refresh_cache-15b2e821-5e8b-4d8a-9a48-7c6a30bd3220" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 02 10:07:05 compute-1 nova_compute[226294]: 2026-02-02 10:07:05.590 226298 DEBUG nova.network.neutron [None req-0873adaa-b51f-4db6-ae30-5e6fd5483f88 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] [instance: 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Feb 02 10:07:05 compute-1 ceph-mon[80115]: pgmap v830: 353 pgs: 353 active+clean; 88 MiB data, 275 MiB used, 60 GiB / 60 GiB avail; 17 KiB/s rd, 1.8 MiB/s wr, 27 op/s
Feb 02 10:07:05 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:07:05 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b4004550 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:07:05 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:07:05 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f849c004340 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:07:06 compute-1 podman[231946]: 2026-02-02 10:07:06.367725129 +0000 UTC m=+0.048363576 container health_status 9cab238676dda9a572dafc047b624a8b78a8fbec063bf997856559897160605d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'db4758ee7523fe447444c4bd2b867b543b1eee4e3bbcf6676cd1b27bf6147d86-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Feb 02 10:07:06 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:07:06 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b4004550 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:07:06 compute-1 nova_compute[226294]: 2026-02-02 10:07:06.676 226298 DEBUG nova.network.neutron [None req-0873adaa-b51f-4db6-ae30-5e6fd5483f88 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] [instance: 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220] Updating instance_info_cache with network_info: [{"id": "09a00258-4f60-42dd-a769-b2ea3b870187", "address": "fa:16:3e:85:9a:96", "network": {"id": "ba6c4c87-77a9-4fcc-aa14-a4637c78f692", "bridge": "br-int", "label": "tempest-network-smoke--761234279", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "efbfe697ca674d72b47da5adf3e42c0c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap09a00258-4f", "ovs_interfaceid": "09a00258-4f60-42dd-a769-b2ea3b870187", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 02 10:07:06 compute-1 nova_compute[226294]: 2026-02-02 10:07:06.709 226298 DEBUG oslo_concurrency.lockutils [None req-0873adaa-b51f-4db6-ae30-5e6fd5483f88 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Releasing lock "refresh_cache-15b2e821-5e8b-4d8a-9a48-7c6a30bd3220" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 02 10:07:06 compute-1 nova_compute[226294]: 2026-02-02 10:07:06.709 226298 DEBUG nova.compute.manager [None req-0873adaa-b51f-4db6-ae30-5e6fd5483f88 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] [instance: 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220] Instance network_info: |[{"id": "09a00258-4f60-42dd-a769-b2ea3b870187", "address": "fa:16:3e:85:9a:96", "network": {"id": "ba6c4c87-77a9-4fcc-aa14-a4637c78f692", "bridge": "br-int", "label": "tempest-network-smoke--761234279", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "efbfe697ca674d72b47da5adf3e42c0c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap09a00258-4f", "ovs_interfaceid": "09a00258-4f60-42dd-a769-b2ea3b870187", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Feb 02 10:07:06 compute-1 nova_compute[226294]: 2026-02-02 10:07:06.710 226298 DEBUG oslo_concurrency.lockutils [req-39baf1d6-6267-4073-a518-beb0581ec750 req-d37672d2-036a-418d-9fc8-b9620ca668c1 b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] Acquired lock "refresh_cache-15b2e821-5e8b-4d8a-9a48-7c6a30bd3220" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 02 10:07:06 compute-1 nova_compute[226294]: 2026-02-02 10:07:06.710 226298 DEBUG nova.network.neutron [req-39baf1d6-6267-4073-a518-beb0581ec750 req-d37672d2-036a-418d-9fc8-b9620ca668c1 b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] [instance: 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220] Refreshing network info cache for port 09a00258-4f60-42dd-a769-b2ea3b870187 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Feb 02 10:07:06 compute-1 nova_compute[226294]: 2026-02-02 10:07:06.713 226298 DEBUG nova.virt.libvirt.driver [None req-0873adaa-b51f-4db6-ae30-5e6fd5483f88 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] [instance: 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220] Start _get_guest_xml network_info=[{"id": "09a00258-4f60-42dd-a769-b2ea3b870187", "address": "fa:16:3e:85:9a:96", "network": {"id": "ba6c4c87-77a9-4fcc-aa14-a4637c78f692", "bridge": "br-int", "label": "tempest-network-smoke--761234279", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "efbfe697ca674d72b47da5adf3e42c0c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap09a00258-4f", "ovs_interfaceid": "09a00258-4f60-42dd-a769-b2ea3b870187", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-02-02T10:01:42Z,direct_url=<?>,disk_format='qcow2',id=d5e062d7-95ef-409c-9ad0-60f7cf6f44ce,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='823d3e7e313a44e9a50531e3fef22a1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-02-02T10:01:45Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_format': None, 'disk_bus': 'virtio', 'encrypted': False, 'guest_format': None, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'size': 0, 'encryption_options': None, 'device_type': 'disk', 'boot_index': 0, 'image_id': 'd5e062d7-95ef-409c-9ad0-60f7cf6f44ce'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Feb 02 10:07:06 compute-1 nova_compute[226294]: 2026-02-02 10:07:06.718 226298 WARNING nova.virt.libvirt.driver [None req-0873adaa-b51f-4db6-ae30-5e6fd5483f88 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 02 10:07:06 compute-1 nova_compute[226294]: 2026-02-02 10:07:06.722 226298 DEBUG nova.virt.libvirt.host [None req-0873adaa-b51f-4db6-ae30-5e6fd5483f88 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Feb 02 10:07:06 compute-1 nova_compute[226294]: 2026-02-02 10:07:06.723 226298 DEBUG nova.virt.libvirt.host [None req-0873adaa-b51f-4db6-ae30-5e6fd5483f88 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Feb 02 10:07:06 compute-1 nova_compute[226294]: 2026-02-02 10:07:06.726 226298 DEBUG nova.virt.libvirt.host [None req-0873adaa-b51f-4db6-ae30-5e6fd5483f88 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Feb 02 10:07:06 compute-1 nova_compute[226294]: 2026-02-02 10:07:06.726 226298 DEBUG nova.virt.libvirt.host [None req-0873adaa-b51f-4db6-ae30-5e6fd5483f88 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Feb 02 10:07:06 compute-1 nova_compute[226294]: 2026-02-02 10:07:06.727 226298 DEBUG nova.virt.libvirt.driver [None req-0873adaa-b51f-4db6-ae30-5e6fd5483f88 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Feb 02 10:07:06 compute-1 nova_compute[226294]: 2026-02-02 10:07:06.727 226298 DEBUG nova.virt.hardware [None req-0873adaa-b51f-4db6-ae30-5e6fd5483f88 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-02-02T10:01:40Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='1194feb9-e285-414e-825a-1e77171d092f',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-02-02T10:01:42Z,direct_url=<?>,disk_format='qcow2',id=d5e062d7-95ef-409c-9ad0-60f7cf6f44ce,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='823d3e7e313a44e9a50531e3fef22a1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-02-02T10:01:45Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Feb 02 10:07:06 compute-1 nova_compute[226294]: 2026-02-02 10:07:06.728 226298 DEBUG nova.virt.hardware [None req-0873adaa-b51f-4db6-ae30-5e6fd5483f88 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Feb 02 10:07:06 compute-1 nova_compute[226294]: 2026-02-02 10:07:06.728 226298 DEBUG nova.virt.hardware [None req-0873adaa-b51f-4db6-ae30-5e6fd5483f88 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Feb 02 10:07:06 compute-1 nova_compute[226294]: 2026-02-02 10:07:06.728 226298 DEBUG nova.virt.hardware [None req-0873adaa-b51f-4db6-ae30-5e6fd5483f88 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Feb 02 10:07:06 compute-1 nova_compute[226294]: 2026-02-02 10:07:06.728 226298 DEBUG nova.virt.hardware [None req-0873adaa-b51f-4db6-ae30-5e6fd5483f88 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Feb 02 10:07:06 compute-1 nova_compute[226294]: 2026-02-02 10:07:06.729 226298 DEBUG nova.virt.hardware [None req-0873adaa-b51f-4db6-ae30-5e6fd5483f88 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Feb 02 10:07:06 compute-1 nova_compute[226294]: 2026-02-02 10:07:06.729 226298 DEBUG nova.virt.hardware [None req-0873adaa-b51f-4db6-ae30-5e6fd5483f88 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Feb 02 10:07:06 compute-1 nova_compute[226294]: 2026-02-02 10:07:06.729 226298 DEBUG nova.virt.hardware [None req-0873adaa-b51f-4db6-ae30-5e6fd5483f88 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Feb 02 10:07:06 compute-1 nova_compute[226294]: 2026-02-02 10:07:06.729 226298 DEBUG nova.virt.hardware [None req-0873adaa-b51f-4db6-ae30-5e6fd5483f88 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Feb 02 10:07:06 compute-1 nova_compute[226294]: 2026-02-02 10:07:06.730 226298 DEBUG nova.virt.hardware [None req-0873adaa-b51f-4db6-ae30-5e6fd5483f88 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Feb 02 10:07:06 compute-1 nova_compute[226294]: 2026-02-02 10:07:06.730 226298 DEBUG nova.virt.hardware [None req-0873adaa-b51f-4db6-ae30-5e6fd5483f88 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Feb 02 10:07:06 compute-1 nova_compute[226294]: 2026-02-02 10:07:06.733 226298 DEBUG oslo_concurrency.processutils [None req-0873adaa-b51f-4db6-ae30-5e6fd5483f88 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 02 10:07:06 compute-1 ceph-mon[80115]: pgmap v831: 353 pgs: 353 active+clean; 88 MiB data, 275 MiB used, 60 GiB / 60 GiB avail; 17 KiB/s rd, 1.8 MiB/s wr, 27 op/s
Feb 02 10:07:06 compute-1 sshd-session[231965]: Invalid user admin from 45.148.10.121 port 43666
Feb 02 10:07:06 compute-1 sshd-session[231965]: Connection closed by invalid user admin 45.148.10.121 port 43666 [preauth]
Feb 02 10:07:07 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:07:07 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:07:07 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:07:07.053 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:07:07 compute-1 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Feb 02 10:07:07 compute-1 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1442920525' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Feb 02 10:07:07 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:07:07 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:07:07 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:07:07.139 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:07:07 compute-1 nova_compute[226294]: 2026-02-02 10:07:07.142 226298 DEBUG oslo_concurrency.processutils [None req-0873adaa-b51f-4db6-ae30-5e6fd5483f88 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.409s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 02 10:07:07 compute-1 nova_compute[226294]: 2026-02-02 10:07:07.167 226298 DEBUG nova.storage.rbd_utils [None req-0873adaa-b51f-4db6-ae30-5e6fd5483f88 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] rbd image 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80
Feb 02 10:07:07 compute-1 nova_compute[226294]: 2026-02-02 10:07:07.170 226298 DEBUG oslo_concurrency.processutils [None req-0873adaa-b51f-4db6-ae30-5e6fd5483f88 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 02 10:07:07 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 10:07:07 compute-1 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Feb 02 10:07:07 compute-1 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/671941752' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Feb 02 10:07:07 compute-1 nova_compute[226294]: 2026-02-02 10:07:07.576 226298 DEBUG oslo_concurrency.processutils [None req-0873adaa-b51f-4db6-ae30-5e6fd5483f88 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.406s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 02 10:07:07 compute-1 nova_compute[226294]: 2026-02-02 10:07:07.578 226298 DEBUG nova.virt.libvirt.vif [None req-0873adaa-b51f-4db6-ae30-5e6fd5483f88 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-02-02T10:07:00Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1612354759',display_name='tempest-TestNetworkBasicOps-server-1612354759',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1612354759',id=6,image_ref='d5e062d7-95ef-409c-9ad0-60f7cf6f44ce',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBMJFk+K1ECuOT65iOILNHAamJ3gGwluebBSGyHnh3tJwd+ehxpJY8ebkg+pBKZw/EcMgvzSZ3FmRm5/iJU1QzjfTd7kqiXGqABRWM3LPcjo/Kmnp/RcvKBxgSpfhTx8kiA==',key_name='tempest-TestNetworkBasicOps-1228211887',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='efbfe697ca674d72b47da5adf3e42c0c',ramdisk_id='',reservation_id='r-1fnfubus',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='d5e062d7-95ef-409c-9ad0-60f7cf6f44ce',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-793549693',owner_user_name='tempest-TestNetworkBasicOps-793549693-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-02-02T10:07:01Z,user_data=None,user_id='1b1695a2a70d4aa0aa350ba17d8f6d5e',uuid=15b2e821-5e8b-4d8a-9a48-7c6a30bd3220,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "09a00258-4f60-42dd-a769-b2ea3b870187", "address": "fa:16:3e:85:9a:96", "network": {"id": "ba6c4c87-77a9-4fcc-aa14-a4637c78f692", "bridge": "br-int", "label": "tempest-network-smoke--761234279", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "efbfe697ca674d72b47da5adf3e42c0c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap09a00258-4f", "ovs_interfaceid": "09a00258-4f60-42dd-a769-b2ea3b870187", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Feb 02 10:07:07 compute-1 nova_compute[226294]: 2026-02-02 10:07:07.578 226298 DEBUG nova.network.os_vif_util [None req-0873adaa-b51f-4db6-ae30-5e6fd5483f88 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Converting VIF {"id": "09a00258-4f60-42dd-a769-b2ea3b870187", "address": "fa:16:3e:85:9a:96", "network": {"id": "ba6c4c87-77a9-4fcc-aa14-a4637c78f692", "bridge": "br-int", "label": "tempest-network-smoke--761234279", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "efbfe697ca674d72b47da5adf3e42c0c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap09a00258-4f", "ovs_interfaceid": "09a00258-4f60-42dd-a769-b2ea3b870187", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 02 10:07:07 compute-1 nova_compute[226294]: 2026-02-02 10:07:07.579 226298 DEBUG nova.network.os_vif_util [None req-0873adaa-b51f-4db6-ae30-5e6fd5483f88 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:85:9a:96,bridge_name='br-int',has_traffic_filtering=True,id=09a00258-4f60-42dd-a769-b2ea3b870187,network=Network(ba6c4c87-77a9-4fcc-aa14-a4637c78f692),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap09a00258-4f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 02 10:07:07 compute-1 nova_compute[226294]: 2026-02-02 10:07:07.580 226298 DEBUG nova.objects.instance [None req-0873adaa-b51f-4db6-ae30-5e6fd5483f88 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Lazy-loading 'pci_devices' on Instance uuid 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 02 10:07:07 compute-1 nova_compute[226294]: 2026-02-02 10:07:07.599 226298 DEBUG nova.virt.libvirt.driver [None req-0873adaa-b51f-4db6-ae30-5e6fd5483f88 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] [instance: 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220] End _get_guest_xml xml=<domain type="kvm">
Feb 02 10:07:07 compute-1 nova_compute[226294]:   <uuid>15b2e821-5e8b-4d8a-9a48-7c6a30bd3220</uuid>
Feb 02 10:07:07 compute-1 nova_compute[226294]:   <name>instance-00000006</name>
Feb 02 10:07:07 compute-1 nova_compute[226294]:   <memory>131072</memory>
Feb 02 10:07:07 compute-1 nova_compute[226294]:   <vcpu>1</vcpu>
Feb 02 10:07:07 compute-1 nova_compute[226294]:   <metadata>
Feb 02 10:07:07 compute-1 nova_compute[226294]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Feb 02 10:07:07 compute-1 nova_compute[226294]:       <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Feb 02 10:07:07 compute-1 nova_compute[226294]:       <nova:name>tempest-TestNetworkBasicOps-server-1612354759</nova:name>
Feb 02 10:07:07 compute-1 nova_compute[226294]:       <nova:creationTime>2026-02-02 10:07:06</nova:creationTime>
Feb 02 10:07:07 compute-1 nova_compute[226294]:       <nova:flavor name="m1.nano">
Feb 02 10:07:07 compute-1 nova_compute[226294]:         <nova:memory>128</nova:memory>
Feb 02 10:07:07 compute-1 nova_compute[226294]:         <nova:disk>1</nova:disk>
Feb 02 10:07:07 compute-1 nova_compute[226294]:         <nova:swap>0</nova:swap>
Feb 02 10:07:07 compute-1 nova_compute[226294]:         <nova:ephemeral>0</nova:ephemeral>
Feb 02 10:07:07 compute-1 nova_compute[226294]:         <nova:vcpus>1</nova:vcpus>
Feb 02 10:07:07 compute-1 nova_compute[226294]:       </nova:flavor>
Feb 02 10:07:07 compute-1 nova_compute[226294]:       <nova:owner>
Feb 02 10:07:07 compute-1 nova_compute[226294]:         <nova:user uuid="1b1695a2a70d4aa0aa350ba17d8f6d5e">tempest-TestNetworkBasicOps-793549693-project-member</nova:user>
Feb 02 10:07:07 compute-1 nova_compute[226294]:         <nova:project uuid="efbfe697ca674d72b47da5adf3e42c0c">tempest-TestNetworkBasicOps-793549693</nova:project>
Feb 02 10:07:07 compute-1 nova_compute[226294]:       </nova:owner>
Feb 02 10:07:07 compute-1 nova_compute[226294]:       <nova:root type="image" uuid="d5e062d7-95ef-409c-9ad0-60f7cf6f44ce"/>
Feb 02 10:07:07 compute-1 nova_compute[226294]:       <nova:ports>
Feb 02 10:07:07 compute-1 nova_compute[226294]:         <nova:port uuid="09a00258-4f60-42dd-a769-b2ea3b870187">
Feb 02 10:07:07 compute-1 nova_compute[226294]:           <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Feb 02 10:07:07 compute-1 nova_compute[226294]:         </nova:port>
Feb 02 10:07:07 compute-1 nova_compute[226294]:       </nova:ports>
Feb 02 10:07:07 compute-1 nova_compute[226294]:     </nova:instance>
Feb 02 10:07:07 compute-1 nova_compute[226294]:   </metadata>
Feb 02 10:07:07 compute-1 nova_compute[226294]:   <sysinfo type="smbios">
Feb 02 10:07:07 compute-1 nova_compute[226294]:     <system>
Feb 02 10:07:07 compute-1 nova_compute[226294]:       <entry name="manufacturer">RDO</entry>
Feb 02 10:07:07 compute-1 nova_compute[226294]:       <entry name="product">OpenStack Compute</entry>
Feb 02 10:07:07 compute-1 nova_compute[226294]:       <entry name="version">27.5.2-0.20260127144738.eaa65f0.el9</entry>
Feb 02 10:07:07 compute-1 nova_compute[226294]:       <entry name="serial">15b2e821-5e8b-4d8a-9a48-7c6a30bd3220</entry>
Feb 02 10:07:07 compute-1 nova_compute[226294]:       <entry name="uuid">15b2e821-5e8b-4d8a-9a48-7c6a30bd3220</entry>
Feb 02 10:07:07 compute-1 nova_compute[226294]:       <entry name="family">Virtual Machine</entry>
Feb 02 10:07:07 compute-1 nova_compute[226294]:     </system>
Feb 02 10:07:07 compute-1 nova_compute[226294]:   </sysinfo>
Feb 02 10:07:07 compute-1 nova_compute[226294]:   <os>
Feb 02 10:07:07 compute-1 nova_compute[226294]:     <type arch="x86_64" machine="q35">hvm</type>
Feb 02 10:07:07 compute-1 nova_compute[226294]:     <boot dev="hd"/>
Feb 02 10:07:07 compute-1 nova_compute[226294]:     <smbios mode="sysinfo"/>
Feb 02 10:07:07 compute-1 nova_compute[226294]:   </os>
Feb 02 10:07:07 compute-1 nova_compute[226294]:   <features>
Feb 02 10:07:07 compute-1 nova_compute[226294]:     <acpi/>
Feb 02 10:07:07 compute-1 nova_compute[226294]:     <apic/>
Feb 02 10:07:07 compute-1 nova_compute[226294]:     <vmcoreinfo/>
Feb 02 10:07:07 compute-1 nova_compute[226294]:   </features>
Feb 02 10:07:07 compute-1 nova_compute[226294]:   <clock offset="utc">
Feb 02 10:07:07 compute-1 nova_compute[226294]:     <timer name="pit" tickpolicy="delay"/>
Feb 02 10:07:07 compute-1 nova_compute[226294]:     <timer name="rtc" tickpolicy="catchup"/>
Feb 02 10:07:07 compute-1 nova_compute[226294]:     <timer name="hpet" present="no"/>
Feb 02 10:07:07 compute-1 nova_compute[226294]:   </clock>
Feb 02 10:07:07 compute-1 nova_compute[226294]:   <cpu mode="host-model" match="exact">
Feb 02 10:07:07 compute-1 nova_compute[226294]:     <topology sockets="1" cores="1" threads="1"/>
Feb 02 10:07:07 compute-1 nova_compute[226294]:   </cpu>
Feb 02 10:07:07 compute-1 nova_compute[226294]:   <devices>
Feb 02 10:07:07 compute-1 nova_compute[226294]:     <disk type="network" device="disk">
Feb 02 10:07:07 compute-1 nova_compute[226294]:       <driver type="raw" cache="none"/>
Feb 02 10:07:07 compute-1 nova_compute[226294]:       <source protocol="rbd" name="vms/15b2e821-5e8b-4d8a-9a48-7c6a30bd3220_disk">
Feb 02 10:07:07 compute-1 nova_compute[226294]:         <host name="192.168.122.100" port="6789"/>
Feb 02 10:07:07 compute-1 nova_compute[226294]:         <host name="192.168.122.102" port="6789"/>
Feb 02 10:07:07 compute-1 nova_compute[226294]:         <host name="192.168.122.101" port="6789"/>
Feb 02 10:07:07 compute-1 nova_compute[226294]:       </source>
Feb 02 10:07:07 compute-1 nova_compute[226294]:       <auth username="openstack">
Feb 02 10:07:07 compute-1 nova_compute[226294]:         <secret type="ceph" uuid="d241d473-9fcb-5f74-b163-f1ca4454e7f1"/>
Feb 02 10:07:07 compute-1 nova_compute[226294]:       </auth>
Feb 02 10:07:07 compute-1 nova_compute[226294]:       <target dev="vda" bus="virtio"/>
Feb 02 10:07:07 compute-1 nova_compute[226294]:     </disk>
Feb 02 10:07:07 compute-1 nova_compute[226294]:     <disk type="network" device="cdrom">
Feb 02 10:07:07 compute-1 nova_compute[226294]:       <driver type="raw" cache="none"/>
Feb 02 10:07:07 compute-1 nova_compute[226294]:       <source protocol="rbd" name="vms/15b2e821-5e8b-4d8a-9a48-7c6a30bd3220_disk.config">
Feb 02 10:07:07 compute-1 nova_compute[226294]:         <host name="192.168.122.100" port="6789"/>
Feb 02 10:07:07 compute-1 nova_compute[226294]:         <host name="192.168.122.102" port="6789"/>
Feb 02 10:07:07 compute-1 nova_compute[226294]:         <host name="192.168.122.101" port="6789"/>
Feb 02 10:07:07 compute-1 nova_compute[226294]:       </source>
Feb 02 10:07:07 compute-1 nova_compute[226294]:       <auth username="openstack">
Feb 02 10:07:07 compute-1 nova_compute[226294]:         <secret type="ceph" uuid="d241d473-9fcb-5f74-b163-f1ca4454e7f1"/>
Feb 02 10:07:07 compute-1 nova_compute[226294]:       </auth>
Feb 02 10:07:07 compute-1 nova_compute[226294]:       <target dev="sda" bus="sata"/>
Feb 02 10:07:07 compute-1 nova_compute[226294]:     </disk>
Feb 02 10:07:07 compute-1 nova_compute[226294]:     <interface type="ethernet">
Feb 02 10:07:07 compute-1 nova_compute[226294]:       <mac address="fa:16:3e:85:9a:96"/>
Feb 02 10:07:07 compute-1 nova_compute[226294]:       <model type="virtio"/>
Feb 02 10:07:07 compute-1 nova_compute[226294]:       <driver name="vhost" rx_queue_size="512"/>
Feb 02 10:07:07 compute-1 nova_compute[226294]:       <mtu size="1442"/>
Feb 02 10:07:07 compute-1 nova_compute[226294]:       <target dev="tap09a00258-4f"/>
Feb 02 10:07:07 compute-1 nova_compute[226294]:     </interface>
Feb 02 10:07:07 compute-1 nova_compute[226294]:     <serial type="pty">
Feb 02 10:07:07 compute-1 nova_compute[226294]:       <log file="/var/lib/nova/instances/15b2e821-5e8b-4d8a-9a48-7c6a30bd3220/console.log" append="off"/>
Feb 02 10:07:07 compute-1 nova_compute[226294]:     </serial>
Feb 02 10:07:07 compute-1 nova_compute[226294]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Feb 02 10:07:07 compute-1 nova_compute[226294]:     <video>
Feb 02 10:07:07 compute-1 nova_compute[226294]:       <model type="virtio"/>
Feb 02 10:07:07 compute-1 nova_compute[226294]:     </video>
Feb 02 10:07:07 compute-1 nova_compute[226294]:     <input type="tablet" bus="usb"/>
Feb 02 10:07:07 compute-1 nova_compute[226294]:     <rng model="virtio">
Feb 02 10:07:07 compute-1 nova_compute[226294]:       <backend model="random">/dev/urandom</backend>
Feb 02 10:07:07 compute-1 nova_compute[226294]:     </rng>
Feb 02 10:07:07 compute-1 nova_compute[226294]:     <controller type="pci" model="pcie-root"/>
Feb 02 10:07:07 compute-1 nova_compute[226294]:     <controller type="pci" model="pcie-root-port"/>
Feb 02 10:07:07 compute-1 nova_compute[226294]:     <controller type="pci" model="pcie-root-port"/>
Feb 02 10:07:07 compute-1 nova_compute[226294]:     <controller type="pci" model="pcie-root-port"/>
Feb 02 10:07:07 compute-1 nova_compute[226294]:     <controller type="pci" model="pcie-root-port"/>
Feb 02 10:07:07 compute-1 nova_compute[226294]:     <controller type="pci" model="pcie-root-port"/>
Feb 02 10:07:07 compute-1 nova_compute[226294]:     <controller type="pci" model="pcie-root-port"/>
Feb 02 10:07:07 compute-1 nova_compute[226294]:     <controller type="pci" model="pcie-root-port"/>
Feb 02 10:07:07 compute-1 nova_compute[226294]:     <controller type="pci" model="pcie-root-port"/>
Feb 02 10:07:07 compute-1 nova_compute[226294]:     <controller type="pci" model="pcie-root-port"/>
Feb 02 10:07:07 compute-1 nova_compute[226294]:     <controller type="pci" model="pcie-root-port"/>
Feb 02 10:07:07 compute-1 nova_compute[226294]:     <controller type="pci" model="pcie-root-port"/>
Feb 02 10:07:07 compute-1 nova_compute[226294]:     <controller type="pci" model="pcie-root-port"/>
Feb 02 10:07:07 compute-1 nova_compute[226294]:     <controller type="pci" model="pcie-root-port"/>
Feb 02 10:07:07 compute-1 nova_compute[226294]:     <controller type="pci" model="pcie-root-port"/>
Feb 02 10:07:07 compute-1 nova_compute[226294]:     <controller type="pci" model="pcie-root-port"/>
Feb 02 10:07:07 compute-1 nova_compute[226294]:     <controller type="pci" model="pcie-root-port"/>
Feb 02 10:07:07 compute-1 nova_compute[226294]:     <controller type="pci" model="pcie-root-port"/>
Feb 02 10:07:07 compute-1 nova_compute[226294]:     <controller type="pci" model="pcie-root-port"/>
Feb 02 10:07:07 compute-1 nova_compute[226294]:     <controller type="pci" model="pcie-root-port"/>
Feb 02 10:07:07 compute-1 nova_compute[226294]:     <controller type="pci" model="pcie-root-port"/>
Feb 02 10:07:07 compute-1 nova_compute[226294]:     <controller type="pci" model="pcie-root-port"/>
Feb 02 10:07:07 compute-1 nova_compute[226294]:     <controller type="pci" model="pcie-root-port"/>
Feb 02 10:07:07 compute-1 nova_compute[226294]:     <controller type="pci" model="pcie-root-port"/>
Feb 02 10:07:07 compute-1 nova_compute[226294]:     <controller type="pci" model="pcie-root-port"/>
Feb 02 10:07:07 compute-1 nova_compute[226294]:     <controller type="usb" index="0"/>
Feb 02 10:07:07 compute-1 nova_compute[226294]:     <memballoon model="virtio">
Feb 02 10:07:07 compute-1 nova_compute[226294]:       <stats period="10"/>
Feb 02 10:07:07 compute-1 nova_compute[226294]:     </memballoon>
Feb 02 10:07:07 compute-1 nova_compute[226294]:   </devices>
Feb 02 10:07:07 compute-1 nova_compute[226294]: </domain>
Feb 02 10:07:07 compute-1 nova_compute[226294]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Feb 02 10:07:07 compute-1 nova_compute[226294]: 2026-02-02 10:07:07.600 226298 DEBUG nova.compute.manager [None req-0873adaa-b51f-4db6-ae30-5e6fd5483f88 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] [instance: 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220] Preparing to wait for external event network-vif-plugged-09a00258-4f60-42dd-a769-b2ea3b870187 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Feb 02 10:07:07 compute-1 nova_compute[226294]: 2026-02-02 10:07:07.600 226298 DEBUG oslo_concurrency.lockutils [None req-0873adaa-b51f-4db6-ae30-5e6fd5483f88 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Acquiring lock "15b2e821-5e8b-4d8a-9a48-7c6a30bd3220-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 02 10:07:07 compute-1 nova_compute[226294]: 2026-02-02 10:07:07.600 226298 DEBUG oslo_concurrency.lockutils [None req-0873adaa-b51f-4db6-ae30-5e6fd5483f88 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Lock "15b2e821-5e8b-4d8a-9a48-7c6a30bd3220-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 02 10:07:07 compute-1 nova_compute[226294]: 2026-02-02 10:07:07.600 226298 DEBUG oslo_concurrency.lockutils [None req-0873adaa-b51f-4db6-ae30-5e6fd5483f88 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Lock "15b2e821-5e8b-4d8a-9a48-7c6a30bd3220-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 02 10:07:07 compute-1 nova_compute[226294]: 2026-02-02 10:07:07.601 226298 DEBUG nova.virt.libvirt.vif [None req-0873adaa-b51f-4db6-ae30-5e6fd5483f88 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-02-02T10:07:00Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1612354759',display_name='tempest-TestNetworkBasicOps-server-1612354759',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1612354759',id=6,image_ref='d5e062d7-95ef-409c-9ad0-60f7cf6f44ce',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBMJFk+K1ECuOT65iOILNHAamJ3gGwluebBSGyHnh3tJwd+ehxpJY8ebkg+pBKZw/EcMgvzSZ3FmRm5/iJU1QzjfTd7kqiXGqABRWM3LPcjo/Kmnp/RcvKBxgSpfhTx8kiA==',key_name='tempest-TestNetworkBasicOps-1228211887',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='efbfe697ca674d72b47da5adf3e42c0c',ramdisk_id='',reservation_id='r-1fnfubus',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='d5e062d7-95ef-409c-9ad0-60f7cf6f44ce',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-793549693',owner_user_name='tempest-TestNetworkBasicOps-793549693-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-02-02T10:07:01Z,user_data=None,user_id='1b1695a2a70d4aa0aa350ba17d8f6d5e',uuid=15b2e821-5e8b-4d8a-9a48-7c6a30bd3220,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "09a00258-4f60-42dd-a769-b2ea3b870187", "address": "fa:16:3e:85:9a:96", "network": {"id": "ba6c4c87-77a9-4fcc-aa14-a4637c78f692", "bridge": "br-int", "label": "tempest-network-smoke--761234279", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "efbfe697ca674d72b47da5adf3e42c0c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap09a00258-4f", "ovs_interfaceid": "09a00258-4f60-42dd-a769-b2ea3b870187", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Feb 02 10:07:07 compute-1 nova_compute[226294]: 2026-02-02 10:07:07.601 226298 DEBUG nova.network.os_vif_util [None req-0873adaa-b51f-4db6-ae30-5e6fd5483f88 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Converting VIF {"id": "09a00258-4f60-42dd-a769-b2ea3b870187", "address": "fa:16:3e:85:9a:96", "network": {"id": "ba6c4c87-77a9-4fcc-aa14-a4637c78f692", "bridge": "br-int", "label": "tempest-network-smoke--761234279", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "efbfe697ca674d72b47da5adf3e42c0c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap09a00258-4f", "ovs_interfaceid": "09a00258-4f60-42dd-a769-b2ea3b870187", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 02 10:07:07 compute-1 nova_compute[226294]: 2026-02-02 10:07:07.602 226298 DEBUG nova.network.os_vif_util [None req-0873adaa-b51f-4db6-ae30-5e6fd5483f88 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:85:9a:96,bridge_name='br-int',has_traffic_filtering=True,id=09a00258-4f60-42dd-a769-b2ea3b870187,network=Network(ba6c4c87-77a9-4fcc-aa14-a4637c78f692),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap09a00258-4f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 02 10:07:07 compute-1 nova_compute[226294]: 2026-02-02 10:07:07.602 226298 DEBUG os_vif [None req-0873adaa-b51f-4db6-ae30-5e6fd5483f88 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:85:9a:96,bridge_name='br-int',has_traffic_filtering=True,id=09a00258-4f60-42dd-a769-b2ea3b870187,network=Network(ba6c4c87-77a9-4fcc-aa14-a4637c78f692),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap09a00258-4f') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Feb 02 10:07:07 compute-1 nova_compute[226294]: 2026-02-02 10:07:07.603 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:07:07 compute-1 nova_compute[226294]: 2026-02-02 10:07:07.603 226298 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 02 10:07:07 compute-1 nova_compute[226294]: 2026-02-02 10:07:07.604 226298 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 02 10:07:07 compute-1 nova_compute[226294]: 2026-02-02 10:07:07.608 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:07:07 compute-1 nova_compute[226294]: 2026-02-02 10:07:07.608 226298 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap09a00258-4f, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 02 10:07:07 compute-1 nova_compute[226294]: 2026-02-02 10:07:07.608 226298 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap09a00258-4f, col_values=(('external_ids', {'iface-id': '09a00258-4f60-42dd-a769-b2ea3b870187', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:85:9a:96', 'vm-uuid': '15b2e821-5e8b-4d8a-9a48-7c6a30bd3220'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 02 10:07:07 compute-1 nova_compute[226294]: 2026-02-02 10:07:07.610 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:07:07 compute-1 NetworkManager[49055]: <info>  [1770026827.6118] manager: (tap09a00258-4f): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/31)
Feb 02 10:07:07 compute-1 nova_compute[226294]: 2026-02-02 10:07:07.612 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 02 10:07:07 compute-1 nova_compute[226294]: 2026-02-02 10:07:07.616 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:07:07 compute-1 nova_compute[226294]: 2026-02-02 10:07:07.616 226298 INFO os_vif [None req-0873adaa-b51f-4db6-ae30-5e6fd5483f88 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:85:9a:96,bridge_name='br-int',has_traffic_filtering=True,id=09a00258-4f60-42dd-a769-b2ea3b870187,network=Network(ba6c4c87-77a9-4fcc-aa14-a4637c78f692),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap09a00258-4f')
Feb 02 10:07:07 compute-1 nova_compute[226294]: 2026-02-02 10:07:07.673 226298 DEBUG nova.virt.libvirt.driver [None req-0873adaa-b51f-4db6-ae30-5e6fd5483f88 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Feb 02 10:07:07 compute-1 nova_compute[226294]: 2026-02-02 10:07:07.673 226298 DEBUG nova.virt.libvirt.driver [None req-0873adaa-b51f-4db6-ae30-5e6fd5483f88 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Feb 02 10:07:07 compute-1 nova_compute[226294]: 2026-02-02 10:07:07.673 226298 DEBUG nova.virt.libvirt.driver [None req-0873adaa-b51f-4db6-ae30-5e6fd5483f88 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] No VIF found with MAC fa:16:3e:85:9a:96, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Feb 02 10:07:07 compute-1 nova_compute[226294]: 2026-02-02 10:07:07.674 226298 INFO nova.virt.libvirt.driver [None req-0873adaa-b51f-4db6-ae30-5e6fd5483f88 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] [instance: 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220] Using config drive
Feb 02 10:07:07 compute-1 nova_compute[226294]: 2026-02-02 10:07:07.700 226298 DEBUG nova.storage.rbd_utils [None req-0873adaa-b51f-4db6-ae30-5e6fd5483f88 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] rbd image 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80
Feb 02 10:07:07 compute-1 ceph-mon[80115]: from='client.? 192.168.122.100:0/1616455594' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Feb 02 10:07:07 compute-1 ceph-mon[80115]: from='client.? 192.168.122.101:0/1442920525' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Feb 02 10:07:07 compute-1 ceph-mon[80115]: from='client.? 192.168.122.101:0/671941752' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Feb 02 10:07:07 compute-1 ceph-mon[80115]: from='client.? 192.168.122.100:0/1821679544' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Feb 02 10:07:07 compute-1 nova_compute[226294]: 2026-02-02 10:07:07.804 226298 DEBUG nova.network.neutron [req-39baf1d6-6267-4073-a518-beb0581ec750 req-d37672d2-036a-418d-9fc8-b9620ca668c1 b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] [instance: 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220] Updated VIF entry in instance network info cache for port 09a00258-4f60-42dd-a769-b2ea3b870187. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Feb 02 10:07:07 compute-1 nova_compute[226294]: 2026-02-02 10:07:07.805 226298 DEBUG nova.network.neutron [req-39baf1d6-6267-4073-a518-beb0581ec750 req-d37672d2-036a-418d-9fc8-b9620ca668c1 b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] [instance: 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220] Updating instance_info_cache with network_info: [{"id": "09a00258-4f60-42dd-a769-b2ea3b870187", "address": "fa:16:3e:85:9a:96", "network": {"id": "ba6c4c87-77a9-4fcc-aa14-a4637c78f692", "bridge": "br-int", "label": "tempest-network-smoke--761234279", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "efbfe697ca674d72b47da5adf3e42c0c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap09a00258-4f", "ovs_interfaceid": "09a00258-4f60-42dd-a769-b2ea3b870187", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 02 10:07:07 compute-1 nova_compute[226294]: 2026-02-02 10:07:07.820 226298 DEBUG oslo_concurrency.lockutils [req-39baf1d6-6267-4073-a518-beb0581ec750 req-d37672d2-036a-418d-9fc8-b9620ca668c1 b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] Releasing lock "refresh_cache-15b2e821-5e8b-4d8a-9a48-7c6a30bd3220" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 02 10:07:07 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:07:07 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f849c004340 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:07:07 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:07:07 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_23] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b8009980 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:07:08 compute-1 nova_compute[226294]: 2026-02-02 10:07:08.011 226298 INFO nova.virt.libvirt.driver [None req-0873adaa-b51f-4db6-ae30-5e6fd5483f88 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] [instance: 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220] Creating config drive at /var/lib/nova/instances/15b2e821-5e8b-4d8a-9a48-7c6a30bd3220/disk.config
Feb 02 10:07:08 compute-1 nova_compute[226294]: 2026-02-02 10:07:08.014 226298 DEBUG oslo_concurrency.processutils [None req-0873adaa-b51f-4db6-ae30-5e6fd5483f88 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/15b2e821-5e8b-4d8a-9a48-7c6a30bd3220/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmp8eow0059 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 02 10:07:08 compute-1 nova_compute[226294]: 2026-02-02 10:07:08.130 226298 DEBUG oslo_concurrency.processutils [None req-0873adaa-b51f-4db6-ae30-5e6fd5483f88 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/15b2e821-5e8b-4d8a-9a48-7c6a30bd3220/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmp8eow0059" returned: 0 in 0.116s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 02 10:07:08 compute-1 nova_compute[226294]: 2026-02-02 10:07:08.159 226298 DEBUG nova.storage.rbd_utils [None req-0873adaa-b51f-4db6-ae30-5e6fd5483f88 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] rbd image 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80
Feb 02 10:07:08 compute-1 nova_compute[226294]: 2026-02-02 10:07:08.163 226298 DEBUG oslo_concurrency.processutils [None req-0873adaa-b51f-4db6-ae30-5e6fd5483f88 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/15b2e821-5e8b-4d8a-9a48-7c6a30bd3220/disk.config 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 02 10:07:08 compute-1 nova_compute[226294]: 2026-02-02 10:07:08.338 226298 DEBUG oslo_concurrency.processutils [None req-0873adaa-b51f-4db6-ae30-5e6fd5483f88 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/15b2e821-5e8b-4d8a-9a48-7c6a30bd3220/disk.config 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.176s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 02 10:07:08 compute-1 nova_compute[226294]: 2026-02-02 10:07:08.339 226298 INFO nova.virt.libvirt.driver [None req-0873adaa-b51f-4db6-ae30-5e6fd5483f88 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] [instance: 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220] Deleting local config drive /var/lib/nova/instances/15b2e821-5e8b-4d8a-9a48-7c6a30bd3220/disk.config because it was imported into RBD.
Feb 02 10:07:08 compute-1 systemd[1]: Starting libvirt secret daemon...
Feb 02 10:07:08 compute-1 systemd[1]: Started libvirt secret daemon.
Feb 02 10:07:08 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:07:08 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8490003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:07:08 compute-1 kernel: tap09a00258-4f: entered promiscuous mode
Feb 02 10:07:08 compute-1 NetworkManager[49055]: <info>  [1770026828.4072] manager: (tap09a00258-4f): new Tun device (/org/freedesktop/NetworkManager/Devices/32)
Feb 02 10:07:08 compute-1 ovn_controller[133666]: 2026-02-02T10:07:08Z|00037|binding|INFO|Claiming lport 09a00258-4f60-42dd-a769-b2ea3b870187 for this chassis.
Feb 02 10:07:08 compute-1 ovn_controller[133666]: 2026-02-02T10:07:08Z|00038|binding|INFO|09a00258-4f60-42dd-a769-b2ea3b870187: Claiming fa:16:3e:85:9a:96 10.100.0.10
Feb 02 10:07:08 compute-1 nova_compute[226294]: 2026-02-02 10:07:08.408 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:07:08 compute-1 nova_compute[226294]: 2026-02-02 10:07:08.413 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:07:08 compute-1 nova_compute[226294]: 2026-02-02 10:07:08.415 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:07:08 compute-1 ovn_metadata_agent[143537]: 2026-02-02 10:07:08.423 143542 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:85:9a:96 10.100.0.10'], port_security=['fa:16:3e:85:9a:96 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '15b2e821-5e8b-4d8a-9a48-7c6a30bd3220', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ba6c4c87-77a9-4fcc-aa14-a4637c78f692', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'efbfe697ca674d72b47da5adf3e42c0c', 'neutron:revision_number': '2', 'neutron:security_group_ids': '09104532-215f-4de3-9920-7fd818e6c676', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=755f8a60-018a-461f-bb4b-b9017895ccf7, chassis=[<ovs.db.idl.Row object at 0x7f9094efd8b0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f9094efd8b0>], logical_port=09a00258-4f60-42dd-a769-b2ea3b870187) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 02 10:07:08 compute-1 ovn_metadata_agent[143537]: 2026-02-02 10:07:08.425 143542 INFO neutron.agent.ovn.metadata.agent [-] Port 09a00258-4f60-42dd-a769-b2ea3b870187 in datapath ba6c4c87-77a9-4fcc-aa14-a4637c78f692 bound to our chassis
Feb 02 10:07:08 compute-1 ovn_metadata_agent[143537]: 2026-02-02 10:07:08.426 143542 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network ba6c4c87-77a9-4fcc-aa14-a4637c78f692
Feb 02 10:07:08 compute-1 systemd-machined[195072]: New machine qemu-2-instance-00000006.
Feb 02 10:07:08 compute-1 ovn_controller[133666]: 2026-02-02T10:07:08Z|00039|binding|INFO|Setting lport 09a00258-4f60-42dd-a769-b2ea3b870187 ovn-installed in OVS
Feb 02 10:07:08 compute-1 ovn_controller[133666]: 2026-02-02T10:07:08Z|00040|binding|INFO|Setting lport 09a00258-4f60-42dd-a769-b2ea3b870187 up in Southbound
Feb 02 10:07:08 compute-1 nova_compute[226294]: 2026-02-02 10:07:08.434 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:07:08 compute-1 ovn_metadata_agent[143537]: 2026-02-02 10:07:08.436 229827 DEBUG oslo.privsep.daemon [-] privsep: reply[65e199ed-06de-4d41-a2eb-33014dbc1bb9]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 02 10:07:08 compute-1 ovn_metadata_agent[143537]: 2026-02-02 10:07:08.436 143542 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapba6c4c87-71 in ovnmeta-ba6c4c87-77a9-4fcc-aa14-a4637c78f692 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Feb 02 10:07:08 compute-1 ovn_metadata_agent[143537]: 2026-02-02 10:07:08.438 229827 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapba6c4c87-70 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Feb 02 10:07:08 compute-1 ovn_metadata_agent[143537]: 2026-02-02 10:07:08.438 229827 DEBUG oslo.privsep.daemon [-] privsep: reply[4506bab7-892e-42e2-b5db-173aa418d42d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 02 10:07:08 compute-1 ovn_metadata_agent[143537]: 2026-02-02 10:07:08.438 229827 DEBUG oslo.privsep.daemon [-] privsep: reply[0465fb71-833b-4c77-84fb-afbfdb434584]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 02 10:07:08 compute-1 systemd[1]: Started Virtual Machine qemu-2-instance-00000006.
Feb 02 10:07:08 compute-1 ovn_metadata_agent[143537]: 2026-02-02 10:07:08.448 143813 DEBUG oslo.privsep.daemon [-] privsep: reply[9d47535b-dd51-4276-9cba-183180b95bbe]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 02 10:07:08 compute-1 systemd-udevd[232123]: Network interface NamePolicy= disabled on kernel command line.
Feb 02 10:07:08 compute-1 ovn_metadata_agent[143537]: 2026-02-02 10:07:08.455 229827 DEBUG oslo.privsep.daemon [-] privsep: reply[10a1883d-f0e4-413e-a01c-b4cff4d6f13d]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 02 10:07:08 compute-1 NetworkManager[49055]: <info>  [1770026828.4647] device (tap09a00258-4f): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Feb 02 10:07:08 compute-1 NetworkManager[49055]: <info>  [1770026828.4654] device (tap09a00258-4f): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Feb 02 10:07:08 compute-1 ovn_metadata_agent[143537]: 2026-02-02 10:07:08.473 229842 DEBUG oslo.privsep.daemon [-] privsep: reply[59b31869-9266-4c89-88b6-3c8f50d232a1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 02 10:07:08 compute-1 NetworkManager[49055]: <info>  [1770026828.4777] manager: (tapba6c4c87-70): new Veth device (/org/freedesktop/NetworkManager/Devices/33)
Feb 02 10:07:08 compute-1 ovn_metadata_agent[143537]: 2026-02-02 10:07:08.476 229827 DEBUG oslo.privsep.daemon [-] privsep: reply[d9c1a2c3-9091-491d-ae28-c07a97f6646e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 02 10:07:08 compute-1 ovn_metadata_agent[143537]: 2026-02-02 10:07:08.496 229842 DEBUG oslo.privsep.daemon [-] privsep: reply[0a08a144-e314-48a0-a5cb-54fb507e7430]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 02 10:07:08 compute-1 ovn_metadata_agent[143537]: 2026-02-02 10:07:08.498 229842 DEBUG oslo.privsep.daemon [-] privsep: reply[639aed7a-33ca-4d5a-9a1b-a84df73d8a52]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 02 10:07:08 compute-1 NetworkManager[49055]: <info>  [1770026828.5112] device (tapba6c4c87-70): carrier: link connected
Feb 02 10:07:08 compute-1 ovn_metadata_agent[143537]: 2026-02-02 10:07:08.513 229842 DEBUG oslo.privsep.daemon [-] privsep: reply[80f3ffbc-ce86-4b5e-b3e9-9ae90c6f2f49]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 02 10:07:08 compute-1 ovn_metadata_agent[143537]: 2026-02-02 10:07:08.525 229827 DEBUG oslo.privsep.daemon [-] privsep: reply[03600d44-1fc6-4f3c-9619-ccd403c15139]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapba6c4c87-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:c0:56:7d'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 16], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 399082, 'reachable_time': 19289, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 232154, 'error': None, 'target': 'ovnmeta-ba6c4c87-77a9-4fcc-aa14-a4637c78f692', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 02 10:07:08 compute-1 ovn_metadata_agent[143537]: 2026-02-02 10:07:08.538 229827 DEBUG oslo.privsep.daemon [-] privsep: reply[5bd17a88-e59a-49b8-bad1-c19ba141fe29]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fec0:567d'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 399082, 'tstamp': 399082}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 232155, 'error': None, 'target': 'ovnmeta-ba6c4c87-77a9-4fcc-aa14-a4637c78f692', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 02 10:07:08 compute-1 ovn_metadata_agent[143537]: 2026-02-02 10:07:08.549 229827 DEBUG oslo.privsep.daemon [-] privsep: reply[2e4af456-b786-4f47-aaa1-4405db578468]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapba6c4c87-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:c0:56:7d'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 16], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 399082, 'reachable_time': 19289, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 148, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 148, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 232156, 'error': None, 'target': 'ovnmeta-ba6c4c87-77a9-4fcc-aa14-a4637c78f692', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 02 10:07:08 compute-1 ovn_metadata_agent[143537]: 2026-02-02 10:07:08.571 229827 DEBUG oslo.privsep.daemon [-] privsep: reply[6236fccb-2caa-41a0-8f1d-f1603319bc5d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 02 10:07:08 compute-1 ovn_metadata_agent[143537]: 2026-02-02 10:07:08.609 229827 DEBUG oslo.privsep.daemon [-] privsep: reply[61c0c742-7b6f-42a7-a608-138bdfb509f7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 02 10:07:08 compute-1 ovn_metadata_agent[143537]: 2026-02-02 10:07:08.610 143542 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapba6c4c87-70, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 02 10:07:08 compute-1 ovn_metadata_agent[143537]: 2026-02-02 10:07:08.610 143542 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 02 10:07:08 compute-1 ovn_metadata_agent[143537]: 2026-02-02 10:07:08.611 143542 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapba6c4c87-70, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 02 10:07:08 compute-1 nova_compute[226294]: 2026-02-02 10:07:08.639 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:07:08 compute-1 NetworkManager[49055]: <info>  [1770026828.6397] manager: (tapba6c4c87-70): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/34)
Feb 02 10:07:08 compute-1 kernel: tapba6c4c87-70: entered promiscuous mode
Feb 02 10:07:08 compute-1 nova_compute[226294]: 2026-02-02 10:07:08.642 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:07:08 compute-1 ovn_metadata_agent[143537]: 2026-02-02 10:07:08.643 143542 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapba6c4c87-70, col_values=(('external_ids', {'iface-id': 'f5df8d3e-4c61-4492-9e28-98679c02afcc'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 02 10:07:08 compute-1 nova_compute[226294]: 2026-02-02 10:07:08.644 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:07:08 compute-1 ovn_controller[133666]: 2026-02-02T10:07:08Z|00041|binding|INFO|Releasing lport f5df8d3e-4c61-4492-9e28-98679c02afcc from this chassis (sb_readonly=0)
Feb 02 10:07:08 compute-1 nova_compute[226294]: 2026-02-02 10:07:08.647 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 02 10:07:08 compute-1 nova_compute[226294]: 2026-02-02 10:07:08.648 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 02 10:07:08 compute-1 nova_compute[226294]: 2026-02-02 10:07:08.648 226298 DEBUG nova.compute.manager [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 02 10:07:08 compute-1 nova_compute[226294]: 2026-02-02 10:07:08.650 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:07:08 compute-1 nova_compute[226294]: 2026-02-02 10:07:08.651 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:07:08 compute-1 ovn_metadata_agent[143537]: 2026-02-02 10:07:08.651 143542 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/ba6c4c87-77a9-4fcc-aa14-a4637c78f692.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/ba6c4c87-77a9-4fcc-aa14-a4637c78f692.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Feb 02 10:07:08 compute-1 ovn_metadata_agent[143537]: 2026-02-02 10:07:08.652 229827 DEBUG oslo.privsep.daemon [-] privsep: reply[f1810634-da25-43aa-9de4-2ce0f394addd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 02 10:07:08 compute-1 ovn_metadata_agent[143537]: 2026-02-02 10:07:08.653 143542 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Feb 02 10:07:08 compute-1 ovn_metadata_agent[143537]: global
Feb 02 10:07:08 compute-1 ovn_metadata_agent[143537]:     log         /dev/log local0 debug
Feb 02 10:07:08 compute-1 ovn_metadata_agent[143537]:     log-tag     haproxy-metadata-proxy-ba6c4c87-77a9-4fcc-aa14-a4637c78f692
Feb 02 10:07:08 compute-1 ovn_metadata_agent[143537]:     user        root
Feb 02 10:07:08 compute-1 ovn_metadata_agent[143537]:     group       root
Feb 02 10:07:08 compute-1 ovn_metadata_agent[143537]:     maxconn     1024
Feb 02 10:07:08 compute-1 ovn_metadata_agent[143537]:     pidfile     /var/lib/neutron/external/pids/ba6c4c87-77a9-4fcc-aa14-a4637c78f692.pid.haproxy
Feb 02 10:07:08 compute-1 ovn_metadata_agent[143537]:     daemon
Feb 02 10:07:08 compute-1 ovn_metadata_agent[143537]: 
Feb 02 10:07:08 compute-1 ovn_metadata_agent[143537]: defaults
Feb 02 10:07:08 compute-1 ovn_metadata_agent[143537]:     log global
Feb 02 10:07:08 compute-1 ovn_metadata_agent[143537]:     mode http
Feb 02 10:07:08 compute-1 ovn_metadata_agent[143537]:     option httplog
Feb 02 10:07:08 compute-1 ovn_metadata_agent[143537]:     option dontlognull
Feb 02 10:07:08 compute-1 ovn_metadata_agent[143537]:     option http-server-close
Feb 02 10:07:08 compute-1 ovn_metadata_agent[143537]:     option forwardfor
Feb 02 10:07:08 compute-1 ovn_metadata_agent[143537]:     retries                 3
Feb 02 10:07:08 compute-1 ovn_metadata_agent[143537]:     timeout http-request    30s
Feb 02 10:07:08 compute-1 ovn_metadata_agent[143537]:     timeout connect         30s
Feb 02 10:07:08 compute-1 ovn_metadata_agent[143537]:     timeout client          32s
Feb 02 10:07:08 compute-1 ovn_metadata_agent[143537]:     timeout server          32s
Feb 02 10:07:08 compute-1 ovn_metadata_agent[143537]:     timeout http-keep-alive 30s
Feb 02 10:07:08 compute-1 ovn_metadata_agent[143537]: 
Feb 02 10:07:08 compute-1 ovn_metadata_agent[143537]: 
Feb 02 10:07:08 compute-1 ovn_metadata_agent[143537]: listen listener
Feb 02 10:07:08 compute-1 ovn_metadata_agent[143537]:     bind 169.254.169.254:80
Feb 02 10:07:08 compute-1 ovn_metadata_agent[143537]:     server metadata /var/lib/neutron/metadata_proxy
Feb 02 10:07:08 compute-1 ovn_metadata_agent[143537]:     http-request add-header X-OVN-Network-ID ba6c4c87-77a9-4fcc-aa14-a4637c78f692
Feb 02 10:07:08 compute-1 ovn_metadata_agent[143537]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Feb 02 10:07:08 compute-1 ovn_metadata_agent[143537]: 2026-02-02 10:07:08.653 143542 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-ba6c4c87-77a9-4fcc-aa14-a4637c78f692', 'env', 'PROCESS_TAG=haproxy-ba6c4c87-77a9-4fcc-aa14-a4637c78f692', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/ba6c4c87-77a9-4fcc-aa14-a4637c78f692.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Feb 02 10:07:08 compute-1 nova_compute[226294]: 2026-02-02 10:07:08.764 226298 DEBUG nova.compute.manager [req-61f82e88-35cc-4a48-940c-0e5050a59af8 req-c64ba354-c440-46ca-8a20-f96eee8d7cd0 b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] [instance: 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220] Received event network-vif-plugged-09a00258-4f60-42dd-a769-b2ea3b870187 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 02 10:07:08 compute-1 nova_compute[226294]: 2026-02-02 10:07:08.765 226298 DEBUG oslo_concurrency.lockutils [req-61f82e88-35cc-4a48-940c-0e5050a59af8 req-c64ba354-c440-46ca-8a20-f96eee8d7cd0 b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] Acquiring lock "15b2e821-5e8b-4d8a-9a48-7c6a30bd3220-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 02 10:07:08 compute-1 nova_compute[226294]: 2026-02-02 10:07:08.765 226298 DEBUG oslo_concurrency.lockutils [req-61f82e88-35cc-4a48-940c-0e5050a59af8 req-c64ba354-c440-46ca-8a20-f96eee8d7cd0 b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] Lock "15b2e821-5e8b-4d8a-9a48-7c6a30bd3220-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 02 10:07:08 compute-1 nova_compute[226294]: 2026-02-02 10:07:08.765 226298 DEBUG oslo_concurrency.lockutils [req-61f82e88-35cc-4a48-940c-0e5050a59af8 req-c64ba354-c440-46ca-8a20-f96eee8d7cd0 b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] Lock "15b2e821-5e8b-4d8a-9a48-7c6a30bd3220-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 02 10:07:08 compute-1 nova_compute[226294]: 2026-02-02 10:07:08.766 226298 DEBUG nova.compute.manager [req-61f82e88-35cc-4a48-940c-0e5050a59af8 req-c64ba354-c440-46ca-8a20-f96eee8d7cd0 b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] [instance: 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220] Processing event network-vif-plugged-09a00258-4f60-42dd-a769-b2ea3b870187 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Feb 02 10:07:08 compute-1 ceph-mon[80115]: pgmap v832: 353 pgs: 353 active+clean; 88 MiB data, 277 MiB used, 60 GiB / 60 GiB avail; 21 KiB/s rd, 1.8 MiB/s wr, 32 op/s
Feb 02 10:07:08 compute-1 podman[232188]: 2026-02-02 10:07:08.95946637 +0000 UTC m=+0.053961975 container create 2561056d020f3623f48283542021bd9e76578181273ce1e80f0a4f2d36959d81 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc, name=neutron-haproxy-ovnmeta-ba6c4c87-77a9-4fcc-aa14-a4637c78f692, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Feb 02 10:07:08 compute-1 systemd[1]: Started libpod-conmon-2561056d020f3623f48283542021bd9e76578181273ce1e80f0a4f2d36959d81.scope.
Feb 02 10:07:09 compute-1 systemd[1]: Started libcrun container.
Feb 02 10:07:09 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b7e71b523a6cf72f6079510db5422c0e2666a6b8442a4c07506d8ee1c5789881/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Feb 02 10:07:09 compute-1 podman[232188]: 2026-02-02 10:07:08.93687982 +0000 UTC m=+0.031375445 image pull 19964fda6b912d3d57e21b0bcc221725d936e513025030cb508474fe04b06af8 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc
Feb 02 10:07:09 compute-1 podman[232188]: 2026-02-02 10:07:09.034332129 +0000 UTC m=+0.128827784 container init 2561056d020f3623f48283542021bd9e76578181273ce1e80f0a4f2d36959d81 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc, name=neutron-haproxy-ovnmeta-ba6c4c87-77a9-4fcc-aa14-a4637c78f692, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2)
Feb 02 10:07:09 compute-1 podman[232188]: 2026-02-02 10:07:09.03887185 +0000 UTC m=+0.133367445 container start 2561056d020f3623f48283542021bd9e76578181273ce1e80f0a4f2d36959d81 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc, name=neutron-haproxy-ovnmeta-ba6c4c87-77a9-4fcc-aa14-a4637c78f692, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 02 10:07:09 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:07:09 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 10:07:09 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:07:09.055 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 10:07:09 compute-1 neutron-haproxy-ovnmeta-ba6c4c87-77a9-4fcc-aa14-a4637c78f692[232203]: [NOTICE]   (232207) : New worker (232217) forked
Feb 02 10:07:09 compute-1 neutron-haproxy-ovnmeta-ba6c4c87-77a9-4fcc-aa14-a4637c78f692[232203]: [NOTICE]   (232207) : Loading success.
Feb 02 10:07:09 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:07:09 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:07:09 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:07:09.141 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:07:09 compute-1 nova_compute[226294]: 2026-02-02 10:07:09.197 226298 DEBUG nova.virt.driver [None req-9f87d029-9537-479a-a951-66a5b7f88d49 - - - - - -] Emitting event <LifecycleEvent: 1770026829.197551, 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 02 10:07:09 compute-1 nova_compute[226294]: 2026-02-02 10:07:09.198 226298 INFO nova.compute.manager [None req-9f87d029-9537-479a-a951-66a5b7f88d49 - - - - - -] [instance: 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220] VM Started (Lifecycle Event)
Feb 02 10:07:09 compute-1 nova_compute[226294]: 2026-02-02 10:07:09.199 226298 DEBUG nova.compute.manager [None req-0873adaa-b51f-4db6-ae30-5e6fd5483f88 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] [instance: 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Feb 02 10:07:09 compute-1 nova_compute[226294]: 2026-02-02 10:07:09.202 226298 DEBUG nova.virt.libvirt.driver [None req-0873adaa-b51f-4db6-ae30-5e6fd5483f88 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] [instance: 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Feb 02 10:07:09 compute-1 nova_compute[226294]: 2026-02-02 10:07:09.204 226298 INFO nova.virt.libvirt.driver [-] [instance: 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220] Instance spawned successfully.
Feb 02 10:07:09 compute-1 nova_compute[226294]: 2026-02-02 10:07:09.205 226298 DEBUG nova.virt.libvirt.driver [None req-0873adaa-b51f-4db6-ae30-5e6fd5483f88 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] [instance: 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Feb 02 10:07:09 compute-1 nova_compute[226294]: 2026-02-02 10:07:09.226 226298 DEBUG nova.compute.manager [None req-9f87d029-9537-479a-a951-66a5b7f88d49 - - - - - -] [instance: 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 02 10:07:09 compute-1 nova_compute[226294]: 2026-02-02 10:07:09.230 226298 DEBUG nova.virt.libvirt.driver [None req-0873adaa-b51f-4db6-ae30-5e6fd5483f88 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] [instance: 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 02 10:07:09 compute-1 nova_compute[226294]: 2026-02-02 10:07:09.231 226298 DEBUG nova.virt.libvirt.driver [None req-0873adaa-b51f-4db6-ae30-5e6fd5483f88 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] [instance: 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 02 10:07:09 compute-1 nova_compute[226294]: 2026-02-02 10:07:09.231 226298 DEBUG nova.virt.libvirt.driver [None req-0873adaa-b51f-4db6-ae30-5e6fd5483f88 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] [instance: 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 02 10:07:09 compute-1 nova_compute[226294]: 2026-02-02 10:07:09.232 226298 DEBUG nova.virt.libvirt.driver [None req-0873adaa-b51f-4db6-ae30-5e6fd5483f88 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] [instance: 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 02 10:07:09 compute-1 nova_compute[226294]: 2026-02-02 10:07:09.232 226298 DEBUG nova.virt.libvirt.driver [None req-0873adaa-b51f-4db6-ae30-5e6fd5483f88 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] [instance: 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 02 10:07:09 compute-1 nova_compute[226294]: 2026-02-02 10:07:09.233 226298 DEBUG nova.virt.libvirt.driver [None req-0873adaa-b51f-4db6-ae30-5e6fd5483f88 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] [instance: 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 02 10:07:09 compute-1 nova_compute[226294]: 2026-02-02 10:07:09.237 226298 DEBUG nova.compute.manager [None req-9f87d029-9537-479a-a951-66a5b7f88d49 - - - - - -] [instance: 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Feb 02 10:07:09 compute-1 nova_compute[226294]: 2026-02-02 10:07:09.265 226298 INFO nova.compute.manager [None req-9f87d029-9537-479a-a951-66a5b7f88d49 - - - - - -] [instance: 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220] During sync_power_state the instance has a pending task (spawning). Skip.
Feb 02 10:07:09 compute-1 nova_compute[226294]: 2026-02-02 10:07:09.266 226298 DEBUG nova.virt.driver [None req-9f87d029-9537-479a-a951-66a5b7f88d49 - - - - - -] Emitting event <LifecycleEvent: 1770026829.1976917, 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 02 10:07:09 compute-1 nova_compute[226294]: 2026-02-02 10:07:09.266 226298 INFO nova.compute.manager [None req-9f87d029-9537-479a-a951-66a5b7f88d49 - - - - - -] [instance: 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220] VM Paused (Lifecycle Event)
Feb 02 10:07:09 compute-1 nova_compute[226294]: 2026-02-02 10:07:09.289 226298 DEBUG nova.compute.manager [None req-9f87d029-9537-479a-a951-66a5b7f88d49 - - - - - -] [instance: 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 02 10:07:09 compute-1 nova_compute[226294]: 2026-02-02 10:07:09.292 226298 DEBUG nova.virt.driver [None req-9f87d029-9537-479a-a951-66a5b7f88d49 - - - - - -] Emitting event <LifecycleEvent: 1770026829.2014902, 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 02 10:07:09 compute-1 nova_compute[226294]: 2026-02-02 10:07:09.292 226298 INFO nova.compute.manager [None req-9f87d029-9537-479a-a951-66a5b7f88d49 - - - - - -] [instance: 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220] VM Resumed (Lifecycle Event)
Feb 02 10:07:09 compute-1 nova_compute[226294]: 2026-02-02 10:07:09.300 226298 INFO nova.compute.manager [None req-0873adaa-b51f-4db6-ae30-5e6fd5483f88 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] [instance: 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220] Took 7.29 seconds to spawn the instance on the hypervisor.
Feb 02 10:07:09 compute-1 nova_compute[226294]: 2026-02-02 10:07:09.300 226298 DEBUG nova.compute.manager [None req-0873adaa-b51f-4db6-ae30-5e6fd5483f88 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] [instance: 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 02 10:07:09 compute-1 nova_compute[226294]: 2026-02-02 10:07:09.312 226298 DEBUG nova.compute.manager [None req-9f87d029-9537-479a-a951-66a5b7f88d49 - - - - - -] [instance: 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 02 10:07:09 compute-1 nova_compute[226294]: 2026-02-02 10:07:09.315 226298 DEBUG nova.compute.manager [None req-9f87d029-9537-479a-a951-66a5b7f88d49 - - - - - -] [instance: 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Feb 02 10:07:09 compute-1 nova_compute[226294]: 2026-02-02 10:07:09.341 226298 INFO nova.compute.manager [None req-9f87d029-9537-479a-a951-66a5b7f88d49 - - - - - -] [instance: 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220] During sync_power_state the instance has a pending task (spawning). Skip.
Feb 02 10:07:09 compute-1 nova_compute[226294]: 2026-02-02 10:07:09.357 226298 INFO nova.compute.manager [None req-0873adaa-b51f-4db6-ae30-5e6fd5483f88 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] [instance: 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220] Took 8.19 seconds to build instance.
Feb 02 10:07:09 compute-1 nova_compute[226294]: 2026-02-02 10:07:09.379 226298 DEBUG oslo_concurrency.lockutils [None req-0873adaa-b51f-4db6-ae30-5e6fd5483f88 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Lock "15b2e821-5e8b-4d8a-9a48-7c6a30bd3220" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 8.276s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 02 10:07:09 compute-1 nova_compute[226294]: 2026-02-02 10:07:09.648 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 02 10:07:09 compute-1 nova_compute[226294]: 2026-02-02 10:07:09.649 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 02 10:07:09 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:07:09 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b4004550 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:07:09 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:07:09 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f849c004340 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:07:10 compute-1 ceph-mon[80115]: from='client.? 192.168.122.102:0/383745568' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Feb 02 10:07:10 compute-1 nova_compute[226294]: 2026-02-02 10:07:10.331 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:07:10 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:07:10 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_23] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b8009980 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:07:10 compute-1 nova_compute[226294]: 2026-02-02 10:07:10.644 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 02 10:07:10 compute-1 nova_compute[226294]: 2026-02-02 10:07:10.648 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 02 10:07:10 compute-1 nova_compute[226294]: 2026-02-02 10:07:10.648 226298 DEBUG nova.compute.manager [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 02 10:07:10 compute-1 nova_compute[226294]: 2026-02-02 10:07:10.648 226298 DEBUG nova.compute.manager [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 02 10:07:10 compute-1 nova_compute[226294]: 2026-02-02 10:07:10.791 226298 DEBUG oslo_concurrency.lockutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Acquiring lock "refresh_cache-15b2e821-5e8b-4d8a-9a48-7c6a30bd3220" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 02 10:07:10 compute-1 nova_compute[226294]: 2026-02-02 10:07:10.791 226298 DEBUG oslo_concurrency.lockutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Acquired lock "refresh_cache-15b2e821-5e8b-4d8a-9a48-7c6a30bd3220" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 02 10:07:10 compute-1 nova_compute[226294]: 2026-02-02 10:07:10.792 226298 DEBUG nova.network.neutron [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] [instance: 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Feb 02 10:07:10 compute-1 nova_compute[226294]: 2026-02-02 10:07:10.793 226298 DEBUG nova.objects.instance [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Lazy-loading 'info_cache' on Instance uuid 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 02 10:07:10 compute-1 nova_compute[226294]: 2026-02-02 10:07:10.884 226298 DEBUG nova.compute.manager [req-b29bdf70-3e0d-4917-b23b-d41f3bd0437f req-c80e9dff-8d8b-4a24-883b-8cd538f6faa7 b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] [instance: 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220] Received event network-vif-plugged-09a00258-4f60-42dd-a769-b2ea3b870187 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 02 10:07:10 compute-1 nova_compute[226294]: 2026-02-02 10:07:10.884 226298 DEBUG oslo_concurrency.lockutils [req-b29bdf70-3e0d-4917-b23b-d41f3bd0437f req-c80e9dff-8d8b-4a24-883b-8cd538f6faa7 b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] Acquiring lock "15b2e821-5e8b-4d8a-9a48-7c6a30bd3220-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 02 10:07:10 compute-1 nova_compute[226294]: 2026-02-02 10:07:10.885 226298 DEBUG oslo_concurrency.lockutils [req-b29bdf70-3e0d-4917-b23b-d41f3bd0437f req-c80e9dff-8d8b-4a24-883b-8cd538f6faa7 b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] Lock "15b2e821-5e8b-4d8a-9a48-7c6a30bd3220-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 02 10:07:10 compute-1 nova_compute[226294]: 2026-02-02 10:07:10.885 226298 DEBUG oslo_concurrency.lockutils [req-b29bdf70-3e0d-4917-b23b-d41f3bd0437f req-c80e9dff-8d8b-4a24-883b-8cd538f6faa7 b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] Lock "15b2e821-5e8b-4d8a-9a48-7c6a30bd3220-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 02 10:07:10 compute-1 nova_compute[226294]: 2026-02-02 10:07:10.885 226298 DEBUG nova.compute.manager [req-b29bdf70-3e0d-4917-b23b-d41f3bd0437f req-c80e9dff-8d8b-4a24-883b-8cd538f6faa7 b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] [instance: 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220] No waiting events found dispatching network-vif-plugged-09a00258-4f60-42dd-a769-b2ea3b870187 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 02 10:07:10 compute-1 nova_compute[226294]: 2026-02-02 10:07:10.886 226298 WARNING nova.compute.manager [req-b29bdf70-3e0d-4917-b23b-d41f3bd0437f req-c80e9dff-8d8b-4a24-883b-8cd538f6faa7 b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] [instance: 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220] Received unexpected event network-vif-plugged-09a00258-4f60-42dd-a769-b2ea3b870187 for instance with vm_state active and task_state None.
Feb 02 10:07:11 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:07:11 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 10:07:11 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:07:11.059 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 10:07:11 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:07:11 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:07:11 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:07:11.143 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:07:11 compute-1 ceph-mon[80115]: pgmap v833: 353 pgs: 353 active+clean; 88 MiB data, 277 MiB used, 60 GiB / 60 GiB avail; 20 KiB/s rd, 1.8 MiB/s wr, 32 op/s
Feb 02 10:07:11 compute-1 ceph-mon[80115]: from='client.? 192.168.122.102:0/615888526' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Feb 02 10:07:12 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:07:12 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8490003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:07:12 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:07:12 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8490004140 fd 49 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:07:12 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 10:07:12 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:07:12 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_27] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b4004550 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:07:12 compute-1 nova_compute[226294]: 2026-02-02 10:07:12.610 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:07:13 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:07:13 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:07:13 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:07:13.063 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:07:13 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:07:13 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:07:13 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:07:13.145 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:07:13 compute-1 ceph-mon[80115]: pgmap v834: 353 pgs: 353 active+clean; 88 MiB data, 277 MiB used, 60 GiB / 60 GiB avail; 20 KiB/s rd, 1.8 MiB/s wr, 32 op/s
Feb 02 10:07:13 compute-1 nova_compute[226294]: 2026-02-02 10:07:13.775 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:07:13 compute-1 NetworkManager[49055]: <info>  [1770026833.7765] manager: (patch-provnet-3738ab71-03c6-44c1-bc4f-10cf3e96782e-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/35)
Feb 02 10:07:13 compute-1 NetworkManager[49055]: <info>  [1770026833.7776] manager: (patch-br-int-to-provnet-3738ab71-03c6-44c1-bc4f-10cf3e96782e): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/36)
Feb 02 10:07:13 compute-1 ovn_controller[133666]: 2026-02-02T10:07:13Z|00042|binding|INFO|Releasing lport f5df8d3e-4c61-4492-9e28-98679c02afcc from this chassis (sb_readonly=0)
Feb 02 10:07:13 compute-1 nova_compute[226294]: 2026-02-02 10:07:13.788 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:07:13 compute-1 ovn_controller[133666]: 2026-02-02T10:07:13Z|00043|binding|INFO|Releasing lport f5df8d3e-4c61-4492-9e28-98679c02afcc from this chassis (sb_readonly=0)
Feb 02 10:07:13 compute-1 nova_compute[226294]: 2026-02-02 10:07:13.792 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:07:13 compute-1 nova_compute[226294]: 2026-02-02 10:07:13.873 226298 DEBUG nova.network.neutron [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] [instance: 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220] Updating instance_info_cache with network_info: [{"id": "09a00258-4f60-42dd-a769-b2ea3b870187", "address": "fa:16:3e:85:9a:96", "network": {"id": "ba6c4c87-77a9-4fcc-aa14-a4637c78f692", "bridge": "br-int", "label": "tempest-network-smoke--761234279", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "efbfe697ca674d72b47da5adf3e42c0c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap09a00258-4f", "ovs_interfaceid": "09a00258-4f60-42dd-a769-b2ea3b870187", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 02 10:07:13 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:07:13 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b8009980 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:07:13 compute-1 nova_compute[226294]: 2026-02-02 10:07:13.896 226298 DEBUG oslo_concurrency.lockutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Releasing lock "refresh_cache-15b2e821-5e8b-4d8a-9a48-7c6a30bd3220" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 02 10:07:13 compute-1 nova_compute[226294]: 2026-02-02 10:07:13.897 226298 DEBUG nova.compute.manager [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] [instance: 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Feb 02 10:07:13 compute-1 nova_compute[226294]: 2026-02-02 10:07:13.897 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 02 10:07:13 compute-1 nova_compute[226294]: 2026-02-02 10:07:13.898 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 02 10:07:13 compute-1 nova_compute[226294]: 2026-02-02 10:07:13.898 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 02 10:07:13 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:07:13 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8498003960 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:07:14 compute-1 nova_compute[226294]: 2026-02-02 10:07:14.038 226298 DEBUG oslo_concurrency.lockutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 02 10:07:14 compute-1 nova_compute[226294]: 2026-02-02 10:07:14.038 226298 DEBUG oslo_concurrency.lockutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 02 10:07:14 compute-1 nova_compute[226294]: 2026-02-02 10:07:14.038 226298 DEBUG oslo_concurrency.lockutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 02 10:07:14 compute-1 nova_compute[226294]: 2026-02-02 10:07:14.039 226298 DEBUG nova.compute.resource_tracker [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 02 10:07:14 compute-1 nova_compute[226294]: 2026-02-02 10:07:14.039 226298 DEBUG oslo_concurrency.processutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 02 10:07:14 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:07:14 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_29] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f848c002670 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:07:14 compute-1 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Feb 02 10:07:14 compute-1 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/386387081' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Feb 02 10:07:14 compute-1 nova_compute[226294]: 2026-02-02 10:07:14.569 226298 DEBUG oslo_concurrency.processutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.530s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 02 10:07:14 compute-1 ceph-mon[80115]: from='client.? 192.168.122.101:0/386387081' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Feb 02 10:07:15 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:07:15 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 10:07:15 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:07:15.065 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 10:07:15 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:07:15 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 10:07:15 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:07:15.147 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 10:07:15 compute-1 nova_compute[226294]: 2026-02-02 10:07:15.149 226298 DEBUG nova.virt.libvirt.driver [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] skipping disk for instance-00000006 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Feb 02 10:07:15 compute-1 nova_compute[226294]: 2026-02-02 10:07:15.150 226298 DEBUG nova.virt.libvirt.driver [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] skipping disk for instance-00000006 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Feb 02 10:07:15 compute-1 nova_compute[226294]: 2026-02-02 10:07:15.286 226298 WARNING nova.virt.libvirt.driver [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 02 10:07:15 compute-1 nova_compute[226294]: 2026-02-02 10:07:15.288 226298 DEBUG nova.compute.resource_tracker [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4787MB free_disk=59.96738052368164GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 02 10:07:15 compute-1 nova_compute[226294]: 2026-02-02 10:07:15.288 226298 DEBUG oslo_concurrency.lockutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 02 10:07:15 compute-1 nova_compute[226294]: 2026-02-02 10:07:15.288 226298 DEBUG oslo_concurrency.lockutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 02 10:07:15 compute-1 nova_compute[226294]: 2026-02-02 10:07:15.329 226298 DEBUG nova.compute.manager [req-e6282863-c821-4054-b5f6-8077c650e3b1 req-057ac70b-d0c3-4538-bced-3789ce939154 b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] [instance: 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220] Received event network-changed-09a00258-4f60-42dd-a769-b2ea3b870187 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 02 10:07:15 compute-1 nova_compute[226294]: 2026-02-02 10:07:15.330 226298 DEBUG nova.compute.manager [req-e6282863-c821-4054-b5f6-8077c650e3b1 req-057ac70b-d0c3-4538-bced-3789ce939154 b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] [instance: 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220] Refreshing instance network info cache due to event network-changed-09a00258-4f60-42dd-a769-b2ea3b870187. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Feb 02 10:07:15 compute-1 nova_compute[226294]: 2026-02-02 10:07:15.330 226298 DEBUG oslo_concurrency.lockutils [req-e6282863-c821-4054-b5f6-8077c650e3b1 req-057ac70b-d0c3-4538-bced-3789ce939154 b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] Acquiring lock "refresh_cache-15b2e821-5e8b-4d8a-9a48-7c6a30bd3220" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 02 10:07:15 compute-1 nova_compute[226294]: 2026-02-02 10:07:15.330 226298 DEBUG oslo_concurrency.lockutils [req-e6282863-c821-4054-b5f6-8077c650e3b1 req-057ac70b-d0c3-4538-bced-3789ce939154 b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] Acquired lock "refresh_cache-15b2e821-5e8b-4d8a-9a48-7c6a30bd3220" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 02 10:07:15 compute-1 nova_compute[226294]: 2026-02-02 10:07:15.331 226298 DEBUG nova.network.neutron [req-e6282863-c821-4054-b5f6-8077c650e3b1 req-057ac70b-d0c3-4538-bced-3789ce939154 b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] [instance: 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220] Refreshing network info cache for port 09a00258-4f60-42dd-a769-b2ea3b870187 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Feb 02 10:07:15 compute-1 nova_compute[226294]: 2026-02-02 10:07:15.369 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:07:15 compute-1 nova_compute[226294]: 2026-02-02 10:07:15.410 226298 DEBUG nova.compute.resource_tracker [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Instance 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Feb 02 10:07:15 compute-1 nova_compute[226294]: 2026-02-02 10:07:15.411 226298 DEBUG nova.compute.resource_tracker [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 02 10:07:15 compute-1 nova_compute[226294]: 2026-02-02 10:07:15.411 226298 DEBUG nova.compute.resource_tracker [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=59GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 02 10:07:15 compute-1 nova_compute[226294]: 2026-02-02 10:07:15.455 226298 DEBUG oslo_concurrency.processutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 02 10:07:15 compute-1 ceph-mon[80115]: pgmap v835: 353 pgs: 353 active+clean; 88 MiB data, 277 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 1.8 MiB/s wr, 101 op/s
Feb 02 10:07:15 compute-1 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Feb 02 10:07:15 compute-1 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1423075345' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Feb 02 10:07:15 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:07:15 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_27] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b4004570 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:07:15 compute-1 nova_compute[226294]: 2026-02-02 10:07:15.909 226298 DEBUG oslo_concurrency.processutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.454s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 02 10:07:15 compute-1 nova_compute[226294]: 2026-02-02 10:07:15.918 226298 DEBUG nova.compute.provider_tree [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Inventory has not changed in ProviderTree for provider: 8e32c057-ad28-4c19-8374-763e0c1c8622 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 02 10:07:15 compute-1 nova_compute[226294]: 2026-02-02 10:07:15.941 226298 DEBUG nova.scheduler.client.report [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Inventory has not changed for provider 8e32c057-ad28-4c19-8374-763e0c1c8622 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 02 10:07:15 compute-1 nova_compute[226294]: 2026-02-02 10:07:15.973 226298 DEBUG nova.compute.resource_tracker [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 02 10:07:15 compute-1 nova_compute[226294]: 2026-02-02 10:07:15.974 226298 DEBUG oslo_concurrency.lockutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.686s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 02 10:07:15 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:07:15 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b8009980 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:07:16 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:07:16 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8498003960 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:07:16 compute-1 ceph-mon[80115]: from='client.? 192.168.122.101:0/1423075345' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Feb 02 10:07:16 compute-1 ceph-mon[80115]: pgmap v836: 353 pgs: 353 active+clean; 88 MiB data, 277 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 12 KiB/s wr, 73 op/s
Feb 02 10:07:17 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:07:17 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:07:17 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:07:17.068 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:07:17 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:07:17 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:07:17 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:07:17.150 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:07:17 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 10:07:17 compute-1 nova_compute[226294]: 2026-02-02 10:07:17.612 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:07:17 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Feb 02 10:07:17 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:07:17 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_29] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f848c001bd0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:07:17 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:07:17 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_27] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b4004590 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:07:18 compute-1 nova_compute[226294]: 2026-02-02 10:07:18.177 226298 DEBUG nova.network.neutron [req-e6282863-c821-4054-b5f6-8077c650e3b1 req-057ac70b-d0c3-4538-bced-3789ce939154 b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] [instance: 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220] Updated VIF entry in instance network info cache for port 09a00258-4f60-42dd-a769-b2ea3b870187. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Feb 02 10:07:18 compute-1 nova_compute[226294]: 2026-02-02 10:07:18.178 226298 DEBUG nova.network.neutron [req-e6282863-c821-4054-b5f6-8077c650e3b1 req-057ac70b-d0c3-4538-bced-3789ce939154 b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] [instance: 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220] Updating instance_info_cache with network_info: [{"id": "09a00258-4f60-42dd-a769-b2ea3b870187", "address": "fa:16:3e:85:9a:96", "network": {"id": "ba6c4c87-77a9-4fcc-aa14-a4637c78f692", "bridge": "br-int", "label": "tempest-network-smoke--761234279", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.238", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "efbfe697ca674d72b47da5adf3e42c0c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap09a00258-4f", "ovs_interfaceid": "09a00258-4f60-42dd-a769-b2ea3b870187", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 02 10:07:18 compute-1 nova_compute[226294]: 2026-02-02 10:07:18.213 226298 DEBUG oslo_concurrency.lockutils [req-e6282863-c821-4054-b5f6-8077c650e3b1 req-057ac70b-d0c3-4538-bced-3789ce939154 b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] Releasing lock "refresh_cache-15b2e821-5e8b-4d8a-9a48-7c6a30bd3220" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 02 10:07:18 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:07:18 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b8009980 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:07:18 compute-1 ceph-mon[80115]: pgmap v837: 353 pgs: 353 active+clean; 88 MiB data, 277 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 12 KiB/s wr, 74 op/s
Feb 02 10:07:19 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:07:19 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:07:19 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:07:19.072 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:07:19 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:07:19 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 10:07:19 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:07:19.152 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 10:07:19 compute-1 sudo[232313]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Feb 02 10:07:19 compute-1 sudo[232313]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 10:07:19 compute-1 sudo[232313]: pam_unix(sudo:session): session closed for user root
Feb 02 10:07:19 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:07:19 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8498003960 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:07:19 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:07:19 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_29] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f848c001bd0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:07:20 compute-1 nova_compute[226294]: 2026-02-02 10:07:20.371 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:07:20 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:07:20 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_27] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b40045b0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:07:21 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:07:21 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:07:21 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:07:21.075 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:07:21 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:07:21 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:07:21 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:07:21.155 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:07:21 compute-1 ceph-mon[80115]: pgmap v838: 353 pgs: 353 active+clean; 88 MiB data, 277 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 68 op/s
Feb 02 10:07:21 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:07:21 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b8009980 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:07:21 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:07:21 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8498003960 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:07:22 compute-1 ovn_controller[133666]: 2026-02-02T10:07:22Z|00006|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:85:9a:96 10.100.0.10
Feb 02 10:07:22 compute-1 ovn_controller[133666]: 2026-02-02T10:07:22Z|00007|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:85:9a:96 10.100.0.10
Feb 02 10:07:22 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 10:07:22 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:07:22 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_29] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f848c001bd0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:07:22 compute-1 nova_compute[226294]: 2026-02-02 10:07:22.650 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:07:22 compute-1 ceph-mon[80115]: pgmap v839: 353 pgs: 353 active+clean; 88 MiB data, 277 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 68 op/s
Feb 02 10:07:23 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:07:23 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:07:23 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:07:23.077 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:07:23 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:07:23 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:07:23 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:07:23.157 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:07:23 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:07:23 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_27] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b40045b0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:07:23 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:07:23 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b8009980 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:07:24 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:07:24 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8498003960 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:07:25 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:07:25 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:07:25 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:07:25.080 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:07:25 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:07:25 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 10:07:25 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:07:25.159 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 10:07:25 compute-1 sudo[232341]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 02 10:07:25 compute-1 sudo[232341]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 10:07:25 compute-1 sudo[232341]: pam_unix(sudo:session): session closed for user root
Feb 02 10:07:25 compute-1 sudo[232366]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/d241d473-9fcb-5f74-b163-f1ca4454e7f1/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Feb 02 10:07:25 compute-1 sudo[232366]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 10:07:25 compute-1 nova_compute[226294]: 2026-02-02 10:07:25.372 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:07:25 compute-1 ceph-mon[80115]: pgmap v840: 353 pgs: 353 active+clean; 121 MiB data, 302 MiB used, 60 GiB / 60 GiB avail; 2.2 MiB/s rd, 2.1 MiB/s wr, 132 op/s
Feb 02 10:07:25 compute-1 sudo[232366]: pam_unix(sudo:session): session closed for user root
Feb 02 10:07:25 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:07:25 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_27] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8498003960 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:07:25 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:07:25 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_27] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b40045d0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:07:26 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:07:26 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b8009980 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:07:26 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Feb 02 10:07:26 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Feb 02 10:07:26 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb 02 10:07:26 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb 02 10:07:26 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Feb 02 10:07:26 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Feb 02 10:07:26 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Feb 02 10:07:27 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:07:27 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:07:27 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:07:27.083 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:07:27 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:07:27 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 10:07:27 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:07:27.162 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 10:07:27 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 10:07:27 compute-1 nova_compute[226294]: 2026-02-02 10:07:27.653 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:07:27 compute-1 ceph-mon[80115]: pgmap v841: 353 pgs: 353 active+clean; 121 MiB data, 302 MiB used, 60 GiB / 60 GiB avail; 326 KiB/s rd, 2.1 MiB/s wr, 63 op/s
Feb 02 10:07:27 compute-1 nova_compute[226294]: 2026-02-02 10:07:27.753 226298 INFO nova.compute.manager [None req-e8a9e4c2-dcd9-4a84-8af8-41e4bd0aa7ec 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] [instance: 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220] Get console output
Feb 02 10:07:27 compute-1 nova_compute[226294]: 2026-02-02 10:07:27.759 226298 INFO oslo.privsep.daemon [None req-e8a9e4c2-dcd9-4a84-8af8-41e4bd0aa7ec 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Running privsep helper: ['sudo', 'nova-rootwrap', '/etc/nova/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/nova/nova.conf', '--config-file', '/etc/nova/nova-compute.conf', '--config-dir', '/etc/nova/nova.conf.d', '--privsep_context', 'nova.privsep.sys_admin_pctxt', '--privsep_sock_path', '/tmp/tmp6yb7dc4f/privsep.sock']
Feb 02 10:07:27 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:07:27 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_29] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b8009980 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:07:27 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:07:27 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_29] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f848c003750 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:07:28 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:07:28 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f848c003750 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:07:28 compute-1 nova_compute[226294]: 2026-02-02 10:07:28.429 226298 INFO oslo.privsep.daemon [None req-e8a9e4c2-dcd9-4a84-8af8-41e4bd0aa7ec 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Spawned new privsep daemon via rootwrap
Feb 02 10:07:28 compute-1 nova_compute[226294]: 2026-02-02 10:07:28.299 232427 INFO oslo.privsep.daemon [-] privsep daemon starting
Feb 02 10:07:28 compute-1 nova_compute[226294]: 2026-02-02 10:07:28.302 232427 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Feb 02 10:07:28 compute-1 nova_compute[226294]: 2026-02-02 10:07:28.304 232427 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_CHOWN|CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_FOWNER|CAP_NET_ADMIN|CAP_SYS_ADMIN/CAP_CHOWN|CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_FOWNER|CAP_NET_ADMIN|CAP_SYS_ADMIN/none
Feb 02 10:07:28 compute-1 nova_compute[226294]: 2026-02-02 10:07:28.304 232427 INFO oslo.privsep.daemon [-] privsep daemon running as pid 232427
Feb 02 10:07:28 compute-1 nova_compute[226294]: 2026-02-02 10:07:28.531 232427 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Feb 02 10:07:28 compute-1 ceph-mon[80115]: pgmap v842: 353 pgs: 353 active+clean; 121 MiB data, 302 MiB used, 60 GiB / 60 GiB avail; 327 KiB/s rd, 2.1 MiB/s wr, 64 op/s
Feb 02 10:07:29 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:07:29 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:07:29 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:07:29.086 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:07:29 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:07:29 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:07:29 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:07:29.166 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:07:29 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:07:29 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f849c004340 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:07:29 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:07:29 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_23] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b8009980 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:07:30 compute-1 nova_compute[226294]: 2026-02-02 10:07:30.375 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:07:30 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:07:30 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f848c003750 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:07:30 compute-1 podman[232430]: 2026-02-02 10:07:30.43933429 +0000 UTC m=+0.106256475 container health_status 1fb2696999ea15e9688144989093738543d3cef725f9986792d6dfd707aefbe2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:099d88ae13fa2b3409da5310cdcba7fa01d2c87a8bc98296299a57054b9a075e, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'db4758ee7523fe447444c4bd2b867b543b1eee4e3bbcf6676cd1b27bf6147d86-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:099d88ae13fa2b3409da5310cdcba7fa01d2c87a8bc98296299a57054b9a075e', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_controller, io.buildah.version=1.41.3)
Feb 02 10:07:30 compute-1 ceph-mon[80115]: pgmap v843: 353 pgs: 353 active+clean; 121 MiB data, 302 MiB used, 60 GiB / 60 GiB avail; 326 KiB/s rd, 2.1 MiB/s wr, 64 op/s
Feb 02 10:07:31 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:07:31 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:07:31 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:07:31.089 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:07:31 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:07:31 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:07:31 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:07:31.169 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:07:31 compute-1 sudo[232458]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 02 10:07:31 compute-1 sudo[232458]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 10:07:31 compute-1 sudo[232458]: pam_unix(sudo:session): session closed for user root
Feb 02 10:07:31 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:07:31 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_27] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b4004630 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:07:31 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:07:31 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f849c004340 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:07:32 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb 02 10:07:32 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb 02 10:07:32 compute-1 nova_compute[226294]: 2026-02-02 10:07:32.110 226298 DEBUG oslo_concurrency.lockutils [None req-b8cdc919-8a88-4a5a-82c9-2e0e7fd94e68 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Acquiring lock "interface-15b2e821-5e8b-4d8a-9a48-7c6a30bd3220-None" by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 02 10:07:32 compute-1 nova_compute[226294]: 2026-02-02 10:07:32.111 226298 DEBUG oslo_concurrency.lockutils [None req-b8cdc919-8a88-4a5a-82c9-2e0e7fd94e68 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Lock "interface-15b2e821-5e8b-4d8a-9a48-7c6a30bd3220-None" acquired by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 02 10:07:32 compute-1 nova_compute[226294]: 2026-02-02 10:07:32.112 226298 DEBUG nova.objects.instance [None req-b8cdc919-8a88-4a5a-82c9-2e0e7fd94e68 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Lazy-loading 'flavor' on Instance uuid 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 02 10:07:32 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 10:07:32 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:07:32 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_23] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b8009980 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:07:32 compute-1 nova_compute[226294]: 2026-02-02 10:07:32.655 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:07:33 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:07:33 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:07:33 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:07:33.091 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:07:33 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Feb 02 10:07:33 compute-1 ceph-mon[80115]: pgmap v844: 353 pgs: 353 active+clean; 121 MiB data, 302 MiB used, 60 GiB / 60 GiB avail; 327 KiB/s rd, 2.1 MiB/s wr, 64 op/s
Feb 02 10:07:33 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:07:33 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 10:07:33 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:07:33.170 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 10:07:33 compute-1 nova_compute[226294]: 2026-02-02 10:07:33.367 226298 DEBUG nova.objects.instance [None req-b8cdc919-8a88-4a5a-82c9-2e0e7fd94e68 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Lazy-loading 'pci_requests' on Instance uuid 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 02 10:07:33 compute-1 nova_compute[226294]: 2026-02-02 10:07:33.385 226298 DEBUG nova.network.neutron [None req-b8cdc919-8a88-4a5a-82c9-2e0e7fd94e68 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] [instance: 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Feb 02 10:07:33 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:07:33 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_27] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b8009980 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:07:33 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:07:33 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_27] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b4004650 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:07:34 compute-1 nova_compute[226294]: 2026-02-02 10:07:34.157 226298 DEBUG nova.policy [None req-b8cdc919-8a88-4a5a-82c9-2e0e7fd94e68 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '1b1695a2a70d4aa0aa350ba17d8f6d5e', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'efbfe697ca674d72b47da5adf3e42c0c', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Feb 02 10:07:34 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:07:34 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f849c004340 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:07:35 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:07:35 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:07:35 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:07:35.093 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:07:35 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:07:35 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:07:35 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:07:35.172 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:07:35 compute-1 nova_compute[226294]: 2026-02-02 10:07:35.283 226298 DEBUG nova.network.neutron [None req-b8cdc919-8a88-4a5a-82c9-2e0e7fd94e68 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] [instance: 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220] Successfully created port: c66e0be1-d166-4088-8ad8-baa84f3d032d _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Feb 02 10:07:35 compute-1 nova_compute[226294]: 2026-02-02 10:07:35.415 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:07:35 compute-1 ceph-mon[80115]: pgmap v845: 353 pgs: 353 active+clean; 121 MiB data, 302 MiB used, 60 GiB / 60 GiB avail; 327 KiB/s rd, 2.1 MiB/s wr, 64 op/s
Feb 02 10:07:35 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:07:35 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f849c004340 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:07:35 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:07:35 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f849c004340 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:07:36 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:07:36 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b4004670 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:07:36 compute-1 nova_compute[226294]: 2026-02-02 10:07:36.516 226298 DEBUG nova.network.neutron [None req-b8cdc919-8a88-4a5a-82c9-2e0e7fd94e68 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] [instance: 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220] Successfully updated port: c66e0be1-d166-4088-8ad8-baa84f3d032d _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Feb 02 10:07:36 compute-1 nova_compute[226294]: 2026-02-02 10:07:36.639 226298 DEBUG oslo_concurrency.lockutils [None req-b8cdc919-8a88-4a5a-82c9-2e0e7fd94e68 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Acquiring lock "refresh_cache-15b2e821-5e8b-4d8a-9a48-7c6a30bd3220" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 02 10:07:36 compute-1 nova_compute[226294]: 2026-02-02 10:07:36.640 226298 DEBUG oslo_concurrency.lockutils [None req-b8cdc919-8a88-4a5a-82c9-2e0e7fd94e68 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Acquired lock "refresh_cache-15b2e821-5e8b-4d8a-9a48-7c6a30bd3220" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 02 10:07:36 compute-1 nova_compute[226294]: 2026-02-02 10:07:36.640 226298 DEBUG nova.network.neutron [None req-b8cdc919-8a88-4a5a-82c9-2e0e7fd94e68 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] [instance: 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Feb 02 10:07:36 compute-1 nova_compute[226294]: 2026-02-02 10:07:36.724 226298 DEBUG nova.compute.manager [req-4cc554dd-e12e-45ae-90d0-50b23f606996 req-ddd77cb6-5259-424c-883e-88b175954441 b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] [instance: 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220] Received event network-changed-c66e0be1-d166-4088-8ad8-baa84f3d032d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 02 10:07:36 compute-1 nova_compute[226294]: 2026-02-02 10:07:36.725 226298 DEBUG nova.compute.manager [req-4cc554dd-e12e-45ae-90d0-50b23f606996 req-ddd77cb6-5259-424c-883e-88b175954441 b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] [instance: 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220] Refreshing instance network info cache due to event network-changed-c66e0be1-d166-4088-8ad8-baa84f3d032d. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Feb 02 10:07:36 compute-1 nova_compute[226294]: 2026-02-02 10:07:36.725 226298 DEBUG oslo_concurrency.lockutils [req-4cc554dd-e12e-45ae-90d0-50b23f606996 req-ddd77cb6-5259-424c-883e-88b175954441 b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] Acquiring lock "refresh_cache-15b2e821-5e8b-4d8a-9a48-7c6a30bd3220" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 02 10:07:36 compute-1 ceph-mon[80115]: pgmap v846: 353 pgs: 353 active+clean; 121 MiB data, 302 MiB used, 60 GiB / 60 GiB avail; 938 B/s rd, 16 KiB/s wr, 1 op/s
Feb 02 10:07:37 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:07:37 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 10:07:37 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:07:37.095 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 10:07:37 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:07:37 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:07:37 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:07:37.174 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:07:37 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 10:07:37 compute-1 podman[232486]: 2026-02-02 10:07:37.414124867 +0000 UTC m=+0.090033523 container health_status 9cab238676dda9a572dafc047b624a8b78a8fbec063bf997856559897160605d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20260127, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'db4758ee7523fe447444c4bd2b867b543b1eee4e3bbcf6676cd1b27bf6147d86-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_managed=true)
Feb 02 10:07:37 compute-1 nova_compute[226294]: 2026-02-02 10:07:37.703 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:07:37 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:07:37 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f849c004340 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:07:37 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:07:37 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_27] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f849c004340 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:07:38 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:07:38 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_27] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b8009980 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:07:38 compute-1 nova_compute[226294]: 2026-02-02 10:07:38.740 226298 DEBUG nova.network.neutron [None req-b8cdc919-8a88-4a5a-82c9-2e0e7fd94e68 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] [instance: 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220] Updating instance_info_cache with network_info: [{"id": "09a00258-4f60-42dd-a769-b2ea3b870187", "address": "fa:16:3e:85:9a:96", "network": {"id": "ba6c4c87-77a9-4fcc-aa14-a4637c78f692", "bridge": "br-int", "label": "tempest-network-smoke--761234279", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.238", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "efbfe697ca674d72b47da5adf3e42c0c", "mtu": null, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap09a00258-4f", "ovs_interfaceid": "09a00258-4f60-42dd-a769-b2ea3b870187", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "c66e0be1-d166-4088-8ad8-baa84f3d032d", "address": "fa:16:3e:2f:49:24", "network": {"id": "e125f54e-7556-49c5-8356-e7390df43c53", "bridge": "br-int", "label": "tempest-network-smoke--39971515", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.18", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "efbfe697ca674d72b47da5adf3e42c0c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc66e0be1-d1", "ovs_interfaceid": "c66e0be1-d166-4088-8ad8-baa84f3d032d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 02 10:07:38 compute-1 nova_compute[226294]: 2026-02-02 10:07:38.786 226298 DEBUG oslo_concurrency.lockutils [None req-b8cdc919-8a88-4a5a-82c9-2e0e7fd94e68 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Releasing lock "refresh_cache-15b2e821-5e8b-4d8a-9a48-7c6a30bd3220" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 02 10:07:38 compute-1 nova_compute[226294]: 2026-02-02 10:07:38.787 226298 DEBUG oslo_concurrency.lockutils [req-4cc554dd-e12e-45ae-90d0-50b23f606996 req-ddd77cb6-5259-424c-883e-88b175954441 b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] Acquired lock "refresh_cache-15b2e821-5e8b-4d8a-9a48-7c6a30bd3220" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 02 10:07:38 compute-1 nova_compute[226294]: 2026-02-02 10:07:38.787 226298 DEBUG nova.network.neutron [req-4cc554dd-e12e-45ae-90d0-50b23f606996 req-ddd77cb6-5259-424c-883e-88b175954441 b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] [instance: 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220] Refreshing network info cache for port c66e0be1-d166-4088-8ad8-baa84f3d032d _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Feb 02 10:07:38 compute-1 nova_compute[226294]: 2026-02-02 10:07:38.789 226298 DEBUG nova.virt.libvirt.vif [None req-b8cdc919-8a88-4a5a-82c9-2e0e7fd94e68 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-02-02T10:07:00Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1612354759',display_name='tempest-TestNetworkBasicOps-server-1612354759',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1612354759',id=6,image_ref='d5e062d7-95ef-409c-9ad0-60f7cf6f44ce',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBMJFk+K1ECuOT65iOILNHAamJ3gGwluebBSGyHnh3tJwd+ehxpJY8ebkg+pBKZw/EcMgvzSZ3FmRm5/iJU1QzjfTd7kqiXGqABRWM3LPcjo/Kmnp/RcvKBxgSpfhTx8kiA==',key_name='tempest-TestNetworkBasicOps-1228211887',keypairs=<?>,launch_index=0,launched_at=2026-02-02T10:07:09Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='efbfe697ca674d72b47da5adf3e42c0c',ramdisk_id='',reservation_id='r-1fnfubus',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='d5e062d7-95ef-409c-9ad0-60f7cf6f44ce',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-793549693',owner_user_name='tempest-TestNetworkBasicOps-793549693-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-02-02T10:07:09Z,user_data=None,user_id='1b1695a2a70d4aa0aa350ba17d8f6d5e',uuid=15b2e821-5e8b-4d8a-9a48-7c6a30bd3220,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "c66e0be1-d166-4088-8ad8-baa84f3d032d", "address": "fa:16:3e:2f:49:24", "network": {"id": "e125f54e-7556-49c5-8356-e7390df43c53", "bridge": "br-int", "label": "tempest-network-smoke--39971515", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.18", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "efbfe697ca674d72b47da5adf3e42c0c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc66e0be1-d1", "ovs_interfaceid": "c66e0be1-d166-4088-8ad8-baa84f3d032d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Feb 02 10:07:38 compute-1 nova_compute[226294]: 2026-02-02 10:07:38.790 226298 DEBUG nova.network.os_vif_util [None req-b8cdc919-8a88-4a5a-82c9-2e0e7fd94e68 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Converting VIF {"id": "c66e0be1-d166-4088-8ad8-baa84f3d032d", "address": "fa:16:3e:2f:49:24", "network": {"id": "e125f54e-7556-49c5-8356-e7390df43c53", "bridge": "br-int", "label": "tempest-network-smoke--39971515", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.18", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "efbfe697ca674d72b47da5adf3e42c0c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc66e0be1-d1", "ovs_interfaceid": "c66e0be1-d166-4088-8ad8-baa84f3d032d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 02 10:07:38 compute-1 nova_compute[226294]: 2026-02-02 10:07:38.790 226298 DEBUG nova.network.os_vif_util [None req-b8cdc919-8a88-4a5a-82c9-2e0e7fd94e68 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:2f:49:24,bridge_name='br-int',has_traffic_filtering=True,id=c66e0be1-d166-4088-8ad8-baa84f3d032d,network=Network(e125f54e-7556-49c5-8356-e7390df43c53),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc66e0be1-d1') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 02 10:07:38 compute-1 nova_compute[226294]: 2026-02-02 10:07:38.791 226298 DEBUG os_vif [None req-b8cdc919-8a88-4a5a-82c9-2e0e7fd94e68 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:2f:49:24,bridge_name='br-int',has_traffic_filtering=True,id=c66e0be1-d166-4088-8ad8-baa84f3d032d,network=Network(e125f54e-7556-49c5-8356-e7390df43c53),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc66e0be1-d1') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Feb 02 10:07:38 compute-1 nova_compute[226294]: 2026-02-02 10:07:38.791 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:07:38 compute-1 nova_compute[226294]: 2026-02-02 10:07:38.792 226298 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 02 10:07:38 compute-1 nova_compute[226294]: 2026-02-02 10:07:38.792 226298 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 02 10:07:38 compute-1 nova_compute[226294]: 2026-02-02 10:07:38.795 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:07:38 compute-1 nova_compute[226294]: 2026-02-02 10:07:38.795 226298 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapc66e0be1-d1, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 02 10:07:38 compute-1 nova_compute[226294]: 2026-02-02 10:07:38.796 226298 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapc66e0be1-d1, col_values=(('external_ids', {'iface-id': 'c66e0be1-d166-4088-8ad8-baa84f3d032d', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:2f:49:24', 'vm-uuid': '15b2e821-5e8b-4d8a-9a48-7c6a30bd3220'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 02 10:07:38 compute-1 nova_compute[226294]: 2026-02-02 10:07:38.798 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:07:38 compute-1 NetworkManager[49055]: <info>  [1770026858.8004] manager: (tapc66e0be1-d1): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/37)
Feb 02 10:07:38 compute-1 nova_compute[226294]: 2026-02-02 10:07:38.801 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 02 10:07:38 compute-1 nova_compute[226294]: 2026-02-02 10:07:38.806 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:07:38 compute-1 nova_compute[226294]: 2026-02-02 10:07:38.806 226298 INFO os_vif [None req-b8cdc919-8a88-4a5a-82c9-2e0e7fd94e68 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:2f:49:24,bridge_name='br-int',has_traffic_filtering=True,id=c66e0be1-d166-4088-8ad8-baa84f3d032d,network=Network(e125f54e-7556-49c5-8356-e7390df43c53),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc66e0be1-d1')
Feb 02 10:07:38 compute-1 nova_compute[226294]: 2026-02-02 10:07:38.807 226298 DEBUG nova.virt.libvirt.vif [None req-b8cdc919-8a88-4a5a-82c9-2e0e7fd94e68 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-02-02T10:07:00Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1612354759',display_name='tempest-TestNetworkBasicOps-server-1612354759',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1612354759',id=6,image_ref='d5e062d7-95ef-409c-9ad0-60f7cf6f44ce',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBMJFk+K1ECuOT65iOILNHAamJ3gGwluebBSGyHnh3tJwd+ehxpJY8ebkg+pBKZw/EcMgvzSZ3FmRm5/iJU1QzjfTd7kqiXGqABRWM3LPcjo/Kmnp/RcvKBxgSpfhTx8kiA==',key_name='tempest-TestNetworkBasicOps-1228211887',keypairs=<?>,launch_index=0,launched_at=2026-02-02T10:07:09Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='efbfe697ca674d72b47da5adf3e42c0c',ramdisk_id='',reservation_id='r-1fnfubus',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='d5e062d7-95ef-409c-9ad0-60f7cf6f44ce',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-793549693',owner_user_name='tempest-TestNetworkBasicOps-793549693-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-02-02T10:07:09Z,user_data=None,user_id='1b1695a2a70d4aa0aa350ba17d8f6d5e',uuid=15b2e821-5e8b-4d8a-9a48-7c6a30bd3220,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "c66e0be1-d166-4088-8ad8-baa84f3d032d", "address": "fa:16:3e:2f:49:24", "network": {"id": "e125f54e-7556-49c5-8356-e7390df43c53", "bridge": "br-int", "label": "tempest-network-smoke--39971515", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.18", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "efbfe697ca674d72b47da5adf3e42c0c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc66e0be1-d1", "ovs_interfaceid": "c66e0be1-d166-4088-8ad8-baa84f3d032d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Feb 02 10:07:38 compute-1 nova_compute[226294]: 2026-02-02 10:07:38.808 226298 DEBUG nova.network.os_vif_util [None req-b8cdc919-8a88-4a5a-82c9-2e0e7fd94e68 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Converting VIF {"id": "c66e0be1-d166-4088-8ad8-baa84f3d032d", "address": "fa:16:3e:2f:49:24", "network": {"id": "e125f54e-7556-49c5-8356-e7390df43c53", "bridge": "br-int", "label": "tempest-network-smoke--39971515", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.18", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "efbfe697ca674d72b47da5adf3e42c0c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc66e0be1-d1", "ovs_interfaceid": "c66e0be1-d166-4088-8ad8-baa84f3d032d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 02 10:07:38 compute-1 nova_compute[226294]: 2026-02-02 10:07:38.808 226298 DEBUG nova.network.os_vif_util [None req-b8cdc919-8a88-4a5a-82c9-2e0e7fd94e68 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:2f:49:24,bridge_name='br-int',has_traffic_filtering=True,id=c66e0be1-d166-4088-8ad8-baa84f3d032d,network=Network(e125f54e-7556-49c5-8356-e7390df43c53),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc66e0be1-d1') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 02 10:07:38 compute-1 nova_compute[226294]: 2026-02-02 10:07:38.811 226298 DEBUG nova.virt.libvirt.guest [None req-b8cdc919-8a88-4a5a-82c9-2e0e7fd94e68 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] attach device xml: <interface type="ethernet">
Feb 02 10:07:38 compute-1 nova_compute[226294]:   <mac address="fa:16:3e:2f:49:24"/>
Feb 02 10:07:38 compute-1 nova_compute[226294]:   <model type="virtio"/>
Feb 02 10:07:38 compute-1 nova_compute[226294]:   <driver name="vhost" rx_queue_size="512"/>
Feb 02 10:07:38 compute-1 nova_compute[226294]:   <mtu size="1442"/>
Feb 02 10:07:38 compute-1 nova_compute[226294]:   <target dev="tapc66e0be1-d1"/>
Feb 02 10:07:38 compute-1 nova_compute[226294]: </interface>
Feb 02 10:07:38 compute-1 nova_compute[226294]:  attach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:339
Feb 02 10:07:38 compute-1 kernel: tapc66e0be1-d1: entered promiscuous mode
Feb 02 10:07:38 compute-1 NetworkManager[49055]: <info>  [1770026858.8216] manager: (tapc66e0be1-d1): new Tun device (/org/freedesktop/NetworkManager/Devices/38)
Feb 02 10:07:38 compute-1 ovn_controller[133666]: 2026-02-02T10:07:38Z|00044|binding|INFO|Claiming lport c66e0be1-d166-4088-8ad8-baa84f3d032d for this chassis.
Feb 02 10:07:38 compute-1 ovn_controller[133666]: 2026-02-02T10:07:38Z|00045|binding|INFO|c66e0be1-d166-4088-8ad8-baa84f3d032d: Claiming fa:16:3e:2f:49:24 10.100.0.18
Feb 02 10:07:38 compute-1 nova_compute[226294]: 2026-02-02 10:07:38.824 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:07:38 compute-1 ceph-mon[80115]: pgmap v847: 353 pgs: 353 active+clean; 121 MiB data, 302 MiB used, 60 GiB / 60 GiB avail; 1.2 KiB/s rd, 16 KiB/s wr, 1 op/s
Feb 02 10:07:38 compute-1 nova_compute[226294]: 2026-02-02 10:07:38.851 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:07:38 compute-1 systemd-udevd[232512]: Network interface NamePolicy= disabled on kernel command line.
Feb 02 10:07:38 compute-1 ovn_controller[133666]: 2026-02-02T10:07:38Z|00046|binding|INFO|Setting lport c66e0be1-d166-4088-8ad8-baa84f3d032d ovn-installed in OVS
Feb 02 10:07:38 compute-1 ovn_controller[133666]: 2026-02-02T10:07:38Z|00047|binding|INFO|Setting lport c66e0be1-d166-4088-8ad8-baa84f3d032d up in Southbound
Feb 02 10:07:38 compute-1 nova_compute[226294]: 2026-02-02 10:07:38.854 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:07:38 compute-1 ovn_metadata_agent[143537]: 2026-02-02 10:07:38.856 143542 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:2f:49:24 10.100.0.18'], port_security=['fa:16:3e:2f:49:24 10.100.0.18'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.18/28', 'neutron:device_id': '15b2e821-5e8b-4d8a-9a48-7c6a30bd3220', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e125f54e-7556-49c5-8356-e7390df43c53', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'efbfe697ca674d72b47da5adf3e42c0c', 'neutron:revision_number': '2', 'neutron:security_group_ids': '22473684-a0d2-4e4f-b1c5-3e6fdbc49578', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a9d42b65-630e-4d58-b649-2acc01d097b4, chassis=[<ovs.db.idl.Row object at 0x7f9094efd8b0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f9094efd8b0>], logical_port=c66e0be1-d166-4088-8ad8-baa84f3d032d) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 02 10:07:38 compute-1 ovn_metadata_agent[143537]: 2026-02-02 10:07:38.860 143542 INFO neutron.agent.ovn.metadata.agent [-] Port c66e0be1-d166-4088-8ad8-baa84f3d032d in datapath e125f54e-7556-49c5-8356-e7390df43c53 bound to our chassis
Feb 02 10:07:38 compute-1 ovn_metadata_agent[143537]: 2026-02-02 10:07:38.863 143542 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network e125f54e-7556-49c5-8356-e7390df43c53
Feb 02 10:07:38 compute-1 NetworkManager[49055]: <info>  [1770026858.8688] device (tapc66e0be1-d1): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Feb 02 10:07:38 compute-1 NetworkManager[49055]: <info>  [1770026858.8698] device (tapc66e0be1-d1): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Feb 02 10:07:38 compute-1 ovn_metadata_agent[143537]: 2026-02-02 10:07:38.874 229827 DEBUG oslo.privsep.daemon [-] privsep: reply[6b57f2f3-eaad-4696-b0f2-0b4ca9b61460]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 02 10:07:38 compute-1 ovn_metadata_agent[143537]: 2026-02-02 10:07:38.877 143542 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tape125f54e-71 in ovnmeta-e125f54e-7556-49c5-8356-e7390df43c53 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Feb 02 10:07:38 compute-1 ovn_metadata_agent[143537]: 2026-02-02 10:07:38.879 229827 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tape125f54e-70 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Feb 02 10:07:38 compute-1 ovn_metadata_agent[143537]: 2026-02-02 10:07:38.879 229827 DEBUG oslo.privsep.daemon [-] privsep: reply[16e039c8-837d-46dd-aac0-d66c92ae2c06]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 02 10:07:38 compute-1 ovn_metadata_agent[143537]: 2026-02-02 10:07:38.881 229827 DEBUG oslo.privsep.daemon [-] privsep: reply[30a942ce-dec5-4e94-8c7c-e29610e9d54f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 02 10:07:38 compute-1 ovn_metadata_agent[143537]: 2026-02-02 10:07:38.893 143813 DEBUG oslo.privsep.daemon [-] privsep: reply[1bd13150-8a38-4993-ab91-3f9cd30d6253]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 02 10:07:38 compute-1 ovn_metadata_agent[143537]: 2026-02-02 10:07:38.908 229827 DEBUG oslo.privsep.daemon [-] privsep: reply[888ba296-52c3-426d-8199-8cd9ffa4f6c8]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 02 10:07:38 compute-1 ovn_metadata_agent[143537]: 2026-02-02 10:07:38.932 229842 DEBUG oslo.privsep.daemon [-] privsep: reply[7bcae479-f76a-4623-b06f-ccdae8f3db50]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 02 10:07:38 compute-1 NetworkManager[49055]: <info>  [1770026858.9375] manager: (tape125f54e-70): new Veth device (/org/freedesktop/NetworkManager/Devices/39)
Feb 02 10:07:38 compute-1 ovn_metadata_agent[143537]: 2026-02-02 10:07:38.938 229827 DEBUG oslo.privsep.daemon [-] privsep: reply[2d92357e-39e5-4ef0-ad04-54654d0c1e88]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 02 10:07:38 compute-1 ovn_metadata_agent[143537]: 2026-02-02 10:07:38.962 229842 DEBUG oslo.privsep.daemon [-] privsep: reply[eb99aa16-e1b7-4faf-a28f-9e0efe9b3ae8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 02 10:07:38 compute-1 ovn_metadata_agent[143537]: 2026-02-02 10:07:38.966 229842 DEBUG oslo.privsep.daemon [-] privsep: reply[2aba1a51-6951-4948-8304-0ae2e266883c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 02 10:07:38 compute-1 NetworkManager[49055]: <info>  [1770026858.9881] device (tape125f54e-70): carrier: link connected
Feb 02 10:07:38 compute-1 ovn_metadata_agent[143537]: 2026-02-02 10:07:38.992 229842 DEBUG oslo.privsep.daemon [-] privsep: reply[8d7e538b-0da0-4b1e-8d92-559254d390e1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 02 10:07:39 compute-1 ovn_metadata_agent[143537]: 2026-02-02 10:07:39.010 229827 DEBUG oslo.privsep.daemon [-] privsep: reply[6c16237c-7bf4-4937-8151-58b6d3f36c57]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tape125f54e-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:2f:b7:41'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 18], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 402129, 'reachable_time': 43246, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 232539, 'error': None, 'target': 'ovnmeta-e125f54e-7556-49c5-8356-e7390df43c53', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 02 10:07:39 compute-1 nova_compute[226294]: 2026-02-02 10:07:39.017 226298 DEBUG nova.virt.libvirt.driver [None req-b8cdc919-8a88-4a5a-82c9-2e0e7fd94e68 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Feb 02 10:07:39 compute-1 nova_compute[226294]: 2026-02-02 10:07:39.018 226298 DEBUG nova.virt.libvirt.driver [None req-b8cdc919-8a88-4a5a-82c9-2e0e7fd94e68 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Feb 02 10:07:39 compute-1 nova_compute[226294]: 2026-02-02 10:07:39.018 226298 DEBUG nova.virt.libvirt.driver [None req-b8cdc919-8a88-4a5a-82c9-2e0e7fd94e68 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] No VIF found with MAC fa:16:3e:85:9a:96, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Feb 02 10:07:39 compute-1 nova_compute[226294]: 2026-02-02 10:07:39.019 226298 DEBUG nova.virt.libvirt.driver [None req-b8cdc919-8a88-4a5a-82c9-2e0e7fd94e68 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] No VIF found with MAC fa:16:3e:2f:49:24, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Feb 02 10:07:39 compute-1 ovn_metadata_agent[143537]: 2026-02-02 10:07:39.025 229827 DEBUG oslo.privsep.daemon [-] privsep: reply[a3e85fc9-cbab-411b-b7e1-5b0ced62b883]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe2f:b741'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 402129, 'tstamp': 402129}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 232540, 'error': None, 'target': 'ovnmeta-e125f54e-7556-49c5-8356-e7390df43c53', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 02 10:07:39 compute-1 ovn_metadata_agent[143537]: 2026-02-02 10:07:39.039 229827 DEBUG oslo.privsep.daemon [-] privsep: reply[702f1213-38d5-4d51-b43f-21a976ccb059]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tape125f54e-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:2f:b7:41'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 18], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 402129, 'reachable_time': 43246, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 232541, 'error': None, 'target': 'ovnmeta-e125f54e-7556-49c5-8356-e7390df43c53', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 02 10:07:39 compute-1 nova_compute[226294]: 2026-02-02 10:07:39.062 226298 DEBUG nova.virt.libvirt.guest [None req-b8cdc919-8a88-4a5a-82c9-2e0e7fd94e68 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] set metadata xml: <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Feb 02 10:07:39 compute-1 nova_compute[226294]:   <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Feb 02 10:07:39 compute-1 nova_compute[226294]:   <nova:name>tempest-TestNetworkBasicOps-server-1612354759</nova:name>
Feb 02 10:07:39 compute-1 nova_compute[226294]:   <nova:creationTime>2026-02-02 10:07:39</nova:creationTime>
Feb 02 10:07:39 compute-1 nova_compute[226294]:   <nova:flavor name="m1.nano">
Feb 02 10:07:39 compute-1 nova_compute[226294]:     <nova:memory>128</nova:memory>
Feb 02 10:07:39 compute-1 nova_compute[226294]:     <nova:disk>1</nova:disk>
Feb 02 10:07:39 compute-1 nova_compute[226294]:     <nova:swap>0</nova:swap>
Feb 02 10:07:39 compute-1 nova_compute[226294]:     <nova:ephemeral>0</nova:ephemeral>
Feb 02 10:07:39 compute-1 nova_compute[226294]:     <nova:vcpus>1</nova:vcpus>
Feb 02 10:07:39 compute-1 nova_compute[226294]:   </nova:flavor>
Feb 02 10:07:39 compute-1 nova_compute[226294]:   <nova:owner>
Feb 02 10:07:39 compute-1 nova_compute[226294]:     <nova:user uuid="1b1695a2a70d4aa0aa350ba17d8f6d5e">tempest-TestNetworkBasicOps-793549693-project-member</nova:user>
Feb 02 10:07:39 compute-1 nova_compute[226294]:     <nova:project uuid="efbfe697ca674d72b47da5adf3e42c0c">tempest-TestNetworkBasicOps-793549693</nova:project>
Feb 02 10:07:39 compute-1 nova_compute[226294]:   </nova:owner>
Feb 02 10:07:39 compute-1 nova_compute[226294]:   <nova:root type="image" uuid="d5e062d7-95ef-409c-9ad0-60f7cf6f44ce"/>
Feb 02 10:07:39 compute-1 nova_compute[226294]:   <nova:ports>
Feb 02 10:07:39 compute-1 nova_compute[226294]:     <nova:port uuid="09a00258-4f60-42dd-a769-b2ea3b870187">
Feb 02 10:07:39 compute-1 nova_compute[226294]:       <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Feb 02 10:07:39 compute-1 nova_compute[226294]:     </nova:port>
Feb 02 10:07:39 compute-1 nova_compute[226294]:     <nova:port uuid="c66e0be1-d166-4088-8ad8-baa84f3d032d">
Feb 02 10:07:39 compute-1 nova_compute[226294]:       <nova:ip type="fixed" address="10.100.0.18" ipVersion="4"/>
Feb 02 10:07:39 compute-1 nova_compute[226294]:     </nova:port>
Feb 02 10:07:39 compute-1 nova_compute[226294]:   </nova:ports>
Feb 02 10:07:39 compute-1 nova_compute[226294]: </nova:instance>
Feb 02 10:07:39 compute-1 nova_compute[226294]:  set_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:359
Feb 02 10:07:39 compute-1 ovn_metadata_agent[143537]: 2026-02-02 10:07:39.063 229827 DEBUG oslo.privsep.daemon [-] privsep: reply[8913437d-f33b-4530-9121-b3e1207c95d2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 02 10:07:39 compute-1 nova_compute[226294]: 2026-02-02 10:07:39.097 226298 DEBUG oslo_concurrency.lockutils [None req-b8cdc919-8a88-4a5a-82c9-2e0e7fd94e68 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Lock "interface-15b2e821-5e8b-4d8a-9a48-7c6a30bd3220-None" "released" by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" :: held 6.986s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 02 10:07:39 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:07:39 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:07:39 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:07:39.099 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:07:39 compute-1 ovn_metadata_agent[143537]: 2026-02-02 10:07:39.111 229827 DEBUG oslo.privsep.daemon [-] privsep: reply[16792929-3308-48a0-a430-30da1b7efdbf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 02 10:07:39 compute-1 ovn_metadata_agent[143537]: 2026-02-02 10:07:39.112 143542 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape125f54e-70, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 02 10:07:39 compute-1 ovn_metadata_agent[143537]: 2026-02-02 10:07:39.113 143542 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 02 10:07:39 compute-1 ovn_metadata_agent[143537]: 2026-02-02 10:07:39.114 143542 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tape125f54e-70, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 02 10:07:39 compute-1 nova_compute[226294]: 2026-02-02 10:07:39.116 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:07:39 compute-1 kernel: tape125f54e-70: entered promiscuous mode
Feb 02 10:07:39 compute-1 NetworkManager[49055]: <info>  [1770026859.1181] manager: (tape125f54e-70): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/40)
Feb 02 10:07:39 compute-1 ovn_metadata_agent[143537]: 2026-02-02 10:07:39.120 143542 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tape125f54e-70, col_values=(('external_ids', {'iface-id': '4948ba2f-4901-4550-ab74-f4adf1b82ea1'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 02 10:07:39 compute-1 nova_compute[226294]: 2026-02-02 10:07:39.121 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:07:39 compute-1 ovn_controller[133666]: 2026-02-02T10:07:39Z|00048|binding|INFO|Releasing lport 4948ba2f-4901-4550-ab74-f4adf1b82ea1 from this chassis (sb_readonly=0)
Feb 02 10:07:39 compute-1 nova_compute[226294]: 2026-02-02 10:07:39.128 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:07:39 compute-1 ovn_metadata_agent[143537]: 2026-02-02 10:07:39.129 143542 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/e125f54e-7556-49c5-8356-e7390df43c53.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/e125f54e-7556-49c5-8356-e7390df43c53.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Feb 02 10:07:39 compute-1 ovn_metadata_agent[143537]: 2026-02-02 10:07:39.130 229827 DEBUG oslo.privsep.daemon [-] privsep: reply[c363a1dc-879f-470e-af0f-a6f47d9c6e74]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 02 10:07:39 compute-1 ovn_metadata_agent[143537]: 2026-02-02 10:07:39.131 143542 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Feb 02 10:07:39 compute-1 ovn_metadata_agent[143537]: global
Feb 02 10:07:39 compute-1 ovn_metadata_agent[143537]:     log         /dev/log local0 debug
Feb 02 10:07:39 compute-1 ovn_metadata_agent[143537]:     log-tag     haproxy-metadata-proxy-e125f54e-7556-49c5-8356-e7390df43c53
Feb 02 10:07:39 compute-1 ovn_metadata_agent[143537]:     user        root
Feb 02 10:07:39 compute-1 ovn_metadata_agent[143537]:     group       root
Feb 02 10:07:39 compute-1 ovn_metadata_agent[143537]:     maxconn     1024
Feb 02 10:07:39 compute-1 ovn_metadata_agent[143537]:     pidfile     /var/lib/neutron/external/pids/e125f54e-7556-49c5-8356-e7390df43c53.pid.haproxy
Feb 02 10:07:39 compute-1 ovn_metadata_agent[143537]:     daemon
Feb 02 10:07:39 compute-1 ovn_metadata_agent[143537]: 
Feb 02 10:07:39 compute-1 ovn_metadata_agent[143537]: defaults
Feb 02 10:07:39 compute-1 ovn_metadata_agent[143537]:     log global
Feb 02 10:07:39 compute-1 ovn_metadata_agent[143537]:     mode http
Feb 02 10:07:39 compute-1 ovn_metadata_agent[143537]:     option httplog
Feb 02 10:07:39 compute-1 ovn_metadata_agent[143537]:     option dontlognull
Feb 02 10:07:39 compute-1 ovn_metadata_agent[143537]:     option http-server-close
Feb 02 10:07:39 compute-1 ovn_metadata_agent[143537]:     option forwardfor
Feb 02 10:07:39 compute-1 ovn_metadata_agent[143537]:     retries                 3
Feb 02 10:07:39 compute-1 ovn_metadata_agent[143537]:     timeout http-request    30s
Feb 02 10:07:39 compute-1 ovn_metadata_agent[143537]:     timeout connect         30s
Feb 02 10:07:39 compute-1 ovn_metadata_agent[143537]:     timeout client          32s
Feb 02 10:07:39 compute-1 ovn_metadata_agent[143537]:     timeout server          32s
Feb 02 10:07:39 compute-1 ovn_metadata_agent[143537]:     timeout http-keep-alive 30s
Feb 02 10:07:39 compute-1 ovn_metadata_agent[143537]: 
Feb 02 10:07:39 compute-1 ovn_metadata_agent[143537]: 
Feb 02 10:07:39 compute-1 ovn_metadata_agent[143537]: listen listener
Feb 02 10:07:39 compute-1 ovn_metadata_agent[143537]:     bind 169.254.169.254:80
Feb 02 10:07:39 compute-1 ovn_metadata_agent[143537]:     server metadata /var/lib/neutron/metadata_proxy
Feb 02 10:07:39 compute-1 ovn_metadata_agent[143537]:     http-request add-header X-OVN-Network-ID e125f54e-7556-49c5-8356-e7390df43c53
Feb 02 10:07:39 compute-1 ovn_metadata_agent[143537]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Feb 02 10:07:39 compute-1 ovn_metadata_agent[143537]: 2026-02-02 10:07:39.131 143542 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-e125f54e-7556-49c5-8356-e7390df43c53', 'env', 'PROCESS_TAG=haproxy-e125f54e-7556-49c5-8356-e7390df43c53', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/e125f54e-7556-49c5-8356-e7390df43c53.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Feb 02 10:07:39 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:07:39 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:07:39 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:07:39.176 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:07:39 compute-1 nova_compute[226294]: 2026-02-02 10:07:39.353 226298 DEBUG nova.compute.manager [req-2dce6e98-e2d9-435a-be9e-b5c084040202 req-77204008-e526-4d9e-9762-6eaa0af2843e b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] [instance: 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220] Received event network-vif-plugged-c66e0be1-d166-4088-8ad8-baa84f3d032d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 02 10:07:39 compute-1 nova_compute[226294]: 2026-02-02 10:07:39.354 226298 DEBUG oslo_concurrency.lockutils [req-2dce6e98-e2d9-435a-be9e-b5c084040202 req-77204008-e526-4d9e-9762-6eaa0af2843e b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] Acquiring lock "15b2e821-5e8b-4d8a-9a48-7c6a30bd3220-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 02 10:07:39 compute-1 nova_compute[226294]: 2026-02-02 10:07:39.354 226298 DEBUG oslo_concurrency.lockutils [req-2dce6e98-e2d9-435a-be9e-b5c084040202 req-77204008-e526-4d9e-9762-6eaa0af2843e b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] Lock "15b2e821-5e8b-4d8a-9a48-7c6a30bd3220-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 02 10:07:39 compute-1 nova_compute[226294]: 2026-02-02 10:07:39.354 226298 DEBUG oslo_concurrency.lockutils [req-2dce6e98-e2d9-435a-be9e-b5c084040202 req-77204008-e526-4d9e-9762-6eaa0af2843e b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] Lock "15b2e821-5e8b-4d8a-9a48-7c6a30bd3220-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 02 10:07:39 compute-1 nova_compute[226294]: 2026-02-02 10:07:39.355 226298 DEBUG nova.compute.manager [req-2dce6e98-e2d9-435a-be9e-b5c084040202 req-77204008-e526-4d9e-9762-6eaa0af2843e b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] [instance: 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220] No waiting events found dispatching network-vif-plugged-c66e0be1-d166-4088-8ad8-baa84f3d032d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 02 10:07:39 compute-1 nova_compute[226294]: 2026-02-02 10:07:39.355 226298 WARNING nova.compute.manager [req-2dce6e98-e2d9-435a-be9e-b5c084040202 req-77204008-e526-4d9e-9762-6eaa0af2843e b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] [instance: 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220] Received unexpected event network-vif-plugged-c66e0be1-d166-4088-8ad8-baa84f3d032d for instance with vm_state active and task_state None.
Feb 02 10:07:39 compute-1 podman[232573]: 2026-02-02 10:07:39.461656959 +0000 UTC m=+0.031982230 image pull 19964fda6b912d3d57e21b0bcc221725d936e513025030cb508474fe04b06af8 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc
Feb 02 10:07:39 compute-1 sudo[232586]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Feb 02 10:07:39 compute-1 sudo[232586]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 10:07:39 compute-1 sudo[232586]: pam_unix(sudo:session): session closed for user root
Feb 02 10:07:39 compute-1 podman[232573]: 2026-02-02 10:07:39.575644308 +0000 UTC m=+0.145969539 container create e67bf652ad78a830f4e9d185df6fed75ea7c2a468c551e9f2677a0fa3cb6602e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc, name=neutron-haproxy-ovnmeta-e125f54e-7556-49c5-8356-e7390df43c53, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_managed=true)
Feb 02 10:07:39 compute-1 systemd[1]: Started libpod-conmon-e67bf652ad78a830f4e9d185df6fed75ea7c2a468c551e9f2677a0fa3cb6602e.scope.
Feb 02 10:07:39 compute-1 systemd[1]: Started libcrun container.
Feb 02 10:07:39 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a7028958bbb9ac49e703bb1728fefda69b8f73736997e2045bf747f59bb53233/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Feb 02 10:07:39 compute-1 podman[232573]: 2026-02-02 10:07:39.683222806 +0000 UTC m=+0.253548047 container init e67bf652ad78a830f4e9d185df6fed75ea7c2a468c551e9f2677a0fa3cb6602e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc, name=neutron-haproxy-ovnmeta-e125f54e-7556-49c5-8356-e7390df43c53, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, tcib_managed=true, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 02 10:07:39 compute-1 podman[232573]: 2026-02-02 10:07:39.687578652 +0000 UTC m=+0.257903873 container start e67bf652ad78a830f4e9d185df6fed75ea7c2a468c551e9f2677a0fa3cb6602e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc, name=neutron-haproxy-ovnmeta-e125f54e-7556-49c5-8356-e7390df43c53, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team)
Feb 02 10:07:39 compute-1 neutron-haproxy-ovnmeta-e125f54e-7556-49c5-8356-e7390df43c53[232613]: [NOTICE]   (232617) : New worker (232619) forked
Feb 02 10:07:39 compute-1 neutron-haproxy-ovnmeta-e125f54e-7556-49c5-8356-e7390df43c53[232613]: [NOTICE]   (232617) : Loading success.
Feb 02 10:07:39 compute-1 nova_compute[226294]: 2026-02-02 10:07:39.744 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:07:39 compute-1 ovn_metadata_agent[143537]: 2026-02-02 10:07:39.745 143542 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=8, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '66:4f:4d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '4a:a7:f3:61:65:15'}, ipsec=False) old=SB_Global(nb_cfg=7) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 02 10:07:39 compute-1 ovn_metadata_agent[143537]: 2026-02-02 10:07:39.746 143542 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Feb 02 10:07:39 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:07:39 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b4004690 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:07:39 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:07:39 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_27] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b4004690 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:07:40 compute-1 nova_compute[226294]: 2026-02-02 10:07:40.225 226298 DEBUG nova.network.neutron [req-4cc554dd-e12e-45ae-90d0-50b23f606996 req-ddd77cb6-5259-424c-883e-88b175954441 b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] [instance: 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220] Updated VIF entry in instance network info cache for port c66e0be1-d166-4088-8ad8-baa84f3d032d. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Feb 02 10:07:40 compute-1 nova_compute[226294]: 2026-02-02 10:07:40.225 226298 DEBUG nova.network.neutron [req-4cc554dd-e12e-45ae-90d0-50b23f606996 req-ddd77cb6-5259-424c-883e-88b175954441 b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] [instance: 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220] Updating instance_info_cache with network_info: [{"id": "09a00258-4f60-42dd-a769-b2ea3b870187", "address": "fa:16:3e:85:9a:96", "network": {"id": "ba6c4c87-77a9-4fcc-aa14-a4637c78f692", "bridge": "br-int", "label": "tempest-network-smoke--761234279", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.238", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "efbfe697ca674d72b47da5adf3e42c0c", "mtu": null, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap09a00258-4f", "ovs_interfaceid": "09a00258-4f60-42dd-a769-b2ea3b870187", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "c66e0be1-d166-4088-8ad8-baa84f3d032d", "address": "fa:16:3e:2f:49:24", "network": {"id": "e125f54e-7556-49c5-8356-e7390df43c53", "bridge": "br-int", "label": "tempest-network-smoke--39971515", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.18", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "efbfe697ca674d72b47da5adf3e42c0c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc66e0be1-d1", "ovs_interfaceid": "c66e0be1-d166-4088-8ad8-baa84f3d032d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 02 10:07:40 compute-1 nova_compute[226294]: 2026-02-02 10:07:40.239 226298 DEBUG oslo_concurrency.lockutils [req-4cc554dd-e12e-45ae-90d0-50b23f606996 req-ddd77cb6-5259-424c-883e-88b175954441 b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] Releasing lock "refresh_cache-15b2e821-5e8b-4d8a-9a48-7c6a30bd3220" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 02 10:07:40 compute-1 nova_compute[226294]: 2026-02-02 10:07:40.418 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:07:40 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:07:40 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_27] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f849c004340 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:07:41 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:07:41 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 10:07:41 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:07:41.101 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 10:07:41 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:07:41 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:07:41 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:07:41.178 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:07:41 compute-1 ovn_controller[133666]: 2026-02-02T10:07:41Z|00008|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:2f:49:24 10.100.0.18
Feb 02 10:07:41 compute-1 ovn_controller[133666]: 2026-02-02T10:07:41Z|00009|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:2f:49:24 10.100.0.18
Feb 02 10:07:41 compute-1 nova_compute[226294]: 2026-02-02 10:07:41.455 226298 DEBUG nova.compute.manager [req-a26d164b-6049-4486-a156-e18c34f5f4c5 req-cd89b7d0-eba5-4d8d-b2e5-717da56ab6ae b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] [instance: 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220] Received event network-vif-plugged-c66e0be1-d166-4088-8ad8-baa84f3d032d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 02 10:07:41 compute-1 nova_compute[226294]: 2026-02-02 10:07:41.455 226298 DEBUG oslo_concurrency.lockutils [req-a26d164b-6049-4486-a156-e18c34f5f4c5 req-cd89b7d0-eba5-4d8d-b2e5-717da56ab6ae b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] Acquiring lock "15b2e821-5e8b-4d8a-9a48-7c6a30bd3220-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 02 10:07:41 compute-1 nova_compute[226294]: 2026-02-02 10:07:41.456 226298 DEBUG oslo_concurrency.lockutils [req-a26d164b-6049-4486-a156-e18c34f5f4c5 req-cd89b7d0-eba5-4d8d-b2e5-717da56ab6ae b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] Lock "15b2e821-5e8b-4d8a-9a48-7c6a30bd3220-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 02 10:07:41 compute-1 nova_compute[226294]: 2026-02-02 10:07:41.456 226298 DEBUG oslo_concurrency.lockutils [req-a26d164b-6049-4486-a156-e18c34f5f4c5 req-cd89b7d0-eba5-4d8d-b2e5-717da56ab6ae b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] Lock "15b2e821-5e8b-4d8a-9a48-7c6a30bd3220-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 02 10:07:41 compute-1 nova_compute[226294]: 2026-02-02 10:07:41.456 226298 DEBUG nova.compute.manager [req-a26d164b-6049-4486-a156-e18c34f5f4c5 req-cd89b7d0-eba5-4d8d-b2e5-717da56ab6ae b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] [instance: 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220] No waiting events found dispatching network-vif-plugged-c66e0be1-d166-4088-8ad8-baa84f3d032d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 02 10:07:41 compute-1 nova_compute[226294]: 2026-02-02 10:07:41.457 226298 WARNING nova.compute.manager [req-a26d164b-6049-4486-a156-e18c34f5f4c5 req-cd89b7d0-eba5-4d8d-b2e5-717da56ab6ae b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] [instance: 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220] Received unexpected event network-vif-plugged-c66e0be1-d166-4088-8ad8-baa84f3d032d for instance with vm_state active and task_state None.
Feb 02 10:07:41 compute-1 ceph-mon[80115]: pgmap v848: 353 pgs: 353 active+clean; 121 MiB data, 302 MiB used, 60 GiB / 60 GiB avail; 938 B/s rd, 4.7 KiB/s wr, 0 op/s
Feb 02 10:07:41 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:07:41 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_23] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b8009980 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:07:42 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:07:42 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b4004690 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:07:42 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 10:07:42 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:07:42 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f848c003750 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:07:42 compute-1 ceph-mon[80115]: pgmap v849: 353 pgs: 353 active+clean; 121 MiB data, 302 MiB used, 60 GiB / 60 GiB avail; 938 B/s rd, 4.7 KiB/s wr, 0 op/s
Feb 02 10:07:43 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:07:43 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:07:43 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:07:43.105 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:07:43 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:07:43 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:07:43 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:07:43.180 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:07:43 compute-1 nova_compute[226294]: 2026-02-02 10:07:43.840 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:07:43 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:07:43 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_23] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f848c003750 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:07:44 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:07:44 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_23] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b8009980 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:07:44 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-haproxy-nfs-cephfs-compute-1-sryqbx[86108]: [WARNING] 032/100744 (4) : Server backend/nfs.cephfs.2 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Feb 02 10:07:44 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:07:44 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8490001090 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:07:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 10:07:44.908 143542 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 02 10:07:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 10:07:44.908 143542 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 02 10:07:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 10:07:44.909 143542 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 02 10:07:45 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:07:45 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:07:45 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:07:45.108 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:07:45 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-haproxy-nfs-cephfs-compute-1-sryqbx[86108]: [WARNING] 032/100745 (4) : Server backend/nfs.cephfs.1 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 1 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Feb 02 10:07:45 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:07:45 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:07:45 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:07:45.182 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:07:45 compute-1 nova_compute[226294]: 2026-02-02 10:07:45.421 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:07:45 compute-1 ceph-mon[80115]: pgmap v850: 353 pgs: 353 active+clean; 121 MiB data, 302 MiB used, 60 GiB / 60 GiB avail; 3.2 KiB/s rd, 5.7 KiB/s wr, 1 op/s
Feb 02 10:07:45 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:07:45 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_31] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8494001070 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:07:46 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:07:46 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f849c004340 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:07:46 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:07:46 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_23] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b8009980 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:07:46 compute-1 ceph-mon[80115]: pgmap v851: 353 pgs: 353 active+clean; 121 MiB data, 302 MiB used, 60 GiB / 60 GiB avail; 2.6 KiB/s rd, 1023 B/s wr, 0 op/s
Feb 02 10:07:47 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:07:47 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:07:47 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:07:47.111 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:07:47 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:07:47 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 10:07:47 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:07:47.184 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 10:07:47 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 10:07:47 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Feb 02 10:07:47 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:07:47 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8490001090 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:07:48 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:07:48 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_31] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8494001070 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:07:48 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:07:48 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f849c004340 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:07:48 compute-1 nova_compute[226294]: 2026-02-02 10:07:48.843 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:07:48 compute-1 ceph-mon[80115]: pgmap v852: 353 pgs: 353 active+clean; 121 MiB data, 302 MiB used, 60 GiB / 60 GiB avail; 2.7 KiB/s rd, 6.3 KiB/s wr, 1 op/s
Feb 02 10:07:49 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:07:49 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 10:07:49 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:07:49.113 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 10:07:49 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:07:49 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:07:49 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:07:49.191 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:07:49 compute-1 ovn_metadata_agent[143537]: 2026-02-02 10:07:49.749 143542 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=2f54a3b0-231a-4b96-9e3a-0a36e3e73216, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '8'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 02 10:07:49 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:07:49 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_23] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b8009980 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:07:50 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:07:50 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8490001090 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:07:50 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:07:50 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_31] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8494001070 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:07:50 compute-1 nova_compute[226294]: 2026-02-02 10:07:50.462 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:07:50 compute-1 ceph-mon[80115]: from='client.? 192.168.122.100:0/4162884263' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Feb 02 10:07:51 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:07:51 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:07:51 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:07:51.117 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:07:51 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:07:51 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:07:51 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:07:51.193 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:07:51 compute-1 ceph-mon[80115]: pgmap v853: 353 pgs: 353 active+clean; 121 MiB data, 302 MiB used, 60 GiB / 60 GiB avail; 2.4 KiB/s rd, 6.3 KiB/s wr, 0 op/s
Feb 02 10:07:51 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:07:51 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f849c004340 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:07:52 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:07:52 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_23] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b8009980 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:07:52 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 10:07:52 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:07:52 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8490001090 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:07:52 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:07:52 : epoch 6980764f : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Feb 02 10:07:53 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:07:53 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 10:07:53 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:07:53.119 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 10:07:53 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:07:53 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:07:53 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:07:53.195 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:07:53 compute-1 ceph-mon[80115]: pgmap v854: 353 pgs: 353 active+clean; 121 MiB data, 302 MiB used, 60 GiB / 60 GiB avail; 2.4 KiB/s rd, 6.3 KiB/s wr, 0 op/s
Feb 02 10:07:53 compute-1 nova_compute[226294]: 2026-02-02 10:07:53.877 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:07:53 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:07:53 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_31] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84940032a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:07:54 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:07:54 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f849c004340 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:07:54 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:07:54 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_23] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b800b0f0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:07:54 compute-1 ceph-mon[80115]: pgmap v855: 353 pgs: 353 active+clean; 167 MiB data, 323 MiB used, 60 GiB / 60 GiB avail; 21 KiB/s rd, 1.8 MiB/s wr, 30 op/s
Feb 02 10:07:55 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:07:55 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:07:55 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:07:55.123 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:07:55 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:07:55 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:07:55 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:07:55.197 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:07:55 compute-1 nova_compute[226294]: 2026-02-02 10:07:55.466 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:07:55 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:07:55 : epoch 6980764f : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Feb 02 10:07:55 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:07:55 : epoch 6980764f : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Feb 02 10:07:55 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:07:55 : epoch 6980764f : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Feb 02 10:07:55 compute-1 ceph-mon[80115]: from='client.? 192.168.122.10:0/458817904' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Feb 02 10:07:55 compute-1 ceph-mon[80115]: from='client.? 192.168.122.10:0/458817904' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Feb 02 10:07:55 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:07:55 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8490003430 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:07:56 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:07:56 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_31] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84940032a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:07:56 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:07:56 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f849c004340 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:07:56 compute-1 ceph-mon[80115]: pgmap v856: 353 pgs: 353 active+clean; 167 MiB data, 323 MiB used, 60 GiB / 60 GiB avail; 18 KiB/s rd, 1.8 MiB/s wr, 29 op/s
Feb 02 10:07:57 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:07:57 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:07:57 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:07:57.126 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:07:57 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:07:57 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 10:07:57 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:07:57.203 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 10:07:57 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 10:07:57 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:07:57 : epoch 6980764f : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Feb 02 10:07:57 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:07:57 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_23] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b800b0f0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:07:58 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:07:58 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8490003430 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:07:58 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:07:58 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8490003430 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:07:58 compute-1 ceph-mon[80115]: from='client.? 192.168.122.100:0/3184578567' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Feb 02 10:07:58 compute-1 nova_compute[226294]: 2026-02-02 10:07:58.903 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:07:59 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:07:59 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:07:59 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:07:59.129 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:07:59 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:07:59 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 10:07:59 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:07:59.205 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 10:07:59 compute-1 ceph-mon[80115]: pgmap v857: 353 pgs: 353 active+clean; 167 MiB data, 323 MiB used, 60 GiB / 60 GiB avail; 20 KiB/s rd, 1.8 MiB/s wr, 31 op/s
Feb 02 10:07:59 compute-1 ceph-mon[80115]: from='client.? 192.168.122.100:0/2630262200' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Feb 02 10:07:59 compute-1 sudo[232640]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Feb 02 10:07:59 compute-1 sudo[232640]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 10:07:59 compute-1 sudo[232640]: pam_unix(sudo:session): session closed for user root
Feb 02 10:07:59 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:07:59 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_31] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f849c004340 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:08:00 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:08:00 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_23] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b800b0f0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:08:00 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:08:00 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84940032a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:08:00 compute-1 nova_compute[226294]: 2026-02-02 10:08:00.468 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:08:00 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:08:00 : epoch 6980764f : compute-1 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Feb 02 10:08:00 compute-1 ceph-mon[80115]: pgmap v858: 353 pgs: 353 active+clean; 167 MiB data, 323 MiB used, 60 GiB / 60 GiB avail; 20 KiB/s rd, 1.8 MiB/s wr, 31 op/s
Feb 02 10:08:01 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:08:01 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.002000053s ======
Feb 02 10:08:01 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:08:01.132 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000053s
Feb 02 10:08:01 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:08:01 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:08:01 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:08:01.208 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:08:01 compute-1 podman[232666]: 2026-02-02 10:08:01.496227236 +0000 UTC m=+0.156197001 container health_status 1fb2696999ea15e9688144989093738543d3cef725f9986792d6dfd707aefbe2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:099d88ae13fa2b3409da5310cdcba7fa01d2c87a8bc98296299a57054b9a075e, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'db4758ee7523fe447444c4bd2b867b543b1eee4e3bbcf6676cd1b27bf6147d86-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:099d88ae13fa2b3409da5310cdcba7fa01d2c87a8bc98296299a57054b9a075e', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 02 10:08:01 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:08:01 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8490003430 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:08:02 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:08:02 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_31] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f849c004340 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:08:02 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Feb 02 10:08:02 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 10:08:02 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:08:02 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_23] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b800b0f0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:08:03 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:08:03 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 10:08:03 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:08:03.136 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 10:08:03 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:08:03 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 10:08:03 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:08:03.210 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 10:08:03 compute-1 ceph-mon[80115]: pgmap v859: 353 pgs: 353 active+clean; 167 MiB data, 323 MiB used, 60 GiB / 60 GiB avail; 20 KiB/s rd, 1.8 MiB/s wr, 31 op/s
Feb 02 10:08:03 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:08:03 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84940032a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:08:03 compute-1 nova_compute[226294]: 2026-02-02 10:08:03.936 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:08:04 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:08:04 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8490003430 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:08:04 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-haproxy-nfs-cephfs-compute-1-sryqbx[86108]: [WARNING] 032/100804 (4) : Server backend/nfs.cephfs.2 is UP, reason: Layer4 check passed, check duration: 0ms. 2 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Feb 02 10:08:04 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:08:04 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_31] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f849c004340 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:08:04 compute-1 ceph-mon[80115]: pgmap v860: 353 pgs: 353 active+clean; 167 MiB data, 323 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 1.8 MiB/s wr, 106 op/s
Feb 02 10:08:05 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:08:05 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:08:05 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:08:05.140 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:08:05 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:08:05 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:08:05 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:08:05.213 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:08:05 compute-1 nova_compute[226294]: 2026-02-02 10:08:05.470 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:08:05 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:08:05 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_23] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b800b0f0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:08:06 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:08:06 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84940032a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:08:06 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:08:06 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8490003430 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:08:06 compute-1 ceph-mon[80115]: pgmap v861: 353 pgs: 353 active+clean; 167 MiB data, 323 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 14 KiB/s wr, 76 op/s
Feb 02 10:08:07 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:08:07 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 10:08:07 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:08:07.144 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 10:08:07 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-haproxy-nfs-cephfs-compute-1-sryqbx[86108]: [WARNING] 032/100807 (4) : Server backend/nfs.cephfs.1 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Feb 02 10:08:07 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:08:07 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:08:07 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:08:07.216 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:08:07 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 10:08:07 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:08:07 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_31] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f849c004340 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:08:08 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:08:08 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_23] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b800b0f0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:08:08 compute-1 podman[232695]: 2026-02-02 10:08:08.389306911 +0000 UTC m=+0.069894048 container health_status 9cab238676dda9a572dafc047b624a8b78a8fbec063bf997856559897160605d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'db4758ee7523fe447444c4bd2b867b543b1eee4e3bbcf6676cd1b27bf6147d86-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20260127, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 02 10:08:08 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:08:08 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84940032a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:08:08 compute-1 ceph-mon[80115]: pgmap v862: 353 pgs: 353 active+clean; 167 MiB data, 323 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 16 KiB/s wr, 77 op/s
Feb 02 10:08:08 compute-1 ceph-mon[80115]: from='client.? 192.168.122.100:0/3440810205' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Feb 02 10:08:08 compute-1 nova_compute[226294]: 2026-02-02 10:08:08.941 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:08:09 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:08:09 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 10:08:09 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:08:09.147 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 10:08:09 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:08:09 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:08:09 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:08:09.218 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:08:09 compute-1 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Feb 02 10:08:09 compute-1 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/961067775' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Feb 02 10:08:09 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:08:09 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8490003430 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:08:09 compute-1 ceph-mon[80115]: from='client.? 192.168.122.100:0/961067775' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Feb 02 10:08:10 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:08:10 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_31] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f849c004340 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:08:10 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:08:10 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_23] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b800b0f0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:08:10 compute-1 nova_compute[226294]: 2026-02-02 10:08:10.473 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:08:10 compute-1 nova_compute[226294]: 2026-02-02 10:08:10.648 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 02 10:08:10 compute-1 nova_compute[226294]: 2026-02-02 10:08:10.649 226298 DEBUG nova.compute.manager [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 02 10:08:10 compute-1 nova_compute[226294]: 2026-02-02 10:08:10.649 226298 DEBUG nova.compute.manager [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 02 10:08:10 compute-1 ceph-mon[80115]: pgmap v863: 353 pgs: 353 active+clean; 167 MiB data, 323 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 16 KiB/s wr, 75 op/s
Feb 02 10:08:11 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:08:11 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:08:11 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:08:11.151 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:08:11 compute-1 nova_compute[226294]: 2026-02-02 10:08:11.162 226298 DEBUG oslo_concurrency.lockutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Acquiring lock "refresh_cache-15b2e821-5e8b-4d8a-9a48-7c6a30bd3220" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 02 10:08:11 compute-1 nova_compute[226294]: 2026-02-02 10:08:11.163 226298 DEBUG oslo_concurrency.lockutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Acquired lock "refresh_cache-15b2e821-5e8b-4d8a-9a48-7c6a30bd3220" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 02 10:08:11 compute-1 nova_compute[226294]: 2026-02-02 10:08:11.163 226298 DEBUG nova.network.neutron [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] [instance: 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Feb 02 10:08:11 compute-1 nova_compute[226294]: 2026-02-02 10:08:11.163 226298 DEBUG nova.objects.instance [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Lazy-loading 'info_cache' on Instance uuid 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 02 10:08:11 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:08:11 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:08:11 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:08:11.220 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:08:11 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:08:11 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84940032a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:08:12 compute-1 ceph-mon[80115]: from='client.? 192.168.122.102:0/3937949805' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Feb 02 10:08:12 compute-1 ceph-mon[80115]: from='client.? 192.168.122.102:0/2707187489' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Feb 02 10:08:12 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:08:12 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_31] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84940032a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:08:12 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 10:08:12 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:08:12 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_31] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f849c004340 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:08:13 compute-1 ceph-mon[80115]: pgmap v864: 353 pgs: 353 active+clean; 167 MiB data, 323 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 16 KiB/s wr, 75 op/s
Feb 02 10:08:13 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:08:13 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:08:13 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:08:13.152 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:08:13 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:08:13 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:08:13 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:08:13.221 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:08:13 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:08:13 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_23] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b800b0f0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:08:13 compute-1 nova_compute[226294]: 2026-02-02 10:08:13.988 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:08:14 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:08:14 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84940032a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:08:14 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:08:14 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8490003430 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:08:14 compute-1 ceph-mon[80115]: pgmap v865: 353 pgs: 353 active+clean; 188 MiB data, 348 MiB used, 60 GiB / 60 GiB avail; 2.2 MiB/s rd, 2.1 MiB/s wr, 125 op/s
Feb 02 10:08:15 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:08:15 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 10:08:15 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:08:15.155 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 10:08:15 compute-1 nova_compute[226294]: 2026-02-02 10:08:15.187 226298 DEBUG nova.network.neutron [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] [instance: 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220] Updating instance_info_cache with network_info: [{"id": "09a00258-4f60-42dd-a769-b2ea3b870187", "address": "fa:16:3e:85:9a:96", "network": {"id": "ba6c4c87-77a9-4fcc-aa14-a4637c78f692", "bridge": "br-int", "label": "tempest-network-smoke--761234279", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.238", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "efbfe697ca674d72b47da5adf3e42c0c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap09a00258-4f", "ovs_interfaceid": "09a00258-4f60-42dd-a769-b2ea3b870187", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "c66e0be1-d166-4088-8ad8-baa84f3d032d", "address": "fa:16:3e:2f:49:24", "network": {"id": "e125f54e-7556-49c5-8356-e7390df43c53", "bridge": "br-int", "label": "tempest-network-smoke--39971515", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.18", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "efbfe697ca674d72b47da5adf3e42c0c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc66e0be1-d1", "ovs_interfaceid": "c66e0be1-d166-4088-8ad8-baa84f3d032d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 02 10:08:15 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:08:15 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:08:15 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:08:15.223 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:08:15 compute-1 nova_compute[226294]: 2026-02-02 10:08:15.274 226298 DEBUG oslo_concurrency.lockutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Releasing lock "refresh_cache-15b2e821-5e8b-4d8a-9a48-7c6a30bd3220" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 02 10:08:15 compute-1 nova_compute[226294]: 2026-02-02 10:08:15.275 226298 DEBUG nova.compute.manager [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] [instance: 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Feb 02 10:08:15 compute-1 nova_compute[226294]: 2026-02-02 10:08:15.276 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 02 10:08:15 compute-1 nova_compute[226294]: 2026-02-02 10:08:15.276 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 02 10:08:15 compute-1 nova_compute[226294]: 2026-02-02 10:08:15.277 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 02 10:08:15 compute-1 nova_compute[226294]: 2026-02-02 10:08:15.277 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 02 10:08:15 compute-1 nova_compute[226294]: 2026-02-02 10:08:15.278 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 02 10:08:15 compute-1 nova_compute[226294]: 2026-02-02 10:08:15.278 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 02 10:08:15 compute-1 nova_compute[226294]: 2026-02-02 10:08:15.279 226298 DEBUG nova.compute.manager [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 02 10:08:15 compute-1 nova_compute[226294]: 2026-02-02 10:08:15.279 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 02 10:08:15 compute-1 nova_compute[226294]: 2026-02-02 10:08:15.348 226298 DEBUG oslo_concurrency.lockutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 02 10:08:15 compute-1 nova_compute[226294]: 2026-02-02 10:08:15.349 226298 DEBUG oslo_concurrency.lockutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 02 10:08:15 compute-1 nova_compute[226294]: 2026-02-02 10:08:15.349 226298 DEBUG oslo_concurrency.lockutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 02 10:08:15 compute-1 nova_compute[226294]: 2026-02-02 10:08:15.350 226298 DEBUG nova.compute.resource_tracker [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 02 10:08:15 compute-1 nova_compute[226294]: 2026-02-02 10:08:15.350 226298 DEBUG oslo_concurrency.processutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 02 10:08:15 compute-1 nova_compute[226294]: 2026-02-02 10:08:15.475 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:08:15 compute-1 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Feb 02 10:08:15 compute-1 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2018632270' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Feb 02 10:08:15 compute-1 nova_compute[226294]: 2026-02-02 10:08:15.804 226298 DEBUG oslo_concurrency.processutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.454s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 02 10:08:15 compute-1 ceph-mon[80115]: from='client.? 192.168.122.101:0/2018632270' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Feb 02 10:08:15 compute-1 nova_compute[226294]: 2026-02-02 10:08:15.864 226298 DEBUG nova.virt.libvirt.driver [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] skipping disk for instance-00000006 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Feb 02 10:08:15 compute-1 nova_compute[226294]: 2026-02-02 10:08:15.865 226298 DEBUG nova.virt.libvirt.driver [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] skipping disk for instance-00000006 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Feb 02 10:08:15 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:08:15 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_31] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f849c004340 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:08:16 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:08:16 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_23] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b800b0f0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:08:16 compute-1 nova_compute[226294]: 2026-02-02 10:08:16.070 226298 WARNING nova.virt.libvirt.driver [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 02 10:08:16 compute-1 nova_compute[226294]: 2026-02-02 10:08:16.071 226298 DEBUG nova.compute.resource_tracker [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4694MB free_disk=59.8979606628418GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 02 10:08:16 compute-1 nova_compute[226294]: 2026-02-02 10:08:16.072 226298 DEBUG oslo_concurrency.lockutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 02 10:08:16 compute-1 nova_compute[226294]: 2026-02-02 10:08:16.072 226298 DEBUG oslo_concurrency.lockutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 02 10:08:16 compute-1 nova_compute[226294]: 2026-02-02 10:08:16.212 226298 DEBUG nova.compute.resource_tracker [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Instance 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Feb 02 10:08:16 compute-1 nova_compute[226294]: 2026-02-02 10:08:16.213 226298 DEBUG nova.compute.resource_tracker [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 02 10:08:16 compute-1 nova_compute[226294]: 2026-02-02 10:08:16.213 226298 DEBUG nova.compute.resource_tracker [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=59GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 02 10:08:16 compute-1 nova_compute[226294]: 2026-02-02 10:08:16.371 226298 DEBUG oslo_concurrency.processutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 02 10:08:16 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:08:16 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b4001a30 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:08:16 compute-1 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Feb 02 10:08:16 compute-1 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2361210182' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Feb 02 10:08:16 compute-1 nova_compute[226294]: 2026-02-02 10:08:16.808 226298 DEBUG oslo_concurrency.processutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.437s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 02 10:08:16 compute-1 nova_compute[226294]: 2026-02-02 10:08:16.849 226298 DEBUG nova.compute.provider_tree [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Inventory has not changed in ProviderTree for provider: 8e32c057-ad28-4c19-8374-763e0c1c8622 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 02 10:08:16 compute-1 nova_compute[226294]: 2026-02-02 10:08:16.869 226298 DEBUG nova.scheduler.client.report [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Inventory has not changed for provider 8e32c057-ad28-4c19-8374-763e0c1c8622 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 02 10:08:16 compute-1 ceph-mon[80115]: pgmap v866: 353 pgs: 353 active+clean; 188 MiB data, 348 MiB used, 60 GiB / 60 GiB avail; 283 KiB/s rd, 2.0 MiB/s wr, 50 op/s
Feb 02 10:08:16 compute-1 ceph-mon[80115]: from='client.? 192.168.122.101:0/2361210182' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Feb 02 10:08:16 compute-1 nova_compute[226294]: 2026-02-02 10:08:16.870 226298 DEBUG nova.compute.resource_tracker [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 02 10:08:16 compute-1 nova_compute[226294]: 2026-02-02 10:08:16.871 226298 DEBUG oslo_concurrency.lockutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.799s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 02 10:08:16 compute-1 nova_compute[226294]: 2026-02-02 10:08:16.871 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 02 10:08:16 compute-1 nova_compute[226294]: 2026-02-02 10:08:16.871 226298 DEBUG nova.compute.manager [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Feb 02 10:08:16 compute-1 nova_compute[226294]: 2026-02-02 10:08:16.885 226298 DEBUG nova.compute.manager [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Feb 02 10:08:16 compute-1 nova_compute[226294]: 2026-02-02 10:08:16.885 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 02 10:08:16 compute-1 nova_compute[226294]: 2026-02-02 10:08:16.885 226298 DEBUG nova.compute.manager [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Feb 02 10:08:16 compute-1 nova_compute[226294]: 2026-02-02 10:08:16.898 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 02 10:08:17 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:08:17 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:08:17 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:08:17.158 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:08:17 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:08:17 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 10:08:17 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:08:17.225 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 10:08:17 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 10:08:17 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Feb 02 10:08:17 compute-1 nova_compute[226294]: 2026-02-02 10:08:17.904 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 02 10:08:17 compute-1 nova_compute[226294]: 2026-02-02 10:08:17.905 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 02 10:08:17 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:08:17 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_34] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f848c001d70 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:08:18 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:08:18 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_33] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f849c004340 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:08:18 compute-1 sshd-session[232767]: Invalid user solv from 80.94.92.184 port 53340
Feb 02 10:08:18 compute-1 sshd-session[232767]: Connection closed by invalid user solv 80.94.92.184 port 53340 [preauth]
Feb 02 10:08:18 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:08:18 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_23] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b800b0f0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:08:18 compute-1 ceph-mon[80115]: pgmap v867: 353 pgs: 353 active+clean; 200 MiB data, 349 MiB used, 60 GiB / 60 GiB avail; 325 KiB/s rd, 2.1 MiB/s wr, 65 op/s
Feb 02 10:08:18 compute-1 nova_compute[226294]: 2026-02-02 10:08:18.992 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:08:19 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:08:19 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:08:19 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:08:19.161 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:08:19 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:08:19 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:08:19 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:08:19.228 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:08:19 compute-1 sudo[232771]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Feb 02 10:08:19 compute-1 sudo[232771]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 10:08:19 compute-1 sudo[232771]: pam_unix(sudo:session): session closed for user root
Feb 02 10:08:19 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:08:19 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b4001a30 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:08:20 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:08:20 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_34] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f848c001d70 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:08:20 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:08:20 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_33] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f849c004340 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:08:20 compute-1 nova_compute[226294]: 2026-02-02 10:08:20.494 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:08:20 compute-1 ceph-mon[80115]: pgmap v868: 353 pgs: 353 active+clean; 200 MiB data, 349 MiB used, 60 GiB / 60 GiB avail; 325 KiB/s rd, 2.1 MiB/s wr, 65 op/s
Feb 02 10:08:21 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:08:21 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 10:08:21 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:08:21.163 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 10:08:21 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:08:21 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:08:21 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:08:21.231 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:08:21 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:08:21 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_23] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b800b0f0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:08:22 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:08:22 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b4004020 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:08:22 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 10:08:22 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:08:22 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f848c001d70 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:08:22 compute-1 ceph-mon[80115]: pgmap v869: 353 pgs: 353 active+clean; 200 MiB data, 349 MiB used, 60 GiB / 60 GiB avail; 325 KiB/s rd, 2.1 MiB/s wr, 65 op/s
Feb 02 10:08:23 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:08:23 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:08:23 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:08:23.166 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:08:23 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:08:23 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:08:23 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:08:23.233 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:08:23 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:08:23 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_33] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f849c004340 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:08:24 compute-1 nova_compute[226294]: 2026-02-02 10:08:24.025 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:08:24 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:08:24 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_23] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b4004020 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:08:24 compute-1 ovn_controller[133666]: 2026-02-02T10:08:24Z|00049|memory_trim|INFO|Detected inactivity (last active 30009 ms ago): trimming memory
Feb 02 10:08:24 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:08:24 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_34] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b800b0f0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:08:24 compute-1 ceph-mon[80115]: pgmap v870: 353 pgs: 353 active+clean; 200 MiB data, 349 MiB used, 60 GiB / 60 GiB avail; 325 KiB/s rd, 2.2 MiB/s wr, 67 op/s
Feb 02 10:08:25 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:08:25 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 10:08:25 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:08:25.168 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 10:08:25 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:08:25 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 10:08:25 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:08:25.235 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 10:08:25 compute-1 nova_compute[226294]: 2026-02-02 10:08:25.382 226298 DEBUG nova.compute.manager [req-129c5940-133c-4a38-ab87-93cdb2440aac req-40c57ac7-1efd-445f-8469-7b8a984d93bb b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] [instance: 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220] Received event network-changed-c66e0be1-d166-4088-8ad8-baa84f3d032d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 02 10:08:25 compute-1 nova_compute[226294]: 2026-02-02 10:08:25.383 226298 DEBUG nova.compute.manager [req-129c5940-133c-4a38-ab87-93cdb2440aac req-40c57ac7-1efd-445f-8469-7b8a984d93bb b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] [instance: 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220] Refreshing instance network info cache due to event network-changed-c66e0be1-d166-4088-8ad8-baa84f3d032d. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Feb 02 10:08:25 compute-1 nova_compute[226294]: 2026-02-02 10:08:25.383 226298 DEBUG oslo_concurrency.lockutils [req-129c5940-133c-4a38-ab87-93cdb2440aac req-40c57ac7-1efd-445f-8469-7b8a984d93bb b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] Acquiring lock "refresh_cache-15b2e821-5e8b-4d8a-9a48-7c6a30bd3220" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 02 10:08:25 compute-1 nova_compute[226294]: 2026-02-02 10:08:25.383 226298 DEBUG oslo_concurrency.lockutils [req-129c5940-133c-4a38-ab87-93cdb2440aac req-40c57ac7-1efd-445f-8469-7b8a984d93bb b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] Acquired lock "refresh_cache-15b2e821-5e8b-4d8a-9a48-7c6a30bd3220" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 02 10:08:25 compute-1 nova_compute[226294]: 2026-02-02 10:08:25.384 226298 DEBUG nova.network.neutron [req-129c5940-133c-4a38-ab87-93cdb2440aac req-40c57ac7-1efd-445f-8469-7b8a984d93bb b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] [instance: 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220] Refreshing network info cache for port c66e0be1-d166-4088-8ad8-baa84f3d032d _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Feb 02 10:08:25 compute-1 nova_compute[226294]: 2026-02-02 10:08:25.496 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:08:25 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:08:25 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f848c001f10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:08:26 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:08:26 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_33] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f849c004340 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:08:26 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:08:26 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_23] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b4004020 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:08:26 compute-1 ceph-mon[80115]: pgmap v871: 353 pgs: 353 active+clean; 200 MiB data, 349 MiB used, 60 GiB / 60 GiB avail; 42 KiB/s rd, 115 KiB/s wr, 16 op/s
Feb 02 10:08:27 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:08:27 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:08:27 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:08:27.171 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:08:27 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:08:27 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:08:27 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:08:27.238 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:08:27 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 10:08:27 compute-1 nova_compute[226294]: 2026-02-02 10:08:27.408 226298 DEBUG nova.network.neutron [req-129c5940-133c-4a38-ab87-93cdb2440aac req-40c57ac7-1efd-445f-8469-7b8a984d93bb b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] [instance: 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220] Updated VIF entry in instance network info cache for port c66e0be1-d166-4088-8ad8-baa84f3d032d. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Feb 02 10:08:27 compute-1 nova_compute[226294]: 2026-02-02 10:08:27.408 226298 DEBUG nova.network.neutron [req-129c5940-133c-4a38-ab87-93cdb2440aac req-40c57ac7-1efd-445f-8469-7b8a984d93bb b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] [instance: 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220] Updating instance_info_cache with network_info: [{"id": "09a00258-4f60-42dd-a769-b2ea3b870187", "address": "fa:16:3e:85:9a:96", "network": {"id": "ba6c4c87-77a9-4fcc-aa14-a4637c78f692", "bridge": "br-int", "label": "tempest-network-smoke--761234279", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.238", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "efbfe697ca674d72b47da5adf3e42c0c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap09a00258-4f", "ovs_interfaceid": "09a00258-4f60-42dd-a769-b2ea3b870187", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "c66e0be1-d166-4088-8ad8-baa84f3d032d", "address": "fa:16:3e:2f:49:24", "network": {"id": "e125f54e-7556-49c5-8356-e7390df43c53", "bridge": "br-int", "label": "tempest-network-smoke--39971515", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.18", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "efbfe697ca674d72b47da5adf3e42c0c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc66e0be1-d1", "ovs_interfaceid": "c66e0be1-d166-4088-8ad8-baa84f3d032d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 02 10:08:27 compute-1 nova_compute[226294]: 2026-02-02 10:08:27.421 226298 DEBUG oslo_concurrency.lockutils [req-129c5940-133c-4a38-ab87-93cdb2440aac req-40c57ac7-1efd-445f-8469-7b8a984d93bb b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] Releasing lock "refresh_cache-15b2e821-5e8b-4d8a-9a48-7c6a30bd3220" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 02 10:08:27 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:08:27 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_34] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b800b0f0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:08:28 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:08:28 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f848c003b40 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:08:28 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:08:28 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_33] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f849c004340 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:08:28 compute-1 ceph-mon[80115]: pgmap v872: 353 pgs: 353 active+clean; 200 MiB data, 349 MiB used, 60 GiB / 60 GiB avail; 42 KiB/s rd, 122 KiB/s wr, 17 op/s
Feb 02 10:08:29 compute-1 nova_compute[226294]: 2026-02-02 10:08:29.058 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:08:29 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:08:29 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:08:29 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:08:29.173 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:08:29 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:08:29 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:08:29 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:08:29.239 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:08:29 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:08:29 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_23] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b4004020 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:08:30 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:08:30 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_34] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b800b0f0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:08:30 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:08:30 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f848c003b40 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:08:30 compute-1 nova_compute[226294]: 2026-02-02 10:08:30.537 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:08:30 compute-1 ceph-mon[80115]: pgmap v873: 353 pgs: 353 active+clean; 200 MiB data, 349 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 27 KiB/s wr, 2 op/s
Feb 02 10:08:31 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:08:31 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:08:31 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:08:31.175 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:08:31 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:08:31 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:08:31 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:08:31.242 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:08:31 compute-1 sudo[232802]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 02 10:08:31 compute-1 sudo[232802]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 10:08:31 compute-1 sudo[232802]: pam_unix(sudo:session): session closed for user root
Feb 02 10:08:31 compute-1 sudo[232833]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/d241d473-9fcb-5f74-b163-f1ca4454e7f1/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Feb 02 10:08:31 compute-1 sudo[232833]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 10:08:31 compute-1 podman[232826]: 2026-02-02 10:08:31.642891158 +0000 UTC m=+0.090379023 container health_status 1fb2696999ea15e9688144989093738543d3cef725f9986792d6dfd707aefbe2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:099d88ae13fa2b3409da5310cdcba7fa01d2c87a8bc98296299a57054b9a075e, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'db4758ee7523fe447444c4bd2b867b543b1eee4e3bbcf6676cd1b27bf6147d86-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:099d88ae13fa2b3409da5310cdcba7fa01d2c87a8bc98296299a57054b9a075e', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, tcib_managed=true, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 02 10:08:31 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:08:31 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_33] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f849c004340 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:08:32 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:08:32 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_23] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b4004020 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:08:32 compute-1 sudo[232833]: pam_unix(sudo:session): session closed for user root
Feb 02 10:08:32 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 10:08:32 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:08:32 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_34] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b800b0f0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:08:33 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb 02 10:08:33 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb 02 10:08:33 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Feb 02 10:08:33 compute-1 ceph-mon[80115]: pgmap v874: 353 pgs: 353 active+clean; 200 MiB data, 349 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 27 KiB/s wr, 2 op/s
Feb 02 10:08:33 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Feb 02 10:08:33 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Feb 02 10:08:33 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb 02 10:08:33 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb 02 10:08:33 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Feb 02 10:08:33 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Feb 02 10:08:33 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Feb 02 10:08:33 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:08:33 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:08:33 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:08:33.178 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:08:33 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:08:33 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:08:33 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:08:33.245 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:08:33 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:08:33 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f848c003b40 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:08:34 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:08:34 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_33] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f849c004340 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:08:34 compute-1 nova_compute[226294]: 2026-02-02 10:08:34.066 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:08:34 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:08:34 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_23] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b4004020 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:08:34 compute-1 ceph-mon[80115]: pgmap v875: 353 pgs: 353 active+clean; 200 MiB data, 349 MiB used, 60 GiB / 60 GiB avail; 426 B/s rd, 30 KiB/s wr, 3 op/s
Feb 02 10:08:35 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:08:35 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:08:35 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:08:35.180 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:08:35 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:08:35 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:08:35 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:08:35.247 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:08:35 compute-1 nova_compute[226294]: 2026-02-02 10:08:35.591 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:08:35 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:08:35 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_34] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b800b0f0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:08:36 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:08:36 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f848c003b40 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:08:36 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:08:36 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_33] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f849c004340 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:08:36 compute-1 ceph-mon[80115]: pgmap v876: 353 pgs: 353 active+clean; 200 MiB data, 349 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 9.7 KiB/s wr, 1 op/s
Feb 02 10:08:37 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:08:37 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 10:08:37 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:08:37.182 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 10:08:37 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:08:37 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:08:37 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:08:37.249 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:08:37 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 10:08:37 compute-1 sudo[232912]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 02 10:08:37 compute-1 sudo[232912]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 10:08:37 compute-1 sudo[232912]: pam_unix(sudo:session): session closed for user root
Feb 02 10:08:37 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:08:37 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_23] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b4004020 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:08:38 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:08:38 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_34] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b800b0f0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:08:38 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:08:38 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f848c003b40 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:08:38 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb 02 10:08:38 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb 02 10:08:39 compute-1 nova_compute[226294]: 2026-02-02 10:08:39.068 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:08:39 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:08:39 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 10:08:39 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:08:39.185 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 10:08:39 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:08:39 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:08:39 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:08:39.266 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:08:39 compute-1 podman[232938]: 2026-02-02 10:08:39.392508601 +0000 UTC m=+0.062843151 container health_status 9cab238676dda9a572dafc047b624a8b78a8fbec063bf997856559897160605d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'db4758ee7523fe447444c4bd2b867b543b1eee4e3bbcf6676cd1b27bf6147d86-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, container_name=ovn_metadata_agent)
Feb 02 10:08:39 compute-1 ceph-mon[80115]: pgmap v877: 353 pgs: 353 active+clean; 200 MiB data, 349 MiB used, 60 GiB / 60 GiB avail; 341 B/s rd, 12 KiB/s wr, 1 op/s
Feb 02 10:08:39 compute-1 sudo[232959]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Feb 02 10:08:39 compute-1 sudo[232959]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 10:08:39 compute-1 sudo[232959]: pam_unix(sudo:session): session closed for user root
Feb 02 10:08:39 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:08:39 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_33] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f849c004340 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:08:40 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:08:40 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_34] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f849c004340 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:08:40 compute-1 nova_compute[226294]: 2026-02-02 10:08:40.405 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 02 10:08:40 compute-1 nova_compute[226294]: 2026-02-02 10:08:40.434 226298 DEBUG nova.compute.manager [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Triggering sync for uuid 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220 _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268
Feb 02 10:08:40 compute-1 nova_compute[226294]: 2026-02-02 10:08:40.435 226298 DEBUG oslo_concurrency.lockutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Acquiring lock "15b2e821-5e8b-4d8a-9a48-7c6a30bd3220" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 02 10:08:40 compute-1 nova_compute[226294]: 2026-02-02 10:08:40.435 226298 DEBUG oslo_concurrency.lockutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Lock "15b2e821-5e8b-4d8a-9a48-7c6a30bd3220" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 02 10:08:40 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:08:40 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_34] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b800b0f0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:08:40 compute-1 nova_compute[226294]: 2026-02-02 10:08:40.496 226298 DEBUG oslo_concurrency.lockutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Lock "15b2e821-5e8b-4d8a-9a48-7c6a30bd3220" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.061s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 02 10:08:40 compute-1 nova_compute[226294]: 2026-02-02 10:08:40.614 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:08:40 compute-1 ceph-mon[80115]: pgmap v878: 353 pgs: 353 active+clean; 200 MiB data, 349 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 5.3 KiB/s wr, 1 op/s
Feb 02 10:08:41 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:08:41 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 10:08:41 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:08:41.187 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 10:08:41 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:08:41 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:08:41 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:08:41.268 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:08:41 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:08:41 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f848c003b40 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:08:42 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:08:42 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_33] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f849c004340 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:08:42 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 10:08:42 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:08:42 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_23] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b4004020 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:08:42 compute-1 ceph-mon[80115]: pgmap v879: 353 pgs: 353 active+clean; 200 MiB data, 349 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 5.3 KiB/s wr, 1 op/s
Feb 02 10:08:43 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:08:43 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:08:43 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:08:43.190 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:08:43 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:08:43 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 10:08:43 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:08:43.270 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 10:08:43 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:08:43 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b4004020 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:08:44 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:08:44 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f848c003b40 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:08:44 compute-1 nova_compute[226294]: 2026-02-02 10:08:44.072 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:08:44 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:08:44 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_33] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f849c004340 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:08:44 compute-1 ceph-mon[80115]: pgmap v880: 353 pgs: 353 active+clean; 200 MiB data, 349 MiB used, 60 GiB / 60 GiB avail; 426 B/s rd, 6.7 KiB/s wr, 1 op/s
Feb 02 10:08:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 10:08:44.908 143542 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 02 10:08:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 10:08:44.908 143542 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 02 10:08:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 10:08:44.909 143542 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 02 10:08:45 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:08:45 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 10:08:45 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:08:45.194 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 10:08:45 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:08:45 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:08:45 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:08:45.273 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:08:45 compute-1 nova_compute[226294]: 2026-02-02 10:08:45.654 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:08:45 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:08:45 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_34] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f849c004340 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:08:46 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:08:46 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_34] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f849c004340 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:08:46 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:08:46 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_34] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f848c003b40 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:08:46 compute-1 ceph-mon[80115]: pgmap v881: 353 pgs: 353 active+clean; 200 MiB data, 349 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 3.7 KiB/s wr, 0 op/s
Feb 02 10:08:47 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:08:47 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 10:08:47 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:08:47.197 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 10:08:47 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:08:47 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:08:47 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:08:47.275 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:08:47 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 10:08:47 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Feb 02 10:08:47 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:08:47 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_33] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f849c004340 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:08:48 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:08:48 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_35] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b800b150 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:08:48 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:08:48 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f848c003b40 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:08:48 compute-1 ceph-mon[80115]: pgmap v882: 353 pgs: 353 active+clean; 200 MiB data, 349 MiB used, 60 GiB / 60 GiB avail; 341 B/s rd, 6.0 KiB/s wr, 1 op/s
Feb 02 10:08:49 compute-1 nova_compute[226294]: 2026-02-02 10:08:49.075 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:08:49 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:08:49 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 10:08:49 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:08:49.200 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 10:08:49 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:08:49 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:08:49 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:08:49.277 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:08:49 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:08:49 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_33] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8494001bd0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:08:50 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:08:50 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_37] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8490002690 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:08:50 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:08:50 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_35] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b800b170 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:08:50 compute-1 nova_compute[226294]: 2026-02-02 10:08:50.708 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:08:50 compute-1 ceph-mon[80115]: pgmap v883: 353 pgs: 353 active+clean; 200 MiB data, 349 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 3.7 KiB/s wr, 0 op/s
Feb 02 10:08:51 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:08:51 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:08:51 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:08:51.202 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:08:51 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:08:51 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:08:51 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:08:51.278 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:08:51 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:08:51 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f848c003b40 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:08:52 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:08:52 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_33] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8494001bd0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:08:52 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 10:08:52 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:08:52 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_33] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8490002690 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:08:52 compute-1 ceph-mon[80115]: pgmap v884: 353 pgs: 353 active+clean; 200 MiB data, 349 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 3.7 KiB/s wr, 0 op/s
Feb 02 10:08:53 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:08:53 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:08:53 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:08:53.205 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:08:53 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:08:53 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:08:53 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:08:53.281 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:08:53 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:08:53 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_35] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b800b190 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:08:54 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:08:54 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_37] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b800b190 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:08:54 compute-1 nova_compute[226294]: 2026-02-02 10:08:54.079 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:08:54 compute-1 nova_compute[226294]: 2026-02-02 10:08:54.228 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:08:54 compute-1 ovn_metadata_agent[143537]: 2026-02-02 10:08:54.229 143542 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=9, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '66:4f:4d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '4a:a7:f3:61:65:15'}, ipsec=False) old=SB_Global(nb_cfg=8) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 02 10:08:54 compute-1 ovn_metadata_agent[143537]: 2026-02-02 10:08:54.230 143542 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Feb 02 10:08:54 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:08:54 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_35] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b800b190 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:08:54 compute-1 ceph-mon[80115]: pgmap v885: 353 pgs: 353 active+clean; 121 MiB data, 307 MiB used, 60 GiB / 60 GiB avail; 19 KiB/s rd, 7.5 KiB/s wr, 29 op/s
Feb 02 10:08:55 compute-1 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Feb 02 10:08:55 compute-1 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2188296208' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Feb 02 10:08:55 compute-1 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Feb 02 10:08:55 compute-1 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2188296208' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Feb 02 10:08:55 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:08:55 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:08:55 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:08:55.207 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:08:55 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:08:55 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:08:55 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:08:55.283 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:08:55 compute-1 nova_compute[226294]: 2026-02-02 10:08:55.711 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:08:55 compute-1 ceph-mon[80115]: from='client.? 192.168.122.10:0/2188296208' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Feb 02 10:08:55 compute-1 ceph-mon[80115]: from='client.? 192.168.122.10:0/2188296208' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Feb 02 10:08:55 compute-1 ceph-mon[80115]: from='client.? 192.168.122.100:0/4280504931' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Feb 02 10:08:55 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:08:55 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8498003b00 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:08:56 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:08:56 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_36] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f848c003b40 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:08:56 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:08:56 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_33] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8490003040 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:08:56 compute-1 ceph-mon[80115]: pgmap v886: 353 pgs: 353 active+clean; 121 MiB data, 307 MiB used, 60 GiB / 60 GiB avail; 19 KiB/s rd, 6.2 KiB/s wr, 28 op/s
Feb 02 10:08:57 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:08:57 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 10:08:57 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:08:57.209 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 10:08:57 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:08:57 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:08:57 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:08:57.285 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:08:57 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 10:08:57 compute-1 nova_compute[226294]: 2026-02-02 10:08:57.489 226298 DEBUG oslo_concurrency.lockutils [None req-a9d9f71d-a6d4-4c48-8fe9-dc7b77e7a7c4 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Acquiring lock "interface-15b2e821-5e8b-4d8a-9a48-7c6a30bd3220-c66e0be1-d166-4088-8ad8-baa84f3d032d" by "nova.compute.manager.ComputeManager.detach_interface.<locals>.do_detach_interface" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 02 10:08:57 compute-1 nova_compute[226294]: 2026-02-02 10:08:57.489 226298 DEBUG oslo_concurrency.lockutils [None req-a9d9f71d-a6d4-4c48-8fe9-dc7b77e7a7c4 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Lock "interface-15b2e821-5e8b-4d8a-9a48-7c6a30bd3220-c66e0be1-d166-4088-8ad8-baa84f3d032d" acquired by "nova.compute.manager.ComputeManager.detach_interface.<locals>.do_detach_interface" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 02 10:08:57 compute-1 nova_compute[226294]: 2026-02-02 10:08:57.508 226298 DEBUG nova.objects.instance [None req-a9d9f71d-a6d4-4c48-8fe9-dc7b77e7a7c4 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Lazy-loading 'flavor' on Instance uuid 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 02 10:08:57 compute-1 nova_compute[226294]: 2026-02-02 10:08:57.535 226298 DEBUG nova.virt.libvirt.vif [None req-a9d9f71d-a6d4-4c48-8fe9-dc7b77e7a7c4 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-02-02T10:07:00Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1612354759',display_name='tempest-TestNetworkBasicOps-server-1612354759',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1612354759',id=6,image_ref='d5e062d7-95ef-409c-9ad0-60f7cf6f44ce',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBMJFk+K1ECuOT65iOILNHAamJ3gGwluebBSGyHnh3tJwd+ehxpJY8ebkg+pBKZw/EcMgvzSZ3FmRm5/iJU1QzjfTd7kqiXGqABRWM3LPcjo/Kmnp/RcvKBxgSpfhTx8kiA==',key_name='tempest-TestNetworkBasicOps-1228211887',keypairs=<?>,launch_index=0,launched_at=2026-02-02T10:07:09Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='efbfe697ca674d72b47da5adf3e42c0c',ramdisk_id='',reservation_id='r-1fnfubus',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='d5e062d7-95ef-409c-9ad0-60f7cf6f44ce',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-793549693',owner_user_name='tempest-TestNetworkBasicOps-793549693-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-02-02T10:07:09Z,user_data=None,user_id='1b1695a2a70d4aa0aa350ba17d8f6d5e',uuid=15b2e821-5e8b-4d8a-9a48-7c6a30bd3220,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "c66e0be1-d166-4088-8ad8-baa84f3d032d", "address": "fa:16:3e:2f:49:24", "network": {"id": "e125f54e-7556-49c5-8356-e7390df43c53", "bridge": "br-int", "label": "tempest-network-smoke--39971515", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.18", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "efbfe697ca674d72b47da5adf3e42c0c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc66e0be1-d1", "ovs_interfaceid": "c66e0be1-d166-4088-8ad8-baa84f3d032d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Feb 02 10:08:57 compute-1 nova_compute[226294]: 2026-02-02 10:08:57.535 226298 DEBUG nova.network.os_vif_util [None req-a9d9f71d-a6d4-4c48-8fe9-dc7b77e7a7c4 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Converting VIF {"id": "c66e0be1-d166-4088-8ad8-baa84f3d032d", "address": "fa:16:3e:2f:49:24", "network": {"id": "e125f54e-7556-49c5-8356-e7390df43c53", "bridge": "br-int", "label": "tempest-network-smoke--39971515", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.18", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "efbfe697ca674d72b47da5adf3e42c0c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc66e0be1-d1", "ovs_interfaceid": "c66e0be1-d166-4088-8ad8-baa84f3d032d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 02 10:08:57 compute-1 nova_compute[226294]: 2026-02-02 10:08:57.536 226298 DEBUG nova.network.os_vif_util [None req-a9d9f71d-a6d4-4c48-8fe9-dc7b77e7a7c4 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:2f:49:24,bridge_name='br-int',has_traffic_filtering=True,id=c66e0be1-d166-4088-8ad8-baa84f3d032d,network=Network(e125f54e-7556-49c5-8356-e7390df43c53),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc66e0be1-d1') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 02 10:08:57 compute-1 nova_compute[226294]: 2026-02-02 10:08:57.540 226298 DEBUG nova.virt.libvirt.guest [None req-a9d9f71d-a6d4-4c48-8fe9-dc7b77e7a7c4 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:2f:49:24"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapc66e0be1-d1"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257
Feb 02 10:08:57 compute-1 nova_compute[226294]: 2026-02-02 10:08:57.542 226298 DEBUG nova.virt.libvirt.guest [None req-a9d9f71d-a6d4-4c48-8fe9-dc7b77e7a7c4 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:2f:49:24"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapc66e0be1-d1"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257
Feb 02 10:08:57 compute-1 nova_compute[226294]: 2026-02-02 10:08:57.545 226298 DEBUG nova.virt.libvirt.driver [None req-a9d9f71d-a6d4-4c48-8fe9-dc7b77e7a7c4 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Attempting to detach device tapc66e0be1-d1 from instance 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220 from the persistent domain config. _detach_from_persistent /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2487
Feb 02 10:08:57 compute-1 nova_compute[226294]: 2026-02-02 10:08:57.546 226298 DEBUG nova.virt.libvirt.guest [None req-a9d9f71d-a6d4-4c48-8fe9-dc7b77e7a7c4 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] detach device xml: <interface type="ethernet">
Feb 02 10:08:57 compute-1 nova_compute[226294]:   <mac address="fa:16:3e:2f:49:24"/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:   <model type="virtio"/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:   <driver name="vhost" rx_queue_size="512"/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:   <mtu size="1442"/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:   <target dev="tapc66e0be1-d1"/>
Feb 02 10:08:57 compute-1 nova_compute[226294]: </interface>
Feb 02 10:08:57 compute-1 nova_compute[226294]:  detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465
Feb 02 10:08:57 compute-1 nova_compute[226294]: 2026-02-02 10:08:57.552 226298 DEBUG nova.virt.libvirt.guest [None req-a9d9f71d-a6d4-4c48-8fe9-dc7b77e7a7c4 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:2f:49:24"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapc66e0be1-d1"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257
Feb 02 10:08:57 compute-1 nova_compute[226294]: 2026-02-02 10:08:57.555 226298 DEBUG nova.virt.libvirt.guest [None req-a9d9f71d-a6d4-4c48-8fe9-dc7b77e7a7c4 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:2f:49:24"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapc66e0be1-d1"/></interface>not found in domain: <domain type='kvm' id='2'>
Feb 02 10:08:57 compute-1 nova_compute[226294]:   <name>instance-00000006</name>
Feb 02 10:08:57 compute-1 nova_compute[226294]:   <uuid>15b2e821-5e8b-4d8a-9a48-7c6a30bd3220</uuid>
Feb 02 10:08:57 compute-1 nova_compute[226294]:   <metadata>
Feb 02 10:08:57 compute-1 nova_compute[226294]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Feb 02 10:08:57 compute-1 nova_compute[226294]:   <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:   <nova:name>tempest-TestNetworkBasicOps-server-1612354759</nova:name>
Feb 02 10:08:57 compute-1 nova_compute[226294]:   <nova:creationTime>2026-02-02 10:07:39</nova:creationTime>
Feb 02 10:08:57 compute-1 nova_compute[226294]:   <nova:flavor name="m1.nano">
Feb 02 10:08:57 compute-1 nova_compute[226294]:     <nova:memory>128</nova:memory>
Feb 02 10:08:57 compute-1 nova_compute[226294]:     <nova:disk>1</nova:disk>
Feb 02 10:08:57 compute-1 nova_compute[226294]:     <nova:swap>0</nova:swap>
Feb 02 10:08:57 compute-1 nova_compute[226294]:     <nova:ephemeral>0</nova:ephemeral>
Feb 02 10:08:57 compute-1 nova_compute[226294]:     <nova:vcpus>1</nova:vcpus>
Feb 02 10:08:57 compute-1 nova_compute[226294]:   </nova:flavor>
Feb 02 10:08:57 compute-1 nova_compute[226294]:   <nova:owner>
Feb 02 10:08:57 compute-1 nova_compute[226294]:     <nova:user uuid="1b1695a2a70d4aa0aa350ba17d8f6d5e">tempest-TestNetworkBasicOps-793549693-project-member</nova:user>
Feb 02 10:08:57 compute-1 nova_compute[226294]:     <nova:project uuid="efbfe697ca674d72b47da5adf3e42c0c">tempest-TestNetworkBasicOps-793549693</nova:project>
Feb 02 10:08:57 compute-1 nova_compute[226294]:   </nova:owner>
Feb 02 10:08:57 compute-1 nova_compute[226294]:   <nova:root type="image" uuid="d5e062d7-95ef-409c-9ad0-60f7cf6f44ce"/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:   <nova:ports>
Feb 02 10:08:57 compute-1 nova_compute[226294]:     <nova:port uuid="09a00258-4f60-42dd-a769-b2ea3b870187">
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:     </nova:port>
Feb 02 10:08:57 compute-1 nova_compute[226294]:     <nova:port uuid="c66e0be1-d166-4088-8ad8-baa84f3d032d">
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <nova:ip type="fixed" address="10.100.0.18" ipVersion="4"/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:     </nova:port>
Feb 02 10:08:57 compute-1 nova_compute[226294]:   </nova:ports>
Feb 02 10:08:57 compute-1 nova_compute[226294]: </nova:instance>
Feb 02 10:08:57 compute-1 nova_compute[226294]:   </metadata>
Feb 02 10:08:57 compute-1 nova_compute[226294]:   <memory unit='KiB'>131072</memory>
Feb 02 10:08:57 compute-1 nova_compute[226294]:   <currentMemory unit='KiB'>131072</currentMemory>
Feb 02 10:08:57 compute-1 nova_compute[226294]:   <vcpu placement='static'>1</vcpu>
Feb 02 10:08:57 compute-1 nova_compute[226294]:   <resource>
Feb 02 10:08:57 compute-1 nova_compute[226294]:     <partition>/machine</partition>
Feb 02 10:08:57 compute-1 nova_compute[226294]:   </resource>
Feb 02 10:08:57 compute-1 nova_compute[226294]:   <sysinfo type='smbios'>
Feb 02 10:08:57 compute-1 nova_compute[226294]:     <system>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <entry name='manufacturer'>RDO</entry>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <entry name='product'>OpenStack Compute</entry>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <entry name='version'>27.5.2-0.20260127144738.eaa65f0.el9</entry>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <entry name='serial'>15b2e821-5e8b-4d8a-9a48-7c6a30bd3220</entry>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <entry name='uuid'>15b2e821-5e8b-4d8a-9a48-7c6a30bd3220</entry>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <entry name='family'>Virtual Machine</entry>
Feb 02 10:08:57 compute-1 nova_compute[226294]:     </system>
Feb 02 10:08:57 compute-1 nova_compute[226294]:   </sysinfo>
Feb 02 10:08:57 compute-1 nova_compute[226294]:   <os>
Feb 02 10:08:57 compute-1 nova_compute[226294]:     <type arch='x86_64' machine='pc-q35-rhel9.8.0'>hvm</type>
Feb 02 10:08:57 compute-1 nova_compute[226294]:     <boot dev='hd'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:     <smbios mode='sysinfo'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:   </os>
Feb 02 10:08:57 compute-1 nova_compute[226294]:   <features>
Feb 02 10:08:57 compute-1 nova_compute[226294]:     <acpi/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:     <apic/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:     <vmcoreinfo state='on'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:   </features>
Feb 02 10:08:57 compute-1 nova_compute[226294]:   <cpu mode='custom' match='exact' check='full'>
Feb 02 10:08:57 compute-1 nova_compute[226294]:     <model fallback='forbid'>EPYC-Rome</model>
Feb 02 10:08:57 compute-1 nova_compute[226294]:     <vendor>AMD</vendor>
Feb 02 10:08:57 compute-1 nova_compute[226294]:     <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:     <feature policy='require' name='x2apic'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:     <feature policy='require' name='tsc-deadline'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:     <feature policy='require' name='hypervisor'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:     <feature policy='require' name='tsc_adjust'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:     <feature policy='require' name='spec-ctrl'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:     <feature policy='require' name='stibp'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:     <feature policy='require' name='ssbd'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:     <feature policy='require' name='cmp_legacy'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:     <feature policy='require' name='overflow-recov'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:     <feature policy='require' name='succor'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:     <feature policy='require' name='ibrs'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:     <feature policy='require' name='amd-ssbd'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:     <feature policy='require' name='virt-ssbd'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:     <feature policy='disable' name='lbrv'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:     <feature policy='disable' name='tsc-scale'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:     <feature policy='disable' name='vmcb-clean'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:     <feature policy='disable' name='flushbyasid'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:     <feature policy='disable' name='pause-filter'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:     <feature policy='disable' name='pfthreshold'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:     <feature policy='disable' name='svme-addr-chk'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:     <feature policy='require' name='lfence-always-serializing'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:     <feature policy='disable' name='xsaves'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:     <feature policy='disable' name='svm'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:     <feature policy='require' name='topoext'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:     <feature policy='disable' name='npt'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:     <feature policy='disable' name='nrip-save'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:   </cpu>
Feb 02 10:08:57 compute-1 nova_compute[226294]:   <clock offset='utc'>
Feb 02 10:08:57 compute-1 nova_compute[226294]:     <timer name='pit' tickpolicy='delay'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:     <timer name='rtc' tickpolicy='catchup'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:     <timer name='hpet' present='no'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:   </clock>
Feb 02 10:08:57 compute-1 nova_compute[226294]:   <on_poweroff>destroy</on_poweroff>
Feb 02 10:08:57 compute-1 nova_compute[226294]:   <on_reboot>restart</on_reboot>
Feb 02 10:08:57 compute-1 nova_compute[226294]:   <on_crash>destroy</on_crash>
Feb 02 10:08:57 compute-1 nova_compute[226294]:   <devices>
Feb 02 10:08:57 compute-1 nova_compute[226294]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Feb 02 10:08:57 compute-1 nova_compute[226294]:     <disk type='network' device='disk'>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <driver name='qemu' type='raw' cache='none'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <auth username='openstack'>
Feb 02 10:08:57 compute-1 nova_compute[226294]:         <secret type='ceph' uuid='d241d473-9fcb-5f74-b163-f1ca4454e7f1'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       </auth>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <source protocol='rbd' name='vms/15b2e821-5e8b-4d8a-9a48-7c6a30bd3220_disk' index='2'>
Feb 02 10:08:57 compute-1 nova_compute[226294]:         <host name='192.168.122.100' port='6789'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:         <host name='192.168.122.102' port='6789'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:         <host name='192.168.122.101' port='6789'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       </source>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <target dev='vda' bus='virtio'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <alias name='virtio-disk0'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:     </disk>
Feb 02 10:08:57 compute-1 nova_compute[226294]:     <disk type='network' device='cdrom'>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <driver name='qemu' type='raw' cache='none'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <auth username='openstack'>
Feb 02 10:08:57 compute-1 nova_compute[226294]:         <secret type='ceph' uuid='d241d473-9fcb-5f74-b163-f1ca4454e7f1'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       </auth>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <source protocol='rbd' name='vms/15b2e821-5e8b-4d8a-9a48-7c6a30bd3220_disk.config' index='1'>
Feb 02 10:08:57 compute-1 nova_compute[226294]:         <host name='192.168.122.100' port='6789'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:         <host name='192.168.122.102' port='6789'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:         <host name='192.168.122.101' port='6789'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       </source>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <target dev='sda' bus='sata'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <readonly/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <alias name='sata0-0-0'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:     </disk>
Feb 02 10:08:57 compute-1 nova_compute[226294]:     <controller type='pci' index='0' model='pcie-root'>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <alias name='pcie.0'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:     </controller>
Feb 02 10:08:57 compute-1 nova_compute[226294]:     <controller type='pci' index='1' model='pcie-root-port'>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <model name='pcie-root-port'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <target chassis='1' port='0x10'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <alias name='pci.1'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:     </controller>
Feb 02 10:08:57 compute-1 nova_compute[226294]:     <controller type='pci' index='2' model='pcie-root-port'>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <model name='pcie-root-port'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <target chassis='2' port='0x11'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <alias name='pci.2'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:     </controller>
Feb 02 10:08:57 compute-1 nova_compute[226294]:     <controller type='pci' index='3' model='pcie-root-port'>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <model name='pcie-root-port'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <target chassis='3' port='0x12'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <alias name='pci.3'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:     </controller>
Feb 02 10:08:57 compute-1 nova_compute[226294]:     <controller type='pci' index='4' model='pcie-root-port'>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <model name='pcie-root-port'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <target chassis='4' port='0x13'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <alias name='pci.4'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:     </controller>
Feb 02 10:08:57 compute-1 nova_compute[226294]:     <controller type='pci' index='5' model='pcie-root-port'>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <model name='pcie-root-port'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <target chassis='5' port='0x14'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <alias name='pci.5'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:     </controller>
Feb 02 10:08:57 compute-1 nova_compute[226294]:     <controller type='pci' index='6' model='pcie-root-port'>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <model name='pcie-root-port'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <target chassis='6' port='0x15'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <alias name='pci.6'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:     </controller>
Feb 02 10:08:57 compute-1 nova_compute[226294]:     <controller type='pci' index='7' model='pcie-root-port'>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <model name='pcie-root-port'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <target chassis='7' port='0x16'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <alias name='pci.7'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:     </controller>
Feb 02 10:08:57 compute-1 nova_compute[226294]:     <controller type='pci' index='8' model='pcie-root-port'>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <model name='pcie-root-port'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <target chassis='8' port='0x17'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <alias name='pci.8'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:     </controller>
Feb 02 10:08:57 compute-1 nova_compute[226294]:     <controller type='pci' index='9' model='pcie-root-port'>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <model name='pcie-root-port'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <target chassis='9' port='0x18'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <alias name='pci.9'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:     </controller>
Feb 02 10:08:57 compute-1 nova_compute[226294]:     <controller type='pci' index='10' model='pcie-root-port'>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <model name='pcie-root-port'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <target chassis='10' port='0x19'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <alias name='pci.10'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:     </controller>
Feb 02 10:08:57 compute-1 nova_compute[226294]:     <controller type='pci' index='11' model='pcie-root-port'>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <model name='pcie-root-port'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <target chassis='11' port='0x1a'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <alias name='pci.11'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:     </controller>
Feb 02 10:08:57 compute-1 nova_compute[226294]:     <controller type='pci' index='12' model='pcie-root-port'>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <model name='pcie-root-port'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <target chassis='12' port='0x1b'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <alias name='pci.12'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:     </controller>
Feb 02 10:08:57 compute-1 nova_compute[226294]:     <controller type='pci' index='13' model='pcie-root-port'>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <model name='pcie-root-port'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <target chassis='13' port='0x1c'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <alias name='pci.13'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:     </controller>
Feb 02 10:08:57 compute-1 nova_compute[226294]:     <controller type='pci' index='14' model='pcie-root-port'>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <model name='pcie-root-port'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <target chassis='14' port='0x1d'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <alias name='pci.14'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:     </controller>
Feb 02 10:08:57 compute-1 nova_compute[226294]:     <controller type='pci' index='15' model='pcie-root-port'>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <model name='pcie-root-port'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <target chassis='15' port='0x1e'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <alias name='pci.15'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:     </controller>
Feb 02 10:08:57 compute-1 nova_compute[226294]:     <controller type='pci' index='16' model='pcie-root-port'>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <model name='pcie-root-port'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <target chassis='16' port='0x1f'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <alias name='pci.16'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:     </controller>
Feb 02 10:08:57 compute-1 nova_compute[226294]:     <controller type='pci' index='17' model='pcie-root-port'>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <model name='pcie-root-port'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <target chassis='17' port='0x20'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <alias name='pci.17'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:     </controller>
Feb 02 10:08:57 compute-1 nova_compute[226294]:     <controller type='pci' index='18' model='pcie-root-port'>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <model name='pcie-root-port'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <target chassis='18' port='0x21'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <alias name='pci.18'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:     </controller>
Feb 02 10:08:57 compute-1 nova_compute[226294]:     <controller type='pci' index='19' model='pcie-root-port'>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <model name='pcie-root-port'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <target chassis='19' port='0x22'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <alias name='pci.19'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:     </controller>
Feb 02 10:08:57 compute-1 nova_compute[226294]:     <controller type='pci' index='20' model='pcie-root-port'>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <model name='pcie-root-port'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <target chassis='20' port='0x23'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <alias name='pci.20'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:     </controller>
Feb 02 10:08:57 compute-1 nova_compute[226294]:     <controller type='pci' index='21' model='pcie-root-port'>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <model name='pcie-root-port'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <target chassis='21' port='0x24'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <alias name='pci.21'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:     </controller>
Feb 02 10:08:57 compute-1 nova_compute[226294]:     <controller type='pci' index='22' model='pcie-root-port'>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <model name='pcie-root-port'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <target chassis='22' port='0x25'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <alias name='pci.22'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:     </controller>
Feb 02 10:08:57 compute-1 nova_compute[226294]:     <controller type='pci' index='23' model='pcie-root-port'>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <model name='pcie-root-port'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <target chassis='23' port='0x26'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <alias name='pci.23'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:     </controller>
Feb 02 10:08:57 compute-1 nova_compute[226294]:     <controller type='pci' index='24' model='pcie-root-port'>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <model name='pcie-root-port'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <target chassis='24' port='0x27'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <alias name='pci.24'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:     </controller>
Feb 02 10:08:57 compute-1 nova_compute[226294]:     <controller type='pci' index='25' model='pcie-root-port'>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <model name='pcie-root-port'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <target chassis='25' port='0x28'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <alias name='pci.25'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:     </controller>
Feb 02 10:08:57 compute-1 nova_compute[226294]:     <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <model name='pcie-pci-bridge'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <alias name='pci.26'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:     </controller>
Feb 02 10:08:57 compute-1 nova_compute[226294]:     <controller type='usb' index='0' model='piix3-uhci'>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <alias name='usb'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:     </controller>
Feb 02 10:08:57 compute-1 nova_compute[226294]:     <controller type='sata' index='0'>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <alias name='ide'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:     </controller>
Feb 02 10:08:57 compute-1 nova_compute[226294]:     <interface type='ethernet'>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <mac address='fa:16:3e:85:9a:96'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <target dev='tap09a00258-4f'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <model type='virtio'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <driver name='vhost' rx_queue_size='512'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <mtu size='1442'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <alias name='net0'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:     </interface>
Feb 02 10:08:57 compute-1 nova_compute[226294]:     <interface type='ethernet'>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <mac address='fa:16:3e:2f:49:24'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <target dev='tapc66e0be1-d1'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <model type='virtio'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <driver name='vhost' rx_queue_size='512'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <mtu size='1442'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <alias name='net1'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <address type='pci' domain='0x0000' bus='0x06' slot='0x00' function='0x0'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:     </interface>
Feb 02 10:08:57 compute-1 nova_compute[226294]:     <serial type='pty'>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <source path='/dev/pts/0'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <log file='/var/lib/nova/instances/15b2e821-5e8b-4d8a-9a48-7c6a30bd3220/console.log' append='off'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <target type='isa-serial' port='0'>
Feb 02 10:08:57 compute-1 nova_compute[226294]:         <model name='isa-serial'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       </target>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <alias name='serial0'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:     </serial>
Feb 02 10:08:57 compute-1 nova_compute[226294]:     <console type='pty' tty='/dev/pts/0'>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <source path='/dev/pts/0'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <log file='/var/lib/nova/instances/15b2e821-5e8b-4d8a-9a48-7c6a30bd3220/console.log' append='off'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <target type='serial' port='0'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <alias name='serial0'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:     </console>
Feb 02 10:08:57 compute-1 nova_compute[226294]:     <input type='tablet' bus='usb'>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <alias name='input0'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <address type='usb' bus='0' port='1'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:     </input>
Feb 02 10:08:57 compute-1 nova_compute[226294]:     <input type='mouse' bus='ps2'>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <alias name='input1'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:     </input>
Feb 02 10:08:57 compute-1 nova_compute[226294]:     <input type='keyboard' bus='ps2'>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <alias name='input2'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:     </input>
Feb 02 10:08:57 compute-1 nova_compute[226294]:     <graphics type='vnc' port='5900' autoport='yes' listen='::0'>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <listen type='address' address='::0'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:     </graphics>
Feb 02 10:08:57 compute-1 nova_compute[226294]:     <audio id='1' type='none'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:     <video>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <model type='virtio' heads='1' primary='yes'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <alias name='video0'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:     </video>
Feb 02 10:08:57 compute-1 nova_compute[226294]:     <watchdog model='itco' action='reset'>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <alias name='watchdog0'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:     </watchdog>
Feb 02 10:08:57 compute-1 nova_compute[226294]:     <memballoon model='virtio'>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <stats period='10'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <alias name='balloon0'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:     </memballoon>
Feb 02 10:08:57 compute-1 nova_compute[226294]:     <rng model='virtio'>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <backend model='random'>/dev/urandom</backend>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <alias name='rng0'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:     </rng>
Feb 02 10:08:57 compute-1 nova_compute[226294]:   </devices>
Feb 02 10:08:57 compute-1 nova_compute[226294]:   <seclabel type='dynamic' model='selinux' relabel='yes'>
Feb 02 10:08:57 compute-1 nova_compute[226294]:     <label>system_u:system_r:svirt_t:s0:c659,c775</label>
Feb 02 10:08:57 compute-1 nova_compute[226294]:     <imagelabel>system_u:object_r:svirt_image_t:s0:c659,c775</imagelabel>
Feb 02 10:08:57 compute-1 nova_compute[226294]:   </seclabel>
Feb 02 10:08:57 compute-1 nova_compute[226294]:   <seclabel type='dynamic' model='dac' relabel='yes'>
Feb 02 10:08:57 compute-1 nova_compute[226294]:     <label>+107:+107</label>
Feb 02 10:08:57 compute-1 nova_compute[226294]:     <imagelabel>+107:+107</imagelabel>
Feb 02 10:08:57 compute-1 nova_compute[226294]:   </seclabel>
Feb 02 10:08:57 compute-1 nova_compute[226294]: </domain>
Feb 02 10:08:57 compute-1 nova_compute[226294]:  get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282
Feb 02 10:08:57 compute-1 nova_compute[226294]: 2026-02-02 10:08:57.556 226298 INFO nova.virt.libvirt.driver [None req-a9d9f71d-a6d4-4c48-8fe9-dc7b77e7a7c4 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Successfully detached device tapc66e0be1-d1 from instance 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220 from the persistent domain config.
Feb 02 10:08:57 compute-1 nova_compute[226294]: 2026-02-02 10:08:57.557 226298 DEBUG nova.virt.libvirt.driver [None req-a9d9f71d-a6d4-4c48-8fe9-dc7b77e7a7c4 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] (1/8): Attempting to detach device tapc66e0be1-d1 with device alias net1 from instance 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220 from the live domain config. _detach_from_live_with_retry /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2523
Feb 02 10:08:57 compute-1 nova_compute[226294]: 2026-02-02 10:08:57.558 226298 DEBUG nova.virt.libvirt.guest [None req-a9d9f71d-a6d4-4c48-8fe9-dc7b77e7a7c4 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] detach device xml: <interface type="ethernet">
Feb 02 10:08:57 compute-1 nova_compute[226294]:   <mac address="fa:16:3e:2f:49:24"/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:   <model type="virtio"/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:   <driver name="vhost" rx_queue_size="512"/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:   <mtu size="1442"/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:   <target dev="tapc66e0be1-d1"/>
Feb 02 10:08:57 compute-1 nova_compute[226294]: </interface>
Feb 02 10:08:57 compute-1 nova_compute[226294]:  detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465
Feb 02 10:08:57 compute-1 kernel: tapc66e0be1-d1 (unregistering): left promiscuous mode
Feb 02 10:08:57 compute-1 NetworkManager[49055]: <info>  [1770026937.6762] device (tapc66e0be1-d1): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Feb 02 10:08:57 compute-1 nova_compute[226294]: 2026-02-02 10:08:57.682 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:08:57 compute-1 ovn_controller[133666]: 2026-02-02T10:08:57Z|00050|binding|INFO|Releasing lport c66e0be1-d166-4088-8ad8-baa84f3d032d from this chassis (sb_readonly=0)
Feb 02 10:08:57 compute-1 ovn_controller[133666]: 2026-02-02T10:08:57Z|00051|binding|INFO|Setting lport c66e0be1-d166-4088-8ad8-baa84f3d032d down in Southbound
Feb 02 10:08:57 compute-1 ovn_controller[133666]: 2026-02-02T10:08:57Z|00052|binding|INFO|Removing iface tapc66e0be1-d1 ovn-installed in OVS
Feb 02 10:08:57 compute-1 nova_compute[226294]: 2026-02-02 10:08:57.686 226298 DEBUG nova.virt.libvirt.driver [None req-9f87d029-9537-479a-a951-66a5b7f88d49 - - - - - -] Received event <DeviceRemovedEvent: 1770026937.6865523, 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220 => net1> from libvirt while the driver is waiting for it; dispatched. emit_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2370
Feb 02 10:08:57 compute-1 ovn_metadata_agent[143537]: 2026-02-02 10:08:57.690 143542 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:2f:49:24 10.100.0.18', 'unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.18/28', 'neutron:device_id': '15b2e821-5e8b-4d8a-9a48-7c6a30bd3220', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e125f54e-7556-49c5-8356-e7390df43c53', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'efbfe697ca674d72b47da5adf3e42c0c', 'neutron:revision_number': '5', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a9d42b65-630e-4d58-b649-2acc01d097b4, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f9094efd8b0>], logical_port=c66e0be1-d166-4088-8ad8-baa84f3d032d) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f9094efd8b0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 02 10:08:57 compute-1 nova_compute[226294]: 2026-02-02 10:08:57.690 226298 DEBUG nova.virt.libvirt.driver [None req-a9d9f71d-a6d4-4c48-8fe9-dc7b77e7a7c4 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Start waiting for the detach event from libvirt for device tapc66e0be1-d1 with device alias net1 for instance 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220 _detach_from_live_and_wait_for_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2599
Feb 02 10:08:57 compute-1 nova_compute[226294]: 2026-02-02 10:08:57.691 226298 DEBUG nova.virt.libvirt.guest [None req-a9d9f71d-a6d4-4c48-8fe9-dc7b77e7a7c4 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:2f:49:24"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapc66e0be1-d1"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257
Feb 02 10:08:57 compute-1 ovn_metadata_agent[143537]: 2026-02-02 10:08:57.691 143542 INFO neutron.agent.ovn.metadata.agent [-] Port c66e0be1-d166-4088-8ad8-baa84f3d032d in datapath e125f54e-7556-49c5-8356-e7390df43c53 unbound from our chassis
Feb 02 10:08:57 compute-1 ovn_metadata_agent[143537]: 2026-02-02 10:08:57.692 143542 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network e125f54e-7556-49c5-8356-e7390df43c53, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Feb 02 10:08:57 compute-1 ovn_metadata_agent[143537]: 2026-02-02 10:08:57.693 229827 DEBUG oslo.privsep.daemon [-] privsep: reply[c93a1ce9-0db9-459e-8786-29195437a201]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 02 10:08:57 compute-1 ovn_metadata_agent[143537]: 2026-02-02 10:08:57.694 143542 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-e125f54e-7556-49c5-8356-e7390df43c53 namespace which is not needed anymore
Feb 02 10:08:57 compute-1 nova_compute[226294]: 2026-02-02 10:08:57.695 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:08:57 compute-1 nova_compute[226294]: 2026-02-02 10:08:57.697 226298 DEBUG nova.virt.libvirt.guest [None req-a9d9f71d-a6d4-4c48-8fe9-dc7b77e7a7c4 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:2f:49:24"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapc66e0be1-d1"/></interface>not found in domain: <domain type='kvm' id='2'>
Feb 02 10:08:57 compute-1 nova_compute[226294]:   <name>instance-00000006</name>
Feb 02 10:08:57 compute-1 nova_compute[226294]:   <uuid>15b2e821-5e8b-4d8a-9a48-7c6a30bd3220</uuid>
Feb 02 10:08:57 compute-1 nova_compute[226294]:   <metadata>
Feb 02 10:08:57 compute-1 nova_compute[226294]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Feb 02 10:08:57 compute-1 nova_compute[226294]:   <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:   <nova:name>tempest-TestNetworkBasicOps-server-1612354759</nova:name>
Feb 02 10:08:57 compute-1 nova_compute[226294]:   <nova:creationTime>2026-02-02 10:07:39</nova:creationTime>
Feb 02 10:08:57 compute-1 nova_compute[226294]:   <nova:flavor name="m1.nano">
Feb 02 10:08:57 compute-1 nova_compute[226294]:     <nova:memory>128</nova:memory>
Feb 02 10:08:57 compute-1 nova_compute[226294]:     <nova:disk>1</nova:disk>
Feb 02 10:08:57 compute-1 nova_compute[226294]:     <nova:swap>0</nova:swap>
Feb 02 10:08:57 compute-1 nova_compute[226294]:     <nova:ephemeral>0</nova:ephemeral>
Feb 02 10:08:57 compute-1 nova_compute[226294]:     <nova:vcpus>1</nova:vcpus>
Feb 02 10:08:57 compute-1 nova_compute[226294]:   </nova:flavor>
Feb 02 10:08:57 compute-1 nova_compute[226294]:   <nova:owner>
Feb 02 10:08:57 compute-1 nova_compute[226294]:     <nova:user uuid="1b1695a2a70d4aa0aa350ba17d8f6d5e">tempest-TestNetworkBasicOps-793549693-project-member</nova:user>
Feb 02 10:08:57 compute-1 nova_compute[226294]:     <nova:project uuid="efbfe697ca674d72b47da5adf3e42c0c">tempest-TestNetworkBasicOps-793549693</nova:project>
Feb 02 10:08:57 compute-1 nova_compute[226294]:   </nova:owner>
Feb 02 10:08:57 compute-1 nova_compute[226294]:   <nova:root type="image" uuid="d5e062d7-95ef-409c-9ad0-60f7cf6f44ce"/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:   <nova:ports>
Feb 02 10:08:57 compute-1 nova_compute[226294]:     <nova:port uuid="09a00258-4f60-42dd-a769-b2ea3b870187">
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:     </nova:port>
Feb 02 10:08:57 compute-1 nova_compute[226294]:     <nova:port uuid="c66e0be1-d166-4088-8ad8-baa84f3d032d">
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <nova:ip type="fixed" address="10.100.0.18" ipVersion="4"/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:     </nova:port>
Feb 02 10:08:57 compute-1 nova_compute[226294]:   </nova:ports>
Feb 02 10:08:57 compute-1 nova_compute[226294]: </nova:instance>
Feb 02 10:08:57 compute-1 nova_compute[226294]:   </metadata>
Feb 02 10:08:57 compute-1 nova_compute[226294]:   <memory unit='KiB'>131072</memory>
Feb 02 10:08:57 compute-1 nova_compute[226294]:   <currentMemory unit='KiB'>131072</currentMemory>
Feb 02 10:08:57 compute-1 nova_compute[226294]:   <vcpu placement='static'>1</vcpu>
Feb 02 10:08:57 compute-1 nova_compute[226294]:   <resource>
Feb 02 10:08:57 compute-1 nova_compute[226294]:     <partition>/machine</partition>
Feb 02 10:08:57 compute-1 nova_compute[226294]:   </resource>
Feb 02 10:08:57 compute-1 nova_compute[226294]:   <sysinfo type='smbios'>
Feb 02 10:08:57 compute-1 nova_compute[226294]:     <system>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <entry name='manufacturer'>RDO</entry>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <entry name='product'>OpenStack Compute</entry>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <entry name='version'>27.5.2-0.20260127144738.eaa65f0.el9</entry>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <entry name='serial'>15b2e821-5e8b-4d8a-9a48-7c6a30bd3220</entry>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <entry name='uuid'>15b2e821-5e8b-4d8a-9a48-7c6a30bd3220</entry>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <entry name='family'>Virtual Machine</entry>
Feb 02 10:08:57 compute-1 nova_compute[226294]:     </system>
Feb 02 10:08:57 compute-1 nova_compute[226294]:   </sysinfo>
Feb 02 10:08:57 compute-1 nova_compute[226294]:   <os>
Feb 02 10:08:57 compute-1 nova_compute[226294]:     <type arch='x86_64' machine='pc-q35-rhel9.8.0'>hvm</type>
Feb 02 10:08:57 compute-1 nova_compute[226294]:     <boot dev='hd'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:     <smbios mode='sysinfo'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:   </os>
Feb 02 10:08:57 compute-1 nova_compute[226294]:   <features>
Feb 02 10:08:57 compute-1 nova_compute[226294]:     <acpi/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:     <apic/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:     <vmcoreinfo state='on'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:   </features>
Feb 02 10:08:57 compute-1 nova_compute[226294]:   <cpu mode='custom' match='exact' check='full'>
Feb 02 10:08:57 compute-1 nova_compute[226294]:     <model fallback='forbid'>EPYC-Rome</model>
Feb 02 10:08:57 compute-1 nova_compute[226294]:     <vendor>AMD</vendor>
Feb 02 10:08:57 compute-1 nova_compute[226294]:     <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:     <feature policy='require' name='x2apic'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:     <feature policy='require' name='tsc-deadline'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:     <feature policy='require' name='hypervisor'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:     <feature policy='require' name='tsc_adjust'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:     <feature policy='require' name='spec-ctrl'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:     <feature policy='require' name='stibp'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:     <feature policy='require' name='ssbd'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:     <feature policy='require' name='cmp_legacy'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:     <feature policy='require' name='overflow-recov'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:     <feature policy='require' name='succor'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:     <feature policy='require' name='ibrs'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:     <feature policy='require' name='amd-ssbd'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:     <feature policy='require' name='virt-ssbd'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:     <feature policy='disable' name='lbrv'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:     <feature policy='disable' name='tsc-scale'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:     <feature policy='disable' name='vmcb-clean'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:     <feature policy='disable' name='flushbyasid'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:     <feature policy='disable' name='pause-filter'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:     <feature policy='disable' name='pfthreshold'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:     <feature policy='disable' name='svme-addr-chk'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:     <feature policy='require' name='lfence-always-serializing'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:     <feature policy='disable' name='xsaves'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:     <feature policy='disable' name='svm'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:     <feature policy='require' name='topoext'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:     <feature policy='disable' name='npt'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:     <feature policy='disable' name='nrip-save'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:   </cpu>
Feb 02 10:08:57 compute-1 nova_compute[226294]:   <clock offset='utc'>
Feb 02 10:08:57 compute-1 nova_compute[226294]:     <timer name='pit' tickpolicy='delay'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:     <timer name='rtc' tickpolicy='catchup'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:     <timer name='hpet' present='no'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:   </clock>
Feb 02 10:08:57 compute-1 nova_compute[226294]:   <on_poweroff>destroy</on_poweroff>
Feb 02 10:08:57 compute-1 nova_compute[226294]:   <on_reboot>restart</on_reboot>
Feb 02 10:08:57 compute-1 nova_compute[226294]:   <on_crash>destroy</on_crash>
Feb 02 10:08:57 compute-1 nova_compute[226294]:   <devices>
Feb 02 10:08:57 compute-1 nova_compute[226294]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Feb 02 10:08:57 compute-1 nova_compute[226294]:     <disk type='network' device='disk'>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <driver name='qemu' type='raw' cache='none'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <auth username='openstack'>
Feb 02 10:08:57 compute-1 nova_compute[226294]:         <secret type='ceph' uuid='d241d473-9fcb-5f74-b163-f1ca4454e7f1'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       </auth>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <source protocol='rbd' name='vms/15b2e821-5e8b-4d8a-9a48-7c6a30bd3220_disk' index='2'>
Feb 02 10:08:57 compute-1 nova_compute[226294]:         <host name='192.168.122.100' port='6789'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:         <host name='192.168.122.102' port='6789'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:         <host name='192.168.122.101' port='6789'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       </source>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <target dev='vda' bus='virtio'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <alias name='virtio-disk0'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:     </disk>
Feb 02 10:08:57 compute-1 nova_compute[226294]:     <disk type='network' device='cdrom'>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <driver name='qemu' type='raw' cache='none'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <auth username='openstack'>
Feb 02 10:08:57 compute-1 nova_compute[226294]:         <secret type='ceph' uuid='d241d473-9fcb-5f74-b163-f1ca4454e7f1'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       </auth>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <source protocol='rbd' name='vms/15b2e821-5e8b-4d8a-9a48-7c6a30bd3220_disk.config' index='1'>
Feb 02 10:08:57 compute-1 nova_compute[226294]:         <host name='192.168.122.100' port='6789'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:         <host name='192.168.122.102' port='6789'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:         <host name='192.168.122.101' port='6789'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       </source>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <target dev='sda' bus='sata'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <readonly/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <alias name='sata0-0-0'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:     </disk>
Feb 02 10:08:57 compute-1 nova_compute[226294]:     <controller type='pci' index='0' model='pcie-root'>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <alias name='pcie.0'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:     </controller>
Feb 02 10:08:57 compute-1 nova_compute[226294]:     <controller type='pci' index='1' model='pcie-root-port'>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <model name='pcie-root-port'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <target chassis='1' port='0x10'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <alias name='pci.1'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:     </controller>
Feb 02 10:08:57 compute-1 nova_compute[226294]:     <controller type='pci' index='2' model='pcie-root-port'>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <model name='pcie-root-port'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <target chassis='2' port='0x11'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <alias name='pci.2'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:     </controller>
Feb 02 10:08:57 compute-1 nova_compute[226294]:     <controller type='pci' index='3' model='pcie-root-port'>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <model name='pcie-root-port'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <target chassis='3' port='0x12'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <alias name='pci.3'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:     </controller>
Feb 02 10:08:57 compute-1 nova_compute[226294]:     <controller type='pci' index='4' model='pcie-root-port'>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <model name='pcie-root-port'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <target chassis='4' port='0x13'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <alias name='pci.4'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:     </controller>
Feb 02 10:08:57 compute-1 nova_compute[226294]:     <controller type='pci' index='5' model='pcie-root-port'>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <model name='pcie-root-port'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <target chassis='5' port='0x14'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <alias name='pci.5'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:     </controller>
Feb 02 10:08:57 compute-1 nova_compute[226294]:     <controller type='pci' index='6' model='pcie-root-port'>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <model name='pcie-root-port'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <target chassis='6' port='0x15'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <alias name='pci.6'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:     </controller>
Feb 02 10:08:57 compute-1 nova_compute[226294]:     <controller type='pci' index='7' model='pcie-root-port'>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <model name='pcie-root-port'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <target chassis='7' port='0x16'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <alias name='pci.7'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:     </controller>
Feb 02 10:08:57 compute-1 nova_compute[226294]:     <controller type='pci' index='8' model='pcie-root-port'>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <model name='pcie-root-port'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <target chassis='8' port='0x17'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <alias name='pci.8'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:     </controller>
Feb 02 10:08:57 compute-1 nova_compute[226294]:     <controller type='pci' index='9' model='pcie-root-port'>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <model name='pcie-root-port'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <target chassis='9' port='0x18'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <alias name='pci.9'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:     </controller>
Feb 02 10:08:57 compute-1 nova_compute[226294]:     <controller type='pci' index='10' model='pcie-root-port'>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <model name='pcie-root-port'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <target chassis='10' port='0x19'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <alias name='pci.10'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:     </controller>
Feb 02 10:08:57 compute-1 nova_compute[226294]:     <controller type='pci' index='11' model='pcie-root-port'>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <model name='pcie-root-port'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <target chassis='11' port='0x1a'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <alias name='pci.11'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:     </controller>
Feb 02 10:08:57 compute-1 nova_compute[226294]:     <controller type='pci' index='12' model='pcie-root-port'>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <model name='pcie-root-port'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <target chassis='12' port='0x1b'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <alias name='pci.12'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:     </controller>
Feb 02 10:08:57 compute-1 nova_compute[226294]:     <controller type='pci' index='13' model='pcie-root-port'>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <model name='pcie-root-port'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <target chassis='13' port='0x1c'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <alias name='pci.13'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:     </controller>
Feb 02 10:08:57 compute-1 nova_compute[226294]:     <controller type='pci' index='14' model='pcie-root-port'>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <model name='pcie-root-port'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <target chassis='14' port='0x1d'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <alias name='pci.14'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:     </controller>
Feb 02 10:08:57 compute-1 nova_compute[226294]:     <controller type='pci' index='15' model='pcie-root-port'>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <model name='pcie-root-port'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <target chassis='15' port='0x1e'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <alias name='pci.15'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:     </controller>
Feb 02 10:08:57 compute-1 nova_compute[226294]:     <controller type='pci' index='16' model='pcie-root-port'>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <model name='pcie-root-port'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <target chassis='16' port='0x1f'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <alias name='pci.16'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:     </controller>
Feb 02 10:08:57 compute-1 nova_compute[226294]:     <controller type='pci' index='17' model='pcie-root-port'>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <model name='pcie-root-port'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <target chassis='17' port='0x20'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <alias name='pci.17'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:     </controller>
Feb 02 10:08:57 compute-1 nova_compute[226294]:     <controller type='pci' index='18' model='pcie-root-port'>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <model name='pcie-root-port'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <target chassis='18' port='0x21'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <alias name='pci.18'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:     </controller>
Feb 02 10:08:57 compute-1 nova_compute[226294]:     <controller type='pci' index='19' model='pcie-root-port'>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <model name='pcie-root-port'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <target chassis='19' port='0x22'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <alias name='pci.19'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:     </controller>
Feb 02 10:08:57 compute-1 nova_compute[226294]:     <controller type='pci' index='20' model='pcie-root-port'>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <model name='pcie-root-port'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <target chassis='20' port='0x23'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <alias name='pci.20'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:     </controller>
Feb 02 10:08:57 compute-1 nova_compute[226294]:     <controller type='pci' index='21' model='pcie-root-port'>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <model name='pcie-root-port'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <target chassis='21' port='0x24'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <alias name='pci.21'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:     </controller>
Feb 02 10:08:57 compute-1 nova_compute[226294]:     <controller type='pci' index='22' model='pcie-root-port'>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <model name='pcie-root-port'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <target chassis='22' port='0x25'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <alias name='pci.22'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:     </controller>
Feb 02 10:08:57 compute-1 nova_compute[226294]:     <controller type='pci' index='23' model='pcie-root-port'>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <model name='pcie-root-port'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <target chassis='23' port='0x26'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <alias name='pci.23'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:     </controller>
Feb 02 10:08:57 compute-1 nova_compute[226294]:     <controller type='pci' index='24' model='pcie-root-port'>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <model name='pcie-root-port'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <target chassis='24' port='0x27'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <alias name='pci.24'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:     </controller>
Feb 02 10:08:57 compute-1 nova_compute[226294]:     <controller type='pci' index='25' model='pcie-root-port'>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <model name='pcie-root-port'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <target chassis='25' port='0x28'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <alias name='pci.25'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:     </controller>
Feb 02 10:08:57 compute-1 nova_compute[226294]:     <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <model name='pcie-pci-bridge'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <alias name='pci.26'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:     </controller>
Feb 02 10:08:57 compute-1 nova_compute[226294]:     <controller type='usb' index='0' model='piix3-uhci'>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <alias name='usb'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:     </controller>
Feb 02 10:08:57 compute-1 nova_compute[226294]:     <controller type='sata' index='0'>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <alias name='ide'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:     </controller>
Feb 02 10:08:57 compute-1 nova_compute[226294]:     <interface type='ethernet'>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <mac address='fa:16:3e:85:9a:96'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <target dev='tap09a00258-4f'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <model type='virtio'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <driver name='vhost' rx_queue_size='512'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <mtu size='1442'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <alias name='net0'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:     </interface>
Feb 02 10:08:57 compute-1 nova_compute[226294]:     <serial type='pty'>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <source path='/dev/pts/0'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <log file='/var/lib/nova/instances/15b2e821-5e8b-4d8a-9a48-7c6a30bd3220/console.log' append='off'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <target type='isa-serial' port='0'>
Feb 02 10:08:57 compute-1 nova_compute[226294]:         <model name='isa-serial'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       </target>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <alias name='serial0'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:     </serial>
Feb 02 10:08:57 compute-1 nova_compute[226294]:     <console type='pty' tty='/dev/pts/0'>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <source path='/dev/pts/0'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <log file='/var/lib/nova/instances/15b2e821-5e8b-4d8a-9a48-7c6a30bd3220/console.log' append='off'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <target type='serial' port='0'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <alias name='serial0'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:     </console>
Feb 02 10:08:57 compute-1 nova_compute[226294]:     <input type='tablet' bus='usb'>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <alias name='input0'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <address type='usb' bus='0' port='1'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:     </input>
Feb 02 10:08:57 compute-1 nova_compute[226294]:     <input type='mouse' bus='ps2'>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <alias name='input1'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:     </input>
Feb 02 10:08:57 compute-1 nova_compute[226294]:     <input type='keyboard' bus='ps2'>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <alias name='input2'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:     </input>
Feb 02 10:08:57 compute-1 nova_compute[226294]:     <graphics type='vnc' port='5900' autoport='yes' listen='::0'>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <listen type='address' address='::0'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:     </graphics>
Feb 02 10:08:57 compute-1 nova_compute[226294]:     <audio id='1' type='none'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:     <video>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <model type='virtio' heads='1' primary='yes'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <alias name='video0'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:     </video>
Feb 02 10:08:57 compute-1 nova_compute[226294]:     <watchdog model='itco' action='reset'>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <alias name='watchdog0'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:     </watchdog>
Feb 02 10:08:57 compute-1 nova_compute[226294]:     <memballoon model='virtio'>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <stats period='10'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <alias name='balloon0'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:     </memballoon>
Feb 02 10:08:57 compute-1 nova_compute[226294]:     <rng model='virtio'>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <backend model='random'>/dev/urandom</backend>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <alias name='rng0'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:     </rng>
Feb 02 10:08:57 compute-1 nova_compute[226294]:   </devices>
Feb 02 10:08:57 compute-1 nova_compute[226294]:   <seclabel type='dynamic' model='selinux' relabel='yes'>
Feb 02 10:08:57 compute-1 nova_compute[226294]:     <label>system_u:system_r:svirt_t:s0:c659,c775</label>
Feb 02 10:08:57 compute-1 nova_compute[226294]:     <imagelabel>system_u:object_r:svirt_image_t:s0:c659,c775</imagelabel>
Feb 02 10:08:57 compute-1 nova_compute[226294]:   </seclabel>
Feb 02 10:08:57 compute-1 nova_compute[226294]:   <seclabel type='dynamic' model='dac' relabel='yes'>
Feb 02 10:08:57 compute-1 nova_compute[226294]:     <label>+107:+107</label>
Feb 02 10:08:57 compute-1 nova_compute[226294]:     <imagelabel>+107:+107</imagelabel>
Feb 02 10:08:57 compute-1 nova_compute[226294]:   </seclabel>
Feb 02 10:08:57 compute-1 nova_compute[226294]: </domain>
Feb 02 10:08:57 compute-1 nova_compute[226294]:  get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282
Feb 02 10:08:57 compute-1 nova_compute[226294]: 2026-02-02 10:08:57.697 226298 INFO nova.virt.libvirt.driver [None req-a9d9f71d-a6d4-4c48-8fe9-dc7b77e7a7c4 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Successfully detached device tapc66e0be1-d1 from instance 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220 from the live domain config.
Feb 02 10:08:57 compute-1 nova_compute[226294]: 2026-02-02 10:08:57.699 226298 DEBUG nova.virt.libvirt.vif [None req-a9d9f71d-a6d4-4c48-8fe9-dc7b77e7a7c4 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-02-02T10:07:00Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1612354759',display_name='tempest-TestNetworkBasicOps-server-1612354759',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1612354759',id=6,image_ref='d5e062d7-95ef-409c-9ad0-60f7cf6f44ce',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBMJFk+K1ECuOT65iOILNHAamJ3gGwluebBSGyHnh3tJwd+ehxpJY8ebkg+pBKZw/EcMgvzSZ3FmRm5/iJU1QzjfTd7kqiXGqABRWM3LPcjo/Kmnp/RcvKBxgSpfhTx8kiA==',key_name='tempest-TestNetworkBasicOps-1228211887',keypairs=<?>,launch_index=0,launched_at=2026-02-02T10:07:09Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='efbfe697ca674d72b47da5adf3e42c0c',ramdisk_id='',reservation_id='r-1fnfubus',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='d5e062d7-95ef-409c-9ad0-60f7cf6f44ce',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-793549693',owner_user_name='tempest-TestNetworkBasicOps-793549693-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-02-02T10:07:09Z,user_data=None,user_id='1b1695a2a70d4aa0aa350ba17d8f6d5e',uuid=15b2e821-5e8b-4d8a-9a48-7c6a30bd3220,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "c66e0be1-d166-4088-8ad8-baa84f3d032d", "address": "fa:16:3e:2f:49:24", "network": {"id": "e125f54e-7556-49c5-8356-e7390df43c53", "bridge": "br-int", "label": "tempest-network-smoke--39971515", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.18", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "efbfe697ca674d72b47da5adf3e42c0c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc66e0be1-d1", "ovs_interfaceid": "c66e0be1-d166-4088-8ad8-baa84f3d032d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Feb 02 10:08:57 compute-1 nova_compute[226294]: 2026-02-02 10:08:57.700 226298 DEBUG nova.network.os_vif_util [None req-a9d9f71d-a6d4-4c48-8fe9-dc7b77e7a7c4 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Converting VIF {"id": "c66e0be1-d166-4088-8ad8-baa84f3d032d", "address": "fa:16:3e:2f:49:24", "network": {"id": "e125f54e-7556-49c5-8356-e7390df43c53", "bridge": "br-int", "label": "tempest-network-smoke--39971515", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.18", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "efbfe697ca674d72b47da5adf3e42c0c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc66e0be1-d1", "ovs_interfaceid": "c66e0be1-d166-4088-8ad8-baa84f3d032d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 02 10:08:57 compute-1 nova_compute[226294]: 2026-02-02 10:08:57.701 226298 DEBUG nova.network.os_vif_util [None req-a9d9f71d-a6d4-4c48-8fe9-dc7b77e7a7c4 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:2f:49:24,bridge_name='br-int',has_traffic_filtering=True,id=c66e0be1-d166-4088-8ad8-baa84f3d032d,network=Network(e125f54e-7556-49c5-8356-e7390df43c53),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc66e0be1-d1') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 02 10:08:57 compute-1 nova_compute[226294]: 2026-02-02 10:08:57.702 226298 DEBUG os_vif [None req-a9d9f71d-a6d4-4c48-8fe9-dc7b77e7a7c4 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:2f:49:24,bridge_name='br-int',has_traffic_filtering=True,id=c66e0be1-d166-4088-8ad8-baa84f3d032d,network=Network(e125f54e-7556-49c5-8356-e7390df43c53),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc66e0be1-d1') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Feb 02 10:08:57 compute-1 nova_compute[226294]: 2026-02-02 10:08:57.705 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:08:57 compute-1 nova_compute[226294]: 2026-02-02 10:08:57.705 226298 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc66e0be1-d1, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 02 10:08:57 compute-1 nova_compute[226294]: 2026-02-02 10:08:57.707 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:08:57 compute-1 nova_compute[226294]: 2026-02-02 10:08:57.711 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 02 10:08:57 compute-1 nova_compute[226294]: 2026-02-02 10:08:57.714 226298 INFO os_vif [None req-a9d9f71d-a6d4-4c48-8fe9-dc7b77e7a7c4 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:2f:49:24,bridge_name='br-int',has_traffic_filtering=True,id=c66e0be1-d166-4088-8ad8-baa84f3d032d,network=Network(e125f54e-7556-49c5-8356-e7390df43c53),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc66e0be1-d1')
Feb 02 10:08:57 compute-1 nova_compute[226294]: 2026-02-02 10:08:57.715 226298 DEBUG nova.virt.libvirt.guest [None req-a9d9f71d-a6d4-4c48-8fe9-dc7b77e7a7c4 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] set metadata xml: <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Feb 02 10:08:57 compute-1 nova_compute[226294]:   <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:   <nova:name>tempest-TestNetworkBasicOps-server-1612354759</nova:name>
Feb 02 10:08:57 compute-1 nova_compute[226294]:   <nova:creationTime>2026-02-02 10:08:57</nova:creationTime>
Feb 02 10:08:57 compute-1 nova_compute[226294]:   <nova:flavor name="m1.nano">
Feb 02 10:08:57 compute-1 nova_compute[226294]:     <nova:memory>128</nova:memory>
Feb 02 10:08:57 compute-1 nova_compute[226294]:     <nova:disk>1</nova:disk>
Feb 02 10:08:57 compute-1 nova_compute[226294]:     <nova:swap>0</nova:swap>
Feb 02 10:08:57 compute-1 nova_compute[226294]:     <nova:ephemeral>0</nova:ephemeral>
Feb 02 10:08:57 compute-1 nova_compute[226294]:     <nova:vcpus>1</nova:vcpus>
Feb 02 10:08:57 compute-1 nova_compute[226294]:   </nova:flavor>
Feb 02 10:08:57 compute-1 nova_compute[226294]:   <nova:owner>
Feb 02 10:08:57 compute-1 nova_compute[226294]:     <nova:user uuid="1b1695a2a70d4aa0aa350ba17d8f6d5e">tempest-TestNetworkBasicOps-793549693-project-member</nova:user>
Feb 02 10:08:57 compute-1 nova_compute[226294]:     <nova:project uuid="efbfe697ca674d72b47da5adf3e42c0c">tempest-TestNetworkBasicOps-793549693</nova:project>
Feb 02 10:08:57 compute-1 nova_compute[226294]:   </nova:owner>
Feb 02 10:08:57 compute-1 nova_compute[226294]:   <nova:root type="image" uuid="d5e062d7-95ef-409c-9ad0-60f7cf6f44ce"/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:   <nova:ports>
Feb 02 10:08:57 compute-1 nova_compute[226294]:     <nova:port uuid="09a00258-4f60-42dd-a769-b2ea3b870187">
Feb 02 10:08:57 compute-1 nova_compute[226294]:       <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Feb 02 10:08:57 compute-1 nova_compute[226294]:     </nova:port>
Feb 02 10:08:57 compute-1 nova_compute[226294]:   </nova:ports>
Feb 02 10:08:57 compute-1 nova_compute[226294]: </nova:instance>
Feb 02 10:08:57 compute-1 nova_compute[226294]:  set_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:359
Feb 02 10:08:57 compute-1 neutron-haproxy-ovnmeta-e125f54e-7556-49c5-8356-e7390df43c53[232613]: [NOTICE]   (232617) : haproxy version is 2.8.14-c23fe91
Feb 02 10:08:57 compute-1 neutron-haproxy-ovnmeta-e125f54e-7556-49c5-8356-e7390df43c53[232613]: [NOTICE]   (232617) : path to executable is /usr/sbin/haproxy
Feb 02 10:08:57 compute-1 neutron-haproxy-ovnmeta-e125f54e-7556-49c5-8356-e7390df43c53[232613]: [WARNING]  (232617) : Exiting Master process...
Feb 02 10:08:57 compute-1 neutron-haproxy-ovnmeta-e125f54e-7556-49c5-8356-e7390df43c53[232613]: [WARNING]  (232617) : Exiting Master process...
Feb 02 10:08:57 compute-1 neutron-haproxy-ovnmeta-e125f54e-7556-49c5-8356-e7390df43c53[232613]: [ALERT]    (232617) : Current worker (232619) exited with code 143 (Terminated)
Feb 02 10:08:57 compute-1 neutron-haproxy-ovnmeta-e125f54e-7556-49c5-8356-e7390df43c53[232613]: [WARNING]  (232617) : All workers exited. Exiting... (0)
Feb 02 10:08:57 compute-1 systemd[1]: libpod-e67bf652ad78a830f4e9d185df6fed75ea7c2a468c551e9f2677a0fa3cb6602e.scope: Deactivated successfully.
Feb 02 10:08:57 compute-1 podman[233021]: 2026-02-02 10:08:57.860983219 +0000 UTC m=+0.062057950 container died e67bf652ad78a830f4e9d185df6fed75ea7c2a468c551e9f2677a0fa3cb6602e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc, name=neutron-haproxy-ovnmeta-e125f54e-7556-49c5-8356-e7390df43c53, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 02 10:08:57 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-e67bf652ad78a830f4e9d185df6fed75ea7c2a468c551e9f2677a0fa3cb6602e-userdata-shm.mount: Deactivated successfully.
Feb 02 10:08:57 compute-1 systemd[1]: var-lib-containers-storage-overlay-a7028958bbb9ac49e703bb1728fefda69b8f73736997e2045bf747f59bb53233-merged.mount: Deactivated successfully.
Feb 02 10:08:57 compute-1 podman[233021]: 2026-02-02 10:08:57.908790769 +0000 UTC m=+0.109865450 container cleanup e67bf652ad78a830f4e9d185df6fed75ea7c2a468c551e9f2677a0fa3cb6602e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc, name=neutron-haproxy-ovnmeta-e125f54e-7556-49c5-8356-e7390df43c53, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true)
Feb 02 10:08:57 compute-1 systemd[1]: libpod-conmon-e67bf652ad78a830f4e9d185df6fed75ea7c2a468c551e9f2677a0fa3cb6602e.scope: Deactivated successfully.
Feb 02 10:08:57 compute-1 podman[233052]: 2026-02-02 10:08:57.966918853 +0000 UTC m=+0.039744507 container remove e67bf652ad78a830f4e9d185df6fed75ea7c2a468c551e9f2677a0fa3cb6602e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc, name=neutron-haproxy-ovnmeta-e125f54e-7556-49c5-8356-e7390df43c53, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 02 10:08:57 compute-1 ovn_metadata_agent[143537]: 2026-02-02 10:08:57.971 229827 DEBUG oslo.privsep.daemon [-] privsep: reply[4c49446b-dcf5-4302-bce9-72f0d53822a5]: (4, ('Mon Feb  2 10:08:57 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-e125f54e-7556-49c5-8356-e7390df43c53 (e67bf652ad78a830f4e9d185df6fed75ea7c2a468c551e9f2677a0fa3cb6602e)\ne67bf652ad78a830f4e9d185df6fed75ea7c2a468c551e9f2677a0fa3cb6602e\nMon Feb  2 10:08:57 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-e125f54e-7556-49c5-8356-e7390df43c53 (e67bf652ad78a830f4e9d185df6fed75ea7c2a468c551e9f2677a0fa3cb6602e)\ne67bf652ad78a830f4e9d185df6fed75ea7c2a468c551e9f2677a0fa3cb6602e\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 02 10:08:57 compute-1 ovn_metadata_agent[143537]: 2026-02-02 10:08:57.973 229827 DEBUG oslo.privsep.daemon [-] privsep: reply[6cfda328-e2a2-4fad-8174-779a6b2f1a36]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 02 10:08:57 compute-1 ovn_metadata_agent[143537]: 2026-02-02 10:08:57.974 143542 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape125f54e-70, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 02 10:08:57 compute-1 nova_compute[226294]: 2026-02-02 10:08:57.975 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:08:57 compute-1 kernel: tape125f54e-70: left promiscuous mode
Feb 02 10:08:57 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:08:57 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_35] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b800b1b0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:08:57 compute-1 nova_compute[226294]: 2026-02-02 10:08:57.985 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:08:57 compute-1 ovn_metadata_agent[143537]: 2026-02-02 10:08:57.988 229827 DEBUG oslo.privsep.daemon [-] privsep: reply[5e1e7c1b-9377-4464-aafc-2717394ce588]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 02 10:08:58 compute-1 ovn_metadata_agent[143537]: 2026-02-02 10:08:58.002 229827 DEBUG oslo.privsep.daemon [-] privsep: reply[625d9a8a-a61c-41be-bf2f-cc2c98bca11c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 02 10:08:58 compute-1 ovn_metadata_agent[143537]: 2026-02-02 10:08:58.004 229827 DEBUG oslo.privsep.daemon [-] privsep: reply[81af7ca5-486e-4116-8fe4-4f72703f4c1c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 02 10:08:58 compute-1 ovn_metadata_agent[143537]: 2026-02-02 10:08:58.016 229827 DEBUG oslo.privsep.daemon [-] privsep: reply[aedc4d99-3611-4aed-b785-9fc8d4c019e9]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 402123, 'reachable_time': 19451, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 233067, 'error': None, 'target': 'ovnmeta-e125f54e-7556-49c5-8356-e7390df43c53', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 02 10:08:58 compute-1 ovn_metadata_agent[143537]: 2026-02-02 10:08:58.019 143813 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-e125f54e-7556-49c5-8356-e7390df43c53 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Feb 02 10:08:58 compute-1 ovn_metadata_agent[143537]: 2026-02-02 10:08:58.019 143813 DEBUG oslo.privsep.daemon [-] privsep: reply[6cf5be8e-1268-4661-86f3-383ab39319fe]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 02 10:08:58 compute-1 systemd[1]: run-netns-ovnmeta\x2de125f54e\x2d7556\x2d49c5\x2d8356\x2de7390df43c53.mount: Deactivated successfully.
Feb 02 10:08:58 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:08:58 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8498003b00 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:08:58 compute-1 nova_compute[226294]: 2026-02-02 10:08:58.213 226298 DEBUG nova.compute.manager [req-53498c25-a5e9-4ec2-83a0-a8bffea887ea req-5e1f6643-d494-4292-9e45-e3664ef7c309 b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] [instance: 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220] Received event network-vif-unplugged-c66e0be1-d166-4088-8ad8-baa84f3d032d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 02 10:08:58 compute-1 nova_compute[226294]: 2026-02-02 10:08:58.214 226298 DEBUG oslo_concurrency.lockutils [req-53498c25-a5e9-4ec2-83a0-a8bffea887ea req-5e1f6643-d494-4292-9e45-e3664ef7c309 b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] Acquiring lock "15b2e821-5e8b-4d8a-9a48-7c6a30bd3220-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 02 10:08:58 compute-1 nova_compute[226294]: 2026-02-02 10:08:58.214 226298 DEBUG oslo_concurrency.lockutils [req-53498c25-a5e9-4ec2-83a0-a8bffea887ea req-5e1f6643-d494-4292-9e45-e3664ef7c309 b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] Lock "15b2e821-5e8b-4d8a-9a48-7c6a30bd3220-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 02 10:08:58 compute-1 nova_compute[226294]: 2026-02-02 10:08:58.214 226298 DEBUG oslo_concurrency.lockutils [req-53498c25-a5e9-4ec2-83a0-a8bffea887ea req-5e1f6643-d494-4292-9e45-e3664ef7c309 b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] Lock "15b2e821-5e8b-4d8a-9a48-7c6a30bd3220-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 02 10:08:58 compute-1 nova_compute[226294]: 2026-02-02 10:08:58.215 226298 DEBUG nova.compute.manager [req-53498c25-a5e9-4ec2-83a0-a8bffea887ea req-5e1f6643-d494-4292-9e45-e3664ef7c309 b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] [instance: 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220] No waiting events found dispatching network-vif-unplugged-c66e0be1-d166-4088-8ad8-baa84f3d032d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 02 10:08:58 compute-1 nova_compute[226294]: 2026-02-02 10:08:58.215 226298 WARNING nova.compute.manager [req-53498c25-a5e9-4ec2-83a0-a8bffea887ea req-5e1f6643-d494-4292-9e45-e3664ef7c309 b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] [instance: 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220] Received unexpected event network-vif-unplugged-c66e0be1-d166-4088-8ad8-baa84f3d032d for instance with vm_state active and task_state None.
Feb 02 10:08:58 compute-1 nova_compute[226294]: 2026-02-02 10:08:58.481 226298 DEBUG oslo_concurrency.lockutils [None req-a9d9f71d-a6d4-4c48-8fe9-dc7b77e7a7c4 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Acquiring lock "refresh_cache-15b2e821-5e8b-4d8a-9a48-7c6a30bd3220" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 02 10:08:58 compute-1 nova_compute[226294]: 2026-02-02 10:08:58.482 226298 DEBUG oslo_concurrency.lockutils [None req-a9d9f71d-a6d4-4c48-8fe9-dc7b77e7a7c4 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Acquired lock "refresh_cache-15b2e821-5e8b-4d8a-9a48-7c6a30bd3220" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 02 10:08:58 compute-1 nova_compute[226294]: 2026-02-02 10:08:58.482 226298 DEBUG nova.network.neutron [None req-a9d9f71d-a6d4-4c48-8fe9-dc7b77e7a7c4 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] [instance: 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Feb 02 10:08:58 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:08:58 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_36] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f848c003b40 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:08:58 compute-1 nova_compute[226294]: 2026-02-02 10:08:58.524 226298 DEBUG nova.compute.manager [req-f4a17fa9-f1c8-41aa-afac-d240f4b81791 req-65c090dd-7afc-41c2-a947-97c0907b12ca b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] [instance: 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220] Received event network-vif-deleted-c66e0be1-d166-4088-8ad8-baa84f3d032d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 02 10:08:58 compute-1 nova_compute[226294]: 2026-02-02 10:08:58.524 226298 INFO nova.compute.manager [req-f4a17fa9-f1c8-41aa-afac-d240f4b81791 req-65c090dd-7afc-41c2-a947-97c0907b12ca b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] [instance: 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220] Neutron deleted interface c66e0be1-d166-4088-8ad8-baa84f3d032d; detaching it from the instance and deleting it from the info cache
Feb 02 10:08:58 compute-1 nova_compute[226294]: 2026-02-02 10:08:58.524 226298 DEBUG nova.network.neutron [req-f4a17fa9-f1c8-41aa-afac-d240f4b81791 req-65c090dd-7afc-41c2-a947-97c0907b12ca b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] [instance: 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220] Updating instance_info_cache with network_info: [{"id": "09a00258-4f60-42dd-a769-b2ea3b870187", "address": "fa:16:3e:85:9a:96", "network": {"id": "ba6c4c87-77a9-4fcc-aa14-a4637c78f692", "bridge": "br-int", "label": "tempest-network-smoke--761234279", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.238", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "efbfe697ca674d72b47da5adf3e42c0c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap09a00258-4f", "ovs_interfaceid": "09a00258-4f60-42dd-a769-b2ea3b870187", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 02 10:08:58 compute-1 nova_compute[226294]: 2026-02-02 10:08:58.545 226298 DEBUG nova.objects.instance [req-f4a17fa9-f1c8-41aa-afac-d240f4b81791 req-65c090dd-7afc-41c2-a947-97c0907b12ca b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] Lazy-loading 'system_metadata' on Instance uuid 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 02 10:08:58 compute-1 nova_compute[226294]: 2026-02-02 10:08:58.598 226298 DEBUG nova.objects.instance [req-f4a17fa9-f1c8-41aa-afac-d240f4b81791 req-65c090dd-7afc-41c2-a947-97c0907b12ca b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] Lazy-loading 'flavor' on Instance uuid 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 02 10:08:58 compute-1 nova_compute[226294]: 2026-02-02 10:08:58.635 226298 DEBUG nova.virt.libvirt.vif [req-f4a17fa9-f1c8-41aa-afac-d240f4b81791 req-65c090dd-7afc-41c2-a947-97c0907b12ca b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-02-02T10:07:00Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1612354759',display_name='tempest-TestNetworkBasicOps-server-1612354759',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1612354759',id=6,image_ref='d5e062d7-95ef-409c-9ad0-60f7cf6f44ce',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBMJFk+K1ECuOT65iOILNHAamJ3gGwluebBSGyHnh3tJwd+ehxpJY8ebkg+pBKZw/EcMgvzSZ3FmRm5/iJU1QzjfTd7kqiXGqABRWM3LPcjo/Kmnp/RcvKBxgSpfhTx8kiA==',key_name='tempest-TestNetworkBasicOps-1228211887',keypairs=<?>,launch_index=0,launched_at=2026-02-02T10:07:09Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata=<?>,migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='efbfe697ca674d72b47da5adf3e42c0c',ramdisk_id='',reservation_id='r-1fnfubus',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='d5e062d7-95ef-409c-9ad0-60f7cf6f44ce',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-793549693',owner_user_name='tempest-TestNetworkBasicOps-793549693-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-02-02T10:07:09Z,user_data=None,user_id='1b1695a2a70d4aa0aa350ba17d8f6d5e',uuid=15b2e821-5e8b-4d8a-9a48-7c6a30bd3220,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "c66e0be1-d166-4088-8ad8-baa84f3d032d", "address": "fa:16:3e:2f:49:24", "network": {"id": "e125f54e-7556-49c5-8356-e7390df43c53", "bridge": "br-int", "label": "tempest-network-smoke--39971515", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.18", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "efbfe697ca674d72b47da5adf3e42c0c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc66e0be1-d1", "ovs_interfaceid": "c66e0be1-d166-4088-8ad8-baa84f3d032d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Feb 02 10:08:58 compute-1 nova_compute[226294]: 2026-02-02 10:08:58.636 226298 DEBUG nova.network.os_vif_util [req-f4a17fa9-f1c8-41aa-afac-d240f4b81791 req-65c090dd-7afc-41c2-a947-97c0907b12ca b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] Converting VIF {"id": "c66e0be1-d166-4088-8ad8-baa84f3d032d", "address": "fa:16:3e:2f:49:24", "network": {"id": "e125f54e-7556-49c5-8356-e7390df43c53", "bridge": "br-int", "label": "tempest-network-smoke--39971515", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.18", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "efbfe697ca674d72b47da5adf3e42c0c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc66e0be1-d1", "ovs_interfaceid": "c66e0be1-d166-4088-8ad8-baa84f3d032d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 02 10:08:58 compute-1 nova_compute[226294]: 2026-02-02 10:08:58.637 226298 DEBUG nova.network.os_vif_util [req-f4a17fa9-f1c8-41aa-afac-d240f4b81791 req-65c090dd-7afc-41c2-a947-97c0907b12ca b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:2f:49:24,bridge_name='br-int',has_traffic_filtering=True,id=c66e0be1-d166-4088-8ad8-baa84f3d032d,network=Network(e125f54e-7556-49c5-8356-e7390df43c53),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc66e0be1-d1') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 02 10:08:58 compute-1 nova_compute[226294]: 2026-02-02 10:08:58.642 226298 DEBUG nova.virt.libvirt.guest [req-f4a17fa9-f1c8-41aa-afac-d240f4b81791 req-65c090dd-7afc-41c2-a947-97c0907b12ca b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:2f:49:24"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapc66e0be1-d1"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257
Feb 02 10:08:58 compute-1 nova_compute[226294]: 2026-02-02 10:08:58.647 226298 DEBUG nova.virt.libvirt.guest [req-f4a17fa9-f1c8-41aa-afac-d240f4b81791 req-65c090dd-7afc-41c2-a947-97c0907b12ca b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:2f:49:24"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapc66e0be1-d1"/></interface>not found in domain: <domain type='kvm' id='2'>
Feb 02 10:08:58 compute-1 nova_compute[226294]:   <name>instance-00000006</name>
Feb 02 10:08:58 compute-1 nova_compute[226294]:   <uuid>15b2e821-5e8b-4d8a-9a48-7c6a30bd3220</uuid>
Feb 02 10:08:58 compute-1 nova_compute[226294]:   <metadata>
Feb 02 10:08:58 compute-1 nova_compute[226294]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Feb 02 10:08:58 compute-1 nova_compute[226294]:   <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:   <nova:name>tempest-TestNetworkBasicOps-server-1612354759</nova:name>
Feb 02 10:08:58 compute-1 nova_compute[226294]:   <nova:creationTime>2026-02-02 10:08:57</nova:creationTime>
Feb 02 10:08:58 compute-1 nova_compute[226294]:   <nova:flavor name="m1.nano">
Feb 02 10:08:58 compute-1 nova_compute[226294]:     <nova:memory>128</nova:memory>
Feb 02 10:08:58 compute-1 nova_compute[226294]:     <nova:disk>1</nova:disk>
Feb 02 10:08:58 compute-1 nova_compute[226294]:     <nova:swap>0</nova:swap>
Feb 02 10:08:58 compute-1 nova_compute[226294]:     <nova:ephemeral>0</nova:ephemeral>
Feb 02 10:08:58 compute-1 nova_compute[226294]:     <nova:vcpus>1</nova:vcpus>
Feb 02 10:08:58 compute-1 nova_compute[226294]:   </nova:flavor>
Feb 02 10:08:58 compute-1 nova_compute[226294]:   <nova:owner>
Feb 02 10:08:58 compute-1 nova_compute[226294]:     <nova:user uuid="1b1695a2a70d4aa0aa350ba17d8f6d5e">tempest-TestNetworkBasicOps-793549693-project-member</nova:user>
Feb 02 10:08:58 compute-1 nova_compute[226294]:     <nova:project uuid="efbfe697ca674d72b47da5adf3e42c0c">tempest-TestNetworkBasicOps-793549693</nova:project>
Feb 02 10:08:58 compute-1 nova_compute[226294]:   </nova:owner>
Feb 02 10:08:58 compute-1 nova_compute[226294]:   <nova:root type="image" uuid="d5e062d7-95ef-409c-9ad0-60f7cf6f44ce"/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:   <nova:ports>
Feb 02 10:08:58 compute-1 nova_compute[226294]:     <nova:port uuid="09a00258-4f60-42dd-a769-b2ea3b870187">
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:     </nova:port>
Feb 02 10:08:58 compute-1 nova_compute[226294]:   </nova:ports>
Feb 02 10:08:58 compute-1 nova_compute[226294]: </nova:instance>
Feb 02 10:08:58 compute-1 nova_compute[226294]:   </metadata>
Feb 02 10:08:58 compute-1 nova_compute[226294]:   <memory unit='KiB'>131072</memory>
Feb 02 10:08:58 compute-1 nova_compute[226294]:   <currentMemory unit='KiB'>131072</currentMemory>
Feb 02 10:08:58 compute-1 nova_compute[226294]:   <vcpu placement='static'>1</vcpu>
Feb 02 10:08:58 compute-1 nova_compute[226294]:   <resource>
Feb 02 10:08:58 compute-1 nova_compute[226294]:     <partition>/machine</partition>
Feb 02 10:08:58 compute-1 nova_compute[226294]:   </resource>
Feb 02 10:08:58 compute-1 nova_compute[226294]:   <sysinfo type='smbios'>
Feb 02 10:08:58 compute-1 nova_compute[226294]:     <system>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <entry name='manufacturer'>RDO</entry>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <entry name='product'>OpenStack Compute</entry>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <entry name='version'>27.5.2-0.20260127144738.eaa65f0.el9</entry>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <entry name='serial'>15b2e821-5e8b-4d8a-9a48-7c6a30bd3220</entry>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <entry name='uuid'>15b2e821-5e8b-4d8a-9a48-7c6a30bd3220</entry>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <entry name='family'>Virtual Machine</entry>
Feb 02 10:08:58 compute-1 nova_compute[226294]:     </system>
Feb 02 10:08:58 compute-1 nova_compute[226294]:   </sysinfo>
Feb 02 10:08:58 compute-1 rsyslogd[1009]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Feb 02 10:08:58 compute-1 nova_compute[226294]:   <os>
Feb 02 10:08:58 compute-1 nova_compute[226294]:     <type arch='x86_64' machine='pc-q35-rhel9.8.0'>hvm</type>
Feb 02 10:08:58 compute-1 nova_compute[226294]:     <boot dev='hd'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:     <smbios mode='sysinfo'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:   </os>
Feb 02 10:08:58 compute-1 nova_compute[226294]:   <features>
Feb 02 10:08:58 compute-1 nova_compute[226294]:     <acpi/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:     <apic/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:     <vmcoreinfo state='on'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:   </features>
Feb 02 10:08:58 compute-1 nova_compute[226294]:   <cpu mode='custom' match='exact' check='full'>
Feb 02 10:08:58 compute-1 nova_compute[226294]:     <model fallback='forbid'>EPYC-Rome</model>
Feb 02 10:08:58 compute-1 nova_compute[226294]:     <vendor>AMD</vendor>
Feb 02 10:08:58 compute-1 nova_compute[226294]:     <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:     <feature policy='require' name='x2apic'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:     <feature policy='require' name='tsc-deadline'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:     <feature policy='require' name='hypervisor'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:     <feature policy='require' name='tsc_adjust'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:     <feature policy='require' name='spec-ctrl'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:     <feature policy='require' name='stibp'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:     <feature policy='require' name='ssbd'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:     <feature policy='require' name='cmp_legacy'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:     <feature policy='require' name='overflow-recov'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:     <feature policy='require' name='succor'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:     <feature policy='require' name='ibrs'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:     <feature policy='require' name='amd-ssbd'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:     <feature policy='require' name='virt-ssbd'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:     <feature policy='disable' name='lbrv'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:     <feature policy='disable' name='tsc-scale'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:     <feature policy='disable' name='vmcb-clean'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:     <feature policy='disable' name='flushbyasid'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:     <feature policy='disable' name='pause-filter'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:     <feature policy='disable' name='pfthreshold'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:     <feature policy='disable' name='svme-addr-chk'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:     <feature policy='require' name='lfence-always-serializing'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:     <feature policy='disable' name='xsaves'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:     <feature policy='disable' name='svm'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:     <feature policy='require' name='topoext'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:     <feature policy='disable' name='npt'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:     <feature policy='disable' name='nrip-save'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:   </cpu>
Feb 02 10:08:58 compute-1 nova_compute[226294]:   <clock offset='utc'>
Feb 02 10:08:58 compute-1 nova_compute[226294]:     <timer name='pit' tickpolicy='delay'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:     <timer name='rtc' tickpolicy='catchup'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:     <timer name='hpet' present='no'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:   </clock>
Feb 02 10:08:58 compute-1 nova_compute[226294]:   <on_poweroff>destroy</on_poweroff>
Feb 02 10:08:58 compute-1 nova_compute[226294]:   <on_reboot>restart</on_reboot>
Feb 02 10:08:58 compute-1 nova_compute[226294]:   <on_crash>destroy</on_crash>
Feb 02 10:08:58 compute-1 nova_compute[226294]:   <devices>
Feb 02 10:08:58 compute-1 nova_compute[226294]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Feb 02 10:08:58 compute-1 nova_compute[226294]:     <disk type='network' device='disk'>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <driver name='qemu' type='raw' cache='none'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <auth username='openstack'>
Feb 02 10:08:58 compute-1 nova_compute[226294]:         <secret type='ceph' uuid='d241d473-9fcb-5f74-b163-f1ca4454e7f1'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       </auth>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <source protocol='rbd' name='vms/15b2e821-5e8b-4d8a-9a48-7c6a30bd3220_disk' index='2'>
Feb 02 10:08:58 compute-1 nova_compute[226294]:         <host name='192.168.122.100' port='6789'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:         <host name='192.168.122.102' port='6789'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:         <host name='192.168.122.101' port='6789'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       </source>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <target dev='vda' bus='virtio'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <alias name='virtio-disk0'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:     </disk>
Feb 02 10:08:58 compute-1 nova_compute[226294]:     <disk type='network' device='cdrom'>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <driver name='qemu' type='raw' cache='none'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <auth username='openstack'>
Feb 02 10:08:58 compute-1 nova_compute[226294]:         <secret type='ceph' uuid='d241d473-9fcb-5f74-b163-f1ca4454e7f1'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       </auth>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <source protocol='rbd' name='vms/15b2e821-5e8b-4d8a-9a48-7c6a30bd3220_disk.config' index='1'>
Feb 02 10:08:58 compute-1 nova_compute[226294]:         <host name='192.168.122.100' port='6789'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:         <host name='192.168.122.102' port='6789'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:         <host name='192.168.122.101' port='6789'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       </source>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <target dev='sda' bus='sata'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <readonly/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <alias name='sata0-0-0'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:     </disk>
Feb 02 10:08:58 compute-1 nova_compute[226294]:     <controller type='pci' index='0' model='pcie-root'>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <alias name='pcie.0'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:     </controller>
Feb 02 10:08:58 compute-1 nova_compute[226294]:     <controller type='pci' index='1' model='pcie-root-port'>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <model name='pcie-root-port'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <target chassis='1' port='0x10'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <alias name='pci.1'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:     </controller>
Feb 02 10:08:58 compute-1 nova_compute[226294]:     <controller type='pci' index='2' model='pcie-root-port'>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <model name='pcie-root-port'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <target chassis='2' port='0x11'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <alias name='pci.2'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:     </controller>
Feb 02 10:08:58 compute-1 nova_compute[226294]:     <controller type='pci' index='3' model='pcie-root-port'>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <model name='pcie-root-port'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <target chassis='3' port='0x12'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <alias name='pci.3'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:     </controller>
Feb 02 10:08:58 compute-1 nova_compute[226294]:     <controller type='pci' index='4' model='pcie-root-port'>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <model name='pcie-root-port'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <target chassis='4' port='0x13'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <alias name='pci.4'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:     </controller>
Feb 02 10:08:58 compute-1 nova_compute[226294]:     <controller type='pci' index='5' model='pcie-root-port'>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <model name='pcie-root-port'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <target chassis='5' port='0x14'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <alias name='pci.5'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:     </controller>
Feb 02 10:08:58 compute-1 nova_compute[226294]:     <controller type='pci' index='6' model='pcie-root-port'>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <model name='pcie-root-port'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <target chassis='6' port='0x15'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <alias name='pci.6'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:     </controller>
Feb 02 10:08:58 compute-1 nova_compute[226294]:     <controller type='pci' index='7' model='pcie-root-port'>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <model name='pcie-root-port'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <target chassis='7' port='0x16'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <alias name='pci.7'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:     </controller>
Feb 02 10:08:58 compute-1 nova_compute[226294]:     <controller type='pci' index='8' model='pcie-root-port'>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <model name='pcie-root-port'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <target chassis='8' port='0x17'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <alias name='pci.8'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:     </controller>
Feb 02 10:08:58 compute-1 nova_compute[226294]:     <controller type='pci' index='9' model='pcie-root-port'>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <model name='pcie-root-port'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <target chassis='9' port='0x18'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <alias name='pci.9'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:     </controller>
Feb 02 10:08:58 compute-1 nova_compute[226294]:     <controller type='pci' index='10' model='pcie-root-port'>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <model name='pcie-root-port'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <target chassis='10' port='0x19'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <alias name='pci.10'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:     </controller>
Feb 02 10:08:58 compute-1 nova_compute[226294]:     <controller type='pci' index='11' model='pcie-root-port'>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <model name='pcie-root-port'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <target chassis='11' port='0x1a'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <alias name='pci.11'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:     </controller>
Feb 02 10:08:58 compute-1 nova_compute[226294]:     <controller type='pci' index='12' model='pcie-root-port'>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <model name='pcie-root-port'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <target chassis='12' port='0x1b'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <alias name='pci.12'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:     </controller>
Feb 02 10:08:58 compute-1 nova_compute[226294]:     <controller type='pci' index='13' model='pcie-root-port'>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <model name='pcie-root-port'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <target chassis='13' port='0x1c'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <alias name='pci.13'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:     </controller>
Feb 02 10:08:58 compute-1 nova_compute[226294]:     <controller type='pci' index='14' model='pcie-root-port'>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <model name='pcie-root-port'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <target chassis='14' port='0x1d'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <alias name='pci.14'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:     </controller>
Feb 02 10:08:58 compute-1 nova_compute[226294]:     <controller type='pci' index='15' model='pcie-root-port'>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <model name='pcie-root-port'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <target chassis='15' port='0x1e'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <alias name='pci.15'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:     </controller>
Feb 02 10:08:58 compute-1 nova_compute[226294]:     <controller type='pci' index='16' model='pcie-root-port'>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <model name='pcie-root-port'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <target chassis='16' port='0x1f'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <alias name='pci.16'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:     </controller>
Feb 02 10:08:58 compute-1 nova_compute[226294]:     <controller type='pci' index='17' model='pcie-root-port'>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <model name='pcie-root-port'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <target chassis='17' port='0x20'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <alias name='pci.17'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:     </controller>
Feb 02 10:08:58 compute-1 nova_compute[226294]:     <controller type='pci' index='18' model='pcie-root-port'>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <model name='pcie-root-port'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <target chassis='18' port='0x21'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <alias name='pci.18'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:     </controller>
Feb 02 10:08:58 compute-1 nova_compute[226294]:     <controller type='pci' index='19' model='pcie-root-port'>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <model name='pcie-root-port'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <target chassis='19' port='0x22'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <alias name='pci.19'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:     </controller>
Feb 02 10:08:58 compute-1 nova_compute[226294]:     <controller type='pci' index='20' model='pcie-root-port'>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <model name='pcie-root-port'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <target chassis='20' port='0x23'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <alias name='pci.20'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:     </controller>
Feb 02 10:08:58 compute-1 nova_compute[226294]:     <controller type='pci' index='21' model='pcie-root-port'>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <model name='pcie-root-port'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <target chassis='21' port='0x24'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <alias name='pci.21'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:     </controller>
Feb 02 10:08:58 compute-1 nova_compute[226294]:     <controller type='pci' index='22' model='pcie-root-port'>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <model name='pcie-root-port'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <target chassis='22' port='0x25'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <alias name='pci.22'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:     </controller>
Feb 02 10:08:58 compute-1 nova_compute[226294]:     <controller type='pci' index='23' model='pcie-root-port'>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <model name='pcie-root-port'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <target chassis='23' port='0x26'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <alias name='pci.23'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:     </controller>
Feb 02 10:08:58 compute-1 nova_compute[226294]:     <controller type='pci' index='24' model='pcie-root-port'>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <model name='pcie-root-port'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <target chassis='24' port='0x27'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <alias name='pci.24'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:     </controller>
Feb 02 10:08:58 compute-1 nova_compute[226294]:     <controller type='pci' index='25' model='pcie-root-port'>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <model name='pcie-root-port'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <target chassis='25' port='0x28'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <alias name='pci.25'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:     </controller>
Feb 02 10:08:58 compute-1 nova_compute[226294]:     <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <model name='pcie-pci-bridge'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <alias name='pci.26'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:     </controller>
Feb 02 10:08:58 compute-1 nova_compute[226294]:     <controller type='usb' index='0' model='piix3-uhci'>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <alias name='usb'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:     </controller>
Feb 02 10:08:58 compute-1 nova_compute[226294]:     <controller type='sata' index='0'>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <alias name='ide'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:     </controller>
Feb 02 10:08:58 compute-1 nova_compute[226294]:     <interface type='ethernet'>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <mac address='fa:16:3e:85:9a:96'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <target dev='tap09a00258-4f'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <model type='virtio'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <driver name='vhost' rx_queue_size='512'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <mtu size='1442'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <alias name='net0'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:     </interface>
Feb 02 10:08:58 compute-1 nova_compute[226294]:     <serial type='pty'>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <source path='/dev/pts/0'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <log file='/var/lib/nova/instances/15b2e821-5e8b-4d8a-9a48-7c6a30bd3220/console.log' append='off'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <target type='isa-serial' port='0'>
Feb 02 10:08:58 compute-1 nova_compute[226294]:         <model name='isa-serial'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       </target>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <alias name='serial0'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:     </serial>
Feb 02 10:08:58 compute-1 nova_compute[226294]:     <console type='pty' tty='/dev/pts/0'>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <source path='/dev/pts/0'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <log file='/var/lib/nova/instances/15b2e821-5e8b-4d8a-9a48-7c6a30bd3220/console.log' append='off'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <target type='serial' port='0'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <alias name='serial0'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:     </console>
Feb 02 10:08:58 compute-1 nova_compute[226294]:     <input type='tablet' bus='usb'>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <alias name='input0'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <address type='usb' bus='0' port='1'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:     </input>
Feb 02 10:08:58 compute-1 nova_compute[226294]:     <input type='mouse' bus='ps2'>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <alias name='input1'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:     </input>
Feb 02 10:08:58 compute-1 nova_compute[226294]:     <input type='keyboard' bus='ps2'>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <alias name='input2'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:     </input>
Feb 02 10:08:58 compute-1 nova_compute[226294]:     <graphics type='vnc' port='5900' autoport='yes' listen='::0'>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <listen type='address' address='::0'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:     </graphics>
Feb 02 10:08:58 compute-1 nova_compute[226294]:     <audio id='1' type='none'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:     <video>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <model type='virtio' heads='1' primary='yes'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <alias name='video0'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:     </video>
Feb 02 10:08:58 compute-1 nova_compute[226294]:     <watchdog model='itco' action='reset'>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <alias name='watchdog0'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:     </watchdog>
Feb 02 10:08:58 compute-1 nova_compute[226294]:     <memballoon model='virtio'>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <stats period='10'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <alias name='balloon0'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:     </memballoon>
Feb 02 10:08:58 compute-1 nova_compute[226294]:     <rng model='virtio'>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <backend model='random'>/dev/urandom</backend>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <alias name='rng0'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:     </rng>
Feb 02 10:08:58 compute-1 nova_compute[226294]:   </devices>
Feb 02 10:08:58 compute-1 nova_compute[226294]:   <seclabel type='dynamic' model='selinux' relabel='yes'>
Feb 02 10:08:58 compute-1 nova_compute[226294]:     <label>system_u:system_r:svirt_t:s0:c659,c775</label>
Feb 02 10:08:58 compute-1 nova_compute[226294]:     <imagelabel>system_u:object_r:svirt_image_t:s0:c659,c775</imagelabel>
Feb 02 10:08:58 compute-1 nova_compute[226294]:   </seclabel>
Feb 02 10:08:58 compute-1 nova_compute[226294]:   <seclabel type='dynamic' model='dac' relabel='yes'>
Feb 02 10:08:58 compute-1 nova_compute[226294]:     <label>+107:+107</label>
Feb 02 10:08:58 compute-1 nova_compute[226294]:     <imagelabel>+107:+107</imagelabel>
Feb 02 10:08:58 compute-1 nova_compute[226294]:   </seclabel>
Feb 02 10:08:58 compute-1 nova_compute[226294]: </domain>
Feb 02 10:08:58 compute-1 nova_compute[226294]:  get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282
Feb 02 10:08:58 compute-1 nova_compute[226294]: 2026-02-02 10:08:58.648 226298 DEBUG nova.virt.libvirt.guest [req-f4a17fa9-f1c8-41aa-afac-d240f4b81791 req-65c090dd-7afc-41c2-a947-97c0907b12ca b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:2f:49:24"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapc66e0be1-d1"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257
Feb 02 10:08:58 compute-1 nova_compute[226294]: 2026-02-02 10:08:58.653 226298 DEBUG nova.virt.libvirt.guest [req-f4a17fa9-f1c8-41aa-afac-d240f4b81791 req-65c090dd-7afc-41c2-a947-97c0907b12ca b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:2f:49:24"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapc66e0be1-d1"/></interface>not found in domain: <domain type='kvm' id='2'>
Feb 02 10:08:58 compute-1 nova_compute[226294]:   <name>instance-00000006</name>
Feb 02 10:08:58 compute-1 nova_compute[226294]:   <uuid>15b2e821-5e8b-4d8a-9a48-7c6a30bd3220</uuid>
Feb 02 10:08:58 compute-1 nova_compute[226294]:   <metadata>
Feb 02 10:08:58 compute-1 nova_compute[226294]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Feb 02 10:08:58 compute-1 nova_compute[226294]:   <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:   <nova:name>tempest-TestNetworkBasicOps-server-1612354759</nova:name>
Feb 02 10:08:58 compute-1 nova_compute[226294]:   <nova:creationTime>2026-02-02 10:08:57</nova:creationTime>
Feb 02 10:08:58 compute-1 nova_compute[226294]:   <nova:flavor name="m1.nano">
Feb 02 10:08:58 compute-1 nova_compute[226294]:     <nova:memory>128</nova:memory>
Feb 02 10:08:58 compute-1 nova_compute[226294]:     <nova:disk>1</nova:disk>
Feb 02 10:08:58 compute-1 nova_compute[226294]:     <nova:swap>0</nova:swap>
Feb 02 10:08:58 compute-1 nova_compute[226294]:     <nova:ephemeral>0</nova:ephemeral>
Feb 02 10:08:58 compute-1 nova_compute[226294]:     <nova:vcpus>1</nova:vcpus>
Feb 02 10:08:58 compute-1 nova_compute[226294]:   </nova:flavor>
Feb 02 10:08:58 compute-1 nova_compute[226294]:   <nova:owner>
Feb 02 10:08:58 compute-1 nova_compute[226294]:     <nova:user uuid="1b1695a2a70d4aa0aa350ba17d8f6d5e">tempest-TestNetworkBasicOps-793549693-project-member</nova:user>
Feb 02 10:08:58 compute-1 nova_compute[226294]:     <nova:project uuid="efbfe697ca674d72b47da5adf3e42c0c">tempest-TestNetworkBasicOps-793549693</nova:project>
Feb 02 10:08:58 compute-1 nova_compute[226294]:   </nova:owner>
Feb 02 10:08:58 compute-1 nova_compute[226294]:   <nova:root type="image" uuid="d5e062d7-95ef-409c-9ad0-60f7cf6f44ce"/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:   <nova:ports>
Feb 02 10:08:58 compute-1 nova_compute[226294]:     <nova:port uuid="09a00258-4f60-42dd-a769-b2ea3b870187">
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:     </nova:port>
Feb 02 10:08:58 compute-1 nova_compute[226294]:   </nova:ports>
Feb 02 10:08:58 compute-1 nova_compute[226294]: </nova:instance>
Feb 02 10:08:58 compute-1 nova_compute[226294]:   </metadata>
Feb 02 10:08:58 compute-1 nova_compute[226294]:   <memory unit='KiB'>131072</memory>
Feb 02 10:08:58 compute-1 nova_compute[226294]:   <currentMemory unit='KiB'>131072</currentMemory>
Feb 02 10:08:58 compute-1 nova_compute[226294]:   <vcpu placement='static'>1</vcpu>
Feb 02 10:08:58 compute-1 nova_compute[226294]:   <resource>
Feb 02 10:08:58 compute-1 nova_compute[226294]:     <partition>/machine</partition>
Feb 02 10:08:58 compute-1 nova_compute[226294]:   </resource>
Feb 02 10:08:58 compute-1 nova_compute[226294]:   <sysinfo type='smbios'>
Feb 02 10:08:58 compute-1 nova_compute[226294]:     <system>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <entry name='manufacturer'>RDO</entry>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <entry name='product'>OpenStack Compute</entry>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <entry name='version'>27.5.2-0.20260127144738.eaa65f0.el9</entry>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <entry name='serial'>15b2e821-5e8b-4d8a-9a48-7c6a30bd3220</entry>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <entry name='uuid'>15b2e821-5e8b-4d8a-9a48-7c6a30bd3220</entry>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <entry name='family'>Virtual Machine</entry>
Feb 02 10:08:58 compute-1 nova_compute[226294]:     </system>
Feb 02 10:08:58 compute-1 nova_compute[226294]:   </sysinfo>
Feb 02 10:08:58 compute-1 nova_compute[226294]:   <os>
Feb 02 10:08:58 compute-1 nova_compute[226294]:     <type arch='x86_64' machine='pc-q35-rhel9.8.0'>hvm</type>
Feb 02 10:08:58 compute-1 nova_compute[226294]:     <boot dev='hd'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:     <smbios mode='sysinfo'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:   </os>
Feb 02 10:08:58 compute-1 nova_compute[226294]:   <features>
Feb 02 10:08:58 compute-1 nova_compute[226294]:     <acpi/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:     <apic/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:     <vmcoreinfo state='on'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:   </features>
Feb 02 10:08:58 compute-1 nova_compute[226294]:   <cpu mode='custom' match='exact' check='full'>
Feb 02 10:08:58 compute-1 nova_compute[226294]:     <model fallback='forbid'>EPYC-Rome</model>
Feb 02 10:08:58 compute-1 nova_compute[226294]:     <vendor>AMD</vendor>
Feb 02 10:08:58 compute-1 nova_compute[226294]:     <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:     <feature policy='require' name='x2apic'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:     <feature policy='require' name='tsc-deadline'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:     <feature policy='require' name='hypervisor'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:     <feature policy='require' name='tsc_adjust'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:     <feature policy='require' name='spec-ctrl'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:     <feature policy='require' name='stibp'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:     <feature policy='require' name='ssbd'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:     <feature policy='require' name='cmp_legacy'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:     <feature policy='require' name='overflow-recov'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:     <feature policy='require' name='succor'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:     <feature policy='require' name='ibrs'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:     <feature policy='require' name='amd-ssbd'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:     <feature policy='require' name='virt-ssbd'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:     <feature policy='disable' name='lbrv'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:     <feature policy='disable' name='tsc-scale'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:     <feature policy='disable' name='vmcb-clean'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:     <feature policy='disable' name='flushbyasid'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:     <feature policy='disable' name='pause-filter'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:     <feature policy='disable' name='pfthreshold'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:     <feature policy='disable' name='svme-addr-chk'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:     <feature policy='require' name='lfence-always-serializing'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:     <feature policy='disable' name='xsaves'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:     <feature policy='disable' name='svm'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:     <feature policy='require' name='topoext'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:     <feature policy='disable' name='npt'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:     <feature policy='disable' name='nrip-save'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:   </cpu>
Feb 02 10:08:58 compute-1 nova_compute[226294]:   <clock offset='utc'>
Feb 02 10:08:58 compute-1 nova_compute[226294]:     <timer name='pit' tickpolicy='delay'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:     <timer name='rtc' tickpolicy='catchup'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:     <timer name='hpet' present='no'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:   </clock>
Feb 02 10:08:58 compute-1 nova_compute[226294]:   <on_poweroff>destroy</on_poweroff>
Feb 02 10:08:58 compute-1 nova_compute[226294]:   <on_reboot>restart</on_reboot>
Feb 02 10:08:58 compute-1 nova_compute[226294]:   <on_crash>destroy</on_crash>
Feb 02 10:08:58 compute-1 nova_compute[226294]:   <devices>
Feb 02 10:08:58 compute-1 nova_compute[226294]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Feb 02 10:08:58 compute-1 nova_compute[226294]:     <disk type='network' device='disk'>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <driver name='qemu' type='raw' cache='none'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <auth username='openstack'>
Feb 02 10:08:58 compute-1 nova_compute[226294]:         <secret type='ceph' uuid='d241d473-9fcb-5f74-b163-f1ca4454e7f1'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       </auth>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <source protocol='rbd' name='vms/15b2e821-5e8b-4d8a-9a48-7c6a30bd3220_disk' index='2'>
Feb 02 10:08:58 compute-1 nova_compute[226294]:         <host name='192.168.122.100' port='6789'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:         <host name='192.168.122.102' port='6789'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:         <host name='192.168.122.101' port='6789'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       </source>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <target dev='vda' bus='virtio'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <alias name='virtio-disk0'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:     </disk>
Feb 02 10:08:58 compute-1 nova_compute[226294]:     <disk type='network' device='cdrom'>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <driver name='qemu' type='raw' cache='none'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <auth username='openstack'>
Feb 02 10:08:58 compute-1 nova_compute[226294]:         <secret type='ceph' uuid='d241d473-9fcb-5f74-b163-f1ca4454e7f1'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       </auth>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <source protocol='rbd' name='vms/15b2e821-5e8b-4d8a-9a48-7c6a30bd3220_disk.config' index='1'>
Feb 02 10:08:58 compute-1 nova_compute[226294]:         <host name='192.168.122.100' port='6789'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:         <host name='192.168.122.102' port='6789'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:         <host name='192.168.122.101' port='6789'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       </source>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <target dev='sda' bus='sata'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <readonly/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <alias name='sata0-0-0'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:     </disk>
Feb 02 10:08:58 compute-1 nova_compute[226294]:     <controller type='pci' index='0' model='pcie-root'>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <alias name='pcie.0'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:     </controller>
Feb 02 10:08:58 compute-1 nova_compute[226294]:     <controller type='pci' index='1' model='pcie-root-port'>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <model name='pcie-root-port'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <target chassis='1' port='0x10'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <alias name='pci.1'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:     </controller>
Feb 02 10:08:58 compute-1 nova_compute[226294]:     <controller type='pci' index='2' model='pcie-root-port'>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <model name='pcie-root-port'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <target chassis='2' port='0x11'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <alias name='pci.2'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:     </controller>
Feb 02 10:08:58 compute-1 nova_compute[226294]:     <controller type='pci' index='3' model='pcie-root-port'>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <model name='pcie-root-port'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <target chassis='3' port='0x12'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <alias name='pci.3'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:     </controller>
Feb 02 10:08:58 compute-1 nova_compute[226294]:     <controller type='pci' index='4' model='pcie-root-port'>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <model name='pcie-root-port'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <target chassis='4' port='0x13'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <alias name='pci.4'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:     </controller>
Feb 02 10:08:58 compute-1 nova_compute[226294]:     <controller type='pci' index='5' model='pcie-root-port'>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <model name='pcie-root-port'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <target chassis='5' port='0x14'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <alias name='pci.5'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:     </controller>
Feb 02 10:08:58 compute-1 nova_compute[226294]:     <controller type='pci' index='6' model='pcie-root-port'>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <model name='pcie-root-port'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <target chassis='6' port='0x15'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <alias name='pci.6'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:     </controller>
Feb 02 10:08:58 compute-1 nova_compute[226294]:     <controller type='pci' index='7' model='pcie-root-port'>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <model name='pcie-root-port'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <target chassis='7' port='0x16'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <alias name='pci.7'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:     </controller>
Feb 02 10:08:58 compute-1 nova_compute[226294]:     <controller type='pci' index='8' model='pcie-root-port'>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <model name='pcie-root-port'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <target chassis='8' port='0x17'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <alias name='pci.8'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:     </controller>
Feb 02 10:08:58 compute-1 nova_compute[226294]:     <controller type='pci' index='9' model='pcie-root-port'>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <model name='pcie-root-port'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <target chassis='9' port='0x18'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <alias name='pci.9'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:     </controller>
Feb 02 10:08:58 compute-1 nova_compute[226294]:     <controller type='pci' index='10' model='pcie-root-port'>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <model name='pcie-root-port'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <target chassis='10' port='0x19'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <alias name='pci.10'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:     </controller>
Feb 02 10:08:58 compute-1 nova_compute[226294]:     <controller type='pci' index='11' model='pcie-root-port'>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <model name='pcie-root-port'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <target chassis='11' port='0x1a'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <alias name='pci.11'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:     </controller>
Feb 02 10:08:58 compute-1 nova_compute[226294]:     <controller type='pci' index='12' model='pcie-root-port'>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <model name='pcie-root-port'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <target chassis='12' port='0x1b'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <alias name='pci.12'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:     </controller>
Feb 02 10:08:58 compute-1 nova_compute[226294]:     <controller type='pci' index='13' model='pcie-root-port'>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <model name='pcie-root-port'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <target chassis='13' port='0x1c'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <alias name='pci.13'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:     </controller>
Feb 02 10:08:58 compute-1 nova_compute[226294]:     <controller type='pci' index='14' model='pcie-root-port'>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <model name='pcie-root-port'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <target chassis='14' port='0x1d'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <alias name='pci.14'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:     </controller>
Feb 02 10:08:58 compute-1 nova_compute[226294]:     <controller type='pci' index='15' model='pcie-root-port'>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <model name='pcie-root-port'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <target chassis='15' port='0x1e'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <alias name='pci.15'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:     </controller>
Feb 02 10:08:58 compute-1 nova_compute[226294]:     <controller type='pci' index='16' model='pcie-root-port'>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <model name='pcie-root-port'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <target chassis='16' port='0x1f'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <alias name='pci.16'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:     </controller>
Feb 02 10:08:58 compute-1 nova_compute[226294]:     <controller type='pci' index='17' model='pcie-root-port'>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <model name='pcie-root-port'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <target chassis='17' port='0x20'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <alias name='pci.17'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:     </controller>
Feb 02 10:08:58 compute-1 nova_compute[226294]:     <controller type='pci' index='18' model='pcie-root-port'>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <model name='pcie-root-port'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <target chassis='18' port='0x21'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <alias name='pci.18'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:     </controller>
Feb 02 10:08:58 compute-1 nova_compute[226294]:     <controller type='pci' index='19' model='pcie-root-port'>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <model name='pcie-root-port'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <target chassis='19' port='0x22'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <alias name='pci.19'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:     </controller>
Feb 02 10:08:58 compute-1 nova_compute[226294]:     <controller type='pci' index='20' model='pcie-root-port'>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <model name='pcie-root-port'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <target chassis='20' port='0x23'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <alias name='pci.20'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:     </controller>
Feb 02 10:08:58 compute-1 nova_compute[226294]:     <controller type='pci' index='21' model='pcie-root-port'>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <model name='pcie-root-port'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <target chassis='21' port='0x24'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <alias name='pci.21'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:     </controller>
Feb 02 10:08:58 compute-1 nova_compute[226294]:     <controller type='pci' index='22' model='pcie-root-port'>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <model name='pcie-root-port'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <target chassis='22' port='0x25'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <alias name='pci.22'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:     </controller>
Feb 02 10:08:58 compute-1 nova_compute[226294]:     <controller type='pci' index='23' model='pcie-root-port'>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <model name='pcie-root-port'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <target chassis='23' port='0x26'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <alias name='pci.23'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:     </controller>
Feb 02 10:08:58 compute-1 nova_compute[226294]:     <controller type='pci' index='24' model='pcie-root-port'>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <model name='pcie-root-port'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <target chassis='24' port='0x27'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <alias name='pci.24'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:     </controller>
Feb 02 10:08:58 compute-1 nova_compute[226294]:     <controller type='pci' index='25' model='pcie-root-port'>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <model name='pcie-root-port'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <target chassis='25' port='0x28'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <alias name='pci.25'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:     </controller>
Feb 02 10:08:58 compute-1 nova_compute[226294]:     <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <model name='pcie-pci-bridge'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <alias name='pci.26'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:     </controller>
Feb 02 10:08:58 compute-1 nova_compute[226294]:     <controller type='usb' index='0' model='piix3-uhci'>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <alias name='usb'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:     </controller>
Feb 02 10:08:58 compute-1 nova_compute[226294]:     <controller type='sata' index='0'>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <alias name='ide'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:     </controller>
Feb 02 10:08:58 compute-1 nova_compute[226294]:     <interface type='ethernet'>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <mac address='fa:16:3e:85:9a:96'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <target dev='tap09a00258-4f'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <model type='virtio'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <driver name='vhost' rx_queue_size='512'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <mtu size='1442'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <alias name='net0'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:     </interface>
Feb 02 10:08:58 compute-1 nova_compute[226294]:     <serial type='pty'>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <source path='/dev/pts/0'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <log file='/var/lib/nova/instances/15b2e821-5e8b-4d8a-9a48-7c6a30bd3220/console.log' append='off'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <target type='isa-serial' port='0'>
Feb 02 10:08:58 compute-1 nova_compute[226294]:         <model name='isa-serial'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       </target>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <alias name='serial0'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:     </serial>
Feb 02 10:08:58 compute-1 nova_compute[226294]:     <console type='pty' tty='/dev/pts/0'>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <source path='/dev/pts/0'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <log file='/var/lib/nova/instances/15b2e821-5e8b-4d8a-9a48-7c6a30bd3220/console.log' append='off'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <target type='serial' port='0'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <alias name='serial0'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:     </console>
Feb 02 10:08:58 compute-1 nova_compute[226294]:     <input type='tablet' bus='usb'>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <alias name='input0'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <address type='usb' bus='0' port='1'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:     </input>
Feb 02 10:08:58 compute-1 nova_compute[226294]:     <input type='mouse' bus='ps2'>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <alias name='input1'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:     </input>
Feb 02 10:08:58 compute-1 nova_compute[226294]:     <input type='keyboard' bus='ps2'>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <alias name='input2'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:     </input>
Feb 02 10:08:58 compute-1 nova_compute[226294]:     <graphics type='vnc' port='5900' autoport='yes' listen='::0'>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <listen type='address' address='::0'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:     </graphics>
Feb 02 10:08:58 compute-1 nova_compute[226294]:     <audio id='1' type='none'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:     <video>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <model type='virtio' heads='1' primary='yes'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <alias name='video0'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:     </video>
Feb 02 10:08:58 compute-1 nova_compute[226294]:     <watchdog model='itco' action='reset'>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <alias name='watchdog0'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:     </watchdog>
Feb 02 10:08:58 compute-1 nova_compute[226294]:     <memballoon model='virtio'>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <stats period='10'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <alias name='balloon0'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:     </memballoon>
Feb 02 10:08:58 compute-1 nova_compute[226294]:     <rng model='virtio'>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <backend model='random'>/dev/urandom</backend>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <alias name='rng0'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:     </rng>
Feb 02 10:08:58 compute-1 nova_compute[226294]:   </devices>
Feb 02 10:08:58 compute-1 nova_compute[226294]:   <seclabel type='dynamic' model='selinux' relabel='yes'>
Feb 02 10:08:58 compute-1 nova_compute[226294]:     <label>system_u:system_r:svirt_t:s0:c659,c775</label>
Feb 02 10:08:58 compute-1 nova_compute[226294]:     <imagelabel>system_u:object_r:svirt_image_t:s0:c659,c775</imagelabel>
Feb 02 10:08:58 compute-1 nova_compute[226294]:   </seclabel>
Feb 02 10:08:58 compute-1 nova_compute[226294]:   <seclabel type='dynamic' model='dac' relabel='yes'>
Feb 02 10:08:58 compute-1 nova_compute[226294]:     <label>+107:+107</label>
Feb 02 10:08:58 compute-1 nova_compute[226294]:     <imagelabel>+107:+107</imagelabel>
Feb 02 10:08:58 compute-1 nova_compute[226294]:   </seclabel>
Feb 02 10:08:58 compute-1 nova_compute[226294]: </domain>
Feb 02 10:08:58 compute-1 nova_compute[226294]:  get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282
Feb 02 10:08:58 compute-1 nova_compute[226294]: 2026-02-02 10:08:58.654 226298 WARNING nova.virt.libvirt.driver [req-f4a17fa9-f1c8-41aa-afac-d240f4b81791 req-65c090dd-7afc-41c2-a947-97c0907b12ca b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] [instance: 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220] Detaching interface fa:16:3e:2f:49:24 failed because the device is no longer found on the guest.: nova.exception.DeviceNotFound: Device 'tapc66e0be1-d1' not found.
Feb 02 10:08:58 compute-1 nova_compute[226294]: 2026-02-02 10:08:58.655 226298 DEBUG nova.virt.libvirt.vif [req-f4a17fa9-f1c8-41aa-afac-d240f4b81791 req-65c090dd-7afc-41c2-a947-97c0907b12ca b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-02-02T10:07:00Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1612354759',display_name='tempest-TestNetworkBasicOps-server-1612354759',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1612354759',id=6,image_ref='d5e062d7-95ef-409c-9ad0-60f7cf6f44ce',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBMJFk+K1ECuOT65iOILNHAamJ3gGwluebBSGyHnh3tJwd+ehxpJY8ebkg+pBKZw/EcMgvzSZ3FmRm5/iJU1QzjfTd7kqiXGqABRWM3LPcjo/Kmnp/RcvKBxgSpfhTx8kiA==',key_name='tempest-TestNetworkBasicOps-1228211887',keypairs=<?>,launch_index=0,launched_at=2026-02-02T10:07:09Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata=<?>,migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='efbfe697ca674d72b47da5adf3e42c0c',ramdisk_id='',reservation_id='r-1fnfubus',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='d5e062d7-95ef-409c-9ad0-60f7cf6f44ce',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-793549693',owner_user_name='tempest-TestNetworkBasicOps-793549693-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-02-02T10:07:09Z,user_data=None,user_id='1b1695a2a70d4aa0aa350ba17d8f6d5e',uuid=15b2e821-5e8b-4d8a-9a48-7c6a30bd3220,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "c66e0be1-d166-4088-8ad8-baa84f3d032d", "address": "fa:16:3e:2f:49:24", "network": {"id": "e125f54e-7556-49c5-8356-e7390df43c53", "bridge": "br-int", "label": "tempest-network-smoke--39971515", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.18", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "efbfe697ca674d72b47da5adf3e42c0c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc66e0be1-d1", "ovs_interfaceid": "c66e0be1-d166-4088-8ad8-baa84f3d032d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Feb 02 10:08:58 compute-1 nova_compute[226294]: 2026-02-02 10:08:58.656 226298 DEBUG nova.network.os_vif_util [req-f4a17fa9-f1c8-41aa-afac-d240f4b81791 req-65c090dd-7afc-41c2-a947-97c0907b12ca b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] Converting VIF {"id": "c66e0be1-d166-4088-8ad8-baa84f3d032d", "address": "fa:16:3e:2f:49:24", "network": {"id": "e125f54e-7556-49c5-8356-e7390df43c53", "bridge": "br-int", "label": "tempest-network-smoke--39971515", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.18", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "efbfe697ca674d72b47da5adf3e42c0c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc66e0be1-d1", "ovs_interfaceid": "c66e0be1-d166-4088-8ad8-baa84f3d032d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 02 10:08:58 compute-1 nova_compute[226294]: 2026-02-02 10:08:58.657 226298 DEBUG nova.network.os_vif_util [req-f4a17fa9-f1c8-41aa-afac-d240f4b81791 req-65c090dd-7afc-41c2-a947-97c0907b12ca b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:2f:49:24,bridge_name='br-int',has_traffic_filtering=True,id=c66e0be1-d166-4088-8ad8-baa84f3d032d,network=Network(e125f54e-7556-49c5-8356-e7390df43c53),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc66e0be1-d1') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 02 10:08:58 compute-1 nova_compute[226294]: 2026-02-02 10:08:58.658 226298 DEBUG os_vif [req-f4a17fa9-f1c8-41aa-afac-d240f4b81791 req-65c090dd-7afc-41c2-a947-97c0907b12ca b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:2f:49:24,bridge_name='br-int',has_traffic_filtering=True,id=c66e0be1-d166-4088-8ad8-baa84f3d032d,network=Network(e125f54e-7556-49c5-8356-e7390df43c53),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc66e0be1-d1') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Feb 02 10:08:58 compute-1 nova_compute[226294]: 2026-02-02 10:08:58.660 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:08:58 compute-1 nova_compute[226294]: 2026-02-02 10:08:58.661 226298 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc66e0be1-d1, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 02 10:08:58 compute-1 nova_compute[226294]: 2026-02-02 10:08:58.661 226298 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 02 10:08:58 compute-1 nova_compute[226294]: 2026-02-02 10:08:58.666 226298 INFO os_vif [req-f4a17fa9-f1c8-41aa-afac-d240f4b81791 req-65c090dd-7afc-41c2-a947-97c0907b12ca b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:2f:49:24,bridge_name='br-int',has_traffic_filtering=True,id=c66e0be1-d166-4088-8ad8-baa84f3d032d,network=Network(e125f54e-7556-49c5-8356-e7390df43c53),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc66e0be1-d1')
Feb 02 10:08:58 compute-1 nova_compute[226294]: 2026-02-02 10:08:58.667 226298 DEBUG nova.virt.libvirt.guest [req-f4a17fa9-f1c8-41aa-afac-d240f4b81791 req-65c090dd-7afc-41c2-a947-97c0907b12ca b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] set metadata xml: <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Feb 02 10:08:58 compute-1 nova_compute[226294]:   <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:   <nova:name>tempest-TestNetworkBasicOps-server-1612354759</nova:name>
Feb 02 10:08:58 compute-1 nova_compute[226294]:   <nova:creationTime>2026-02-02 10:08:58</nova:creationTime>
Feb 02 10:08:58 compute-1 nova_compute[226294]:   <nova:flavor name="m1.nano">
Feb 02 10:08:58 compute-1 nova_compute[226294]:     <nova:memory>128</nova:memory>
Feb 02 10:08:58 compute-1 nova_compute[226294]:     <nova:disk>1</nova:disk>
Feb 02 10:08:58 compute-1 nova_compute[226294]:     <nova:swap>0</nova:swap>
Feb 02 10:08:58 compute-1 nova_compute[226294]:     <nova:ephemeral>0</nova:ephemeral>
Feb 02 10:08:58 compute-1 nova_compute[226294]:     <nova:vcpus>1</nova:vcpus>
Feb 02 10:08:58 compute-1 nova_compute[226294]:   </nova:flavor>
Feb 02 10:08:58 compute-1 nova_compute[226294]:   <nova:owner>
Feb 02 10:08:58 compute-1 nova_compute[226294]:     <nova:user uuid="1b1695a2a70d4aa0aa350ba17d8f6d5e">tempest-TestNetworkBasicOps-793549693-project-member</nova:user>
Feb 02 10:08:58 compute-1 nova_compute[226294]:     <nova:project uuid="efbfe697ca674d72b47da5adf3e42c0c">tempest-TestNetworkBasicOps-793549693</nova:project>
Feb 02 10:08:58 compute-1 nova_compute[226294]:   </nova:owner>
Feb 02 10:08:58 compute-1 nova_compute[226294]:   <nova:root type="image" uuid="d5e062d7-95ef-409c-9ad0-60f7cf6f44ce"/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:   <nova:ports>
Feb 02 10:08:58 compute-1 nova_compute[226294]:     <nova:port uuid="09a00258-4f60-42dd-a769-b2ea3b870187">
Feb 02 10:08:58 compute-1 nova_compute[226294]:       <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Feb 02 10:08:58 compute-1 nova_compute[226294]:     </nova:port>
Feb 02 10:08:58 compute-1 nova_compute[226294]:   </nova:ports>
Feb 02 10:08:58 compute-1 nova_compute[226294]: </nova:instance>
Feb 02 10:08:58 compute-1 nova_compute[226294]:  set_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:359
Feb 02 10:08:58 compute-1 ceph-mon[80115]: pgmap v887: 353 pgs: 353 active+clean; 121 MiB data, 306 MiB used, 60 GiB / 60 GiB avail; 19 KiB/s rd, 8.2 KiB/s wr, 29 op/s
Feb 02 10:08:59 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:08:59 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 10:08:59 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:08:59.212 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 10:08:59 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:08:59 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:08:59 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:08:59.287 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:08:59 compute-1 sudo[233071]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Feb 02 10:08:59 compute-1 sudo[233071]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 10:08:59 compute-1 sudo[233071]: pam_unix(sudo:session): session closed for user root
Feb 02 10:08:59 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:08:59 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_33] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8490003040 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:09:00 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:09:00 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_35] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b800b1d0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:09:00 compute-1 nova_compute[226294]: 2026-02-02 10:09:00.307 226298 DEBUG nova.compute.manager [req-2e78c648-e135-4024-b5f2-adcdf6740e14 req-b92ef01d-05f2-4fb7-8fcf-651c5e208451 b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] [instance: 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220] Received event network-vif-plugged-c66e0be1-d166-4088-8ad8-baa84f3d032d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 02 10:09:00 compute-1 nova_compute[226294]: 2026-02-02 10:09:00.307 226298 DEBUG oslo_concurrency.lockutils [req-2e78c648-e135-4024-b5f2-adcdf6740e14 req-b92ef01d-05f2-4fb7-8fcf-651c5e208451 b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] Acquiring lock "15b2e821-5e8b-4d8a-9a48-7c6a30bd3220-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 02 10:09:00 compute-1 nova_compute[226294]: 2026-02-02 10:09:00.307 226298 DEBUG oslo_concurrency.lockutils [req-2e78c648-e135-4024-b5f2-adcdf6740e14 req-b92ef01d-05f2-4fb7-8fcf-651c5e208451 b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] Lock "15b2e821-5e8b-4d8a-9a48-7c6a30bd3220-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 02 10:09:00 compute-1 nova_compute[226294]: 2026-02-02 10:09:00.308 226298 DEBUG oslo_concurrency.lockutils [req-2e78c648-e135-4024-b5f2-adcdf6740e14 req-b92ef01d-05f2-4fb7-8fcf-651c5e208451 b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] Lock "15b2e821-5e8b-4d8a-9a48-7c6a30bd3220-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 02 10:09:00 compute-1 nova_compute[226294]: 2026-02-02 10:09:00.308 226298 DEBUG nova.compute.manager [req-2e78c648-e135-4024-b5f2-adcdf6740e14 req-b92ef01d-05f2-4fb7-8fcf-651c5e208451 b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] [instance: 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220] No waiting events found dispatching network-vif-plugged-c66e0be1-d166-4088-8ad8-baa84f3d032d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 02 10:09:00 compute-1 nova_compute[226294]: 2026-02-02 10:09:00.308 226298 WARNING nova.compute.manager [req-2e78c648-e135-4024-b5f2-adcdf6740e14 req-b92ef01d-05f2-4fb7-8fcf-651c5e208451 b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] [instance: 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220] Received unexpected event network-vif-plugged-c66e0be1-d166-4088-8ad8-baa84f3d032d for instance with vm_state active and task_state None.
Feb 02 10:09:00 compute-1 nova_compute[226294]: 2026-02-02 10:09:00.342 226298 INFO nova.network.neutron [None req-a9d9f71d-a6d4-4c48-8fe9-dc7b77e7a7c4 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] [instance: 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220] Port c66e0be1-d166-4088-8ad8-baa84f3d032d from network info_cache is no longer associated with instance in Neutron. Removing from network info_cache.
Feb 02 10:09:00 compute-1 nova_compute[226294]: 2026-02-02 10:09:00.342 226298 DEBUG nova.network.neutron [None req-a9d9f71d-a6d4-4c48-8fe9-dc7b77e7a7c4 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] [instance: 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220] Updating instance_info_cache with network_info: [{"id": "09a00258-4f60-42dd-a769-b2ea3b870187", "address": "fa:16:3e:85:9a:96", "network": {"id": "ba6c4c87-77a9-4fcc-aa14-a4637c78f692", "bridge": "br-int", "label": "tempest-network-smoke--761234279", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.238", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "efbfe697ca674d72b47da5adf3e42c0c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap09a00258-4f", "ovs_interfaceid": "09a00258-4f60-42dd-a769-b2ea3b870187", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 02 10:09:00 compute-1 nova_compute[226294]: 2026-02-02 10:09:00.358 226298 DEBUG oslo_concurrency.lockutils [None req-a9d9f71d-a6d4-4c48-8fe9-dc7b77e7a7c4 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Releasing lock "refresh_cache-15b2e821-5e8b-4d8a-9a48-7c6a30bd3220" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 02 10:09:00 compute-1 nova_compute[226294]: 2026-02-02 10:09:00.391 226298 DEBUG oslo_concurrency.lockutils [None req-a9d9f71d-a6d4-4c48-8fe9-dc7b77e7a7c4 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Lock "interface-15b2e821-5e8b-4d8a-9a48-7c6a30bd3220-c66e0be1-d166-4088-8ad8-baa84f3d032d" "released" by "nova.compute.manager.ComputeManager.detach_interface.<locals>.do_detach_interface" :: held 2.902s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 02 10:09:00 compute-1 ovn_controller[133666]: 2026-02-02T10:09:00Z|00053|binding|INFO|Releasing lport f5df8d3e-4c61-4492-9e28-98679c02afcc from this chassis (sb_readonly=0)
Feb 02 10:09:00 compute-1 nova_compute[226294]: 2026-02-02 10:09:00.502 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:09:00 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:09:00 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8498003b00 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:09:00 compute-1 nova_compute[226294]: 2026-02-02 10:09:00.713 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:09:00 compute-1 ceph-mon[80115]: pgmap v888: 353 pgs: 353 active+clean; 121 MiB data, 306 MiB used, 60 GiB / 60 GiB avail; 19 KiB/s rd, 5.8 KiB/s wr, 28 op/s
Feb 02 10:09:01 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:09:01 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:09:01 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:09:01.216 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:09:01 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:09:01 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:09:01 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:09:01.289 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:09:01 compute-1 nova_compute[226294]: 2026-02-02 10:09:01.699 226298 DEBUG nova.compute.manager [req-3e4cbe8c-e119-4596-a0b1-b54750a128d4 req-34f68f31-414b-4358-9512-d6c344c9d463 b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] [instance: 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220] Received event network-changed-09a00258-4f60-42dd-a769-b2ea3b870187 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 02 10:09:01 compute-1 nova_compute[226294]: 2026-02-02 10:09:01.699 226298 DEBUG nova.compute.manager [req-3e4cbe8c-e119-4596-a0b1-b54750a128d4 req-34f68f31-414b-4358-9512-d6c344c9d463 b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] [instance: 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220] Refreshing instance network info cache due to event network-changed-09a00258-4f60-42dd-a769-b2ea3b870187. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Feb 02 10:09:01 compute-1 nova_compute[226294]: 2026-02-02 10:09:01.700 226298 DEBUG oslo_concurrency.lockutils [req-3e4cbe8c-e119-4596-a0b1-b54750a128d4 req-34f68f31-414b-4358-9512-d6c344c9d463 b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] Acquiring lock "refresh_cache-15b2e821-5e8b-4d8a-9a48-7c6a30bd3220" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 02 10:09:01 compute-1 nova_compute[226294]: 2026-02-02 10:09:01.700 226298 DEBUG oslo_concurrency.lockutils [req-3e4cbe8c-e119-4596-a0b1-b54750a128d4 req-34f68f31-414b-4358-9512-d6c344c9d463 b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] Acquired lock "refresh_cache-15b2e821-5e8b-4d8a-9a48-7c6a30bd3220" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 02 10:09:01 compute-1 nova_compute[226294]: 2026-02-02 10:09:01.700 226298 DEBUG nova.network.neutron [req-3e4cbe8c-e119-4596-a0b1-b54750a128d4 req-34f68f31-414b-4358-9512-d6c344c9d463 b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] [instance: 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220] Refreshing network info cache for port 09a00258-4f60-42dd-a769-b2ea3b870187 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Feb 02 10:09:01 compute-1 nova_compute[226294]: 2026-02-02 10:09:01.785 226298 DEBUG oslo_concurrency.lockutils [None req-97a70864-13d1-4f21-ae03-5e69b2b80b33 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Acquiring lock "15b2e821-5e8b-4d8a-9a48-7c6a30bd3220" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 02 10:09:01 compute-1 nova_compute[226294]: 2026-02-02 10:09:01.786 226298 DEBUG oslo_concurrency.lockutils [None req-97a70864-13d1-4f21-ae03-5e69b2b80b33 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Lock "15b2e821-5e8b-4d8a-9a48-7c6a30bd3220" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 02 10:09:01 compute-1 nova_compute[226294]: 2026-02-02 10:09:01.786 226298 DEBUG oslo_concurrency.lockutils [None req-97a70864-13d1-4f21-ae03-5e69b2b80b33 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Acquiring lock "15b2e821-5e8b-4d8a-9a48-7c6a30bd3220-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 02 10:09:01 compute-1 nova_compute[226294]: 2026-02-02 10:09:01.786 226298 DEBUG oslo_concurrency.lockutils [None req-97a70864-13d1-4f21-ae03-5e69b2b80b33 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Lock "15b2e821-5e8b-4d8a-9a48-7c6a30bd3220-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 02 10:09:01 compute-1 nova_compute[226294]: 2026-02-02 10:09:01.787 226298 DEBUG oslo_concurrency.lockutils [None req-97a70864-13d1-4f21-ae03-5e69b2b80b33 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Lock "15b2e821-5e8b-4d8a-9a48-7c6a30bd3220-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 02 10:09:01 compute-1 nova_compute[226294]: 2026-02-02 10:09:01.789 226298 INFO nova.compute.manager [None req-97a70864-13d1-4f21-ae03-5e69b2b80b33 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] [instance: 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220] Terminating instance
Feb 02 10:09:01 compute-1 nova_compute[226294]: 2026-02-02 10:09:01.790 226298 DEBUG nova.compute.manager [None req-97a70864-13d1-4f21-ae03-5e69b2b80b33 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] [instance: 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Feb 02 10:09:01 compute-1 kernel: tap09a00258-4f (unregistering): left promiscuous mode
Feb 02 10:09:01 compute-1 NetworkManager[49055]: <info>  [1770026941.8556] device (tap09a00258-4f): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Feb 02 10:09:01 compute-1 nova_compute[226294]: 2026-02-02 10:09:01.863 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:09:01 compute-1 ovn_controller[133666]: 2026-02-02T10:09:01Z|00054|binding|INFO|Releasing lport 09a00258-4f60-42dd-a769-b2ea3b870187 from this chassis (sb_readonly=0)
Feb 02 10:09:01 compute-1 ovn_controller[133666]: 2026-02-02T10:09:01Z|00055|binding|INFO|Setting lport 09a00258-4f60-42dd-a769-b2ea3b870187 down in Southbound
Feb 02 10:09:01 compute-1 ovn_controller[133666]: 2026-02-02T10:09:01Z|00056|binding|INFO|Removing iface tap09a00258-4f ovn-installed in OVS
Feb 02 10:09:01 compute-1 nova_compute[226294]: 2026-02-02 10:09:01.865 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:09:01 compute-1 ovn_metadata_agent[143537]: 2026-02-02 10:09:01.873 143542 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:85:9a:96 10.100.0.10'], port_security=['fa:16:3e:85:9a:96 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '15b2e821-5e8b-4d8a-9a48-7c6a30bd3220', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ba6c4c87-77a9-4fcc-aa14-a4637c78f692', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'efbfe697ca674d72b47da5adf3e42c0c', 'neutron:revision_number': '4', 'neutron:security_group_ids': '09104532-215f-4de3-9920-7fd818e6c676', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=755f8a60-018a-461f-bb4b-b9017895ccf7, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f9094efd8b0>], logical_port=09a00258-4f60-42dd-a769-b2ea3b870187) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f9094efd8b0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 02 10:09:01 compute-1 ovn_metadata_agent[143537]: 2026-02-02 10:09:01.875 143542 INFO neutron.agent.ovn.metadata.agent [-] Port 09a00258-4f60-42dd-a769-b2ea3b870187 in datapath ba6c4c87-77a9-4fcc-aa14-a4637c78f692 unbound from our chassis
Feb 02 10:09:01 compute-1 ovn_metadata_agent[143537]: 2026-02-02 10:09:01.877 143542 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network ba6c4c87-77a9-4fcc-aa14-a4637c78f692, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Feb 02 10:09:01 compute-1 ovn_metadata_agent[143537]: 2026-02-02 10:09:01.878 229827 DEBUG oslo.privsep.daemon [-] privsep: reply[a6b0ac4a-c5c1-449e-88c7-81aa4dda2f80]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 02 10:09:01 compute-1 nova_compute[226294]: 2026-02-02 10:09:01.879 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:09:01 compute-1 ovn_metadata_agent[143537]: 2026-02-02 10:09:01.881 143542 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-ba6c4c87-77a9-4fcc-aa14-a4637c78f692 namespace which is not needed anymore
Feb 02 10:09:01 compute-1 systemd[1]: machine-qemu\x2d2\x2dinstance\x2d00000006.scope: Deactivated successfully.
Feb 02 10:09:01 compute-1 systemd[1]: machine-qemu\x2d2\x2dinstance\x2d00000006.scope: Consumed 17.991s CPU time.
Feb 02 10:09:01 compute-1 systemd-machined[195072]: Machine qemu-2-instance-00000006 terminated.
Feb 02 10:09:01 compute-1 podman[233097]: 2026-02-02 10:09:01.967594008 +0000 UTC m=+0.084438955 container health_status 1fb2696999ea15e9688144989093738543d3cef725f9986792d6dfd707aefbe2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:099d88ae13fa2b3409da5310cdcba7fa01d2c87a8bc98296299a57054b9a075e, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'db4758ee7523fe447444c4bd2b867b543b1eee4e3bbcf6676cd1b27bf6147d86-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:099d88ae13fa2b3409da5310cdcba7fa01d2c87a8bc98296299a57054b9a075e', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true)
Feb 02 10:09:01 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:09:01 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_36] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f848c003b40 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:09:01 compute-1 neutron-haproxy-ovnmeta-ba6c4c87-77a9-4fcc-aa14-a4637c78f692[232203]: [NOTICE]   (232207) : haproxy version is 2.8.14-c23fe91
Feb 02 10:09:01 compute-1 neutron-haproxy-ovnmeta-ba6c4c87-77a9-4fcc-aa14-a4637c78f692[232203]: [NOTICE]   (232207) : path to executable is /usr/sbin/haproxy
Feb 02 10:09:01 compute-1 neutron-haproxy-ovnmeta-ba6c4c87-77a9-4fcc-aa14-a4637c78f692[232203]: [WARNING]  (232207) : Exiting Master process...
Feb 02 10:09:01 compute-1 neutron-haproxy-ovnmeta-ba6c4c87-77a9-4fcc-aa14-a4637c78f692[232203]: [WARNING]  (232207) : Exiting Master process...
Feb 02 10:09:01 compute-1 neutron-haproxy-ovnmeta-ba6c4c87-77a9-4fcc-aa14-a4637c78f692[232203]: [ALERT]    (232207) : Current worker (232217) exited with code 143 (Terminated)
Feb 02 10:09:01 compute-1 neutron-haproxy-ovnmeta-ba6c4c87-77a9-4fcc-aa14-a4637c78f692[232203]: [WARNING]  (232207) : All workers exited. Exiting... (0)
Feb 02 10:09:01 compute-1 systemd[1]: libpod-2561056d020f3623f48283542021bd9e76578181273ce1e80f0a4f2d36959d81.scope: Deactivated successfully.
Feb 02 10:09:02 compute-1 podman[233145]: 2026-02-02 10:09:02.00454058 +0000 UTC m=+0.045054829 container died 2561056d020f3623f48283542021bd9e76578181273ce1e80f0a4f2d36959d81 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc, name=neutron-haproxy-ovnmeta-ba6c4c87-77a9-4fcc-aa14-a4637c78f692, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true)
Feb 02 10:09:02 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-2561056d020f3623f48283542021bd9e76578181273ce1e80f0a4f2d36959d81-userdata-shm.mount: Deactivated successfully.
Feb 02 10:09:02 compute-1 systemd[1]: var-lib-containers-storage-overlay-b7e71b523a6cf72f6079510db5422c0e2666a6b8442a4c07506d8ee1c5789881-merged.mount: Deactivated successfully.
Feb 02 10:09:02 compute-1 podman[233145]: 2026-02-02 10:09:02.048198729 +0000 UTC m=+0.088712968 container cleanup 2561056d020f3623f48283542021bd9e76578181273ce1e80f0a4f2d36959d81 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc, name=neutron-haproxy-ovnmeta-ba6c4c87-77a9-4fcc-aa14-a4637c78f692, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0)
Feb 02 10:09:02 compute-1 nova_compute[226294]: 2026-02-02 10:09:02.046 226298 INFO nova.virt.libvirt.driver [-] [instance: 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220] Instance destroyed successfully.
Feb 02 10:09:02 compute-1 nova_compute[226294]: 2026-02-02 10:09:02.048 226298 DEBUG nova.objects.instance [None req-97a70864-13d1-4f21-ae03-5e69b2b80b33 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Lazy-loading 'resources' on Instance uuid 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 02 10:09:02 compute-1 systemd[1]: libpod-conmon-2561056d020f3623f48283542021bd9e76578181273ce1e80f0a4f2d36959d81.scope: Deactivated successfully.
Feb 02 10:09:02 compute-1 nova_compute[226294]: 2026-02-02 10:09:02.063 226298 DEBUG nova.virt.libvirt.vif [None req-97a70864-13d1-4f21-ae03-5e69b2b80b33 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-02-02T10:07:00Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1612354759',display_name='tempest-TestNetworkBasicOps-server-1612354759',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1612354759',id=6,image_ref='d5e062d7-95ef-409c-9ad0-60f7cf6f44ce',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBMJFk+K1ECuOT65iOILNHAamJ3gGwluebBSGyHnh3tJwd+ehxpJY8ebkg+pBKZw/EcMgvzSZ3FmRm5/iJU1QzjfTd7kqiXGqABRWM3LPcjo/Kmnp/RcvKBxgSpfhTx8kiA==',key_name='tempest-TestNetworkBasicOps-1228211887',keypairs=<?>,launch_index=0,launched_at=2026-02-02T10:07:09Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='efbfe697ca674d72b47da5adf3e42c0c',ramdisk_id='',reservation_id='r-1fnfubus',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='d5e062d7-95ef-409c-9ad0-60f7cf6f44ce',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-793549693',owner_user_name='tempest-TestNetworkBasicOps-793549693-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-02-02T10:07:09Z,user_data=None,user_id='1b1695a2a70d4aa0aa350ba17d8f6d5e',uuid=15b2e821-5e8b-4d8a-9a48-7c6a30bd3220,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "09a00258-4f60-42dd-a769-b2ea3b870187", "address": "fa:16:3e:85:9a:96", "network": {"id": "ba6c4c87-77a9-4fcc-aa14-a4637c78f692", "bridge": "br-int", "label": "tempest-network-smoke--761234279", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.238", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "efbfe697ca674d72b47da5adf3e42c0c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap09a00258-4f", "ovs_interfaceid": "09a00258-4f60-42dd-a769-b2ea3b870187", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Feb 02 10:09:02 compute-1 nova_compute[226294]: 2026-02-02 10:09:02.063 226298 DEBUG nova.network.os_vif_util [None req-97a70864-13d1-4f21-ae03-5e69b2b80b33 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Converting VIF {"id": "09a00258-4f60-42dd-a769-b2ea3b870187", "address": "fa:16:3e:85:9a:96", "network": {"id": "ba6c4c87-77a9-4fcc-aa14-a4637c78f692", "bridge": "br-int", "label": "tempest-network-smoke--761234279", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.238", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "efbfe697ca674d72b47da5adf3e42c0c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap09a00258-4f", "ovs_interfaceid": "09a00258-4f60-42dd-a769-b2ea3b870187", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 02 10:09:02 compute-1 nova_compute[226294]: 2026-02-02 10:09:02.065 226298 DEBUG nova.network.os_vif_util [None req-97a70864-13d1-4f21-ae03-5e69b2b80b33 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:85:9a:96,bridge_name='br-int',has_traffic_filtering=True,id=09a00258-4f60-42dd-a769-b2ea3b870187,network=Network(ba6c4c87-77a9-4fcc-aa14-a4637c78f692),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap09a00258-4f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 02 10:09:02 compute-1 nova_compute[226294]: 2026-02-02 10:09:02.065 226298 DEBUG os_vif [None req-97a70864-13d1-4f21-ae03-5e69b2b80b33 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:85:9a:96,bridge_name='br-int',has_traffic_filtering=True,id=09a00258-4f60-42dd-a769-b2ea3b870187,network=Network(ba6c4c87-77a9-4fcc-aa14-a4637c78f692),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap09a00258-4f') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Feb 02 10:09:02 compute-1 nova_compute[226294]: 2026-02-02 10:09:02.068 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:09:02 compute-1 nova_compute[226294]: 2026-02-02 10:09:02.070 226298 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap09a00258-4f, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 02 10:09:02 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:09:02 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_33] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8490003040 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:09:02 compute-1 nova_compute[226294]: 2026-02-02 10:09:02.073 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:09:02 compute-1 nova_compute[226294]: 2026-02-02 10:09:02.076 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 02 10:09:02 compute-1 nova_compute[226294]: 2026-02-02 10:09:02.081 226298 INFO os_vif [None req-97a70864-13d1-4f21-ae03-5e69b2b80b33 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:85:9a:96,bridge_name='br-int',has_traffic_filtering=True,id=09a00258-4f60-42dd-a769-b2ea3b870187,network=Network(ba6c4c87-77a9-4fcc-aa14-a4637c78f692),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap09a00258-4f')
Feb 02 10:09:02 compute-1 podman[233190]: 2026-02-02 10:09:02.112002725 +0000 UTC m=+0.045071609 container remove 2561056d020f3623f48283542021bd9e76578181273ce1e80f0a4f2d36959d81 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc, name=neutron-haproxy-ovnmeta-ba6c4c87-77a9-4fcc-aa14-a4637c78f692, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Feb 02 10:09:02 compute-1 ovn_metadata_agent[143537]: 2026-02-02 10:09:02.117 229827 DEBUG oslo.privsep.daemon [-] privsep: reply[5e989ec3-a3db-4586-8b88-cab6e49c8513]: (4, ('Mon Feb  2 10:09:01 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-ba6c4c87-77a9-4fcc-aa14-a4637c78f692 (2561056d020f3623f48283542021bd9e76578181273ce1e80f0a4f2d36959d81)\n2561056d020f3623f48283542021bd9e76578181273ce1e80f0a4f2d36959d81\nMon Feb  2 10:09:02 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-ba6c4c87-77a9-4fcc-aa14-a4637c78f692 (2561056d020f3623f48283542021bd9e76578181273ce1e80f0a4f2d36959d81)\n2561056d020f3623f48283542021bd9e76578181273ce1e80f0a4f2d36959d81\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 02 10:09:02 compute-1 ovn_metadata_agent[143537]: 2026-02-02 10:09:02.118 229827 DEBUG oslo.privsep.daemon [-] privsep: reply[e49fcec6-e611-43c7-8528-604f3ac06b49]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 02 10:09:02 compute-1 ovn_metadata_agent[143537]: 2026-02-02 10:09:02.119 143542 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapba6c4c87-70, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 02 10:09:02 compute-1 nova_compute[226294]: 2026-02-02 10:09:02.121 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:09:02 compute-1 kernel: tapba6c4c87-70: left promiscuous mode
Feb 02 10:09:02 compute-1 nova_compute[226294]: 2026-02-02 10:09:02.127 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:09:02 compute-1 ovn_metadata_agent[143537]: 2026-02-02 10:09:02.129 229827 DEBUG oslo.privsep.daemon [-] privsep: reply[04cba2ad-3bf2-4d91-a918-6d43504b7089]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 02 10:09:02 compute-1 ovn_metadata_agent[143537]: 2026-02-02 10:09:02.146 229827 DEBUG oslo.privsep.daemon [-] privsep: reply[cecf340c-9100-411b-9ef5-e67eda71bc47]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 02 10:09:02 compute-1 ovn_metadata_agent[143537]: 2026-02-02 10:09:02.147 229827 DEBUG oslo.privsep.daemon [-] privsep: reply[e1d3a369-b465-4324-ba52-3ee8c9f9c962]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 02 10:09:02 compute-1 ovn_metadata_agent[143537]: 2026-02-02 10:09:02.161 229827 DEBUG oslo.privsep.daemon [-] privsep: reply[fd92aaab-3cba-40b1-96d3-be2ba50cd190]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 399078, 'reachable_time': 35155, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 233221, 'error': None, 'target': 'ovnmeta-ba6c4c87-77a9-4fcc-aa14-a4637c78f692', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 02 10:09:02 compute-1 systemd[1]: run-netns-ovnmeta\x2dba6c4c87\x2d77a9\x2d4fcc\x2daa14\x2da4637c78f692.mount: Deactivated successfully.
Feb 02 10:09:02 compute-1 ovn_metadata_agent[143537]: 2026-02-02 10:09:02.163 143813 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-ba6c4c87-77a9-4fcc-aa14-a4637c78f692 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Feb 02 10:09:02 compute-1 ovn_metadata_agent[143537]: 2026-02-02 10:09:02.164 143813 DEBUG oslo.privsep.daemon [-] privsep: reply[cfc6b143-94d2-484e-b114-1b843a192a83]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 02 10:09:02 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Feb 02 10:09:02 compute-1 nova_compute[226294]: 2026-02-02 10:09:02.377 226298 DEBUG nova.compute.manager [req-70fe2c32-d8ca-49d4-b148-d0d1edd20c63 req-22c911f7-d15d-486c-b45f-fb4621dee9d9 b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] [instance: 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220] Received event network-vif-unplugged-09a00258-4f60-42dd-a769-b2ea3b870187 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 02 10:09:02 compute-1 nova_compute[226294]: 2026-02-02 10:09:02.378 226298 DEBUG oslo_concurrency.lockutils [req-70fe2c32-d8ca-49d4-b148-d0d1edd20c63 req-22c911f7-d15d-486c-b45f-fb4621dee9d9 b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] Acquiring lock "15b2e821-5e8b-4d8a-9a48-7c6a30bd3220-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 02 10:09:02 compute-1 nova_compute[226294]: 2026-02-02 10:09:02.379 226298 DEBUG oslo_concurrency.lockutils [req-70fe2c32-d8ca-49d4-b148-d0d1edd20c63 req-22c911f7-d15d-486c-b45f-fb4621dee9d9 b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] Lock "15b2e821-5e8b-4d8a-9a48-7c6a30bd3220-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 02 10:09:02 compute-1 nova_compute[226294]: 2026-02-02 10:09:02.379 226298 DEBUG oslo_concurrency.lockutils [req-70fe2c32-d8ca-49d4-b148-d0d1edd20c63 req-22c911f7-d15d-486c-b45f-fb4621dee9d9 b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] Lock "15b2e821-5e8b-4d8a-9a48-7c6a30bd3220-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 02 10:09:02 compute-1 nova_compute[226294]: 2026-02-02 10:09:02.379 226298 DEBUG nova.compute.manager [req-70fe2c32-d8ca-49d4-b148-d0d1edd20c63 req-22c911f7-d15d-486c-b45f-fb4621dee9d9 b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] [instance: 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220] No waiting events found dispatching network-vif-unplugged-09a00258-4f60-42dd-a769-b2ea3b870187 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 02 10:09:02 compute-1 nova_compute[226294]: 2026-02-02 10:09:02.380 226298 DEBUG nova.compute.manager [req-70fe2c32-d8ca-49d4-b148-d0d1edd20c63 req-22c911f7-d15d-486c-b45f-fb4621dee9d9 b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] [instance: 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220] Received event network-vif-unplugged-09a00258-4f60-42dd-a769-b2ea3b870187 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Feb 02 10:09:02 compute-1 nova_compute[226294]: 2026-02-02 10:09:02.380 226298 DEBUG nova.compute.manager [req-70fe2c32-d8ca-49d4-b148-d0d1edd20c63 req-22c911f7-d15d-486c-b45f-fb4621dee9d9 b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] [instance: 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220] Received event network-vif-plugged-09a00258-4f60-42dd-a769-b2ea3b870187 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 02 10:09:02 compute-1 nova_compute[226294]: 2026-02-02 10:09:02.380 226298 DEBUG oslo_concurrency.lockutils [req-70fe2c32-d8ca-49d4-b148-d0d1edd20c63 req-22c911f7-d15d-486c-b45f-fb4621dee9d9 b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] Acquiring lock "15b2e821-5e8b-4d8a-9a48-7c6a30bd3220-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 02 10:09:02 compute-1 nova_compute[226294]: 2026-02-02 10:09:02.380 226298 DEBUG oslo_concurrency.lockutils [req-70fe2c32-d8ca-49d4-b148-d0d1edd20c63 req-22c911f7-d15d-486c-b45f-fb4621dee9d9 b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] Lock "15b2e821-5e8b-4d8a-9a48-7c6a30bd3220-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 02 10:09:02 compute-1 nova_compute[226294]: 2026-02-02 10:09:02.381 226298 DEBUG oslo_concurrency.lockutils [req-70fe2c32-d8ca-49d4-b148-d0d1edd20c63 req-22c911f7-d15d-486c-b45f-fb4621dee9d9 b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] Lock "15b2e821-5e8b-4d8a-9a48-7c6a30bd3220-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 02 10:09:02 compute-1 nova_compute[226294]: 2026-02-02 10:09:02.381 226298 DEBUG nova.compute.manager [req-70fe2c32-d8ca-49d4-b148-d0d1edd20c63 req-22c911f7-d15d-486c-b45f-fb4621dee9d9 b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] [instance: 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220] No waiting events found dispatching network-vif-plugged-09a00258-4f60-42dd-a769-b2ea3b870187 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 02 10:09:02 compute-1 nova_compute[226294]: 2026-02-02 10:09:02.381 226298 WARNING nova.compute.manager [req-70fe2c32-d8ca-49d4-b148-d0d1edd20c63 req-22c911f7-d15d-486c-b45f-fb4621dee9d9 b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] [instance: 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220] Received unexpected event network-vif-plugged-09a00258-4f60-42dd-a769-b2ea3b870187 for instance with vm_state active and task_state deleting.
Feb 02 10:09:02 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 10:09:02 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:09:02 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_35] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b800b1f0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:09:02 compute-1 nova_compute[226294]: 2026-02-02 10:09:02.513 226298 INFO nova.virt.libvirt.driver [None req-97a70864-13d1-4f21-ae03-5e69b2b80b33 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] [instance: 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220] Deleting instance files /var/lib/nova/instances/15b2e821-5e8b-4d8a-9a48-7c6a30bd3220_del
Feb 02 10:09:02 compute-1 nova_compute[226294]: 2026-02-02 10:09:02.514 226298 INFO nova.virt.libvirt.driver [None req-97a70864-13d1-4f21-ae03-5e69b2b80b33 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] [instance: 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220] Deletion of /var/lib/nova/instances/15b2e821-5e8b-4d8a-9a48-7c6a30bd3220_del complete
Feb 02 10:09:02 compute-1 nova_compute[226294]: 2026-02-02 10:09:02.559 226298 INFO nova.compute.manager [None req-97a70864-13d1-4f21-ae03-5e69b2b80b33 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] [instance: 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220] Took 0.77 seconds to destroy the instance on the hypervisor.
Feb 02 10:09:02 compute-1 nova_compute[226294]: 2026-02-02 10:09:02.561 226298 DEBUG oslo.service.loopingcall [None req-97a70864-13d1-4f21-ae03-5e69b2b80b33 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Feb 02 10:09:02 compute-1 nova_compute[226294]: 2026-02-02 10:09:02.561 226298 DEBUG nova.compute.manager [-] [instance: 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Feb 02 10:09:02 compute-1 nova_compute[226294]: 2026-02-02 10:09:02.561 226298 DEBUG nova.network.neutron [-] [instance: 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Feb 02 10:09:02 compute-1 nova_compute[226294]: 2026-02-02 10:09:02.712 226298 DEBUG nova.network.neutron [req-3e4cbe8c-e119-4596-a0b1-b54750a128d4 req-34f68f31-414b-4358-9512-d6c344c9d463 b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] [instance: 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220] Updated VIF entry in instance network info cache for port 09a00258-4f60-42dd-a769-b2ea3b870187. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Feb 02 10:09:02 compute-1 nova_compute[226294]: 2026-02-02 10:09:02.712 226298 DEBUG nova.network.neutron [req-3e4cbe8c-e119-4596-a0b1-b54750a128d4 req-34f68f31-414b-4358-9512-d6c344c9d463 b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] [instance: 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220] Updating instance_info_cache with network_info: [{"id": "09a00258-4f60-42dd-a769-b2ea3b870187", "address": "fa:16:3e:85:9a:96", "network": {"id": "ba6c4c87-77a9-4fcc-aa14-a4637c78f692", "bridge": "br-int", "label": "tempest-network-smoke--761234279", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "efbfe697ca674d72b47da5adf3e42c0c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap09a00258-4f", "ovs_interfaceid": "09a00258-4f60-42dd-a769-b2ea3b870187", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 02 10:09:02 compute-1 nova_compute[226294]: 2026-02-02 10:09:02.732 226298 DEBUG oslo_concurrency.lockutils [req-3e4cbe8c-e119-4596-a0b1-b54750a128d4 req-34f68f31-414b-4358-9512-d6c344c9d463 b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] Releasing lock "refresh_cache-15b2e821-5e8b-4d8a-9a48-7c6a30bd3220" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 02 10:09:03 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:09:03 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:09:03 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:09:03.218 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:09:03 compute-1 ceph-mon[80115]: pgmap v889: 353 pgs: 353 active+clean; 121 MiB data, 306 MiB used, 60 GiB / 60 GiB avail; 19 KiB/s rd, 5.8 KiB/s wr, 28 op/s
Feb 02 10:09:03 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:09:03 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 10:09:03 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:09:03.291 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 10:09:03 compute-1 nova_compute[226294]: 2026-02-02 10:09:03.635 226298 DEBUG nova.network.neutron [-] [instance: 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 02 10:09:03 compute-1 nova_compute[226294]: 2026-02-02 10:09:03.650 226298 INFO nova.compute.manager [-] [instance: 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220] Took 1.09 seconds to deallocate network for instance.
Feb 02 10:09:03 compute-1 nova_compute[226294]: 2026-02-02 10:09:03.709 226298 DEBUG oslo_concurrency.lockutils [None req-97a70864-13d1-4f21-ae03-5e69b2b80b33 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 02 10:09:03 compute-1 nova_compute[226294]: 2026-02-02 10:09:03.710 226298 DEBUG oslo_concurrency.lockutils [None req-97a70864-13d1-4f21-ae03-5e69b2b80b33 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 02 10:09:03 compute-1 nova_compute[226294]: 2026-02-02 10:09:03.744 226298 DEBUG nova.scheduler.client.report [None req-97a70864-13d1-4f21-ae03-5e69b2b80b33 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Refreshing inventories for resource provider 8e32c057-ad28-4c19-8374-763e0c1c8622 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Feb 02 10:09:03 compute-1 nova_compute[226294]: 2026-02-02 10:09:03.770 226298 DEBUG nova.scheduler.client.report [None req-97a70864-13d1-4f21-ae03-5e69b2b80b33 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Updating ProviderTree inventory for provider 8e32c057-ad28-4c19-8374-763e0c1c8622 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Feb 02 10:09:03 compute-1 nova_compute[226294]: 2026-02-02 10:09:03.771 226298 DEBUG nova.compute.provider_tree [None req-97a70864-13d1-4f21-ae03-5e69b2b80b33 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Updating inventory in ProviderTree for provider 8e32c057-ad28-4c19-8374-763e0c1c8622 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Feb 02 10:09:03 compute-1 nova_compute[226294]: 2026-02-02 10:09:03.789 226298 DEBUG nova.scheduler.client.report [None req-97a70864-13d1-4f21-ae03-5e69b2b80b33 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Refreshing aggregate associations for resource provider 8e32c057-ad28-4c19-8374-763e0c1c8622, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Feb 02 10:09:03 compute-1 nova_compute[226294]: 2026-02-02 10:09:03.809 226298 DEBUG nova.compute.manager [req-152ddc3c-7b1d-4940-af10-4df5772e9263 req-ba355b06-d9da-45f7-a2b5-69a117efce9d b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] [instance: 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220] Received event network-vif-deleted-09a00258-4f60-42dd-a769-b2ea3b870187 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 02 10:09:03 compute-1 nova_compute[226294]: 2026-02-02 10:09:03.814 226298 DEBUG nova.scheduler.client.report [None req-97a70864-13d1-4f21-ae03-5e69b2b80b33 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Refreshing trait associations for resource provider 8e32c057-ad28-4c19-8374-763e0c1c8622, traits: HW_CPU_X86_AESNI,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_SECURITY_TPM_1_2,HW_CPU_X86_SSE2,HW_CPU_X86_MMX,COMPUTE_STORAGE_BUS_USB,COMPUTE_STORAGE_BUS_IDE,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_SVM,HW_CPU_X86_F16C,HW_CPU_X86_AMD_SVM,HW_CPU_X86_SSE,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_GRAPHICS_MODEL_VIRTIO,HW_CPU_X86_BMI2,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_DEVICE_TAGGING,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_GRAPHICS_MODEL_VGA,HW_CPU_X86_AVX,COMPUTE_VOLUME_EXTEND,COMPUTE_SECURITY_UEFI_SECURE_BOOT,HW_CPU_X86_SHA,COMPUTE_SECURITY_TPM_2_0,HW_CPU_X86_ABM,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_CLMUL,HW_CPU_X86_FMA3,COMPUTE_ACCELERATORS,COMPUTE_TRUSTED_CERTS,HW_CPU_X86_BMI,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_SSE41,COMPUTE_VIOMMU_MODEL_AUTO,HW_CPU_X86_AVX2,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_RESCUE_BFV,HW_CPU_X86_SSE42,COMPUTE_NODE,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_SSSE3,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_NET_VIF_MODEL_PCNET,HW_CPU_X86_SSE4A,COMPUTE_IMAGE_TYPE_QCOW2 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Feb 02 10:09:03 compute-1 nova_compute[226294]: 2026-02-02 10:09:03.861 226298 DEBUG oslo_concurrency.processutils [None req-97a70864-13d1-4f21-ae03-5e69b2b80b33 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 02 10:09:03 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:09:03 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8498003040 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:09:04 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:09:04 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_36] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f848c003b40 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:09:04 compute-1 ovn_metadata_agent[143537]: 2026-02-02 10:09:04.233 143542 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=2f54a3b0-231a-4b96-9e3a-0a36e3e73216, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '9'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 02 10:09:04 compute-1 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Feb 02 10:09:04 compute-1 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2825372233' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Feb 02 10:09:04 compute-1 nova_compute[226294]: 2026-02-02 10:09:04.333 226298 DEBUG oslo_concurrency.processutils [None req-97a70864-13d1-4f21-ae03-5e69b2b80b33 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.472s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 02 10:09:04 compute-1 nova_compute[226294]: 2026-02-02 10:09:04.340 226298 DEBUG nova.compute.provider_tree [None req-97a70864-13d1-4f21-ae03-5e69b2b80b33 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Inventory has not changed in ProviderTree for provider: 8e32c057-ad28-4c19-8374-763e0c1c8622 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 02 10:09:04 compute-1 nova_compute[226294]: 2026-02-02 10:09:04.356 226298 DEBUG nova.scheduler.client.report [None req-97a70864-13d1-4f21-ae03-5e69b2b80b33 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Inventory has not changed for provider 8e32c057-ad28-4c19-8374-763e0c1c8622 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 02 10:09:04 compute-1 ceph-mon[80115]: from='client.? 192.168.122.101:0/2825372233' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Feb 02 10:09:04 compute-1 nova_compute[226294]: 2026-02-02 10:09:04.379 226298 DEBUG oslo_concurrency.lockutils [None req-97a70864-13d1-4f21-ae03-5e69b2b80b33 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.669s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 02 10:09:04 compute-1 nova_compute[226294]: 2026-02-02 10:09:04.432 226298 INFO nova.scheduler.client.report [None req-97a70864-13d1-4f21-ae03-5e69b2b80b33 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Deleted allocations for instance 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220
Feb 02 10:09:04 compute-1 nova_compute[226294]: 2026-02-02 10:09:04.509 226298 DEBUG oslo_concurrency.lockutils [None req-97a70864-13d1-4f21-ae03-5e69b2b80b33 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Lock "15b2e821-5e8b-4d8a-9a48-7c6a30bd3220" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.724s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 02 10:09:04 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:09:04 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_33] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8490003040 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:09:05 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:09:05 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 10:09:05 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:09:05.220 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 10:09:05 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:09:05 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 10:09:05 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:09:05.294 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 10:09:05 compute-1 ceph-mon[80115]: pgmap v890: 353 pgs: 353 active+clean; 41 MiB data, 263 MiB used, 60 GiB / 60 GiB avail; 38 KiB/s rd, 7.0 KiB/s wr, 56 op/s
Feb 02 10:09:05 compute-1 nova_compute[226294]: 2026-02-02 10:09:05.761 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:09:05 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:09:05 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_35] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b800b210 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:09:06 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:09:06 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8498003040 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:09:06 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:09:06 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_36] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f848c003b40 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:09:06 compute-1 ceph-mon[80115]: pgmap v891: 353 pgs: 353 active+clean; 41 MiB data, 263 MiB used, 60 GiB / 60 GiB avail; 19 KiB/s rd, 3.2 KiB/s wr, 28 op/s
Feb 02 10:09:07 compute-1 nova_compute[226294]: 2026-02-02 10:09:07.072 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:09:07 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:09:07 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:09:07 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:09:07.224 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:09:07 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:09:07 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:09:07 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:09:07.297 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:09:07 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 10:09:07 compute-1 nova_compute[226294]: 2026-02-02 10:09:07.773 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:09:07 compute-1 nova_compute[226294]: 2026-02-02 10:09:07.798 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:09:07 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:09:07 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_33] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8490003040 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:09:08 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:09:08 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_35] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b800b230 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:09:08 compute-1 systemd[1]: virtsecretd.service: Deactivated successfully.
Feb 02 10:09:08 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:09:08 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8498003040 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:09:08 compute-1 ceph-mon[80115]: pgmap v892: 353 pgs: 353 active+clean; 41 MiB data, 263 MiB used, 60 GiB / 60 GiB avail; 19 KiB/s rd, 3.2 KiB/s wr, 28 op/s
Feb 02 10:09:08 compute-1 ceph-osd[77691]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Feb 02 10:09:08 compute-1 ceph-osd[77691]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                           ** DB Stats **
                                           Uptime(secs): 1800.1 total, 600.0 interval
                                           Cumulative writes: 11K writes, 41K keys, 11K commit groups, 1.0 writes per commit group, ingest: 0.03 GB, 0.02 MB/s
                                           Cumulative WAL: 11K writes, 2812 syncs, 3.92 writes per sync, written: 0.03 GB, 0.02 MB/s
                                           Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                           Interval writes: 2196 writes, 6935 keys, 2196 commit groups, 1.0 writes per commit group, ingest: 6.82 MB, 0.01 MB/s
                                           Interval WAL: 2196 writes, 936 syncs, 2.35 writes per sync, written: 0.01 GB, 0.01 MB/s
                                           Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Feb 02 10:09:09 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:09:09 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:09:09 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:09:09.227 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:09:09 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:09:09 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:09:09 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:09:09.299 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:09:09 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:09:09 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_36] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f848c003b40 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:09:10 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:09:10 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_33] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8490003040 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:09:10 compute-1 podman[233251]: 2026-02-02 10:09:10.39647417 +0000 UTC m=+0.068073840 container health_status 9cab238676dda9a572dafc047b624a8b78a8fbec063bf997856559897160605d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20260127, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'db4758ee7523fe447444c4bd2b867b543b1eee4e3bbcf6676cd1b27bf6147d86-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Feb 02 10:09:10 compute-1 ceph-mon[80115]: from='client.? 192.168.122.100:0/2488635550' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Feb 02 10:09:10 compute-1 ceph-mon[80115]: from='client.? 192.168.122.100:0/2731748324' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Feb 02 10:09:10 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:09:10 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_35] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b800b250 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:09:10 compute-1 nova_compute[226294]: 2026-02-02 10:09:10.680 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 02 10:09:10 compute-1 nova_compute[226294]: 2026-02-02 10:09:10.804 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:09:11 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:09:11 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 10:09:11 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:09:11.229 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 10:09:11 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:09:11 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 10:09:11 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:09:11.301 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 10:09:11 compute-1 ceph-mon[80115]: pgmap v893: 353 pgs: 353 active+clean; 41 MiB data, 263 MiB used, 60 GiB / 60 GiB avail; 19 KiB/s rd, 1.2 KiB/s wr, 27 op/s
Feb 02 10:09:11 compute-1 ceph-mon[80115]: from='client.? 192.168.122.102:0/2017275887' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Feb 02 10:09:11 compute-1 nova_compute[226294]: 2026-02-02 10:09:11.644 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 02 10:09:11 compute-1 nova_compute[226294]: 2026-02-02 10:09:11.648 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 02 10:09:11 compute-1 nova_compute[226294]: 2026-02-02 10:09:11.649 226298 DEBUG nova.compute.manager [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 02 10:09:11 compute-1 nova_compute[226294]: 2026-02-02 10:09:11.649 226298 DEBUG nova.compute.manager [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 02 10:09:11 compute-1 nova_compute[226294]: 2026-02-02 10:09:11.676 226298 DEBUG nova.compute.manager [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Feb 02 10:09:11 compute-1 nova_compute[226294]: 2026-02-02 10:09:11.676 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 02 10:09:11 compute-1 nova_compute[226294]: 2026-02-02 10:09:11.677 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 02 10:09:11 compute-1 nova_compute[226294]: 2026-02-02 10:09:11.677 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 02 10:09:11 compute-1 nova_compute[226294]: 2026-02-02 10:09:11.678 226298 DEBUG nova.compute.manager [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 02 10:09:11 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:09:11 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8498003040 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:09:12 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:09:12 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_36] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f848c003b40 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:09:12 compute-1 nova_compute[226294]: 2026-02-02 10:09:12.109 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:09:12 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 10:09:12 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:09:12 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_33] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8490003040 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:09:12 compute-1 ceph-mon[80115]: from='client.? 192.168.122.102:0/1206153838' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Feb 02 10:09:12 compute-1 nova_compute[226294]: 2026-02-02 10:09:12.648 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 02 10:09:13 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:09:13 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:09:13 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:09:13.233 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:09:13 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:09:13 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:09:13 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:09:13.304 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:09:13 compute-1 ceph-mon[80115]: pgmap v894: 353 pgs: 353 active+clean; 41 MiB data, 263 MiB used, 60 GiB / 60 GiB avail; 19 KiB/s rd, 1.2 KiB/s wr, 27 op/s
Feb 02 10:09:13 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:09:13 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_35] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b800b270 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:09:14 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:09:14 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8498003040 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:09:14 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:09:14 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_33] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8498003040 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:09:14 compute-1 nova_compute[226294]: 2026-02-02 10:09:14.649 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 02 10:09:14 compute-1 nova_compute[226294]: 2026-02-02 10:09:14.649 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 02 10:09:14 compute-1 nova_compute[226294]: 2026-02-02 10:09:14.673 226298 DEBUG oslo_concurrency.lockutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 02 10:09:14 compute-1 nova_compute[226294]: 2026-02-02 10:09:14.673 226298 DEBUG oslo_concurrency.lockutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 02 10:09:14 compute-1 nova_compute[226294]: 2026-02-02 10:09:14.674 226298 DEBUG oslo_concurrency.lockutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 02 10:09:14 compute-1 nova_compute[226294]: 2026-02-02 10:09:14.674 226298 DEBUG nova.compute.resource_tracker [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 02 10:09:14 compute-1 nova_compute[226294]: 2026-02-02 10:09:14.674 226298 DEBUG oslo_concurrency.processutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 02 10:09:14 compute-1 ceph-mon[80115]: pgmap v895: 353 pgs: 353 active+clean; 41 MiB data, 263 MiB used, 60 GiB / 60 GiB avail; 19 KiB/s rd, 1.2 KiB/s wr, 28 op/s
Feb 02 10:09:15 compute-1 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Feb 02 10:09:15 compute-1 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3180929845' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Feb 02 10:09:15 compute-1 nova_compute[226294]: 2026-02-02 10:09:15.149 226298 DEBUG oslo_concurrency.processutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.474s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 02 10:09:15 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:09:15 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:09:15 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:09:15.236 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:09:15 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:09:15 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:09:15 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:09:15.306 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:09:15 compute-1 nova_compute[226294]: 2026-02-02 10:09:15.346 226298 WARNING nova.virt.libvirt.driver [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 02 10:09:15 compute-1 nova_compute[226294]: 2026-02-02 10:09:15.348 226298 DEBUG nova.compute.resource_tracker [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4891MB free_disk=59.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 02 10:09:15 compute-1 nova_compute[226294]: 2026-02-02 10:09:15.349 226298 DEBUG oslo_concurrency.lockutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 02 10:09:15 compute-1 nova_compute[226294]: 2026-02-02 10:09:15.349 226298 DEBUG oslo_concurrency.lockutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 02 10:09:15 compute-1 nova_compute[226294]: 2026-02-02 10:09:15.425 226298 DEBUG nova.compute.resource_tracker [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 02 10:09:15 compute-1 nova_compute[226294]: 2026-02-02 10:09:15.426 226298 DEBUG nova.compute.resource_tracker [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 02 10:09:15 compute-1 nova_compute[226294]: 2026-02-02 10:09:15.447 226298 DEBUG oslo_concurrency.processutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 02 10:09:15 compute-1 ceph-mon[80115]: from='client.? 192.168.122.101:0/3180929845' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Feb 02 10:09:15 compute-1 nova_compute[226294]: 2026-02-02 10:09:15.811 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:09:15 compute-1 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Feb 02 10:09:15 compute-1 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/422795760' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Feb 02 10:09:15 compute-1 nova_compute[226294]: 2026-02-02 10:09:15.941 226298 DEBUG oslo_concurrency.processutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.494s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 02 10:09:15 compute-1 nova_compute[226294]: 2026-02-02 10:09:15.948 226298 DEBUG nova.compute.provider_tree [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Inventory has not changed in ProviderTree for provider: 8e32c057-ad28-4c19-8374-763e0c1c8622 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 02 10:09:15 compute-1 nova_compute[226294]: 2026-02-02 10:09:15.967 226298 DEBUG nova.scheduler.client.report [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Inventory has not changed for provider 8e32c057-ad28-4c19-8374-763e0c1c8622 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 02 10:09:15 compute-1 nova_compute[226294]: 2026-02-02 10:09:15.994 226298 DEBUG nova.compute.resource_tracker [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 02 10:09:15 compute-1 nova_compute[226294]: 2026-02-02 10:09:15.995 226298 DEBUG oslo_concurrency.lockutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.646s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 02 10:09:15 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:09:15 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_33] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8490003040 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:09:16 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:09:16 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_35] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b800b290 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:09:16 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:09:16 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8498003040 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:09:16 compute-1 ceph-mon[80115]: from='client.? 192.168.122.101:0/422795760' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Feb 02 10:09:16 compute-1 ceph-mon[80115]: pgmap v896: 353 pgs: 353 active+clean; 41 MiB data, 263 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Feb 02 10:09:17 compute-1 nova_compute[226294]: 2026-02-02 10:09:17.032 226298 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1770026942.0311568, 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 02 10:09:17 compute-1 nova_compute[226294]: 2026-02-02 10:09:17.033 226298 INFO nova.compute.manager [-] [instance: 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220] VM Stopped (Lifecycle Event)
Feb 02 10:09:17 compute-1 nova_compute[226294]: 2026-02-02 10:09:17.051 226298 DEBUG nova.compute.manager [None req-63b1e011-ab2b-41ab-922b-4a317db92e93 - - - - - -] [instance: 15b2e821-5e8b-4d8a-9a48-7c6a30bd3220] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 02 10:09:17 compute-1 nova_compute[226294]: 2026-02-02 10:09:17.112 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:09:17 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:09:17 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 10:09:17 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:09:17.238 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 10:09:17 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:09:17 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:09:17 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:09:17.309 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:09:17 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 10:09:17 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Feb 02 10:09:17 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:09:17 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_33] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8498003040 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:09:18 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:09:18 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_33] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8490003040 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:09:18 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:09:18 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_35] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b800b2b0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:09:18 compute-1 ceph-mon[80115]: pgmap v897: 353 pgs: 353 active+clean; 41 MiB data, 263 MiB used, 60 GiB / 60 GiB avail; 341 B/s rd, 0 op/s
Feb 02 10:09:19 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:09:19 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 10:09:19 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:09:19.242 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 10:09:19 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:09:19 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:09:19 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:09:19.311 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:09:19 compute-1 sudo[233321]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Feb 02 10:09:19 compute-1 sudo[233321]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 10:09:19 compute-1 sudo[233321]: pam_unix(sudo:session): session closed for user root
Feb 02 10:09:20 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:09:19 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_36] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f84b800b2b0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:09:20 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:09:20 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_36] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f848c003b40 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:09:20 compute-1 ceph-mon[80115]: from='client.? 192.168.122.102:0/4255523962' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Feb 02 10:09:20 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:09:20 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_33] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8490003040 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:09:20 compute-1 nova_compute[226294]: 2026-02-02 10:09:20.812 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:09:21 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:09:21 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 10:09:21 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:09:21.244 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 10:09:21 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:09:21 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:09:21 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:09:21.313 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:09:21 compute-1 ceph-mon[80115]: pgmap v898: 353 pgs: 353 active+clean; 41 MiB data, 263 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Feb 02 10:09:21 compute-1 ceph-mon[80115]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #52. Immutable memtables: 0.
Feb 02 10:09:21 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:09:21.448649) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Feb 02 10:09:21 compute-1 ceph-mon[80115]: rocksdb: [db/flush_job.cc:856] [default] [JOB 29] Flushing memtable with next log file: 52
Feb 02 10:09:21 compute-1 ceph-mon[80115]: rocksdb: EVENT_LOG_v1 {"time_micros": 1770026961448736, "job": 29, "event": "flush_started", "num_memtables": 1, "num_entries": 2337, "num_deletes": 251, "total_data_size": 5842028, "memory_usage": 5928096, "flush_reason": "Manual Compaction"}
Feb 02 10:09:21 compute-1 ceph-mon[80115]: rocksdb: [db/flush_job.cc:885] [default] [JOB 29] Level-0 flush table #53: started
Feb 02 10:09:21 compute-1 ceph-mon[80115]: rocksdb: EVENT_LOG_v1 {"time_micros": 1770026961487343, "cf_name": "default", "job": 29, "event": "table_file_creation", "file_number": 53, "file_size": 3781117, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 26072, "largest_seqno": 28404, "table_properties": {"data_size": 3772144, "index_size": 5467, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2437, "raw_key_size": 19281, "raw_average_key_size": 20, "raw_value_size": 3753887, "raw_average_value_size": 3918, "num_data_blocks": 242, "num_entries": 958, "num_filter_entries": 958, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1770026763, "oldest_key_time": 1770026763, "file_creation_time": 1770026961, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "fac1d709-8a2a-487d-b05b-57255ec289c7", "db_session_id": "DE871D21TSCUFP8UED8E", "orig_file_number": 53, "seqno_to_time_mapping": "N/A"}}
Feb 02 10:09:21 compute-1 ceph-mon[80115]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 29] Flush lasted 38746 microseconds, and 8936 cpu microseconds.
Feb 02 10:09:21 compute-1 ceph-mon[80115]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Feb 02 10:09:21 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:09:21.487407) [db/flush_job.cc:967] [default] [JOB 29] Level-0 flush table #53: 3781117 bytes OK
Feb 02 10:09:21 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:09:21.487434) [db/memtable_list.cc:519] [default] Level-0 commit table #53 started
Feb 02 10:09:21 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:09:21.489951) [db/memtable_list.cc:722] [default] Level-0 commit table #53: memtable #1 done
Feb 02 10:09:21 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:09:21.489980) EVENT_LOG_v1 {"time_micros": 1770026961489972, "job": 29, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Feb 02 10:09:21 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:09:21.490005) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Feb 02 10:09:21 compute-1 ceph-mon[80115]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 29] Try to delete WAL files size 5831682, prev total WAL file size 5831682, number of live WAL files 2.
Feb 02 10:09:21 compute-1 ceph-mon[80115]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000049.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb 02 10:09:21 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:09:21.491349) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730032303038' seq:72057594037927935, type:22 .. '7061786F730032323630' seq:0, type:0; will stop at (end)
Feb 02 10:09:21 compute-1 ceph-mon[80115]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 30] Compacting 1@0 + 1@6 files to L6, score -1.00
Feb 02 10:09:21 compute-1 ceph-mon[80115]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 29 Base level 0, inputs: [53(3692KB)], [51(11MB)]
Feb 02 10:09:21 compute-1 ceph-mon[80115]: rocksdb: EVENT_LOG_v1 {"time_micros": 1770026961491413, "job": 30, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [53], "files_L6": [51], "score": -1, "input_data_size": 16145690, "oldest_snapshot_seqno": -1}
Feb 02 10:09:21 compute-1 ceph-mon[80115]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 30] Generated table #54: 5849 keys, 13998793 bytes, temperature: kUnknown
Feb 02 10:09:21 compute-1 ceph-mon[80115]: rocksdb: EVENT_LOG_v1 {"time_micros": 1770026961637830, "cf_name": "default", "job": 30, "event": "table_file_creation", "file_number": 54, "file_size": 13998793, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 13959385, "index_size": 23682, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 14661, "raw_key_size": 148732, "raw_average_key_size": 25, "raw_value_size": 13853557, "raw_average_value_size": 2368, "num_data_blocks": 966, "num_entries": 5849, "num_filter_entries": 5849, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1770025175, "oldest_key_time": 0, "file_creation_time": 1770026961, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "fac1d709-8a2a-487d-b05b-57255ec289c7", "db_session_id": "DE871D21TSCUFP8UED8E", "orig_file_number": 54, "seqno_to_time_mapping": "N/A"}}
Feb 02 10:09:21 compute-1 ceph-mon[80115]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Feb 02 10:09:21 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:09:21.638190) [db/compaction/compaction_job.cc:1663] [default] [JOB 30] Compacted 1@0 + 1@6 files to L6 => 13998793 bytes
Feb 02 10:09:21 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:09:21.639560) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 110.2 rd, 95.5 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(3.6, 11.8 +0.0 blob) out(13.4 +0.0 blob), read-write-amplify(8.0) write-amplify(3.7) OK, records in: 6365, records dropped: 516 output_compression: NoCompression
Feb 02 10:09:21 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:09:21.639589) EVENT_LOG_v1 {"time_micros": 1770026961639575, "job": 30, "event": "compaction_finished", "compaction_time_micros": 146512, "compaction_time_cpu_micros": 34138, "output_level": 6, "num_output_files": 1, "total_output_size": 13998793, "num_input_records": 6365, "num_output_records": 5849, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Feb 02 10:09:21 compute-1 ceph-mon[80115]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000053.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb 02 10:09:21 compute-1 ceph-mon[80115]: rocksdb: EVENT_LOG_v1 {"time_micros": 1770026961640389, "job": 30, "event": "table_file_deletion", "file_number": 53}
Feb 02 10:09:21 compute-1 ceph-mon[80115]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000051.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb 02 10:09:21 compute-1 ceph-mon[80115]: rocksdb: EVENT_LOG_v1 {"time_micros": 1770026961642408, "job": 30, "event": "table_file_deletion", "file_number": 51}
Feb 02 10:09:21 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:09:21.491219) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 02 10:09:21 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:09:21.642493) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 02 10:09:21 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:09:21.642708) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 02 10:09:21 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:09:21.642714) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 02 10:09:21 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:09:21.642716) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 02 10:09:21 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:09:21.642719) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 02 10:09:22 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:09:22 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_35] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f849c001ba0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:09:22 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:09:22 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_39] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8498003040 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:09:22 compute-1 nova_compute[226294]: 2026-02-02 10:09:22.154 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:09:22 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 10:09:22 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:09:22 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_36] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f848c004a80 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:09:22 compute-1 ceph-mon[80115]: pgmap v899: 353 pgs: 353 active+clean; 41 MiB data, 263 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Feb 02 10:09:23 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:09:23 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 10:09:23 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:09:23.248 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 10:09:23 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:09:23 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:09:23 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:09:23.316 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:09:24 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:09:24 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_33] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8490003040 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:09:24 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:09:24 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_35] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f849c001ba0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:09:24 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:09:24 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_39] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8498003040 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:09:24 compute-1 ceph-mon[80115]: pgmap v900: 353 pgs: 353 active+clean; 88 MiB data, 281 MiB used, 60 GiB / 60 GiB avail; 18 KiB/s rd, 1.8 MiB/s wr, 27 op/s
Feb 02 10:09:25 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:09:25 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.002000053s ======
Feb 02 10:09:25 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:09:25.252 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000053s
Feb 02 10:09:25 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:09:25 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:09:25 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:09:25.318 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:09:25 compute-1 nova_compute[226294]: 2026-02-02 10:09:25.815 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:09:26 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:09:26 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_36] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f848c004a80 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:09:26 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:09:26 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_33] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8490003040 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:09:26 compute-1 ceph-mon[80115]: from='client.? 192.168.122.102:0/1278242065' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Feb 02 10:09:26 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:09:26 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_35] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f849c001d40 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:09:27 compute-1 nova_compute[226294]: 2026-02-02 10:09:27.182 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:09:27 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:09:27 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:09:27 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:09:27.256 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:09:27 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:09:27 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 10:09:27 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:09:27.320 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 10:09:27 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 10:09:27 compute-1 ceph-mon[80115]: pgmap v901: 353 pgs: 353 active+clean; 88 MiB data, 281 MiB used, 60 GiB / 60 GiB avail; 17 KiB/s rd, 1.8 MiB/s wr, 27 op/s
Feb 02 10:09:27 compute-1 ceph-mon[80115]: from='client.? 192.168.122.102:0/1301271496' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Feb 02 10:09:28 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:09:28 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_39] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8498003040 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:09:28 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:09:28 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_36] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f848c004a80 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:09:28 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:09:28 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_33] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8490003040 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:09:28 compute-1 ceph-mon[80115]: pgmap v902: 353 pgs: 353 active+clean; 88 MiB data, 281 MiB used, 60 GiB / 60 GiB avail; 25 KiB/s rd, 1.8 MiB/s wr, 37 op/s
Feb 02 10:09:29 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:09:29 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:09:29 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:09:29.258 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:09:29 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:09:29 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:09:29 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:09:29.322 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:09:30 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:09:30 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_35] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f849c001d40 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:09:30 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:09:30 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_39] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8498003040 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:09:30 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:09:30 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_36] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f848c004a80 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:09:30 compute-1 ceph-mon[80115]: pgmap v903: 353 pgs: 353 active+clean; 88 MiB data, 281 MiB used, 60 GiB / 60 GiB avail; 25 KiB/s rd, 1.8 MiB/s wr, 37 op/s
Feb 02 10:09:30 compute-1 nova_compute[226294]: 2026-02-02 10:09:30.816 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:09:31 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:09:31 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 10:09:31 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:09:31.260 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 10:09:31 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:09:31 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:09:31 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:09:31.324 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:09:32 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:09:32 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_33] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8490003040 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:09:32 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:09:32 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_35] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f849c003f50 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:09:32 compute-1 nova_compute[226294]: 2026-02-02 10:09:32.184 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:09:32 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Feb 02 10:09:32 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 10:09:32 compute-1 podman[233354]: 2026-02-02 10:09:32.468188837 +0000 UTC m=+0.130979821 container health_status 1fb2696999ea15e9688144989093738543d3cef725f9986792d6dfd707aefbe2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:099d88ae13fa2b3409da5310cdcba7fa01d2c87a8bc98296299a57054b9a075e, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'db4758ee7523fe447444c4bd2b867b543b1eee4e3bbcf6676cd1b27bf6147d86-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:099d88ae13fa2b3409da5310cdcba7fa01d2c87a8bc98296299a57054b9a075e', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Feb 02 10:09:32 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:09:32 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_39] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8498003040 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:09:33 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:09:33 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:09:33 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:09:33.263 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:09:33 compute-1 ceph-mon[80115]: pgmap v904: 353 pgs: 353 active+clean; 88 MiB data, 281 MiB used, 60 GiB / 60 GiB avail; 25 KiB/s rd, 1.8 MiB/s wr, 37 op/s
Feb 02 10:09:33 compute-1 ceph-mon[80115]: from='client.? 192.168.122.102:0/3844840386' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Feb 02 10:09:33 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:09:33 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:09:33 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:09:33.326 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:09:34 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:09:34 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_36] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f848c004a80 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:09:34 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:09:34 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_33] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8490003040 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:09:34 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:09:34 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_35] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f849c003f50 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:09:34 compute-1 ceph-mon[80115]: pgmap v905: 353 pgs: 353 active+clean; 41 MiB data, 259 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 1.8 MiB/s wr, 127 op/s
Feb 02 10:09:35 compute-1 ceph-mon[80115]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Feb 02 10:09:35 compute-1 ceph-mon[80115]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                           ** DB Stats **
                                           Uptime(secs): 1800.0 total, 600.0 interval
                                           Cumulative writes: 5356 writes, 28K keys, 5356 commit groups, 1.0 writes per commit group, ingest: 0.07 GB, 0.04 MB/s
                                           Cumulative WAL: 5356 writes, 5356 syncs, 1.00 writes per sync, written: 0.07 GB, 0.04 MB/s
                                           Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                           Interval writes: 1530 writes, 7398 keys, 1530 commit groups, 1.0 writes per commit group, ingest: 16.91 MB, 0.03 MB/s
                                           Interval WAL: 1530 writes, 1530 syncs, 1.00 writes per sync, written: 0.02 GB, 0.03 MB/s
                                           Interval stall: 00:00:0.000 H:M:S, 0.0 percent
                                           
                                           ** Compaction Stats [default] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0    109.1      0.40              0.11        15    0.027       0      0       0.0       0.0
                                             L6      1/0   13.35 MB   0.0      0.2     0.0      0.2       0.2      0.0       0.0   4.0    119.4    101.8      1.73              0.40        14    0.123     74K   7395       0.0       0.0
                                            Sum      1/0   13.35 MB   0.0      0.2     0.0      0.2       0.2      0.1       0.0   5.0     96.9    103.2      2.13              0.51        29    0.073     74K   7395       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.1     0.0      0.1       0.1      0.0       0.0   6.8     99.4    101.1      0.73              0.18        10    0.073     30K   2565       0.0       0.0
                                           
                                           ** Compaction Stats [default] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Low      0/0    0.00 KB   0.0      0.2     0.0      0.2       0.2      0.0       0.0   0.0    119.4    101.8      1.73              0.40        14    0.123     74K   7395       0.0       0.0
                                           High      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0    109.6      0.40              0.11        14    0.028       0      0       0.0       0.0
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.8      0.00              0.00         1    0.002       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1800.0 total, 600.0 interval
                                           Flush(GB): cumulative 0.043, interval 0.011
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.21 GB write, 0.12 MB/s write, 0.20 GB read, 0.11 MB/s read, 2.1 seconds
                                           Interval compaction: 0.07 GB write, 0.12 MB/s write, 0.07 GB read, 0.12 MB/s read, 0.7 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55a64debd350#2 capacity: 304.00 MB usage: 17.54 MB table_size: 0 occupancy: 18446744073709551615 collections: 4 last_copies: 0 last_secs: 0.00019 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(926,16.95 MB,5.57595%) FilterBlock(29,220.61 KB,0.070868%) IndexBlock(29,383.73 KB,0.12327%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [default] **
Feb 02 10:09:35 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:09:35 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 10:09:35 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:09:35.265 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 10:09:35 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:09:35 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 10:09:35 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:09:35.328 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 10:09:35 compute-1 nova_compute[226294]: 2026-02-02 10:09:35.841 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:09:36 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:09:36 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_39] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8498003040 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:09:36 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:09:36 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_36] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f848c004a80 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:09:36 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:09:36 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_33] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8490003040 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:09:36 compute-1 ceph-mon[80115]: pgmap v906: 353 pgs: 353 active+clean; 41 MiB data, 259 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 14 KiB/s wr, 100 op/s
Feb 02 10:09:37 compute-1 nova_compute[226294]: 2026-02-02 10:09:37.213 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:09:37 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:09:37 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 10:09:37 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:09:37.269 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 10:09:37 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:09:37 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 10:09:37 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:09:37.331 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 10:09:37 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 10:09:37 compute-1 sudo[233384]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 02 10:09:37 compute-1 sudo[233384]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 10:09:37 compute-1 sudo[233384]: pam_unix(sudo:session): session closed for user root
Feb 02 10:09:37 compute-1 sudo[233409]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/d241d473-9fcb-5f74-b163-f1ca4454e7f1/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 check-host
Feb 02 10:09:37 compute-1 sudo[233409]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 10:09:38 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:09:38 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_35] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f849c003f50 fd 48 proxy header rest len failed header rlen = % (will set dead)
Feb 02 10:09:38 compute-1 kernel: ganesha.nfsd[232989]: segfault at 50 ip 00007f8543c9b32e sp 00007f84a77fd210 error 4 in libntirpc.so.5.8[7f8543c80000+2c000] likely on CPU 1 (core 0, socket 1)
Feb 02 10:09:38 compute-1 kernel: Code: 47 20 66 41 89 86 f2 00 00 00 41 bf 01 00 00 00 b9 40 00 00 00 e9 af fd ff ff 66 90 48 8b 85 f8 00 00 00 48 8b 40 08 4c 8b 28 <45> 8b 65 50 49 8b 75 68 41 8b be 28 02 00 00 b9 40 00 00 00 e8 29
Feb 02 10:09:38 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx[229168]: 02/02/2026 10:09:38 : epoch 6980764f : compute-1 : ganesha.nfsd-2[svc_39] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8498003040 fd 48 proxy ignored for local
Feb 02 10:09:38 compute-1 systemd[1]: Started Process Core Dump (PID 233434/UID 0).
Feb 02 10:09:38 compute-1 sudo[233409]: pam_unix(sudo:session): session closed for user root
Feb 02 10:09:38 compute-1 sudo[233457]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 02 10:09:38 compute-1 sudo[233457]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 10:09:38 compute-1 sudo[233457]: pam_unix(sudo:session): session closed for user root
Feb 02 10:09:38 compute-1 sudo[233482]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/d241d473-9fcb-5f74-b163-f1ca4454e7f1/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Feb 02 10:09:38 compute-1 sudo[233482]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 10:09:38 compute-1 sudo[233482]: pam_unix(sudo:session): session closed for user root
Feb 02 10:09:39 compute-1 sudo[233541]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 02 10:09:39 compute-1 sudo[233541]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 10:09:39 compute-1 sudo[233541]: pam_unix(sudo:session): session closed for user root
Feb 02 10:09:39 compute-1 systemd-coredump[233437]: Process 229173 (ganesha.nfsd) of user 0 dumped core.
                                                    
                                                    Stack trace of thread 81:
                                                    #0  0x00007f8543c9b32e n/a (/usr/lib64/libntirpc.so.5.8 + 0x2232e)
                                                    ELF object binary architecture: AMD x86-64
Feb 02 10:09:39 compute-1 sudo[233566]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/d241d473-9fcb-5f74-b163-f1ca4454e7f1/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --image quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec --timeout 895 ceph-volume --fsid d241d473-9fcb-5f74-b163-f1ca4454e7f1 -- inventory --format=json-pretty --filter-for-batch
Feb 02 10:09:39 compute-1 sudo[233566]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 10:09:39 compute-1 systemd[1]: systemd-coredump@13-233434-0.service: Deactivated successfully.
Feb 02 10:09:39 compute-1 podman[233595]: 2026-02-02 10:09:39.141565911 +0000 UTC m=+0.039461879 container died 4a082278f968d851715d3ff83b10198d099fc94dfdc956a8f353cfb211d0aa31 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, org.opencontainers.image.documentation=https://docs.ceph.com/)
Feb 02 10:09:39 compute-1 systemd[1]: var-lib-containers-storage-overlay-aab1a8eaea1398d15f9f0e4bf76ebc7ab73640ad6ddb093b4227eedbb09799dc-merged.mount: Deactivated successfully.
Feb 02 10:09:39 compute-1 podman[233595]: 2026-02-02 10:09:39.180658599 +0000 UTC m=+0.078554497 container remove 4a082278f968d851715d3ff83b10198d099fc94dfdc956a8f353cfb211d0aa31 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-nfs-cephfs-0-0-compute-1-mhzhsx, io.buildah.version=1.40.1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, org.label-schema.build-date=20250325, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 02 10:09:39 compute-1 systemd[1]: ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1@nfs.cephfs.0.0.compute-1.mhzhsx.service: Main process exited, code=exited, status=139/n/a
Feb 02 10:09:39 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:09:39 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 10:09:39 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:09:39.271 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 10:09:39 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb 02 10:09:39 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb 02 10:09:39 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb 02 10:09:39 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb 02 10:09:39 compute-1 ceph-mon[80115]: pgmap v907: 353 pgs: 353 active+clean; 41 MiB data, 259 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 14 KiB/s wr, 100 op/s
Feb 02 10:09:39 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:09:39 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:09:39 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:09:39.334 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:09:39 compute-1 systemd[1]: ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1@nfs.cephfs.0.0.compute-1.mhzhsx.service: Failed with result 'exit-code'.
Feb 02 10:09:39 compute-1 systemd[1]: ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1@nfs.cephfs.0.0.compute-1.mhzhsx.service: Consumed 1.782s CPU time.
Feb 02 10:09:39 compute-1 podman[233678]: 2026-02-02 10:09:39.481740535 +0000 UTC m=+0.064145684 container create 2a3f7ff2cdd90b1d31fc3245b8d99939323ae7d01e5331cebb513aef2978a22d (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=beautiful_chatterjee, ceph=True, CEPH_REF=squid, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325, io.buildah.version=1.40.1, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 02 10:09:39 compute-1 systemd[1]: Started libpod-conmon-2a3f7ff2cdd90b1d31fc3245b8d99939323ae7d01e5331cebb513aef2978a22d.scope.
Feb 02 10:09:39 compute-1 systemd[1]: Started libcrun container.
Feb 02 10:09:39 compute-1 podman[233678]: 2026-02-02 10:09:39.458756345 +0000 UTC m=+0.041161504 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Feb 02 10:09:39 compute-1 podman[233678]: 2026-02-02 10:09:39.561228826 +0000 UTC m=+0.143634025 container init 2a3f7ff2cdd90b1d31fc3245b8d99939323ae7d01e5331cebb513aef2978a22d (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=beautiful_chatterjee, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.build-date=20250325, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Feb 02 10:09:39 compute-1 podman[233678]: 2026-02-02 10:09:39.571509679 +0000 UTC m=+0.153914828 container start 2a3f7ff2cdd90b1d31fc3245b8d99939323ae7d01e5331cebb513aef2978a22d (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=beautiful_chatterjee, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325)
Feb 02 10:09:39 compute-1 podman[233678]: 2026-02-02 10:09:39.57642783 +0000 UTC m=+0.158833039 container attach 2a3f7ff2cdd90b1d31fc3245b8d99939323ae7d01e5331cebb513aef2978a22d (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=beautiful_chatterjee, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 02 10:09:39 compute-1 beautiful_chatterjee[233694]: 167 167
Feb 02 10:09:39 compute-1 systemd[1]: libpod-2a3f7ff2cdd90b1d31fc3245b8d99939323ae7d01e5331cebb513aef2978a22d.scope: Deactivated successfully.
Feb 02 10:09:39 compute-1 podman[233678]: 2026-02-02 10:09:39.580340854 +0000 UTC m=+0.162745973 container died 2a3f7ff2cdd90b1d31fc3245b8d99939323ae7d01e5331cebb513aef2978a22d (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=beautiful_chatterjee, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250325, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 02 10:09:39 compute-1 systemd[1]: var-lib-containers-storage-overlay-57ffeb4ee93a5dee9150cae19b2072c7324f0e57bb2e2112cb59a9de266a868e-merged.mount: Deactivated successfully.
Feb 02 10:09:39 compute-1 podman[233678]: 2026-02-02 10:09:39.622508144 +0000 UTC m=+0.204913293 container remove 2a3f7ff2cdd90b1d31fc3245b8d99939323ae7d01e5331cebb513aef2978a22d (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=beautiful_chatterjee, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, io.buildah.version=1.40.1, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0)
Feb 02 10:09:39 compute-1 systemd[1]: libpod-conmon-2a3f7ff2cdd90b1d31fc3245b8d99939323ae7d01e5331cebb513aef2978a22d.scope: Deactivated successfully.
Feb 02 10:09:39 compute-1 podman[233719]: 2026-02-02 10:09:39.785529944 +0000 UTC m=+0.060399716 container create 0ac330c2797daefe2a280259c223b57e79f9ee9ca604f3cc7b13d500250714ca (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=priceless_cannon, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, CEPH_REF=squid, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325, ceph=True)
Feb 02 10:09:39 compute-1 systemd[1]: Started libpod-conmon-0ac330c2797daefe2a280259c223b57e79f9ee9ca604f3cc7b13d500250714ca.scope.
Feb 02 10:09:39 compute-1 podman[233719]: 2026-02-02 10:09:39.75827999 +0000 UTC m=+0.033149792 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Feb 02 10:09:39 compute-1 systemd[1]: Started libcrun container.
Feb 02 10:09:39 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/62a567448a77d4ae7e7cf080a0ff49d6e20e8117f8d43b2f65ef01d5deb92542/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Feb 02 10:09:39 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/62a567448a77d4ae7e7cf080a0ff49d6e20e8117f8d43b2f65ef01d5deb92542/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Feb 02 10:09:39 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/62a567448a77d4ae7e7cf080a0ff49d6e20e8117f8d43b2f65ef01d5deb92542/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Feb 02 10:09:39 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/62a567448a77d4ae7e7cf080a0ff49d6e20e8117f8d43b2f65ef01d5deb92542/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Feb 02 10:09:39 compute-1 podman[233719]: 2026-02-02 10:09:39.87727417 +0000 UTC m=+0.152143902 container init 0ac330c2797daefe2a280259c223b57e79f9ee9ca604f3cc7b13d500250714ca (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=priceless_cannon, org.label-schema.schema-version=1.0, io.buildah.version=1.40.1, OSD_FLAVOR=default, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.label-schema.build-date=20250325, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9)
Feb 02 10:09:39 compute-1 podman[233719]: 2026-02-02 10:09:39.886535256 +0000 UTC m=+0.161405008 container start 0ac330c2797daefe2a280259c223b57e79f9ee9ca604f3cc7b13d500250714ca (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=priceless_cannon, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325, ceph=True, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Feb 02 10:09:39 compute-1 podman[233719]: 2026-02-02 10:09:39.893927452 +0000 UTC m=+0.168797264 container attach 0ac330c2797daefe2a280259c223b57e79f9ee9ca604f3cc7b13d500250714ca (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=priceless_cannon, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True)
Feb 02 10:09:40 compute-1 sudo[233740]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Feb 02 10:09:40 compute-1 sudo[233740]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 10:09:40 compute-1 sudo[233740]: pam_unix(sudo:session): session closed for user root
Feb 02 10:09:40 compute-1 ceph-mon[80115]: from='client.? 192.168.122.102:0/3490136624' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Feb 02 10:09:40 compute-1 priceless_cannon[233735]: [
Feb 02 10:09:40 compute-1 priceless_cannon[233735]:     {
Feb 02 10:09:40 compute-1 priceless_cannon[233735]:         "available": false,
Feb 02 10:09:40 compute-1 priceless_cannon[233735]:         "being_replaced": false,
Feb 02 10:09:40 compute-1 priceless_cannon[233735]:         "ceph_device_lvm": false,
Feb 02 10:09:40 compute-1 priceless_cannon[233735]:         "device_id": "QEMU_DVD-ROM_QM00001",
Feb 02 10:09:40 compute-1 priceless_cannon[233735]:         "lsm_data": {},
Feb 02 10:09:40 compute-1 priceless_cannon[233735]:         "lvs": [],
Feb 02 10:09:40 compute-1 priceless_cannon[233735]:         "path": "/dev/sr0",
Feb 02 10:09:40 compute-1 priceless_cannon[233735]:         "rejected_reasons": [
Feb 02 10:09:40 compute-1 priceless_cannon[233735]:             "Has a FileSystem",
Feb 02 10:09:40 compute-1 priceless_cannon[233735]:             "Insufficient space (<5GB)"
Feb 02 10:09:40 compute-1 priceless_cannon[233735]:         ],
Feb 02 10:09:40 compute-1 priceless_cannon[233735]:         "sys_api": {
Feb 02 10:09:40 compute-1 priceless_cannon[233735]:             "actuators": null,
Feb 02 10:09:40 compute-1 priceless_cannon[233735]:             "device_nodes": [
Feb 02 10:09:40 compute-1 priceless_cannon[233735]:                 "sr0"
Feb 02 10:09:40 compute-1 priceless_cannon[233735]:             ],
Feb 02 10:09:40 compute-1 priceless_cannon[233735]:             "devname": "sr0",
Feb 02 10:09:40 compute-1 priceless_cannon[233735]:             "human_readable_size": "482.00 KB",
Feb 02 10:09:40 compute-1 priceless_cannon[233735]:             "id_bus": "ata",
Feb 02 10:09:40 compute-1 priceless_cannon[233735]:             "model": "QEMU DVD-ROM",
Feb 02 10:09:40 compute-1 priceless_cannon[233735]:             "nr_requests": "2",
Feb 02 10:09:40 compute-1 priceless_cannon[233735]:             "parent": "/dev/sr0",
Feb 02 10:09:40 compute-1 priceless_cannon[233735]:             "partitions": {},
Feb 02 10:09:40 compute-1 priceless_cannon[233735]:             "path": "/dev/sr0",
Feb 02 10:09:40 compute-1 priceless_cannon[233735]:             "removable": "1",
Feb 02 10:09:40 compute-1 priceless_cannon[233735]:             "rev": "2.5+",
Feb 02 10:09:40 compute-1 priceless_cannon[233735]:             "ro": "0",
Feb 02 10:09:40 compute-1 priceless_cannon[233735]:             "rotational": "1",
Feb 02 10:09:40 compute-1 priceless_cannon[233735]:             "sas_address": "",
Feb 02 10:09:40 compute-1 priceless_cannon[233735]:             "sas_device_handle": "",
Feb 02 10:09:40 compute-1 priceless_cannon[233735]:             "scheduler_mode": "mq-deadline",
Feb 02 10:09:40 compute-1 priceless_cannon[233735]:             "sectors": 0,
Feb 02 10:09:40 compute-1 priceless_cannon[233735]:             "sectorsize": "2048",
Feb 02 10:09:40 compute-1 priceless_cannon[233735]:             "size": 493568.0,
Feb 02 10:09:40 compute-1 priceless_cannon[233735]:             "support_discard": "2048",
Feb 02 10:09:40 compute-1 priceless_cannon[233735]:             "type": "disk",
Feb 02 10:09:40 compute-1 priceless_cannon[233735]:             "vendor": "QEMU"
Feb 02 10:09:40 compute-1 priceless_cannon[233735]:         }
Feb 02 10:09:40 compute-1 priceless_cannon[233735]:     }
Feb 02 10:09:40 compute-1 priceless_cannon[233735]: ]
Feb 02 10:09:40 compute-1 systemd[1]: libpod-0ac330c2797daefe2a280259c223b57e79f9ee9ca604f3cc7b13d500250714ca.scope: Deactivated successfully.
Feb 02 10:09:40 compute-1 podman[233719]: 2026-02-02 10:09:40.594767226 +0000 UTC m=+0.869636948 container died 0ac330c2797daefe2a280259c223b57e79f9ee9ca604f3cc7b13d500250714ca (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=priceless_cannon, CEPH_REF=squid, org.label-schema.license=GPLv2, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, io.buildah.version=1.40.1)
Feb 02 10:09:40 compute-1 systemd[1]: var-lib-containers-storage-overlay-62a567448a77d4ae7e7cf080a0ff49d6e20e8117f8d43b2f65ef01d5deb92542-merged.mount: Deactivated successfully.
Feb 02 10:09:40 compute-1 podman[233719]: 2026-02-02 10:09:40.63972578 +0000 UTC m=+0.914595512 container remove 0ac330c2797daefe2a280259c223b57e79f9ee9ca604f3cc7b13d500250714ca (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=priceless_cannon, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, OSD_FLAVOR=default, io.buildah.version=1.40.1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 02 10:09:40 compute-1 systemd[1]: libpod-conmon-0ac330c2797daefe2a280259c223b57e79f9ee9ca604f3cc7b13d500250714ca.scope: Deactivated successfully.
Feb 02 10:09:40 compute-1 podman[234850]: 2026-02-02 10:09:40.66683068 +0000 UTC m=+0.051731835 container health_status 9cab238676dda9a572dafc047b624a8b78a8fbec063bf997856559897160605d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'db4758ee7523fe447444c4bd2b867b543b1eee4e3bbcf6676cd1b27bf6147d86-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS)
Feb 02 10:09:40 compute-1 sudo[233566]: pam_unix(sudo:session): session closed for user root
Feb 02 10:09:40 compute-1 nova_compute[226294]: 2026-02-02 10:09:40.877 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:09:41 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:09:41 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:09:41 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:09:41.274 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:09:41 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:09:41 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:09:41 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:09:41.336 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:09:41 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb 02 10:09:41 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb 02 10:09:41 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb 02 10:09:41 compute-1 ceph-mon[80115]: pgmap v908: 353 pgs: 353 active+clean; 41 MiB data, 259 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 1.2 KiB/s wr, 90 op/s
Feb 02 10:09:41 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb 02 10:09:41 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Feb 02 10:09:41 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Feb 02 10:09:41 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb 02 10:09:41 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb 02 10:09:41 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Feb 02 10:09:41 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Feb 02 10:09:41 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Feb 02 10:09:42 compute-1 nova_compute[226294]: 2026-02-02 10:09:42.248 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:09:42 compute-1 ovn_controller[133666]: 2026-02-02T10:09:42Z|00057|memory_trim|INFO|Detected inactivity (last active 30001 ms ago): trimming memory
Feb 02 10:09:42 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 10:09:43 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:09:43 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 10:09:43 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:09:43.276 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 10:09:43 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:09:43 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 10:09:43 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:09:43.338 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 10:09:43 compute-1 ceph-mon[80115]: pgmap v909: 353 pgs: 353 active+clean; 41 MiB data, 259 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 1.2 KiB/s wr, 90 op/s
Feb 02 10:09:43 compute-1 ceph-mon[80115]: from='client.? 192.168.122.102:0/1448999903' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Feb 02 10:09:43 compute-1 ceph-mon[80115]: from='client.? 192.168.122.102:0/1044743080' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Feb 02 10:09:44 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-haproxy-nfs-cephfs-compute-1-sryqbx[86108]: [WARNING] 032/100944 (4) : Server backend/nfs.cephfs.0 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Feb 02 10:09:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 10:09:44.909 143542 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 02 10:09:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 10:09:44.909 143542 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 02 10:09:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 10:09:44.910 143542 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 02 10:09:45 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:09:45 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 10:09:45 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:09:45.279 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 10:09:45 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:09:45 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:09:45 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:09:45.341 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:09:45 compute-1 sudo[234882]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 02 10:09:45 compute-1 sudo[234882]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 10:09:45 compute-1 sudo[234882]: pam_unix(sudo:session): session closed for user root
Feb 02 10:09:45 compute-1 ceph-mon[80115]: pgmap v910: 353 pgs: 353 active+clean; 88 MiB data, 281 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 1.8 MiB/s wr, 117 op/s
Feb 02 10:09:45 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb 02 10:09:45 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb 02 10:09:45 compute-1 nova_compute[226294]: 2026-02-02 10:09:45.878 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:09:46 compute-1 ceph-mon[80115]: pgmap v911: 353 pgs: 353 active+clean; 88 MiB data, 281 MiB used, 60 GiB / 60 GiB avail; 17 KiB/s rd, 1.8 MiB/s wr, 27 op/s
Feb 02 10:09:47 compute-1 nova_compute[226294]: 2026-02-02 10:09:47.251 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:09:47 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:09:47 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:09:47 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:09:47.283 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:09:47 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:09:47 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 10:09:47 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:09:47.343 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 10:09:47 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 10:09:47 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Feb 02 10:09:48 compute-1 ceph-mon[80115]: pgmap v912: 353 pgs: 353 active+clean; 88 MiB data, 281 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 1.8 MiB/s wr, 101 op/s
Feb 02 10:09:49 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:09:49 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:09:49 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:09:49.286 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:09:49 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:09:49 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:09:49 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:09:49.346 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:09:49 compute-1 systemd[1]: ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1@nfs.cephfs.0.0.compute-1.mhzhsx.service: Scheduled restart job, restart counter is at 14.
Feb 02 10:09:49 compute-1 systemd[1]: Stopped Ceph nfs.cephfs.0.0.compute-1.mhzhsx for d241d473-9fcb-5f74-b163-f1ca4454e7f1.
Feb 02 10:09:49 compute-1 systemd[1]: ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1@nfs.cephfs.0.0.compute-1.mhzhsx.service: Consumed 1.782s CPU time.
Feb 02 10:09:49 compute-1 systemd[1]: ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1@nfs.cephfs.0.0.compute-1.mhzhsx.service: Start request repeated too quickly.
Feb 02 10:09:49 compute-1 systemd[1]: ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1@nfs.cephfs.0.0.compute-1.mhzhsx.service: Failed with result 'exit-code'.
Feb 02 10:09:49 compute-1 systemd[1]: Failed to start Ceph nfs.cephfs.0.0.compute-1.mhzhsx for d241d473-9fcb-5f74-b163-f1ca4454e7f1.
Feb 02 10:09:50 compute-1 ceph-mon[80115]: pgmap v913: 353 pgs: 353 active+clean; 88 MiB data, 281 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 1.8 MiB/s wr, 100 op/s
Feb 02 10:09:50 compute-1 nova_compute[226294]: 2026-02-02 10:09:50.924 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:09:51 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:09:51 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 10:09:51 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:09:51.288 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 10:09:51 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:09:51 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:09:51 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:09:51.348 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:09:51 compute-1 ceph-mon[80115]: from='client.? 192.168.122.102:0/2724965647' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Feb 02 10:09:52 compute-1 nova_compute[226294]: 2026-02-02 10:09:52.283 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:09:52 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 10:09:52 compute-1 ceph-mon[80115]: pgmap v914: 353 pgs: 353 active+clean; 88 MiB data, 281 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 1.8 MiB/s wr, 100 op/s
Feb 02 10:09:53 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:09:53 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:09:53 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:09:53.291 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:09:53 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:09:53 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 10:09:53 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:09:53.350 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 10:09:54 compute-1 ceph-mon[80115]: pgmap v915: 353 pgs: 353 active+clean; 41 MiB data, 259 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 1.8 MiB/s wr, 127 op/s
Feb 02 10:09:55 compute-1 ovn_metadata_agent[143537]: 2026-02-02 10:09:55.066 143542 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=10, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '66:4f:4d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '4a:a7:f3:61:65:15'}, ipsec=False) old=SB_Global(nb_cfg=9) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 02 10:09:55 compute-1 nova_compute[226294]: 2026-02-02 10:09:55.068 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:09:55 compute-1 ovn_metadata_agent[143537]: 2026-02-02 10:09:55.068 143542 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Feb 02 10:09:55 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:09:55 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:09:55 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:09:55.294 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:09:55 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:09:55 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 10:09:55 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:09:55.353 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 10:09:55 compute-1 ceph-mon[80115]: from='client.? 192.168.122.10:0/3722678984' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Feb 02 10:09:55 compute-1 ceph-mon[80115]: from='client.? 192.168.122.10:0/3722678984' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Feb 02 10:09:55 compute-1 nova_compute[226294]: 2026-02-02 10:09:55.970 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:09:56 compute-1 ceph-mon[80115]: pgmap v916: 353 pgs: 353 active+clean; 41 MiB data, 259 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 14 KiB/s wr, 100 op/s
Feb 02 10:09:57 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:09:57 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:09:57 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:09:57.297 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:09:57 compute-1 nova_compute[226294]: 2026-02-02 10:09:57.323 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:09:57 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:09:57 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 10:09:57 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:09:57.356 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 10:09:57 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 10:09:58 compute-1 ceph-mon[80115]: pgmap v917: 353 pgs: 353 active+clean; 41 MiB data, 259 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 14 KiB/s wr, 100 op/s
Feb 02 10:09:59 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:09:59 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 10:09:59 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:09:59.299 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 10:09:59 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:09:59 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:09:59 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:09:59.358 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:10:00 compute-1 ceph-mon[80115]: overall HEALTH_OK
Feb 02 10:10:00 compute-1 sudo[234914]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Feb 02 10:10:00 compute-1 sudo[234914]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 10:10:00 compute-1 sudo[234914]: pam_unix(sudo:session): session closed for user root
Feb 02 10:10:01 compute-1 nova_compute[226294]: 2026-02-02 10:10:01.010 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:10:01 compute-1 ceph-mon[80115]: pgmap v918: 353 pgs: 353 active+clean; 41 MiB data, 259 MiB used, 60 GiB / 60 GiB avail; 19 KiB/s rd, 1.2 KiB/s wr, 26 op/s
Feb 02 10:10:01 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:10:01 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:10:01 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:10:01.303 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:10:01 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:10:01 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:10:01 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:10:01.361 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:10:02 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Feb 02 10:10:02 compute-1 nova_compute[226294]: 2026-02-02 10:10:02.371 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:10:02 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 10:10:03 compute-1 ceph-mon[80115]: pgmap v919: 353 pgs: 353 active+clean; 41 MiB data, 259 MiB used, 60 GiB / 60 GiB avail; 19 KiB/s rd, 1.2 KiB/s wr, 26 op/s
Feb 02 10:10:03 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:10:03 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:10:03 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:10:03.305 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:10:03 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:10:03 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:10:03 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:10:03.363 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:10:03 compute-1 podman[234941]: 2026-02-02 10:10:03.481007935 +0000 UTC m=+0.151066183 container health_status 1fb2696999ea15e9688144989093738543d3cef725f9986792d6dfd707aefbe2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:099d88ae13fa2b3409da5310cdcba7fa01d2c87a8bc98296299a57054b9a075e, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'db4758ee7523fe447444c4bd2b867b543b1eee4e3bbcf6676cd1b27bf6147d86-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:099d88ae13fa2b3409da5310cdcba7fa01d2c87a8bc98296299a57054b9a075e', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Feb 02 10:10:04 compute-1 ceph-mon[80115]: pgmap v920: 353 pgs: 353 active+clean; 41 MiB data, 259 MiB used, 60 GiB / 60 GiB avail; 19 KiB/s rd, 1.2 KiB/s wr, 26 op/s
Feb 02 10:10:05 compute-1 ovn_metadata_agent[143537]: 2026-02-02 10:10:05.071 143542 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=2f54a3b0-231a-4b96-9e3a-0a36e3e73216, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '10'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 02 10:10:05 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:10:05 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:10:05 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:10:05.309 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:10:05 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:10:05 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:10:05 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:10:05.365 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:10:06 compute-1 nova_compute[226294]: 2026-02-02 10:10:06.011 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:10:06 compute-1 ceph-mon[80115]: pgmap v921: 353 pgs: 353 active+clean; 41 MiB data, 259 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s
Feb 02 10:10:07 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:10:07 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 10:10:07 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:10:07.310 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 10:10:07 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:10:07 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:10:07 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:10:07.366 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:10:07 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 10:10:07 compute-1 nova_compute[226294]: 2026-02-02 10:10:07.404 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:10:08 compute-1 ceph-mon[80115]: pgmap v922: 353 pgs: 353 active+clean; 41 MiB data, 259 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Feb 02 10:10:08 compute-1 ceph-mon[80115]: from='client.? 192.168.122.100:0/2215496927' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Feb 02 10:10:09 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:10:09 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:10:09 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:10:09.314 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:10:09 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:10:09 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:10:09 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:10:09.368 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:10:09 compute-1 ceph-mon[80115]: from='client.? 192.168.122.100:0/2018837737' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Feb 02 10:10:11 compute-1 ceph-mon[80115]: pgmap v923: 353 pgs: 353 active+clean; 41 MiB data, 259 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s
Feb 02 10:10:11 compute-1 nova_compute[226294]: 2026-02-02 10:10:11.060 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:10:11 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:10:11 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 10:10:11 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:10:11.316 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 10:10:11 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:10:11 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:10:11 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:10:11.370 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:10:11 compute-1 podman[234971]: 2026-02-02 10:10:11.416926501 +0000 UTC m=+0.088977564 container health_status 9cab238676dda9a572dafc047b624a8b78a8fbec063bf997856559897160605d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'db4758ee7523fe447444c4bd2b867b543b1eee4e3bbcf6676cd1b27bf6147d86-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_metadata_agent, managed_by=edpm_ansible, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true)
Feb 02 10:10:12 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 10:10:12 compute-1 nova_compute[226294]: 2026-02-02 10:10:12.406 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:10:12 compute-1 nova_compute[226294]: 2026-02-02 10:10:12.995 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 02 10:10:13 compute-1 nova_compute[226294]: 2026-02-02 10:10:13.014 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 02 10:10:13 compute-1 nova_compute[226294]: 2026-02-02 10:10:13.015 226298 DEBUG nova.compute.manager [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 02 10:10:13 compute-1 nova_compute[226294]: 2026-02-02 10:10:13.015 226298 DEBUG nova.compute.manager [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 02 10:10:13 compute-1 nova_compute[226294]: 2026-02-02 10:10:13.028 226298 DEBUG nova.compute.manager [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Feb 02 10:10:13 compute-1 nova_compute[226294]: 2026-02-02 10:10:13.029 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 02 10:10:13 compute-1 nova_compute[226294]: 2026-02-02 10:10:13.029 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 02 10:10:13 compute-1 nova_compute[226294]: 2026-02-02 10:10:13.030 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 02 10:10:13 compute-1 nova_compute[226294]: 2026-02-02 10:10:13.030 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 02 10:10:13 compute-1 ceph-mon[80115]: pgmap v924: 353 pgs: 353 active+clean; 41 MiB data, 259 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s
Feb 02 10:10:13 compute-1 ceph-mon[80115]: from='client.? 192.168.122.102:0/1530118463' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Feb 02 10:10:13 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:10:13 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:10:13 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:10:13.319 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:10:13 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:10:13 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:10:13 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:10:13.372 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:10:13 compute-1 nova_compute[226294]: 2026-02-02 10:10:13.648 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 02 10:10:13 compute-1 nova_compute[226294]: 2026-02-02 10:10:13.648 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 02 10:10:13 compute-1 nova_compute[226294]: 2026-02-02 10:10:13.649 226298 DEBUG nova.compute.manager [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 02 10:10:14 compute-1 ceph-mon[80115]: from='client.? 192.168.122.102:0/3241869427' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Feb 02 10:10:14 compute-1 ceph-mon[80115]: from='client.? 192.168.122.102:0/1103873038' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Feb 02 10:10:15 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:10:15 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 10:10:15 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:10:15.321 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 10:10:15 compute-1 ceph-mon[80115]: pgmap v925: 353 pgs: 353 active+clean; 41 MiB data, 259 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Feb 02 10:10:15 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:10:15 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:10:15 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:10:15.374 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:10:15 compute-1 nova_compute[226294]: 2026-02-02 10:10:15.649 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 02 10:10:16 compute-1 nova_compute[226294]: 2026-02-02 10:10:16.061 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:10:16 compute-1 nova_compute[226294]: 2026-02-02 10:10:16.648 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 02 10:10:16 compute-1 nova_compute[226294]: 2026-02-02 10:10:16.676 226298 DEBUG oslo_concurrency.lockutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 02 10:10:16 compute-1 nova_compute[226294]: 2026-02-02 10:10:16.677 226298 DEBUG oslo_concurrency.lockutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 02 10:10:16 compute-1 nova_compute[226294]: 2026-02-02 10:10:16.677 226298 DEBUG oslo_concurrency.lockutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 02 10:10:16 compute-1 nova_compute[226294]: 2026-02-02 10:10:16.677 226298 DEBUG nova.compute.resource_tracker [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 02 10:10:16 compute-1 nova_compute[226294]: 2026-02-02 10:10:16.678 226298 DEBUG oslo_concurrency.processutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 02 10:10:16 compute-1 ceph-mon[80115]: pgmap v926: 353 pgs: 353 active+clean; 41 MiB data, 259 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s
Feb 02 10:10:17 compute-1 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Feb 02 10:10:17 compute-1 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/975664394' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Feb 02 10:10:17 compute-1 nova_compute[226294]: 2026-02-02 10:10:17.123 226298 DEBUG oslo_concurrency.processutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.445s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 02 10:10:17 compute-1 nova_compute[226294]: 2026-02-02 10:10:17.293 226298 WARNING nova.virt.libvirt.driver [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 02 10:10:17 compute-1 nova_compute[226294]: 2026-02-02 10:10:17.294 226298 DEBUG nova.compute.resource_tracker [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4888MB free_disk=59.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 02 10:10:17 compute-1 nova_compute[226294]: 2026-02-02 10:10:17.295 226298 DEBUG oslo_concurrency.lockutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 02 10:10:17 compute-1 nova_compute[226294]: 2026-02-02 10:10:17.295 226298 DEBUG oslo_concurrency.lockutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 02 10:10:17 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:10:17 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:10:17 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:10:17.323 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:10:17 compute-1 nova_compute[226294]: 2026-02-02 10:10:17.374 226298 DEBUG nova.compute.resource_tracker [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 02 10:10:17 compute-1 nova_compute[226294]: 2026-02-02 10:10:17.374 226298 DEBUG nova.compute.resource_tracker [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 02 10:10:17 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:10:17 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:10:17 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:10:17.376 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:10:17 compute-1 nova_compute[226294]: 2026-02-02 10:10:17.394 226298 DEBUG oslo_concurrency.processutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 02 10:10:17 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 10:10:17 compute-1 nova_compute[226294]: 2026-02-02 10:10:17.407 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:10:17 compute-1 ceph-mon[80115]: from='client.? 192.168.122.101:0/975664394' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Feb 02 10:10:17 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Feb 02 10:10:17 compute-1 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Feb 02 10:10:17 compute-1 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/293474388' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Feb 02 10:10:17 compute-1 nova_compute[226294]: 2026-02-02 10:10:17.848 226298 DEBUG oslo_concurrency.processutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.453s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 02 10:10:17 compute-1 nova_compute[226294]: 2026-02-02 10:10:17.854 226298 DEBUG nova.compute.provider_tree [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Inventory has not changed in ProviderTree for provider: 8e32c057-ad28-4c19-8374-763e0c1c8622 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 02 10:10:17 compute-1 nova_compute[226294]: 2026-02-02 10:10:17.871 226298 DEBUG nova.scheduler.client.report [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Inventory has not changed for provider 8e32c057-ad28-4c19-8374-763e0c1c8622 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 02 10:10:17 compute-1 nova_compute[226294]: 2026-02-02 10:10:17.874 226298 DEBUG nova.compute.resource_tracker [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 02 10:10:17 compute-1 nova_compute[226294]: 2026-02-02 10:10:17.874 226298 DEBUG oslo_concurrency.lockutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.580s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 02 10:10:18 compute-1 ceph-mon[80115]: from='client.? 192.168.122.101:0/293474388' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Feb 02 10:10:18 compute-1 ceph-mon[80115]: pgmap v927: 353 pgs: 353 active+clean; 88 MiB data, 281 MiB used, 60 GiB / 60 GiB avail; 17 KiB/s rd, 1.8 MiB/s wr, 27 op/s
Feb 02 10:10:18 compute-1 ceph-mon[80115]: from='client.? 192.168.122.102:0/1726624806' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Feb 02 10:10:19 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:10:19 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 10:10:19 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:10:19.325 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 10:10:19 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:10:19 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:10:19 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:10:19.378 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:10:19 compute-1 ceph-mon[80115]: from='client.? 192.168.122.102:0/2026861310' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Feb 02 10:10:20 compute-1 sudo[235038]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Feb 02 10:10:20 compute-1 sudo[235038]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 10:10:20 compute-1 sudo[235038]: pam_unix(sudo:session): session closed for user root
Feb 02 10:10:20 compute-1 ceph-mon[80115]: pgmap v928: 353 pgs: 353 active+clean; 88 MiB data, 281 MiB used, 60 GiB / 60 GiB avail; 17 KiB/s rd, 1.8 MiB/s wr, 27 op/s
Feb 02 10:10:21 compute-1 nova_compute[226294]: 2026-02-02 10:10:21.064 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:10:21 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:10:21 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 10:10:21 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:10:21.328 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 10:10:21 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:10:21 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:10:21 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:10:21.380 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:10:22 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 10:10:22 compute-1 nova_compute[226294]: 2026-02-02 10:10:22.445 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:10:22 compute-1 ceph-mon[80115]: pgmap v929: 353 pgs: 353 active+clean; 88 MiB data, 281 MiB used, 60 GiB / 60 GiB avail; 17 KiB/s rd, 1.8 MiB/s wr, 27 op/s
Feb 02 10:10:23 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:10:23 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 10:10:23 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:10:23.331 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 10:10:23 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:10:23 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:10:23 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:10:23.382 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:10:24 compute-1 ceph-mon[80115]: pgmap v930: 353 pgs: 353 active+clean; 88 MiB data, 281 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 1.8 MiB/s wr, 101 op/s
Feb 02 10:10:25 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:10:25 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 10:10:25 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:10:25.333 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 10:10:25 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:10:25 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 10:10:25 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:10:25.384 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 10:10:26 compute-1 nova_compute[226294]: 2026-02-02 10:10:26.066 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:10:26 compute-1 ceph-mon[80115]: pgmap v931: 353 pgs: 353 active+clean; 88 MiB data, 281 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 1.8 MiB/s wr, 100 op/s
Feb 02 10:10:27 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:10:27 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.002000053s ======
Feb 02 10:10:27 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:10:27.336 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000053s
Feb 02 10:10:27 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:10:27 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:10:27 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:10:27.387 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:10:27 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 10:10:27 compute-1 nova_compute[226294]: 2026-02-02 10:10:27.495 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:10:28 compute-1 ceph-mon[80115]: pgmap v932: 353 pgs: 353 active+clean; 88 MiB data, 285 MiB used, 60 GiB / 60 GiB avail; 2.0 MiB/s rd, 1.8 MiB/s wr, 273 op/s
Feb 02 10:10:29 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:10:29 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 10:10:29 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:10:29.340 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 10:10:29 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:10:29 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:10:29 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:10:29.389 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:10:30 compute-1 ceph-mon[80115]: pgmap v933: 353 pgs: 353 active+clean; 88 MiB data, 285 MiB used, 60 GiB / 60 GiB avail; 2.0 MiB/s rd, 12 KiB/s wr, 245 op/s
Feb 02 10:10:31 compute-1 nova_compute[226294]: 2026-02-02 10:10:31.068 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:10:31 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:10:31 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:10:31 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:10:31.342 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:10:31 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:10:31 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 10:10:31 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:10:31.391 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 10:10:32 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Feb 02 10:10:32 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 10:10:32 compute-1 nova_compute[226294]: 2026-02-02 10:10:32.528 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:10:33 compute-1 ceph-mon[80115]: pgmap v934: 353 pgs: 353 active+clean; 88 MiB data, 285 MiB used, 60 GiB / 60 GiB avail; 2.0 MiB/s rd, 12 KiB/s wr, 245 op/s
Feb 02 10:10:33 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:10:33 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:10:33 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:10:33.344 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:10:33 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:10:33 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:10:33 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:10:33.394 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:10:34 compute-1 podman[235070]: 2026-02-02 10:10:34.44619357 +0000 UTC m=+0.120142002 container health_status 1fb2696999ea15e9688144989093738543d3cef725f9986792d6dfd707aefbe2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:099d88ae13fa2b3409da5310cdcba7fa01d2c87a8bc98296299a57054b9a075e, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'db4758ee7523fe447444c4bd2b867b543b1eee4e3bbcf6676cd1b27bf6147d86-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:099d88ae13fa2b3409da5310cdcba7fa01d2c87a8bc98296299a57054b9a075e', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Feb 02 10:10:34 compute-1 ceph-mon[80115]: pgmap v935: 353 pgs: 353 active+clean; 121 MiB data, 310 MiB used, 60 GiB / 60 GiB avail; 2.3 MiB/s rd, 2.1 MiB/s wr, 308 op/s
Feb 02 10:10:35 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:10:35 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:10:35 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:10:35.347 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:10:35 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:10:35 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:10:35 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:10:35.397 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:10:36 compute-1 nova_compute[226294]: 2026-02-02 10:10:36.088 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:10:36 compute-1 ceph-mon[80115]: pgmap v936: 353 pgs: 353 active+clean; 121 MiB data, 310 MiB used, 60 GiB / 60 GiB avail; 428 KiB/s rd, 2.1 MiB/s wr, 235 op/s
Feb 02 10:10:37 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:10:37 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:10:37 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:10:37.349 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:10:37 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 10:10:37 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:10:37 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 10:10:37 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:10:37.400 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 10:10:37 compute-1 nova_compute[226294]: 2026-02-02 10:10:37.531 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:10:38 compute-1 ceph-mon[80115]: pgmap v937: 353 pgs: 353 active+clean; 121 MiB data, 310 MiB used, 60 GiB / 60 GiB avail; 428 KiB/s rd, 2.1 MiB/s wr, 235 op/s
Feb 02 10:10:39 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:10:39 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 10:10:39 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:10:39.352 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 10:10:39 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:10:39 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:10:39 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:10:39.403 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:10:40 compute-1 sudo[235099]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Feb 02 10:10:40 compute-1 sudo[235099]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 10:10:40 compute-1 sudo[235099]: pam_unix(sudo:session): session closed for user root
Feb 02 10:10:40 compute-1 ceph-mon[80115]: pgmap v938: 353 pgs: 353 active+clean; 121 MiB data, 310 MiB used, 60 GiB / 60 GiB avail; 325 KiB/s rd, 2.1 MiB/s wr, 63 op/s
Feb 02 10:10:41 compute-1 nova_compute[226294]: 2026-02-02 10:10:41.113 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:10:41 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:10:41 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:10:41 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:10:41.355 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:10:41 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:10:41 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:10:41 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:10:41.405 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:10:42 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 10:10:42 compute-1 podman[235125]: 2026-02-02 10:10:42.409207376 +0000 UTC m=+0.086198870 container health_status 9cab238676dda9a572dafc047b624a8b78a8fbec063bf997856559897160605d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'db4758ee7523fe447444c4bd2b867b543b1eee4e3bbcf6676cd1b27bf6147d86-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent)
Feb 02 10:10:42 compute-1 nova_compute[226294]: 2026-02-02 10:10:42.560 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:10:42 compute-1 ceph-mon[80115]: pgmap v939: 353 pgs: 353 active+clean; 121 MiB data, 310 MiB used, 60 GiB / 60 GiB avail; 325 KiB/s rd, 2.1 MiB/s wr, 63 op/s
Feb 02 10:10:43 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:10:43 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 10:10:43 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:10:43.356 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 10:10:43 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:10:43 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:10:43 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:10:43.407 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:10:44 compute-1 ceph-mon[80115]: pgmap v940: 353 pgs: 353 active+clean; 121 MiB data, 310 MiB used, 60 GiB / 60 GiB avail; 328 KiB/s rd, 2.1 MiB/s wr, 64 op/s
Feb 02 10:10:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 10:10:44.909 143542 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 02 10:10:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 10:10:44.910 143542 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 02 10:10:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 10:10:44.910 143542 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 02 10:10:45 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:10:45 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:10:45 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:10:45.359 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:10:45 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:10:45 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:10:45 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:10:45.409 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:10:45 compute-1 sudo[235147]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 02 10:10:45 compute-1 sudo[235147]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 10:10:45 compute-1 sudo[235147]: pam_unix(sudo:session): session closed for user root
Feb 02 10:10:45 compute-1 sudo[235172]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/d241d473-9fcb-5f74-b163-f1ca4454e7f1/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Feb 02 10:10:45 compute-1 sudo[235172]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 10:10:46 compute-1 nova_compute[226294]: 2026-02-02 10:10:46.150 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:10:46 compute-1 sudo[235172]: pam_unix(sudo:session): session closed for user root
Feb 02 10:10:46 compute-1 ceph-mon[80115]: pgmap v941: 353 pgs: 353 active+clean; 121 MiB data, 310 MiB used, 60 GiB / 60 GiB avail; 3.2 KiB/s rd, 12 KiB/s wr, 1 op/s
Feb 02 10:10:47 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:10:47 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:10:47 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:10:47.360 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:10:47 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 10:10:47 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:10:47 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:10:47 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:10:47.410 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:10:47 compute-1 nova_compute[226294]: 2026-02-02 10:10:47.562 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:10:47 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Feb 02 10:10:47 compute-1 ceph-mon[80115]: from='client.? 192.168.122.102:0/1165621846' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Feb 02 10:10:49 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb 02 10:10:49 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb 02 10:10:49 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Feb 02 10:10:49 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Feb 02 10:10:49 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb 02 10:10:49 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb 02 10:10:49 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Feb 02 10:10:49 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Feb 02 10:10:49 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Feb 02 10:10:49 compute-1 ceph-mon[80115]: pgmap v942: 353 pgs: 353 active+clean; 41 MiB data, 268 MiB used, 60 GiB / 60 GiB avail; 22 KiB/s rd, 19 KiB/s wr, 29 op/s
Feb 02 10:10:49 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:10:49 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 10:10:49 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:10:49.362 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 10:10:49 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:10:49 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:10:49 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:10:49.412 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:10:50 compute-1 ceph-mon[80115]: pgmap v943: 353 pgs: 353 active+clean; 41 MiB data, 268 MiB used, 60 GiB / 60 GiB avail; 22 KiB/s rd, 6.5 KiB/s wr, 28 op/s
Feb 02 10:10:51 compute-1 nova_compute[226294]: 2026-02-02 10:10:51.204 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:10:51 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:10:51 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:10:51 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:10:51.365 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:10:51 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:10:51 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 10:10:51 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:10:51.414 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 10:10:52 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 10:10:52 compute-1 nova_compute[226294]: 2026-02-02 10:10:52.591 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:10:52 compute-1 ceph-mon[80115]: pgmap v944: 353 pgs: 353 active+clean; 41 MiB data, 268 MiB used, 60 GiB / 60 GiB avail; 22 KiB/s rd, 6.5 KiB/s wr, 28 op/s
Feb 02 10:10:53 compute-1 sudo[235230]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 02 10:10:53 compute-1 sudo[235230]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 10:10:53 compute-1 sudo[235230]: pam_unix(sudo:session): session closed for user root
Feb 02 10:10:53 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:10:53 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 10:10:53 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:10:53.367 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 10:10:53 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:10:53 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:10:53 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:10:53.417 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:10:54 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb 02 10:10:54 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb 02 10:10:55 compute-1 ceph-mon[80115]: pgmap v945: 353 pgs: 353 active+clean; 41 MiB data, 268 MiB used, 60 GiB / 60 GiB avail; 22 KiB/s rd, 6.5 KiB/s wr, 28 op/s
Feb 02 10:10:55 compute-1 ceph-mon[80115]: from='client.? 192.168.122.10:0/1470274605' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Feb 02 10:10:55 compute-1 ceph-mon[80115]: from='client.? 192.168.122.10:0/1470274605' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Feb 02 10:10:55 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:10:55 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 10:10:55 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:10:55.370 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 10:10:55 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:10:55 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 10:10:55 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:10:55.419 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 10:10:55 compute-1 nova_compute[226294]: 2026-02-02 10:10:55.520 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:10:55 compute-1 ovn_metadata_agent[143537]: 2026-02-02 10:10:55.521 143542 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=11, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '66:4f:4d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '4a:a7:f3:61:65:15'}, ipsec=False) old=SB_Global(nb_cfg=10) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 02 10:10:55 compute-1 ovn_metadata_agent[143537]: 2026-02-02 10:10:55.523 143542 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Feb 02 10:10:56 compute-1 nova_compute[226294]: 2026-02-02 10:10:56.229 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:10:56 compute-1 ceph-mon[80115]: pgmap v946: 353 pgs: 353 active+clean; 41 MiB data, 268 MiB used, 60 GiB / 60 GiB avail; 19 KiB/s rd, 6.5 KiB/s wr, 28 op/s
Feb 02 10:10:57 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:10:57 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 10:10:57 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:10:57.372 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 10:10:57 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 10:10:57 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:10:57 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 10:10:57 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:10:57.422 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 10:10:57 compute-1 nova_compute[226294]: 2026-02-02 10:10:57.625 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:10:58 compute-1 ovn_metadata_agent[143537]: 2026-02-02 10:10:58.525 143542 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=2f54a3b0-231a-4b96-9e3a-0a36e3e73216, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '11'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 02 10:10:58 compute-1 ceph-mon[80115]: pgmap v947: 353 pgs: 353 active+clean; 41 MiB data, 268 MiB used, 60 GiB / 60 GiB avail; 19 KiB/s rd, 6.5 KiB/s wr, 28 op/s
Feb 02 10:10:59 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:10:59 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 10:10:59 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:10:59.374 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 10:10:59 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:10:59 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 10:10:59 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:10:59.425 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 10:11:00 compute-1 sudo[235258]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Feb 02 10:11:00 compute-1 sudo[235258]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 10:11:00 compute-1 sudo[235258]: pam_unix(sudo:session): session closed for user root
Feb 02 10:11:00 compute-1 ceph-mon[80115]: pgmap v948: 353 pgs: 353 active+clean; 41 MiB data, 268 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s
Feb 02 10:11:01 compute-1 nova_compute[226294]: 2026-02-02 10:11:01.275 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:11:01 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:11:01 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 10:11:01 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:11:01.377 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 10:11:01 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:11:01 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 10:11:01 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:11:01.428 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 10:11:02 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Feb 02 10:11:02 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 10:11:02 compute-1 nova_compute[226294]: 2026-02-02 10:11:02.627 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:11:03 compute-1 nova_compute[226294]: 2026-02-02 10:11:03.048 226298 DEBUG oslo_concurrency.lockutils [None req-7367c843-9ba1-4df9-91fe-c76f5d61c312 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Acquiring lock "42dc4712-7770-4ecd-abba-8c8e970f8e46" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 02 10:11:03 compute-1 nova_compute[226294]: 2026-02-02 10:11:03.049 226298 DEBUG oslo_concurrency.lockutils [None req-7367c843-9ba1-4df9-91fe-c76f5d61c312 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Lock "42dc4712-7770-4ecd-abba-8c8e970f8e46" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 02 10:11:03 compute-1 nova_compute[226294]: 2026-02-02 10:11:03.068 226298 DEBUG nova.compute.manager [None req-7367c843-9ba1-4df9-91fe-c76f5d61c312 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] [instance: 42dc4712-7770-4ecd-abba-8c8e970f8e46] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Feb 02 10:11:03 compute-1 nova_compute[226294]: 2026-02-02 10:11:03.162 226298 DEBUG oslo_concurrency.lockutils [None req-7367c843-9ba1-4df9-91fe-c76f5d61c312 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 02 10:11:03 compute-1 nova_compute[226294]: 2026-02-02 10:11:03.162 226298 DEBUG oslo_concurrency.lockutils [None req-7367c843-9ba1-4df9-91fe-c76f5d61c312 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 02 10:11:03 compute-1 nova_compute[226294]: 2026-02-02 10:11:03.171 226298 DEBUG nova.virt.hardware [None req-7367c843-9ba1-4df9-91fe-c76f5d61c312 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Feb 02 10:11:03 compute-1 nova_compute[226294]: 2026-02-02 10:11:03.172 226298 INFO nova.compute.claims [None req-7367c843-9ba1-4df9-91fe-c76f5d61c312 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] [instance: 42dc4712-7770-4ecd-abba-8c8e970f8e46] Claim successful on node compute-1.ctlplane.example.com
Feb 02 10:11:03 compute-1 ceph-mon[80115]: pgmap v949: 353 pgs: 353 active+clean; 41 MiB data, 268 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s
Feb 02 10:11:03 compute-1 nova_compute[226294]: 2026-02-02 10:11:03.270 226298 DEBUG oslo_concurrency.processutils [None req-7367c843-9ba1-4df9-91fe-c76f5d61c312 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 02 10:11:03 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:11:03 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:11:03 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:11:03.380 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:11:03 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:11:03 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:11:03 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:11:03.431 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:11:03 compute-1 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Feb 02 10:11:03 compute-1 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/20677095' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Feb 02 10:11:03 compute-1 nova_compute[226294]: 2026-02-02 10:11:03.757 226298 DEBUG oslo_concurrency.processutils [None req-7367c843-9ba1-4df9-91fe-c76f5d61c312 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.487s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 02 10:11:03 compute-1 nova_compute[226294]: 2026-02-02 10:11:03.765 226298 DEBUG nova.compute.provider_tree [None req-7367c843-9ba1-4df9-91fe-c76f5d61c312 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Inventory has not changed in ProviderTree for provider: 8e32c057-ad28-4c19-8374-763e0c1c8622 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 02 10:11:03 compute-1 nova_compute[226294]: 2026-02-02 10:11:03.813 226298 DEBUG nova.scheduler.client.report [None req-7367c843-9ba1-4df9-91fe-c76f5d61c312 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Inventory has not changed for provider 8e32c057-ad28-4c19-8374-763e0c1c8622 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 02 10:11:03 compute-1 nova_compute[226294]: 2026-02-02 10:11:03.858 226298 DEBUG oslo_concurrency.lockutils [None req-7367c843-9ba1-4df9-91fe-c76f5d61c312 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.695s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 02 10:11:03 compute-1 nova_compute[226294]: 2026-02-02 10:11:03.859 226298 DEBUG nova.compute.manager [None req-7367c843-9ba1-4df9-91fe-c76f5d61c312 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] [instance: 42dc4712-7770-4ecd-abba-8c8e970f8e46] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Feb 02 10:11:03 compute-1 nova_compute[226294]: 2026-02-02 10:11:03.925 226298 DEBUG nova.compute.manager [None req-7367c843-9ba1-4df9-91fe-c76f5d61c312 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] [instance: 42dc4712-7770-4ecd-abba-8c8e970f8e46] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Feb 02 10:11:03 compute-1 nova_compute[226294]: 2026-02-02 10:11:03.926 226298 DEBUG nova.network.neutron [None req-7367c843-9ba1-4df9-91fe-c76f5d61c312 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] [instance: 42dc4712-7770-4ecd-abba-8c8e970f8e46] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Feb 02 10:11:03 compute-1 nova_compute[226294]: 2026-02-02 10:11:03.959 226298 INFO nova.virt.libvirt.driver [None req-7367c843-9ba1-4df9-91fe-c76f5d61c312 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] [instance: 42dc4712-7770-4ecd-abba-8c8e970f8e46] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Feb 02 10:11:03 compute-1 nova_compute[226294]: 2026-02-02 10:11:03.978 226298 DEBUG nova.compute.manager [None req-7367c843-9ba1-4df9-91fe-c76f5d61c312 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] [instance: 42dc4712-7770-4ecd-abba-8c8e970f8e46] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Feb 02 10:11:04 compute-1 nova_compute[226294]: 2026-02-02 10:11:04.077 226298 DEBUG nova.compute.manager [None req-7367c843-9ba1-4df9-91fe-c76f5d61c312 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] [instance: 42dc4712-7770-4ecd-abba-8c8e970f8e46] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Feb 02 10:11:04 compute-1 nova_compute[226294]: 2026-02-02 10:11:04.078 226298 DEBUG nova.virt.libvirt.driver [None req-7367c843-9ba1-4df9-91fe-c76f5d61c312 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] [instance: 42dc4712-7770-4ecd-abba-8c8e970f8e46] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Feb 02 10:11:04 compute-1 nova_compute[226294]: 2026-02-02 10:11:04.079 226298 INFO nova.virt.libvirt.driver [None req-7367c843-9ba1-4df9-91fe-c76f5d61c312 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] [instance: 42dc4712-7770-4ecd-abba-8c8e970f8e46] Creating image(s)
Feb 02 10:11:04 compute-1 nova_compute[226294]: 2026-02-02 10:11:04.102 226298 DEBUG nova.storage.rbd_utils [None req-7367c843-9ba1-4df9-91fe-c76f5d61c312 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] rbd image 42dc4712-7770-4ecd-abba-8c8e970f8e46_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80
Feb 02 10:11:04 compute-1 nova_compute[226294]: 2026-02-02 10:11:04.126 226298 DEBUG nova.storage.rbd_utils [None req-7367c843-9ba1-4df9-91fe-c76f5d61c312 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] rbd image 42dc4712-7770-4ecd-abba-8c8e970f8e46_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80
Feb 02 10:11:04 compute-1 nova_compute[226294]: 2026-02-02 10:11:04.154 226298 DEBUG nova.storage.rbd_utils [None req-7367c843-9ba1-4df9-91fe-c76f5d61c312 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] rbd image 42dc4712-7770-4ecd-abba-8c8e970f8e46_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80
Feb 02 10:11:04 compute-1 nova_compute[226294]: 2026-02-02 10:11:04.157 226298 DEBUG oslo_concurrency.processutils [None req-7367c843-9ba1-4df9-91fe-c76f5d61c312 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b48fe8b86a7168723be684d0fce89ef3f0abcc61 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 02 10:11:04 compute-1 nova_compute[226294]: 2026-02-02 10:11:04.211 226298 DEBUG oslo_concurrency.processutils [None req-7367c843-9ba1-4df9-91fe-c76f5d61c312 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b48fe8b86a7168723be684d0fce89ef3f0abcc61 --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 02 10:11:04 compute-1 nova_compute[226294]: 2026-02-02 10:11:04.212 226298 DEBUG oslo_concurrency.lockutils [None req-7367c843-9ba1-4df9-91fe-c76f5d61c312 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Acquiring lock "b48fe8b86a7168723be684d0fce89ef3f0abcc61" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 02 10:11:04 compute-1 nova_compute[226294]: 2026-02-02 10:11:04.213 226298 DEBUG oslo_concurrency.lockutils [None req-7367c843-9ba1-4df9-91fe-c76f5d61c312 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Lock "b48fe8b86a7168723be684d0fce89ef3f0abcc61" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 02 10:11:04 compute-1 nova_compute[226294]: 2026-02-02 10:11:04.213 226298 DEBUG oslo_concurrency.lockutils [None req-7367c843-9ba1-4df9-91fe-c76f5d61c312 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Lock "b48fe8b86a7168723be684d0fce89ef3f0abcc61" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 02 10:11:04 compute-1 nova_compute[226294]: 2026-02-02 10:11:04.238 226298 DEBUG nova.storage.rbd_utils [None req-7367c843-9ba1-4df9-91fe-c76f5d61c312 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] rbd image 42dc4712-7770-4ecd-abba-8c8e970f8e46_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80
Feb 02 10:11:04 compute-1 nova_compute[226294]: 2026-02-02 10:11:04.243 226298 DEBUG oslo_concurrency.processutils [None req-7367c843-9ba1-4df9-91fe-c76f5d61c312 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/b48fe8b86a7168723be684d0fce89ef3f0abcc61 42dc4712-7770-4ecd-abba-8c8e970f8e46_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 02 10:11:04 compute-1 ceph-mon[80115]: from='client.? 192.168.122.101:0/20677095' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Feb 02 10:11:04 compute-1 nova_compute[226294]: 2026-02-02 10:11:04.340 226298 DEBUG nova.policy [None req-7367c843-9ba1-4df9-91fe-c76f5d61c312 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '1b1695a2a70d4aa0aa350ba17d8f6d5e', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'efbfe697ca674d72b47da5adf3e42c0c', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Feb 02 10:11:04 compute-1 nova_compute[226294]: 2026-02-02 10:11:04.514 226298 DEBUG oslo_concurrency.processutils [None req-7367c843-9ba1-4df9-91fe-c76f5d61c312 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/b48fe8b86a7168723be684d0fce89ef3f0abcc61 42dc4712-7770-4ecd-abba-8c8e970f8e46_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.272s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 02 10:11:04 compute-1 nova_compute[226294]: 2026-02-02 10:11:04.578 226298 DEBUG nova.storage.rbd_utils [None req-7367c843-9ba1-4df9-91fe-c76f5d61c312 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] resizing rbd image 42dc4712-7770-4ecd-abba-8c8e970f8e46_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288
Feb 02 10:11:04 compute-1 nova_compute[226294]: 2026-02-02 10:11:04.725 226298 DEBUG nova.objects.instance [None req-7367c843-9ba1-4df9-91fe-c76f5d61c312 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Lazy-loading 'migration_context' on Instance uuid 42dc4712-7770-4ecd-abba-8c8e970f8e46 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 02 10:11:04 compute-1 nova_compute[226294]: 2026-02-02 10:11:04.745 226298 DEBUG nova.virt.libvirt.driver [None req-7367c843-9ba1-4df9-91fe-c76f5d61c312 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] [instance: 42dc4712-7770-4ecd-abba-8c8e970f8e46] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Feb 02 10:11:04 compute-1 nova_compute[226294]: 2026-02-02 10:11:04.745 226298 DEBUG nova.virt.libvirt.driver [None req-7367c843-9ba1-4df9-91fe-c76f5d61c312 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] [instance: 42dc4712-7770-4ecd-abba-8c8e970f8e46] Ensure instance console log exists: /var/lib/nova/instances/42dc4712-7770-4ecd-abba-8c8e970f8e46/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Feb 02 10:11:04 compute-1 nova_compute[226294]: 2026-02-02 10:11:04.746 226298 DEBUG oslo_concurrency.lockutils [None req-7367c843-9ba1-4df9-91fe-c76f5d61c312 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 02 10:11:04 compute-1 nova_compute[226294]: 2026-02-02 10:11:04.747 226298 DEBUG oslo_concurrency.lockutils [None req-7367c843-9ba1-4df9-91fe-c76f5d61c312 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 02 10:11:04 compute-1 nova_compute[226294]: 2026-02-02 10:11:04.747 226298 DEBUG oslo_concurrency.lockutils [None req-7367c843-9ba1-4df9-91fe-c76f5d61c312 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 02 10:11:05 compute-1 nova_compute[226294]: 2026-02-02 10:11:05.278 226298 DEBUG nova.network.neutron [None req-7367c843-9ba1-4df9-91fe-c76f5d61c312 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] [instance: 42dc4712-7770-4ecd-abba-8c8e970f8e46] Successfully created port: 29f94a0b-58b9-437a-9157-c3ce95454def _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Feb 02 10:11:05 compute-1 ceph-mon[80115]: pgmap v950: 353 pgs: 353 active+clean; 41 MiB data, 268 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Feb 02 10:11:05 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:11:05 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 10:11:05 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:11:05.382 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 10:11:05 compute-1 podman[235474]: 2026-02-02 10:11:05.431852292 +0000 UTC m=+0.099992626 container health_status 1fb2696999ea15e9688144989093738543d3cef725f9986792d6dfd707aefbe2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:099d88ae13fa2b3409da5310cdcba7fa01d2c87a8bc98296299a57054b9a075e, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'db4758ee7523fe447444c4bd2b867b543b1eee4e3bbcf6676cd1b27bf6147d86-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:099d88ae13fa2b3409da5310cdcba7fa01d2c87a8bc98296299a57054b9a075e', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team)
Feb 02 10:11:05 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:11:05 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:11:05 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:11:05.433 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:11:06 compute-1 nova_compute[226294]: 2026-02-02 10:11:05.998 226298 DEBUG nova.network.neutron [None req-7367c843-9ba1-4df9-91fe-c76f5d61c312 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] [instance: 42dc4712-7770-4ecd-abba-8c8e970f8e46] Successfully updated port: 29f94a0b-58b9-437a-9157-c3ce95454def _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Feb 02 10:11:06 compute-1 nova_compute[226294]: 2026-02-02 10:11:06.010 226298 DEBUG oslo_concurrency.lockutils [None req-7367c843-9ba1-4df9-91fe-c76f5d61c312 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Acquiring lock "refresh_cache-42dc4712-7770-4ecd-abba-8c8e970f8e46" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 02 10:11:06 compute-1 nova_compute[226294]: 2026-02-02 10:11:06.010 226298 DEBUG oslo_concurrency.lockutils [None req-7367c843-9ba1-4df9-91fe-c76f5d61c312 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Acquired lock "refresh_cache-42dc4712-7770-4ecd-abba-8c8e970f8e46" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 02 10:11:06 compute-1 nova_compute[226294]: 2026-02-02 10:11:06.011 226298 DEBUG nova.network.neutron [None req-7367c843-9ba1-4df9-91fe-c76f5d61c312 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] [instance: 42dc4712-7770-4ecd-abba-8c8e970f8e46] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Feb 02 10:11:06 compute-1 nova_compute[226294]: 2026-02-02 10:11:06.117 226298 DEBUG nova.compute.manager [req-ceb4eb3a-cc41-4a24-a844-98602c9f2798 req-b54bb0de-5dea-4e50-8a34-8a934d920aac b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] [instance: 42dc4712-7770-4ecd-abba-8c8e970f8e46] Received event network-changed-29f94a0b-58b9-437a-9157-c3ce95454def external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 02 10:11:06 compute-1 nova_compute[226294]: 2026-02-02 10:11:06.117 226298 DEBUG nova.compute.manager [req-ceb4eb3a-cc41-4a24-a844-98602c9f2798 req-b54bb0de-5dea-4e50-8a34-8a934d920aac b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] [instance: 42dc4712-7770-4ecd-abba-8c8e970f8e46] Refreshing instance network info cache due to event network-changed-29f94a0b-58b9-437a-9157-c3ce95454def. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Feb 02 10:11:06 compute-1 nova_compute[226294]: 2026-02-02 10:11:06.118 226298 DEBUG oslo_concurrency.lockutils [req-ceb4eb3a-cc41-4a24-a844-98602c9f2798 req-b54bb0de-5dea-4e50-8a34-8a934d920aac b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] Acquiring lock "refresh_cache-42dc4712-7770-4ecd-abba-8c8e970f8e46" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 02 10:11:06 compute-1 nova_compute[226294]: 2026-02-02 10:11:06.257 226298 DEBUG nova.network.neutron [None req-7367c843-9ba1-4df9-91fe-c76f5d61c312 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] [instance: 42dc4712-7770-4ecd-abba-8c8e970f8e46] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Feb 02 10:11:06 compute-1 nova_compute[226294]: 2026-02-02 10:11:06.309 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:11:06 compute-1 ceph-mon[80115]: pgmap v951: 353 pgs: 353 active+clean; 41 MiB data, 268 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s
Feb 02 10:11:07 compute-1 nova_compute[226294]: 2026-02-02 10:11:07.081 226298 DEBUG nova.network.neutron [None req-7367c843-9ba1-4df9-91fe-c76f5d61c312 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] [instance: 42dc4712-7770-4ecd-abba-8c8e970f8e46] Updating instance_info_cache with network_info: [{"id": "29f94a0b-58b9-437a-9157-c3ce95454def", "address": "fa:16:3e:5f:0c:ce", "network": {"id": "07b5f9e6-a53d-47d1-be8b-5269063b871d", "bridge": "br-int", "label": "tempest-network-smoke--1567229594", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "efbfe697ca674d72b47da5adf3e42c0c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap29f94a0b-58", "ovs_interfaceid": "29f94a0b-58b9-437a-9157-c3ce95454def", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 02 10:11:07 compute-1 nova_compute[226294]: 2026-02-02 10:11:07.108 226298 DEBUG oslo_concurrency.lockutils [None req-7367c843-9ba1-4df9-91fe-c76f5d61c312 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Releasing lock "refresh_cache-42dc4712-7770-4ecd-abba-8c8e970f8e46" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 02 10:11:07 compute-1 nova_compute[226294]: 2026-02-02 10:11:07.108 226298 DEBUG nova.compute.manager [None req-7367c843-9ba1-4df9-91fe-c76f5d61c312 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] [instance: 42dc4712-7770-4ecd-abba-8c8e970f8e46] Instance network_info: |[{"id": "29f94a0b-58b9-437a-9157-c3ce95454def", "address": "fa:16:3e:5f:0c:ce", "network": {"id": "07b5f9e6-a53d-47d1-be8b-5269063b871d", "bridge": "br-int", "label": "tempest-network-smoke--1567229594", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "efbfe697ca674d72b47da5adf3e42c0c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap29f94a0b-58", "ovs_interfaceid": "29f94a0b-58b9-437a-9157-c3ce95454def", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Feb 02 10:11:07 compute-1 nova_compute[226294]: 2026-02-02 10:11:07.109 226298 DEBUG oslo_concurrency.lockutils [req-ceb4eb3a-cc41-4a24-a844-98602c9f2798 req-b54bb0de-5dea-4e50-8a34-8a934d920aac b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] Acquired lock "refresh_cache-42dc4712-7770-4ecd-abba-8c8e970f8e46" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 02 10:11:07 compute-1 nova_compute[226294]: 2026-02-02 10:11:07.109 226298 DEBUG nova.network.neutron [req-ceb4eb3a-cc41-4a24-a844-98602c9f2798 req-b54bb0de-5dea-4e50-8a34-8a934d920aac b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] [instance: 42dc4712-7770-4ecd-abba-8c8e970f8e46] Refreshing network info cache for port 29f94a0b-58b9-437a-9157-c3ce95454def _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Feb 02 10:11:07 compute-1 nova_compute[226294]: 2026-02-02 10:11:07.114 226298 DEBUG nova.virt.libvirt.driver [None req-7367c843-9ba1-4df9-91fe-c76f5d61c312 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] [instance: 42dc4712-7770-4ecd-abba-8c8e970f8e46] Start _get_guest_xml network_info=[{"id": "29f94a0b-58b9-437a-9157-c3ce95454def", "address": "fa:16:3e:5f:0c:ce", "network": {"id": "07b5f9e6-a53d-47d1-be8b-5269063b871d", "bridge": "br-int", "label": "tempest-network-smoke--1567229594", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "efbfe697ca674d72b47da5adf3e42c0c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap29f94a0b-58", "ovs_interfaceid": "29f94a0b-58b9-437a-9157-c3ce95454def", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-02-02T10:01:42Z,direct_url=<?>,disk_format='qcow2',id=d5e062d7-95ef-409c-9ad0-60f7cf6f44ce,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='823d3e7e313a44e9a50531e3fef22a1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-02-02T10:01:45Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_format': None, 'disk_bus': 'virtio', 'encrypted': False, 'guest_format': None, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'size': 0, 'encryption_options': None, 'device_type': 'disk', 'boot_index': 0, 'image_id': 'd5e062d7-95ef-409c-9ad0-60f7cf6f44ce'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Feb 02 10:11:07 compute-1 nova_compute[226294]: 2026-02-02 10:11:07.120 226298 WARNING nova.virt.libvirt.driver [None req-7367c843-9ba1-4df9-91fe-c76f5d61c312 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 02 10:11:07 compute-1 nova_compute[226294]: 2026-02-02 10:11:07.126 226298 DEBUG nova.virt.libvirt.host [None req-7367c843-9ba1-4df9-91fe-c76f5d61c312 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Feb 02 10:11:07 compute-1 nova_compute[226294]: 2026-02-02 10:11:07.127 226298 DEBUG nova.virt.libvirt.host [None req-7367c843-9ba1-4df9-91fe-c76f5d61c312 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Feb 02 10:11:07 compute-1 nova_compute[226294]: 2026-02-02 10:11:07.139 226298 DEBUG nova.virt.libvirt.host [None req-7367c843-9ba1-4df9-91fe-c76f5d61c312 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Feb 02 10:11:07 compute-1 nova_compute[226294]: 2026-02-02 10:11:07.140 226298 DEBUG nova.virt.libvirt.host [None req-7367c843-9ba1-4df9-91fe-c76f5d61c312 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Feb 02 10:11:07 compute-1 nova_compute[226294]: 2026-02-02 10:11:07.141 226298 DEBUG nova.virt.libvirt.driver [None req-7367c843-9ba1-4df9-91fe-c76f5d61c312 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Feb 02 10:11:07 compute-1 nova_compute[226294]: 2026-02-02 10:11:07.142 226298 DEBUG nova.virt.hardware [None req-7367c843-9ba1-4df9-91fe-c76f5d61c312 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-02-02T10:01:40Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='1194feb9-e285-414e-825a-1e77171d092f',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-02-02T10:01:42Z,direct_url=<?>,disk_format='qcow2',id=d5e062d7-95ef-409c-9ad0-60f7cf6f44ce,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='823d3e7e313a44e9a50531e3fef22a1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-02-02T10:01:45Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Feb 02 10:11:07 compute-1 nova_compute[226294]: 2026-02-02 10:11:07.143 226298 DEBUG nova.virt.hardware [None req-7367c843-9ba1-4df9-91fe-c76f5d61c312 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Feb 02 10:11:07 compute-1 nova_compute[226294]: 2026-02-02 10:11:07.143 226298 DEBUG nova.virt.hardware [None req-7367c843-9ba1-4df9-91fe-c76f5d61c312 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Feb 02 10:11:07 compute-1 nova_compute[226294]: 2026-02-02 10:11:07.144 226298 DEBUG nova.virt.hardware [None req-7367c843-9ba1-4df9-91fe-c76f5d61c312 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Feb 02 10:11:07 compute-1 nova_compute[226294]: 2026-02-02 10:11:07.145 226298 DEBUG nova.virt.hardware [None req-7367c843-9ba1-4df9-91fe-c76f5d61c312 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Feb 02 10:11:07 compute-1 nova_compute[226294]: 2026-02-02 10:11:07.145 226298 DEBUG nova.virt.hardware [None req-7367c843-9ba1-4df9-91fe-c76f5d61c312 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Feb 02 10:11:07 compute-1 nova_compute[226294]: 2026-02-02 10:11:07.146 226298 DEBUG nova.virt.hardware [None req-7367c843-9ba1-4df9-91fe-c76f5d61c312 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Feb 02 10:11:07 compute-1 nova_compute[226294]: 2026-02-02 10:11:07.146 226298 DEBUG nova.virt.hardware [None req-7367c843-9ba1-4df9-91fe-c76f5d61c312 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Feb 02 10:11:07 compute-1 nova_compute[226294]: 2026-02-02 10:11:07.147 226298 DEBUG nova.virt.hardware [None req-7367c843-9ba1-4df9-91fe-c76f5d61c312 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Feb 02 10:11:07 compute-1 nova_compute[226294]: 2026-02-02 10:11:07.148 226298 DEBUG nova.virt.hardware [None req-7367c843-9ba1-4df9-91fe-c76f5d61c312 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Feb 02 10:11:07 compute-1 nova_compute[226294]: 2026-02-02 10:11:07.148 226298 DEBUG nova.virt.hardware [None req-7367c843-9ba1-4df9-91fe-c76f5d61c312 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Feb 02 10:11:07 compute-1 nova_compute[226294]: 2026-02-02 10:11:07.154 226298 DEBUG oslo_concurrency.processutils [None req-7367c843-9ba1-4df9-91fe-c76f5d61c312 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 02 10:11:07 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:11:07 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:11:07 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:11:07.385 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:11:07 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 10:11:07 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:11:07 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:11:07 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:11:07.435 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:11:07 compute-1 ceph-mon[80115]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #55. Immutable memtables: 0.
Feb 02 10:11:07 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:11:07.441992) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Feb 02 10:11:07 compute-1 ceph-mon[80115]: rocksdb: [db/flush_job.cc:856] [default] [JOB 31] Flushing memtable with next log file: 55
Feb 02 10:11:07 compute-1 ceph-mon[80115]: rocksdb: EVENT_LOG_v1 {"time_micros": 1770027067442024, "job": 31, "event": "flush_started", "num_memtables": 1, "num_entries": 1443, "num_deletes": 255, "total_data_size": 3522886, "memory_usage": 3616896, "flush_reason": "Manual Compaction"}
Feb 02 10:11:07 compute-1 ceph-mon[80115]: rocksdb: [db/flush_job.cc:885] [default] [JOB 31] Level-0 flush table #56: started
Feb 02 10:11:07 compute-1 ceph-mon[80115]: rocksdb: EVENT_LOG_v1 {"time_micros": 1770027067462832, "cf_name": "default", "job": 31, "event": "table_file_creation", "file_number": 56, "file_size": 2262069, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 28409, "largest_seqno": 29847, "table_properties": {"data_size": 2256038, "index_size": 3230, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1733, "raw_key_size": 12928, "raw_average_key_size": 19, "raw_value_size": 2243786, "raw_average_value_size": 3358, "num_data_blocks": 143, "num_entries": 668, "num_filter_entries": 668, "num_deletions": 255, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1770026962, "oldest_key_time": 1770026962, "file_creation_time": 1770027067, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "fac1d709-8a2a-487d-b05b-57255ec289c7", "db_session_id": "DE871D21TSCUFP8UED8E", "orig_file_number": 56, "seqno_to_time_mapping": "N/A"}}
Feb 02 10:11:07 compute-1 ceph-mon[80115]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 31] Flush lasted 20921 microseconds, and 4325 cpu microseconds.
Feb 02 10:11:07 compute-1 ceph-mon[80115]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Feb 02 10:11:07 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:11:07.462907) [db/flush_job.cc:967] [default] [JOB 31] Level-0 flush table #56: 2262069 bytes OK
Feb 02 10:11:07 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:11:07.462936) [db/memtable_list.cc:519] [default] Level-0 commit table #56 started
Feb 02 10:11:07 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:11:07.466057) [db/memtable_list.cc:722] [default] Level-0 commit table #56: memtable #1 done
Feb 02 10:11:07 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:11:07.466083) EVENT_LOG_v1 {"time_micros": 1770027067466075, "job": 31, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Feb 02 10:11:07 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:11:07.466127) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Feb 02 10:11:07 compute-1 ceph-mon[80115]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 31] Try to delete WAL files size 3516070, prev total WAL file size 3516070, number of live WAL files 2.
Feb 02 10:11:07 compute-1 ceph-mon[80115]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000052.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb 02 10:11:07 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:11:07.466954) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6C6F676D00353031' seq:72057594037927935, type:22 .. '6C6F676D00373532' seq:0, type:0; will stop at (end)
Feb 02 10:11:07 compute-1 ceph-mon[80115]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 32] Compacting 1@0 + 1@6 files to L6, score -1.00
Feb 02 10:11:07 compute-1 ceph-mon[80115]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 31 Base level 0, inputs: [56(2209KB)], [54(13MB)]
Feb 02 10:11:07 compute-1 ceph-mon[80115]: rocksdb: EVENT_LOG_v1 {"time_micros": 1770027067466978, "job": 32, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [56], "files_L6": [54], "score": -1, "input_data_size": 16260862, "oldest_snapshot_seqno": -1}
Feb 02 10:11:07 compute-1 ceph-mon[80115]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 32] Generated table #57: 5989 keys, 16112660 bytes, temperature: kUnknown
Feb 02 10:11:07 compute-1 ceph-mon[80115]: rocksdb: EVENT_LOG_v1 {"time_micros": 1770027067594872, "cf_name": "default", "job": 32, "event": "table_file_creation", "file_number": 57, "file_size": 16112660, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 16070218, "index_size": 26396, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 14981, "raw_key_size": 152759, "raw_average_key_size": 25, "raw_value_size": 15959887, "raw_average_value_size": 2664, "num_data_blocks": 1081, "num_entries": 5989, "num_filter_entries": 5989, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1770025175, "oldest_key_time": 0, "file_creation_time": 1770027067, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "fac1d709-8a2a-487d-b05b-57255ec289c7", "db_session_id": "DE871D21TSCUFP8UED8E", "orig_file_number": 57, "seqno_to_time_mapping": "N/A"}}
Feb 02 10:11:07 compute-1 ceph-mon[80115]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Feb 02 10:11:07 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:11:07.595229) [db/compaction/compaction_job.cc:1663] [default] [JOB 32] Compacted 1@0 + 1@6 files to L6 => 16112660 bytes
Feb 02 10:11:07 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:11:07.596876) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 127.1 rd, 125.9 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(2.2, 13.4 +0.0 blob) out(15.4 +0.0 blob), read-write-amplify(14.3) write-amplify(7.1) OK, records in: 6517, records dropped: 528 output_compression: NoCompression
Feb 02 10:11:07 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:11:07.596912) EVENT_LOG_v1 {"time_micros": 1770027067596892, "job": 32, "event": "compaction_finished", "compaction_time_micros": 127985, "compaction_time_cpu_micros": 21052, "output_level": 6, "num_output_files": 1, "total_output_size": 16112660, "num_input_records": 6517, "num_output_records": 5989, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Feb 02 10:11:07 compute-1 ceph-mon[80115]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000056.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb 02 10:11:07 compute-1 ceph-mon[80115]: rocksdb: EVENT_LOG_v1 {"time_micros": 1770027067597351, "job": 32, "event": "table_file_deletion", "file_number": 56}
Feb 02 10:11:07 compute-1 ceph-mon[80115]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000054.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb 02 10:11:07 compute-1 ceph-mon[80115]: rocksdb: EVENT_LOG_v1 {"time_micros": 1770027067599030, "job": 32, "event": "table_file_deletion", "file_number": 54}
Feb 02 10:11:07 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:11:07.466894) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 02 10:11:07 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:11:07.599127) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 02 10:11:07 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:11:07.599134) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 02 10:11:07 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:11:07.599139) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 02 10:11:07 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:11:07.599169) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 02 10:11:07 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:11:07.599174) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 02 10:11:07 compute-1 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Feb 02 10:11:07 compute-1 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2282015443' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Feb 02 10:11:07 compute-1 nova_compute[226294]: 2026-02-02 10:11:07.680 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:11:07 compute-1 nova_compute[226294]: 2026-02-02 10:11:07.687 226298 DEBUG oslo_concurrency.processutils [None req-7367c843-9ba1-4df9-91fe-c76f5d61c312 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.532s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 02 10:11:07 compute-1 nova_compute[226294]: 2026-02-02 10:11:07.711 226298 DEBUG nova.storage.rbd_utils [None req-7367c843-9ba1-4df9-91fe-c76f5d61c312 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] rbd image 42dc4712-7770-4ecd-abba-8c8e970f8e46_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80
Feb 02 10:11:07 compute-1 nova_compute[226294]: 2026-02-02 10:11:07.715 226298 DEBUG oslo_concurrency.processutils [None req-7367c843-9ba1-4df9-91fe-c76f5d61c312 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 02 10:11:08 compute-1 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Feb 02 10:11:08 compute-1 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/17390625' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Feb 02 10:11:08 compute-1 nova_compute[226294]: 2026-02-02 10:11:08.112 226298 DEBUG oslo_concurrency.processutils [None req-7367c843-9ba1-4df9-91fe-c76f5d61c312 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.397s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 02 10:11:08 compute-1 nova_compute[226294]: 2026-02-02 10:11:08.115 226298 DEBUG nova.virt.libvirt.vif [None req-7367c843-9ba1-4df9-91fe-c76f5d61c312 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-02-02T10:11:02Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1589717047',display_name='tempest-TestNetworkBasicOps-server-1589717047',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1589717047',id=11,image_ref='d5e062d7-95ef-409c-9ad0-60f7cf6f44ce',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBFKLhtXPFNo+7qMy7WM4rXv1wxOn6wk80R7orPjLFemWslU1farAMLdF2l7TazRd92gQv0m2wSsyelv9AIIl5lW/89YdwjsAA40J0bv4RJZ9H+7Em3wwtPI4Gx0836EIRw==',key_name='tempest-TestNetworkBasicOps-746965999',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='efbfe697ca674d72b47da5adf3e42c0c',ramdisk_id='',reservation_id='r-n5k0k93c',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='d5e062d7-95ef-409c-9ad0-60f7cf6f44ce',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-793549693',owner_user_name='tempest-TestNetworkBasicOps-793549693-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-02-02T10:11:04Z,user_data=None,user_id='1b1695a2a70d4aa0aa350ba17d8f6d5e',uuid=42dc4712-7770-4ecd-abba-8c8e970f8e46,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "29f94a0b-58b9-437a-9157-c3ce95454def", "address": "fa:16:3e:5f:0c:ce", "network": {"id": "07b5f9e6-a53d-47d1-be8b-5269063b871d", "bridge": "br-int", "label": "tempest-network-smoke--1567229594", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "efbfe697ca674d72b47da5adf3e42c0c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap29f94a0b-58", "ovs_interfaceid": "29f94a0b-58b9-437a-9157-c3ce95454def", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Feb 02 10:11:08 compute-1 nova_compute[226294]: 2026-02-02 10:11:08.116 226298 DEBUG nova.network.os_vif_util [None req-7367c843-9ba1-4df9-91fe-c76f5d61c312 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Converting VIF {"id": "29f94a0b-58b9-437a-9157-c3ce95454def", "address": "fa:16:3e:5f:0c:ce", "network": {"id": "07b5f9e6-a53d-47d1-be8b-5269063b871d", "bridge": "br-int", "label": "tempest-network-smoke--1567229594", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "efbfe697ca674d72b47da5adf3e42c0c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap29f94a0b-58", "ovs_interfaceid": "29f94a0b-58b9-437a-9157-c3ce95454def", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 02 10:11:08 compute-1 nova_compute[226294]: 2026-02-02 10:11:08.118 226298 DEBUG nova.network.os_vif_util [None req-7367c843-9ba1-4df9-91fe-c76f5d61c312 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:5f:0c:ce,bridge_name='br-int',has_traffic_filtering=True,id=29f94a0b-58b9-437a-9157-c3ce95454def,network=Network(07b5f9e6-a53d-47d1-be8b-5269063b871d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap29f94a0b-58') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 02 10:11:08 compute-1 nova_compute[226294]: 2026-02-02 10:11:08.121 226298 DEBUG nova.objects.instance [None req-7367c843-9ba1-4df9-91fe-c76f5d61c312 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Lazy-loading 'pci_devices' on Instance uuid 42dc4712-7770-4ecd-abba-8c8e970f8e46 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 02 10:11:08 compute-1 nova_compute[226294]: 2026-02-02 10:11:08.145 226298 DEBUG nova.virt.libvirt.driver [None req-7367c843-9ba1-4df9-91fe-c76f5d61c312 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] [instance: 42dc4712-7770-4ecd-abba-8c8e970f8e46] End _get_guest_xml xml=<domain type="kvm">
Feb 02 10:11:08 compute-1 nova_compute[226294]:   <uuid>42dc4712-7770-4ecd-abba-8c8e970f8e46</uuid>
Feb 02 10:11:08 compute-1 nova_compute[226294]:   <name>instance-0000000b</name>
Feb 02 10:11:08 compute-1 nova_compute[226294]:   <memory>131072</memory>
Feb 02 10:11:08 compute-1 nova_compute[226294]:   <vcpu>1</vcpu>
Feb 02 10:11:08 compute-1 nova_compute[226294]:   <metadata>
Feb 02 10:11:08 compute-1 nova_compute[226294]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Feb 02 10:11:08 compute-1 nova_compute[226294]:       <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Feb 02 10:11:08 compute-1 nova_compute[226294]:       <nova:name>tempest-TestNetworkBasicOps-server-1589717047</nova:name>
Feb 02 10:11:08 compute-1 nova_compute[226294]:       <nova:creationTime>2026-02-02 10:11:07</nova:creationTime>
Feb 02 10:11:08 compute-1 nova_compute[226294]:       <nova:flavor name="m1.nano">
Feb 02 10:11:08 compute-1 nova_compute[226294]:         <nova:memory>128</nova:memory>
Feb 02 10:11:08 compute-1 nova_compute[226294]:         <nova:disk>1</nova:disk>
Feb 02 10:11:08 compute-1 nova_compute[226294]:         <nova:swap>0</nova:swap>
Feb 02 10:11:08 compute-1 nova_compute[226294]:         <nova:ephemeral>0</nova:ephemeral>
Feb 02 10:11:08 compute-1 nova_compute[226294]:         <nova:vcpus>1</nova:vcpus>
Feb 02 10:11:08 compute-1 nova_compute[226294]:       </nova:flavor>
Feb 02 10:11:08 compute-1 nova_compute[226294]:       <nova:owner>
Feb 02 10:11:08 compute-1 nova_compute[226294]:         <nova:user uuid="1b1695a2a70d4aa0aa350ba17d8f6d5e">tempest-TestNetworkBasicOps-793549693-project-member</nova:user>
Feb 02 10:11:08 compute-1 nova_compute[226294]:         <nova:project uuid="efbfe697ca674d72b47da5adf3e42c0c">tempest-TestNetworkBasicOps-793549693</nova:project>
Feb 02 10:11:08 compute-1 nova_compute[226294]:       </nova:owner>
Feb 02 10:11:08 compute-1 nova_compute[226294]:       <nova:root type="image" uuid="d5e062d7-95ef-409c-9ad0-60f7cf6f44ce"/>
Feb 02 10:11:08 compute-1 nova_compute[226294]:       <nova:ports>
Feb 02 10:11:08 compute-1 nova_compute[226294]:         <nova:port uuid="29f94a0b-58b9-437a-9157-c3ce95454def">
Feb 02 10:11:08 compute-1 nova_compute[226294]:           <nova:ip type="fixed" address="10.100.0.6" ipVersion="4"/>
Feb 02 10:11:08 compute-1 nova_compute[226294]:         </nova:port>
Feb 02 10:11:08 compute-1 nova_compute[226294]:       </nova:ports>
Feb 02 10:11:08 compute-1 nova_compute[226294]:     </nova:instance>
Feb 02 10:11:08 compute-1 nova_compute[226294]:   </metadata>
Feb 02 10:11:08 compute-1 nova_compute[226294]:   <sysinfo type="smbios">
Feb 02 10:11:08 compute-1 nova_compute[226294]:     <system>
Feb 02 10:11:08 compute-1 nova_compute[226294]:       <entry name="manufacturer">RDO</entry>
Feb 02 10:11:08 compute-1 nova_compute[226294]:       <entry name="product">OpenStack Compute</entry>
Feb 02 10:11:08 compute-1 nova_compute[226294]:       <entry name="version">27.5.2-0.20260127144738.eaa65f0.el9</entry>
Feb 02 10:11:08 compute-1 nova_compute[226294]:       <entry name="serial">42dc4712-7770-4ecd-abba-8c8e970f8e46</entry>
Feb 02 10:11:08 compute-1 nova_compute[226294]:       <entry name="uuid">42dc4712-7770-4ecd-abba-8c8e970f8e46</entry>
Feb 02 10:11:08 compute-1 nova_compute[226294]:       <entry name="family">Virtual Machine</entry>
Feb 02 10:11:08 compute-1 nova_compute[226294]:     </system>
Feb 02 10:11:08 compute-1 nova_compute[226294]:   </sysinfo>
Feb 02 10:11:08 compute-1 nova_compute[226294]:   <os>
Feb 02 10:11:08 compute-1 nova_compute[226294]:     <type arch="x86_64" machine="q35">hvm</type>
Feb 02 10:11:08 compute-1 nova_compute[226294]:     <boot dev="hd"/>
Feb 02 10:11:08 compute-1 nova_compute[226294]:     <smbios mode="sysinfo"/>
Feb 02 10:11:08 compute-1 nova_compute[226294]:   </os>
Feb 02 10:11:08 compute-1 nova_compute[226294]:   <features>
Feb 02 10:11:08 compute-1 nova_compute[226294]:     <acpi/>
Feb 02 10:11:08 compute-1 nova_compute[226294]:     <apic/>
Feb 02 10:11:08 compute-1 nova_compute[226294]:     <vmcoreinfo/>
Feb 02 10:11:08 compute-1 nova_compute[226294]:   </features>
Feb 02 10:11:08 compute-1 nova_compute[226294]:   <clock offset="utc">
Feb 02 10:11:08 compute-1 nova_compute[226294]:     <timer name="pit" tickpolicy="delay"/>
Feb 02 10:11:08 compute-1 nova_compute[226294]:     <timer name="rtc" tickpolicy="catchup"/>
Feb 02 10:11:08 compute-1 nova_compute[226294]:     <timer name="hpet" present="no"/>
Feb 02 10:11:08 compute-1 nova_compute[226294]:   </clock>
Feb 02 10:11:08 compute-1 nova_compute[226294]:   <cpu mode="host-model" match="exact">
Feb 02 10:11:08 compute-1 nova_compute[226294]:     <topology sockets="1" cores="1" threads="1"/>
Feb 02 10:11:08 compute-1 nova_compute[226294]:   </cpu>
Feb 02 10:11:08 compute-1 nova_compute[226294]:   <devices>
Feb 02 10:11:08 compute-1 nova_compute[226294]:     <disk type="network" device="disk">
Feb 02 10:11:08 compute-1 nova_compute[226294]:       <driver type="raw" cache="none"/>
Feb 02 10:11:08 compute-1 nova_compute[226294]:       <source protocol="rbd" name="vms/42dc4712-7770-4ecd-abba-8c8e970f8e46_disk">
Feb 02 10:11:08 compute-1 nova_compute[226294]:         <host name="192.168.122.100" port="6789"/>
Feb 02 10:11:08 compute-1 nova_compute[226294]:         <host name="192.168.122.102" port="6789"/>
Feb 02 10:11:08 compute-1 nova_compute[226294]:         <host name="192.168.122.101" port="6789"/>
Feb 02 10:11:08 compute-1 nova_compute[226294]:       </source>
Feb 02 10:11:08 compute-1 nova_compute[226294]:       <auth username="openstack">
Feb 02 10:11:08 compute-1 nova_compute[226294]:         <secret type="ceph" uuid="d241d473-9fcb-5f74-b163-f1ca4454e7f1"/>
Feb 02 10:11:08 compute-1 nova_compute[226294]:       </auth>
Feb 02 10:11:08 compute-1 nova_compute[226294]:       <target dev="vda" bus="virtio"/>
Feb 02 10:11:08 compute-1 nova_compute[226294]:     </disk>
Feb 02 10:11:08 compute-1 nova_compute[226294]:     <disk type="network" device="cdrom">
Feb 02 10:11:08 compute-1 nova_compute[226294]:       <driver type="raw" cache="none"/>
Feb 02 10:11:08 compute-1 nova_compute[226294]:       <source protocol="rbd" name="vms/42dc4712-7770-4ecd-abba-8c8e970f8e46_disk.config">
Feb 02 10:11:08 compute-1 nova_compute[226294]:         <host name="192.168.122.100" port="6789"/>
Feb 02 10:11:08 compute-1 nova_compute[226294]:         <host name="192.168.122.102" port="6789"/>
Feb 02 10:11:08 compute-1 nova_compute[226294]:         <host name="192.168.122.101" port="6789"/>
Feb 02 10:11:08 compute-1 nova_compute[226294]:       </source>
Feb 02 10:11:08 compute-1 nova_compute[226294]:       <auth username="openstack">
Feb 02 10:11:08 compute-1 nova_compute[226294]:         <secret type="ceph" uuid="d241d473-9fcb-5f74-b163-f1ca4454e7f1"/>
Feb 02 10:11:08 compute-1 nova_compute[226294]:       </auth>
Feb 02 10:11:08 compute-1 nova_compute[226294]:       <target dev="sda" bus="sata"/>
Feb 02 10:11:08 compute-1 nova_compute[226294]:     </disk>
Feb 02 10:11:08 compute-1 nova_compute[226294]:     <interface type="ethernet">
Feb 02 10:11:08 compute-1 nova_compute[226294]:       <mac address="fa:16:3e:5f:0c:ce"/>
Feb 02 10:11:08 compute-1 nova_compute[226294]:       <model type="virtio"/>
Feb 02 10:11:08 compute-1 nova_compute[226294]:       <driver name="vhost" rx_queue_size="512"/>
Feb 02 10:11:08 compute-1 nova_compute[226294]:       <mtu size="1442"/>
Feb 02 10:11:08 compute-1 nova_compute[226294]:       <target dev="tap29f94a0b-58"/>
Feb 02 10:11:08 compute-1 nova_compute[226294]:     </interface>
Feb 02 10:11:08 compute-1 nova_compute[226294]:     <serial type="pty">
Feb 02 10:11:08 compute-1 nova_compute[226294]:       <log file="/var/lib/nova/instances/42dc4712-7770-4ecd-abba-8c8e970f8e46/console.log" append="off"/>
Feb 02 10:11:08 compute-1 nova_compute[226294]:     </serial>
Feb 02 10:11:08 compute-1 nova_compute[226294]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Feb 02 10:11:08 compute-1 nova_compute[226294]:     <video>
Feb 02 10:11:08 compute-1 nova_compute[226294]:       <model type="virtio"/>
Feb 02 10:11:08 compute-1 nova_compute[226294]:     </video>
Feb 02 10:11:08 compute-1 nova_compute[226294]:     <input type="tablet" bus="usb"/>
Feb 02 10:11:08 compute-1 nova_compute[226294]:     <rng model="virtio">
Feb 02 10:11:08 compute-1 nova_compute[226294]:       <backend model="random">/dev/urandom</backend>
Feb 02 10:11:08 compute-1 nova_compute[226294]:     </rng>
Feb 02 10:11:08 compute-1 nova_compute[226294]:     <controller type="pci" model="pcie-root"/>
Feb 02 10:11:08 compute-1 nova_compute[226294]:     <controller type="pci" model="pcie-root-port"/>
Feb 02 10:11:08 compute-1 nova_compute[226294]:     <controller type="pci" model="pcie-root-port"/>
Feb 02 10:11:08 compute-1 nova_compute[226294]:     <controller type="pci" model="pcie-root-port"/>
Feb 02 10:11:08 compute-1 nova_compute[226294]:     <controller type="pci" model="pcie-root-port"/>
Feb 02 10:11:08 compute-1 nova_compute[226294]:     <controller type="pci" model="pcie-root-port"/>
Feb 02 10:11:08 compute-1 nova_compute[226294]:     <controller type="pci" model="pcie-root-port"/>
Feb 02 10:11:08 compute-1 nova_compute[226294]:     <controller type="pci" model="pcie-root-port"/>
Feb 02 10:11:08 compute-1 nova_compute[226294]:     <controller type="pci" model="pcie-root-port"/>
Feb 02 10:11:08 compute-1 nova_compute[226294]:     <controller type="pci" model="pcie-root-port"/>
Feb 02 10:11:08 compute-1 nova_compute[226294]:     <controller type="pci" model="pcie-root-port"/>
Feb 02 10:11:08 compute-1 nova_compute[226294]:     <controller type="pci" model="pcie-root-port"/>
Feb 02 10:11:08 compute-1 nova_compute[226294]:     <controller type="pci" model="pcie-root-port"/>
Feb 02 10:11:08 compute-1 nova_compute[226294]:     <controller type="pci" model="pcie-root-port"/>
Feb 02 10:11:08 compute-1 nova_compute[226294]:     <controller type="pci" model="pcie-root-port"/>
Feb 02 10:11:08 compute-1 nova_compute[226294]:     <controller type="pci" model="pcie-root-port"/>
Feb 02 10:11:08 compute-1 nova_compute[226294]:     <controller type="pci" model="pcie-root-port"/>
Feb 02 10:11:08 compute-1 nova_compute[226294]:     <controller type="pci" model="pcie-root-port"/>
Feb 02 10:11:08 compute-1 nova_compute[226294]:     <controller type="pci" model="pcie-root-port"/>
Feb 02 10:11:08 compute-1 nova_compute[226294]:     <controller type="pci" model="pcie-root-port"/>
Feb 02 10:11:08 compute-1 nova_compute[226294]:     <controller type="pci" model="pcie-root-port"/>
Feb 02 10:11:08 compute-1 nova_compute[226294]:     <controller type="pci" model="pcie-root-port"/>
Feb 02 10:11:08 compute-1 nova_compute[226294]:     <controller type="pci" model="pcie-root-port"/>
Feb 02 10:11:08 compute-1 nova_compute[226294]:     <controller type="pci" model="pcie-root-port"/>
Feb 02 10:11:08 compute-1 nova_compute[226294]:     <controller type="pci" model="pcie-root-port"/>
Feb 02 10:11:08 compute-1 nova_compute[226294]:     <controller type="usb" index="0"/>
Feb 02 10:11:08 compute-1 nova_compute[226294]:     <memballoon model="virtio">
Feb 02 10:11:08 compute-1 nova_compute[226294]:       <stats period="10"/>
Feb 02 10:11:08 compute-1 nova_compute[226294]:     </memballoon>
Feb 02 10:11:08 compute-1 nova_compute[226294]:   </devices>
Feb 02 10:11:08 compute-1 nova_compute[226294]: </domain>
Feb 02 10:11:08 compute-1 nova_compute[226294]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Feb 02 10:11:08 compute-1 nova_compute[226294]: 2026-02-02 10:11:08.146 226298 DEBUG nova.compute.manager [None req-7367c843-9ba1-4df9-91fe-c76f5d61c312 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] [instance: 42dc4712-7770-4ecd-abba-8c8e970f8e46] Preparing to wait for external event network-vif-plugged-29f94a0b-58b9-437a-9157-c3ce95454def prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Feb 02 10:11:08 compute-1 nova_compute[226294]: 2026-02-02 10:11:08.147 226298 DEBUG oslo_concurrency.lockutils [None req-7367c843-9ba1-4df9-91fe-c76f5d61c312 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Acquiring lock "42dc4712-7770-4ecd-abba-8c8e970f8e46-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 02 10:11:08 compute-1 nova_compute[226294]: 2026-02-02 10:11:08.147 226298 DEBUG oslo_concurrency.lockutils [None req-7367c843-9ba1-4df9-91fe-c76f5d61c312 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Lock "42dc4712-7770-4ecd-abba-8c8e970f8e46-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 02 10:11:08 compute-1 nova_compute[226294]: 2026-02-02 10:11:08.148 226298 DEBUG oslo_concurrency.lockutils [None req-7367c843-9ba1-4df9-91fe-c76f5d61c312 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Lock "42dc4712-7770-4ecd-abba-8c8e970f8e46-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 02 10:11:08 compute-1 nova_compute[226294]: 2026-02-02 10:11:08.149 226298 DEBUG nova.virt.libvirt.vif [None req-7367c843-9ba1-4df9-91fe-c76f5d61c312 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-02-02T10:11:02Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1589717047',display_name='tempest-TestNetworkBasicOps-server-1589717047',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1589717047',id=11,image_ref='d5e062d7-95ef-409c-9ad0-60f7cf6f44ce',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBFKLhtXPFNo+7qMy7WM4rXv1wxOn6wk80R7orPjLFemWslU1farAMLdF2l7TazRd92gQv0m2wSsyelv9AIIl5lW/89YdwjsAA40J0bv4RJZ9H+7Em3wwtPI4Gx0836EIRw==',key_name='tempest-TestNetworkBasicOps-746965999',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='efbfe697ca674d72b47da5adf3e42c0c',ramdisk_id='',reservation_id='r-n5k0k93c',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='d5e062d7-95ef-409c-9ad0-60f7cf6f44ce',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-793549693',owner_user_name='tempest-TestNetworkBasicOps-793549693-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-02-02T10:11:04Z,user_data=None,user_id='1b1695a2a70d4aa0aa350ba17d8f6d5e',uuid=42dc4712-7770-4ecd-abba-8c8e970f8e46,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "29f94a0b-58b9-437a-9157-c3ce95454def", "address": "fa:16:3e:5f:0c:ce", "network": {"id": "07b5f9e6-a53d-47d1-be8b-5269063b871d", "bridge": "br-int", "label": "tempest-network-smoke--1567229594", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "efbfe697ca674d72b47da5adf3e42c0c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap29f94a0b-58", "ovs_interfaceid": "29f94a0b-58b9-437a-9157-c3ce95454def", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Feb 02 10:11:08 compute-1 nova_compute[226294]: 2026-02-02 10:11:08.149 226298 DEBUG nova.network.os_vif_util [None req-7367c843-9ba1-4df9-91fe-c76f5d61c312 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Converting VIF {"id": "29f94a0b-58b9-437a-9157-c3ce95454def", "address": "fa:16:3e:5f:0c:ce", "network": {"id": "07b5f9e6-a53d-47d1-be8b-5269063b871d", "bridge": "br-int", "label": "tempest-network-smoke--1567229594", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "efbfe697ca674d72b47da5adf3e42c0c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap29f94a0b-58", "ovs_interfaceid": "29f94a0b-58b9-437a-9157-c3ce95454def", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 02 10:11:08 compute-1 nova_compute[226294]: 2026-02-02 10:11:08.150 226298 DEBUG nova.network.os_vif_util [None req-7367c843-9ba1-4df9-91fe-c76f5d61c312 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:5f:0c:ce,bridge_name='br-int',has_traffic_filtering=True,id=29f94a0b-58b9-437a-9157-c3ce95454def,network=Network(07b5f9e6-a53d-47d1-be8b-5269063b871d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap29f94a0b-58') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 02 10:11:08 compute-1 nova_compute[226294]: 2026-02-02 10:11:08.151 226298 DEBUG os_vif [None req-7367c843-9ba1-4df9-91fe-c76f5d61c312 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:5f:0c:ce,bridge_name='br-int',has_traffic_filtering=True,id=29f94a0b-58b9-437a-9157-c3ce95454def,network=Network(07b5f9e6-a53d-47d1-be8b-5269063b871d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap29f94a0b-58') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Feb 02 10:11:08 compute-1 nova_compute[226294]: 2026-02-02 10:11:08.152 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:11:08 compute-1 nova_compute[226294]: 2026-02-02 10:11:08.152 226298 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 02 10:11:08 compute-1 nova_compute[226294]: 2026-02-02 10:11:08.153 226298 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 02 10:11:08 compute-1 nova_compute[226294]: 2026-02-02 10:11:08.157 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:11:08 compute-1 nova_compute[226294]: 2026-02-02 10:11:08.157 226298 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap29f94a0b-58, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 02 10:11:08 compute-1 nova_compute[226294]: 2026-02-02 10:11:08.157 226298 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap29f94a0b-58, col_values=(('external_ids', {'iface-id': '29f94a0b-58b9-437a-9157-c3ce95454def', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:5f:0c:ce', 'vm-uuid': '42dc4712-7770-4ecd-abba-8c8e970f8e46'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 02 10:11:08 compute-1 nova_compute[226294]: 2026-02-02 10:11:08.158 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:11:08 compute-1 NetworkManager[49055]: <info>  [1770027068.1595] manager: (tap29f94a0b-58): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/41)
Feb 02 10:11:08 compute-1 nova_compute[226294]: 2026-02-02 10:11:08.162 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 02 10:11:08 compute-1 nova_compute[226294]: 2026-02-02 10:11:08.164 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:11:08 compute-1 nova_compute[226294]: 2026-02-02 10:11:08.164 226298 INFO os_vif [None req-7367c843-9ba1-4df9-91fe-c76f5d61c312 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:5f:0c:ce,bridge_name='br-int',has_traffic_filtering=True,id=29f94a0b-58b9-437a-9157-c3ce95454def,network=Network(07b5f9e6-a53d-47d1-be8b-5269063b871d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap29f94a0b-58')
Feb 02 10:11:08 compute-1 nova_compute[226294]: 2026-02-02 10:11:08.225 226298 DEBUG nova.virt.libvirt.driver [None req-7367c843-9ba1-4df9-91fe-c76f5d61c312 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Feb 02 10:11:08 compute-1 nova_compute[226294]: 2026-02-02 10:11:08.225 226298 DEBUG nova.virt.libvirt.driver [None req-7367c843-9ba1-4df9-91fe-c76f5d61c312 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Feb 02 10:11:08 compute-1 nova_compute[226294]: 2026-02-02 10:11:08.226 226298 DEBUG nova.virt.libvirt.driver [None req-7367c843-9ba1-4df9-91fe-c76f5d61c312 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] No VIF found with MAC fa:16:3e:5f:0c:ce, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Feb 02 10:11:08 compute-1 nova_compute[226294]: 2026-02-02 10:11:08.227 226298 INFO nova.virt.libvirt.driver [None req-7367c843-9ba1-4df9-91fe-c76f5d61c312 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] [instance: 42dc4712-7770-4ecd-abba-8c8e970f8e46] Using config drive
Feb 02 10:11:08 compute-1 nova_compute[226294]: 2026-02-02 10:11:08.262 226298 DEBUG nova.storage.rbd_utils [None req-7367c843-9ba1-4df9-91fe-c76f5d61c312 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] rbd image 42dc4712-7770-4ecd-abba-8c8e970f8e46_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80
Feb 02 10:11:08 compute-1 ceph-mon[80115]: from='client.? 192.168.122.101:0/2282015443' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Feb 02 10:11:08 compute-1 ceph-mon[80115]: from='client.? 192.168.122.101:0/17390625' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Feb 02 10:11:09 compute-1 nova_compute[226294]: 2026-02-02 10:11:09.004 226298 DEBUG nova.network.neutron [req-ceb4eb3a-cc41-4a24-a844-98602c9f2798 req-b54bb0de-5dea-4e50-8a34-8a934d920aac b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] [instance: 42dc4712-7770-4ecd-abba-8c8e970f8e46] Updated VIF entry in instance network info cache for port 29f94a0b-58b9-437a-9157-c3ce95454def. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Feb 02 10:11:09 compute-1 nova_compute[226294]: 2026-02-02 10:11:09.005 226298 DEBUG nova.network.neutron [req-ceb4eb3a-cc41-4a24-a844-98602c9f2798 req-b54bb0de-5dea-4e50-8a34-8a934d920aac b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] [instance: 42dc4712-7770-4ecd-abba-8c8e970f8e46] Updating instance_info_cache with network_info: [{"id": "29f94a0b-58b9-437a-9157-c3ce95454def", "address": "fa:16:3e:5f:0c:ce", "network": {"id": "07b5f9e6-a53d-47d1-be8b-5269063b871d", "bridge": "br-int", "label": "tempest-network-smoke--1567229594", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "efbfe697ca674d72b47da5adf3e42c0c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap29f94a0b-58", "ovs_interfaceid": "29f94a0b-58b9-437a-9157-c3ce95454def", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 02 10:11:09 compute-1 nova_compute[226294]: 2026-02-02 10:11:09.024 226298 DEBUG oslo_concurrency.lockutils [req-ceb4eb3a-cc41-4a24-a844-98602c9f2798 req-b54bb0de-5dea-4e50-8a34-8a934d920aac b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] Releasing lock "refresh_cache-42dc4712-7770-4ecd-abba-8c8e970f8e46" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 02 10:11:09 compute-1 nova_compute[226294]: 2026-02-02 10:11:09.152 226298 INFO nova.virt.libvirt.driver [None req-7367c843-9ba1-4df9-91fe-c76f5d61c312 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] [instance: 42dc4712-7770-4ecd-abba-8c8e970f8e46] Creating config drive at /var/lib/nova/instances/42dc4712-7770-4ecd-abba-8c8e970f8e46/disk.config
Feb 02 10:11:09 compute-1 nova_compute[226294]: 2026-02-02 10:11:09.160 226298 DEBUG oslo_concurrency.processutils [None req-7367c843-9ba1-4df9-91fe-c76f5d61c312 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/42dc4712-7770-4ecd-abba-8c8e970f8e46/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmpyunjrs_n execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 02 10:11:09 compute-1 nova_compute[226294]: 2026-02-02 10:11:09.286 226298 DEBUG oslo_concurrency.processutils [None req-7367c843-9ba1-4df9-91fe-c76f5d61c312 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/42dc4712-7770-4ecd-abba-8c8e970f8e46/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmpyunjrs_n" returned: 0 in 0.126s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 02 10:11:09 compute-1 nova_compute[226294]: 2026-02-02 10:11:09.325 226298 DEBUG nova.storage.rbd_utils [None req-7367c843-9ba1-4df9-91fe-c76f5d61c312 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] rbd image 42dc4712-7770-4ecd-abba-8c8e970f8e46_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80
Feb 02 10:11:09 compute-1 nova_compute[226294]: 2026-02-02 10:11:09.329 226298 DEBUG oslo_concurrency.processutils [None req-7367c843-9ba1-4df9-91fe-c76f5d61c312 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/42dc4712-7770-4ecd-abba-8c8e970f8e46/disk.config 42dc4712-7770-4ecd-abba-8c8e970f8e46_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 02 10:11:09 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:11:09 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:11:09 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:11:09.387 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:11:09 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:11:09 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:11:09 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:11:09.437 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:11:09 compute-1 ceph-mon[80115]: from='client.? 192.168.122.100:0/1237912221' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Feb 02 10:11:09 compute-1 ceph-mon[80115]: pgmap v952: 353 pgs: 353 active+clean; 88 MiB data, 285 MiB used, 60 GiB / 60 GiB avail; 17 KiB/s rd, 1.8 MiB/s wr, 27 op/s
Feb 02 10:11:09 compute-1 nova_compute[226294]: 2026-02-02 10:11:09.513 226298 DEBUG oslo_concurrency.processutils [None req-7367c843-9ba1-4df9-91fe-c76f5d61c312 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/42dc4712-7770-4ecd-abba-8c8e970f8e46/disk.config 42dc4712-7770-4ecd-abba-8c8e970f8e46_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.184s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 02 10:11:09 compute-1 nova_compute[226294]: 2026-02-02 10:11:09.514 226298 INFO nova.virt.libvirt.driver [None req-7367c843-9ba1-4df9-91fe-c76f5d61c312 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] [instance: 42dc4712-7770-4ecd-abba-8c8e970f8e46] Deleting local config drive /var/lib/nova/instances/42dc4712-7770-4ecd-abba-8c8e970f8e46/disk.config because it was imported into RBD.
Feb 02 10:11:09 compute-1 systemd[1]: Starting libvirt secret daemon...
Feb 02 10:11:09 compute-1 systemd[1]: Started libvirt secret daemon.
Feb 02 10:11:09 compute-1 kernel: tap29f94a0b-58: entered promiscuous mode
Feb 02 10:11:09 compute-1 NetworkManager[49055]: <info>  [1770027069.6166] manager: (tap29f94a0b-58): new Tun device (/org/freedesktop/NetworkManager/Devices/42)
Feb 02 10:11:09 compute-1 ovn_controller[133666]: 2026-02-02T10:11:09Z|00058|binding|INFO|Claiming lport 29f94a0b-58b9-437a-9157-c3ce95454def for this chassis.
Feb 02 10:11:09 compute-1 ovn_controller[133666]: 2026-02-02T10:11:09Z|00059|binding|INFO|29f94a0b-58b9-437a-9157-c3ce95454def: Claiming fa:16:3e:5f:0c:ce 10.100.0.6
Feb 02 10:11:09 compute-1 nova_compute[226294]: 2026-02-02 10:11:09.617 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:11:09 compute-1 nova_compute[226294]: 2026-02-02 10:11:09.621 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:11:09 compute-1 nova_compute[226294]: 2026-02-02 10:11:09.627 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:11:09 compute-1 nova_compute[226294]: 2026-02-02 10:11:09.632 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:11:09 compute-1 ovn_metadata_agent[143537]: 2026-02-02 10:11:09.641 143542 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:5f:0c:ce 10.100.0.6'], port_security=['fa:16:3e:5f:0c:ce 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '42dc4712-7770-4ecd-abba-8c8e970f8e46', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-07b5f9e6-a53d-47d1-be8b-5269063b871d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'efbfe697ca674d72b47da5adf3e42c0c', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'a8371570-b364-43fb-9d49-41b819ae5fa9', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=7385ccf6-5875-4ca6-bbfb-418e49c25618, chassis=[<ovs.db.idl.Row object at 0x7f9094efd8b0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f9094efd8b0>], logical_port=29f94a0b-58b9-437a-9157-c3ce95454def) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 02 10:11:09 compute-1 ovn_metadata_agent[143537]: 2026-02-02 10:11:09.643 143542 INFO neutron.agent.ovn.metadata.agent [-] Port 29f94a0b-58b9-437a-9157-c3ce95454def in datapath 07b5f9e6-a53d-47d1-be8b-5269063b871d bound to our chassis
Feb 02 10:11:09 compute-1 ovn_metadata_agent[143537]: 2026-02-02 10:11:09.644 143542 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 07b5f9e6-a53d-47d1-be8b-5269063b871d
Feb 02 10:11:09 compute-1 systemd-machined[195072]: New machine qemu-3-instance-0000000b.
Feb 02 10:11:09 compute-1 ovn_metadata_agent[143537]: 2026-02-02 10:11:09.656 229827 DEBUG oslo.privsep.daemon [-] privsep: reply[16437a34-54db-4d11-97b6-6bd19be6c91e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 02 10:11:09 compute-1 ovn_metadata_agent[143537]: 2026-02-02 10:11:09.657 143542 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap07b5f9e6-a1 in ovnmeta-07b5f9e6-a53d-47d1-be8b-5269063b871d namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Feb 02 10:11:09 compute-1 ovn_metadata_agent[143537]: 2026-02-02 10:11:09.659 229827 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap07b5f9e6-a0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Feb 02 10:11:09 compute-1 ovn_metadata_agent[143537]: 2026-02-02 10:11:09.659 229827 DEBUG oslo.privsep.daemon [-] privsep: reply[2b74dae0-39f8-4d4d-89e6-401173eed0e7]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 02 10:11:09 compute-1 ovn_metadata_agent[143537]: 2026-02-02 10:11:09.660 229827 DEBUG oslo.privsep.daemon [-] privsep: reply[1183bc3c-ad31-4aa3-9ea4-f98d79e2a7a4]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 02 10:11:09 compute-1 ovn_metadata_agent[143537]: 2026-02-02 10:11:09.672 143813 DEBUG oslo.privsep.daemon [-] privsep: reply[d75f4cf4-9185-47de-9757-ea2bbeabe486]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 02 10:11:09 compute-1 ovn_controller[133666]: 2026-02-02T10:11:09Z|00060|binding|INFO|Setting lport 29f94a0b-58b9-437a-9157-c3ce95454def ovn-installed in OVS
Feb 02 10:11:09 compute-1 ovn_controller[133666]: 2026-02-02T10:11:09Z|00061|binding|INFO|Setting lport 29f94a0b-58b9-437a-9157-c3ce95454def up in Southbound
Feb 02 10:11:09 compute-1 nova_compute[226294]: 2026-02-02 10:11:09.677 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:11:09 compute-1 systemd[1]: Started Virtual Machine qemu-3-instance-0000000b.
Feb 02 10:11:09 compute-1 systemd-udevd[235660]: Network interface NamePolicy= disabled on kernel command line.
Feb 02 10:11:09 compute-1 ovn_metadata_agent[143537]: 2026-02-02 10:11:09.700 229827 DEBUG oslo.privsep.daemon [-] privsep: reply[c0b37b91-c3e0-4330-81d6-1e97d34de4bf]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 02 10:11:09 compute-1 NetworkManager[49055]: <info>  [1770027069.7061] device (tap29f94a0b-58): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Feb 02 10:11:09 compute-1 NetworkManager[49055]: <info>  [1770027069.7074] device (tap29f94a0b-58): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Feb 02 10:11:09 compute-1 ovn_metadata_agent[143537]: 2026-02-02 10:11:09.726 229842 DEBUG oslo.privsep.daemon [-] privsep: reply[6bb1c889-7952-4fe5-9370-4e9ad00bc5df]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 02 10:11:09 compute-1 systemd-udevd[235662]: Network interface NamePolicy= disabled on kernel command line.
Feb 02 10:11:09 compute-1 ovn_metadata_agent[143537]: 2026-02-02 10:11:09.732 229827 DEBUG oslo.privsep.daemon [-] privsep: reply[e4fa83cc-5fd1-4c61-b70b-25c166f31665]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 02 10:11:09 compute-1 NetworkManager[49055]: <info>  [1770027069.7333] manager: (tap07b5f9e6-a0): new Veth device (/org/freedesktop/NetworkManager/Devices/43)
Feb 02 10:11:09 compute-1 ovn_metadata_agent[143537]: 2026-02-02 10:11:09.764 229842 DEBUG oslo.privsep.daemon [-] privsep: reply[3db2d985-7ac7-4314-8aae-f5057c335b61]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 02 10:11:09 compute-1 ovn_metadata_agent[143537]: 2026-02-02 10:11:09.767 229842 DEBUG oslo.privsep.daemon [-] privsep: reply[0e5f9d47-83c3-41da-bf2d-a1d566e55c14]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 02 10:11:09 compute-1 NetworkManager[49055]: <info>  [1770027069.7849] device (tap07b5f9e6-a0): carrier: link connected
Feb 02 10:11:09 compute-1 ovn_metadata_agent[143537]: 2026-02-02 10:11:09.790 229842 DEBUG oslo.privsep.daemon [-] privsep: reply[0b9e79d4-4827-477d-8939-6bf1019e070d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 02 10:11:09 compute-1 ovn_metadata_agent[143537]: 2026-02-02 10:11:09.804 229827 DEBUG oslo.privsep.daemon [-] privsep: reply[f63e72ce-bb45-4977-97d5-24a4ee317936]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap07b5f9e6-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:41:18:c1'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 21], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 423209, 'reachable_time': 37330, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 235690, 'error': None, 'target': 'ovnmeta-07b5f9e6-a53d-47d1-be8b-5269063b871d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 02 10:11:09 compute-1 ovn_metadata_agent[143537]: 2026-02-02 10:11:09.817 229827 DEBUG oslo.privsep.daemon [-] privsep: reply[511e9d43-4f90-4c02-a1c4-c03a73e0355a]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe41:18c1'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 423209, 'tstamp': 423209}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 235691, 'error': None, 'target': 'ovnmeta-07b5f9e6-a53d-47d1-be8b-5269063b871d', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 02 10:11:09 compute-1 ovn_metadata_agent[143537]: 2026-02-02 10:11:09.834 229827 DEBUG oslo.privsep.daemon [-] privsep: reply[5ed1b81d-fc08-4895-bdbb-5d8cb59f4a0c]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap07b5f9e6-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:41:18:c1'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 21], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 423209, 'reachable_time': 37330, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 235692, 'error': None, 'target': 'ovnmeta-07b5f9e6-a53d-47d1-be8b-5269063b871d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 02 10:11:09 compute-1 ovn_metadata_agent[143537]: 2026-02-02 10:11:09.861 229827 DEBUG oslo.privsep.daemon [-] privsep: reply[f0a71186-027e-4c6e-a226-737d0c790707]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 02 10:11:09 compute-1 ovn_metadata_agent[143537]: 2026-02-02 10:11:09.908 229827 DEBUG oslo.privsep.daemon [-] privsep: reply[80866ac5-5fe7-45db-8a8f-95475bfa623e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 02 10:11:09 compute-1 ovn_metadata_agent[143537]: 2026-02-02 10:11:09.910 143542 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap07b5f9e6-a0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 02 10:11:09 compute-1 ovn_metadata_agent[143537]: 2026-02-02 10:11:09.910 143542 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 02 10:11:09 compute-1 ovn_metadata_agent[143537]: 2026-02-02 10:11:09.911 143542 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap07b5f9e6-a0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 02 10:11:09 compute-1 nova_compute[226294]: 2026-02-02 10:11:09.923 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:11:09 compute-1 kernel: tap07b5f9e6-a0: entered promiscuous mode
Feb 02 10:11:09 compute-1 NetworkManager[49055]: <info>  [1770027069.9242] manager: (tap07b5f9e6-a0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/44)
Feb 02 10:11:09 compute-1 nova_compute[226294]: 2026-02-02 10:11:09.927 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:11:09 compute-1 ovn_metadata_agent[143537]: 2026-02-02 10:11:09.928 143542 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap07b5f9e6-a0, col_values=(('external_ids', {'iface-id': '1cafc178-edb3-4734-825c-ef4e45193789'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 02 10:11:09 compute-1 nova_compute[226294]: 2026-02-02 10:11:09.930 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:11:09 compute-1 ovn_controller[133666]: 2026-02-02T10:11:09Z|00062|binding|INFO|Releasing lport 1cafc178-edb3-4734-825c-ef4e45193789 from this chassis (sb_readonly=0)
Feb 02 10:11:09 compute-1 nova_compute[226294]: 2026-02-02 10:11:09.931 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:11:09 compute-1 ovn_metadata_agent[143537]: 2026-02-02 10:11:09.931 143542 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/07b5f9e6-a53d-47d1-be8b-5269063b871d.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/07b5f9e6-a53d-47d1-be8b-5269063b871d.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Feb 02 10:11:09 compute-1 nova_compute[226294]: 2026-02-02 10:11:09.935 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:11:09 compute-1 ovn_metadata_agent[143537]: 2026-02-02 10:11:09.935 229827 DEBUG oslo.privsep.daemon [-] privsep: reply[800e92d3-2c99-4c14-a7b6-0a5f81645cb2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 02 10:11:09 compute-1 ovn_metadata_agent[143537]: 2026-02-02 10:11:09.937 143542 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Feb 02 10:11:09 compute-1 ovn_metadata_agent[143537]: global
Feb 02 10:11:09 compute-1 ovn_metadata_agent[143537]:     log         /dev/log local0 debug
Feb 02 10:11:09 compute-1 ovn_metadata_agent[143537]:     log-tag     haproxy-metadata-proxy-07b5f9e6-a53d-47d1-be8b-5269063b871d
Feb 02 10:11:09 compute-1 ovn_metadata_agent[143537]:     user        root
Feb 02 10:11:09 compute-1 ovn_metadata_agent[143537]:     group       root
Feb 02 10:11:09 compute-1 ovn_metadata_agent[143537]:     maxconn     1024
Feb 02 10:11:09 compute-1 ovn_metadata_agent[143537]:     pidfile     /var/lib/neutron/external/pids/07b5f9e6-a53d-47d1-be8b-5269063b871d.pid.haproxy
Feb 02 10:11:09 compute-1 ovn_metadata_agent[143537]:     daemon
Feb 02 10:11:09 compute-1 ovn_metadata_agent[143537]: 
Feb 02 10:11:09 compute-1 ovn_metadata_agent[143537]: defaults
Feb 02 10:11:09 compute-1 ovn_metadata_agent[143537]:     log global
Feb 02 10:11:09 compute-1 ovn_metadata_agent[143537]:     mode http
Feb 02 10:11:09 compute-1 ovn_metadata_agent[143537]:     option httplog
Feb 02 10:11:09 compute-1 ovn_metadata_agent[143537]:     option dontlognull
Feb 02 10:11:09 compute-1 ovn_metadata_agent[143537]:     option http-server-close
Feb 02 10:11:09 compute-1 ovn_metadata_agent[143537]:     option forwardfor
Feb 02 10:11:09 compute-1 ovn_metadata_agent[143537]:     retries                 3
Feb 02 10:11:09 compute-1 ovn_metadata_agent[143537]:     timeout http-request    30s
Feb 02 10:11:09 compute-1 ovn_metadata_agent[143537]:     timeout connect         30s
Feb 02 10:11:09 compute-1 ovn_metadata_agent[143537]:     timeout client          32s
Feb 02 10:11:09 compute-1 ovn_metadata_agent[143537]:     timeout server          32s
Feb 02 10:11:09 compute-1 ovn_metadata_agent[143537]:     timeout http-keep-alive 30s
Feb 02 10:11:09 compute-1 ovn_metadata_agent[143537]: 
Feb 02 10:11:09 compute-1 ovn_metadata_agent[143537]: 
Feb 02 10:11:09 compute-1 ovn_metadata_agent[143537]: listen listener
Feb 02 10:11:09 compute-1 ovn_metadata_agent[143537]:     bind 169.254.169.254:80
Feb 02 10:11:09 compute-1 ovn_metadata_agent[143537]:     server metadata /var/lib/neutron/metadata_proxy
Feb 02 10:11:09 compute-1 ovn_metadata_agent[143537]:     http-request add-header X-OVN-Network-ID 07b5f9e6-a53d-47d1-be8b-5269063b871d
Feb 02 10:11:09 compute-1 ovn_metadata_agent[143537]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Feb 02 10:11:09 compute-1 ovn_metadata_agent[143537]: 2026-02-02 10:11:09.937 143542 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-07b5f9e6-a53d-47d1-be8b-5269063b871d', 'env', 'PROCESS_TAG=haproxy-07b5f9e6-a53d-47d1-be8b-5269063b871d', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/07b5f9e6-a53d-47d1-be8b-5269063b871d.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Feb 02 10:11:10 compute-1 nova_compute[226294]: 2026-02-02 10:11:10.104 226298 DEBUG nova.compute.manager [req-3fdcb1c4-813c-4127-8e3b-6f420da4db69 req-a58829e8-2cdf-4d4c-967f-f35026035452 b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] [instance: 42dc4712-7770-4ecd-abba-8c8e970f8e46] Received event network-vif-plugged-29f94a0b-58b9-437a-9157-c3ce95454def external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 02 10:11:10 compute-1 nova_compute[226294]: 2026-02-02 10:11:10.115 226298 DEBUG oslo_concurrency.lockutils [req-3fdcb1c4-813c-4127-8e3b-6f420da4db69 req-a58829e8-2cdf-4d4c-967f-f35026035452 b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] Acquiring lock "42dc4712-7770-4ecd-abba-8c8e970f8e46-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 02 10:11:10 compute-1 nova_compute[226294]: 2026-02-02 10:11:10.116 226298 DEBUG oslo_concurrency.lockutils [req-3fdcb1c4-813c-4127-8e3b-6f420da4db69 req-a58829e8-2cdf-4d4c-967f-f35026035452 b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] Lock "42dc4712-7770-4ecd-abba-8c8e970f8e46-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 02 10:11:10 compute-1 nova_compute[226294]: 2026-02-02 10:11:10.117 226298 DEBUG oslo_concurrency.lockutils [req-3fdcb1c4-813c-4127-8e3b-6f420da4db69 req-a58829e8-2cdf-4d4c-967f-f35026035452 b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] Lock "42dc4712-7770-4ecd-abba-8c8e970f8e46-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 02 10:11:10 compute-1 nova_compute[226294]: 2026-02-02 10:11:10.117 226298 DEBUG nova.compute.manager [req-3fdcb1c4-813c-4127-8e3b-6f420da4db69 req-a58829e8-2cdf-4d4c-967f-f35026035452 b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] [instance: 42dc4712-7770-4ecd-abba-8c8e970f8e46] Processing event network-vif-plugged-29f94a0b-58b9-437a-9157-c3ce95454def _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Feb 02 10:11:10 compute-1 nova_compute[226294]: 2026-02-02 10:11:10.262 226298 DEBUG nova.compute.manager [None req-7367c843-9ba1-4df9-91fe-c76f5d61c312 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] [instance: 42dc4712-7770-4ecd-abba-8c8e970f8e46] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Feb 02 10:11:10 compute-1 nova_compute[226294]: 2026-02-02 10:11:10.263 226298 DEBUG nova.virt.driver [None req-9f87d029-9537-479a-a951-66a5b7f88d49 - - - - - -] Emitting event <LifecycleEvent: 1770027070.262436, 42dc4712-7770-4ecd-abba-8c8e970f8e46 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 02 10:11:10 compute-1 nova_compute[226294]: 2026-02-02 10:11:10.264 226298 INFO nova.compute.manager [None req-9f87d029-9537-479a-a951-66a5b7f88d49 - - - - - -] [instance: 42dc4712-7770-4ecd-abba-8c8e970f8e46] VM Started (Lifecycle Event)
Feb 02 10:11:10 compute-1 podman[235764]: 2026-02-02 10:11:10.26525355 +0000 UTC m=+0.050113932 container create 4f62f65d56fc12e0d09bac18f19d04c76adbe3def2882e05c580438f3f6680e7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc, name=neutron-haproxy-ovnmeta-07b5f9e6-a53d-47d1-be8b-5269063b871d, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Feb 02 10:11:10 compute-1 nova_compute[226294]: 2026-02-02 10:11:10.266 226298 DEBUG nova.virt.libvirt.driver [None req-7367c843-9ba1-4df9-91fe-c76f5d61c312 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] [instance: 42dc4712-7770-4ecd-abba-8c8e970f8e46] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Feb 02 10:11:10 compute-1 nova_compute[226294]: 2026-02-02 10:11:10.271 226298 INFO nova.virt.libvirt.driver [-] [instance: 42dc4712-7770-4ecd-abba-8c8e970f8e46] Instance spawned successfully.
Feb 02 10:11:10 compute-1 nova_compute[226294]: 2026-02-02 10:11:10.272 226298 DEBUG nova.virt.libvirt.driver [None req-7367c843-9ba1-4df9-91fe-c76f5d61c312 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] [instance: 42dc4712-7770-4ecd-abba-8c8e970f8e46] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Feb 02 10:11:10 compute-1 nova_compute[226294]: 2026-02-02 10:11:10.284 226298 DEBUG nova.compute.manager [None req-9f87d029-9537-479a-a951-66a5b7f88d49 - - - - - -] [instance: 42dc4712-7770-4ecd-abba-8c8e970f8e46] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 02 10:11:10 compute-1 nova_compute[226294]: 2026-02-02 10:11:10.291 226298 DEBUG nova.compute.manager [None req-9f87d029-9537-479a-a951-66a5b7f88d49 - - - - - -] [instance: 42dc4712-7770-4ecd-abba-8c8e970f8e46] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Feb 02 10:11:10 compute-1 nova_compute[226294]: 2026-02-02 10:11:10.295 226298 DEBUG nova.virt.libvirt.driver [None req-7367c843-9ba1-4df9-91fe-c76f5d61c312 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] [instance: 42dc4712-7770-4ecd-abba-8c8e970f8e46] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 02 10:11:10 compute-1 nova_compute[226294]: 2026-02-02 10:11:10.296 226298 DEBUG nova.virt.libvirt.driver [None req-7367c843-9ba1-4df9-91fe-c76f5d61c312 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] [instance: 42dc4712-7770-4ecd-abba-8c8e970f8e46] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 02 10:11:10 compute-1 nova_compute[226294]: 2026-02-02 10:11:10.296 226298 DEBUG nova.virt.libvirt.driver [None req-7367c843-9ba1-4df9-91fe-c76f5d61c312 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] [instance: 42dc4712-7770-4ecd-abba-8c8e970f8e46] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 02 10:11:10 compute-1 nova_compute[226294]: 2026-02-02 10:11:10.297 226298 DEBUG nova.virt.libvirt.driver [None req-7367c843-9ba1-4df9-91fe-c76f5d61c312 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] [instance: 42dc4712-7770-4ecd-abba-8c8e970f8e46] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 02 10:11:10 compute-1 nova_compute[226294]: 2026-02-02 10:11:10.297 226298 DEBUG nova.virt.libvirt.driver [None req-7367c843-9ba1-4df9-91fe-c76f5d61c312 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] [instance: 42dc4712-7770-4ecd-abba-8c8e970f8e46] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 02 10:11:10 compute-1 nova_compute[226294]: 2026-02-02 10:11:10.298 226298 DEBUG nova.virt.libvirt.driver [None req-7367c843-9ba1-4df9-91fe-c76f5d61c312 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] [instance: 42dc4712-7770-4ecd-abba-8c8e970f8e46] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 02 10:11:10 compute-1 systemd[1]: Started libpod-conmon-4f62f65d56fc12e0d09bac18f19d04c76adbe3def2882e05c580438f3f6680e7.scope.
Feb 02 10:11:10 compute-1 systemd[1]: Started libcrun container.
Feb 02 10:11:10 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/26cfac83bec4483eeb9fd6487ef88b8a7ccc7477473882fbfbe7498fdcc8d7a6/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Feb 02 10:11:10 compute-1 podman[235764]: 2026-02-02 10:11:10.236183378 +0000 UTC m=+0.021043790 image pull 19964fda6b912d3d57e21b0bcc221725d936e513025030cb508474fe04b06af8 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc
Feb 02 10:11:10 compute-1 nova_compute[226294]: 2026-02-02 10:11:10.334 226298 INFO nova.compute.manager [None req-9f87d029-9537-479a-a951-66a5b7f88d49 - - - - - -] [instance: 42dc4712-7770-4ecd-abba-8c8e970f8e46] During sync_power_state the instance has a pending task (spawning). Skip.
Feb 02 10:11:10 compute-1 nova_compute[226294]: 2026-02-02 10:11:10.334 226298 DEBUG nova.virt.driver [None req-9f87d029-9537-479a-a951-66a5b7f88d49 - - - - - -] Emitting event <LifecycleEvent: 1770027070.2634013, 42dc4712-7770-4ecd-abba-8c8e970f8e46 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 02 10:11:10 compute-1 nova_compute[226294]: 2026-02-02 10:11:10.335 226298 INFO nova.compute.manager [None req-9f87d029-9537-479a-a951-66a5b7f88d49 - - - - - -] [instance: 42dc4712-7770-4ecd-abba-8c8e970f8e46] VM Paused (Lifecycle Event)
Feb 02 10:11:10 compute-1 podman[235764]: 2026-02-02 10:11:10.339988285 +0000 UTC m=+0.124848687 container init 4f62f65d56fc12e0d09bac18f19d04c76adbe3def2882e05c580438f3f6680e7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc, name=neutron-haproxy-ovnmeta-07b5f9e6-a53d-47d1-be8b-5269063b871d, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Feb 02 10:11:10 compute-1 podman[235764]: 2026-02-02 10:11:10.344400752 +0000 UTC m=+0.129261124 container start 4f62f65d56fc12e0d09bac18f19d04c76adbe3def2882e05c580438f3f6680e7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc, name=neutron-haproxy-ovnmeta-07b5f9e6-a53d-47d1-be8b-5269063b871d, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS)
Feb 02 10:11:10 compute-1 nova_compute[226294]: 2026-02-02 10:11:10.360 226298 DEBUG nova.compute.manager [None req-9f87d029-9537-479a-a951-66a5b7f88d49 - - - - - -] [instance: 42dc4712-7770-4ecd-abba-8c8e970f8e46] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 02 10:11:10 compute-1 neutron-haproxy-ovnmeta-07b5f9e6-a53d-47d1-be8b-5269063b871d[235781]: [NOTICE]   (235785) : New worker (235787) forked
Feb 02 10:11:10 compute-1 neutron-haproxy-ovnmeta-07b5f9e6-a53d-47d1-be8b-5269063b871d[235781]: [NOTICE]   (235785) : Loading success.
Feb 02 10:11:10 compute-1 nova_compute[226294]: 2026-02-02 10:11:10.370 226298 INFO nova.compute.manager [None req-7367c843-9ba1-4df9-91fe-c76f5d61c312 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] [instance: 42dc4712-7770-4ecd-abba-8c8e970f8e46] Took 6.29 seconds to spawn the instance on the hypervisor.
Feb 02 10:11:10 compute-1 nova_compute[226294]: 2026-02-02 10:11:10.371 226298 DEBUG nova.compute.manager [None req-7367c843-9ba1-4df9-91fe-c76f5d61c312 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] [instance: 42dc4712-7770-4ecd-abba-8c8e970f8e46] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 02 10:11:10 compute-1 nova_compute[226294]: 2026-02-02 10:11:10.372 226298 DEBUG nova.virt.driver [None req-9f87d029-9537-479a-a951-66a5b7f88d49 - - - - - -] Emitting event <LifecycleEvent: 1770027070.266223, 42dc4712-7770-4ecd-abba-8c8e970f8e46 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 02 10:11:10 compute-1 nova_compute[226294]: 2026-02-02 10:11:10.372 226298 INFO nova.compute.manager [None req-9f87d029-9537-479a-a951-66a5b7f88d49 - - - - - -] [instance: 42dc4712-7770-4ecd-abba-8c8e970f8e46] VM Resumed (Lifecycle Event)
Feb 02 10:11:10 compute-1 nova_compute[226294]: 2026-02-02 10:11:10.391 226298 DEBUG nova.compute.manager [None req-9f87d029-9537-479a-a951-66a5b7f88d49 - - - - - -] [instance: 42dc4712-7770-4ecd-abba-8c8e970f8e46] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 02 10:11:10 compute-1 nova_compute[226294]: 2026-02-02 10:11:10.394 226298 DEBUG nova.compute.manager [None req-9f87d029-9537-479a-a951-66a5b7f88d49 - - - - - -] [instance: 42dc4712-7770-4ecd-abba-8c8e970f8e46] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Feb 02 10:11:10 compute-1 nova_compute[226294]: 2026-02-02 10:11:10.418 226298 INFO nova.compute.manager [None req-9f87d029-9537-479a-a951-66a5b7f88d49 - - - - - -] [instance: 42dc4712-7770-4ecd-abba-8c8e970f8e46] During sync_power_state the instance has a pending task (spawning). Skip.
Feb 02 10:11:10 compute-1 nova_compute[226294]: 2026-02-02 10:11:10.441 226298 INFO nova.compute.manager [None req-7367c843-9ba1-4df9-91fe-c76f5d61c312 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] [instance: 42dc4712-7770-4ecd-abba-8c8e970f8e46] Took 7.32 seconds to build instance.
Feb 02 10:11:10 compute-1 nova_compute[226294]: 2026-02-02 10:11:10.460 226298 DEBUG oslo_concurrency.lockutils [None req-7367c843-9ba1-4df9-91fe-c76f5d61c312 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Lock "42dc4712-7770-4ecd-abba-8c8e970f8e46" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 7.411s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 02 10:11:10 compute-1 ceph-mon[80115]: from='client.? 192.168.122.100:0/846757801' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Feb 02 10:11:11 compute-1 nova_compute[226294]: 2026-02-02 10:11:11.340 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:11:11 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:11:11 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:11:11 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:11:11.389 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:11:11 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:11:11 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 10:11:11 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:11:11.438 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 10:11:11 compute-1 ceph-mon[80115]: pgmap v953: 353 pgs: 353 active+clean; 88 MiB data, 285 MiB used, 60 GiB / 60 GiB avail; 17 KiB/s rd, 1.8 MiB/s wr, 27 op/s
Feb 02 10:11:12 compute-1 nova_compute[226294]: 2026-02-02 10:11:12.193 226298 DEBUG nova.compute.manager [req-23ae7e7a-46be-452b-9e74-6cd7c5df83ed req-71a72e68-4fa5-4839-9d16-cb17af951f1b b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] [instance: 42dc4712-7770-4ecd-abba-8c8e970f8e46] Received event network-vif-plugged-29f94a0b-58b9-437a-9157-c3ce95454def external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 02 10:11:12 compute-1 nova_compute[226294]: 2026-02-02 10:11:12.194 226298 DEBUG oslo_concurrency.lockutils [req-23ae7e7a-46be-452b-9e74-6cd7c5df83ed req-71a72e68-4fa5-4839-9d16-cb17af951f1b b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] Acquiring lock "42dc4712-7770-4ecd-abba-8c8e970f8e46-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 02 10:11:12 compute-1 nova_compute[226294]: 2026-02-02 10:11:12.194 226298 DEBUG oslo_concurrency.lockutils [req-23ae7e7a-46be-452b-9e74-6cd7c5df83ed req-71a72e68-4fa5-4839-9d16-cb17af951f1b b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] Lock "42dc4712-7770-4ecd-abba-8c8e970f8e46-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 02 10:11:12 compute-1 nova_compute[226294]: 2026-02-02 10:11:12.195 226298 DEBUG oslo_concurrency.lockutils [req-23ae7e7a-46be-452b-9e74-6cd7c5df83ed req-71a72e68-4fa5-4839-9d16-cb17af951f1b b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] Lock "42dc4712-7770-4ecd-abba-8c8e970f8e46-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 02 10:11:12 compute-1 nova_compute[226294]: 2026-02-02 10:11:12.195 226298 DEBUG nova.compute.manager [req-23ae7e7a-46be-452b-9e74-6cd7c5df83ed req-71a72e68-4fa5-4839-9d16-cb17af951f1b b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] [instance: 42dc4712-7770-4ecd-abba-8c8e970f8e46] No waiting events found dispatching network-vif-plugged-29f94a0b-58b9-437a-9157-c3ce95454def pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 02 10:11:12 compute-1 nova_compute[226294]: 2026-02-02 10:11:12.195 226298 WARNING nova.compute.manager [req-23ae7e7a-46be-452b-9e74-6cd7c5df83ed req-71a72e68-4fa5-4839-9d16-cb17af951f1b b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] [instance: 42dc4712-7770-4ecd-abba-8c8e970f8e46] Received unexpected event network-vif-plugged-29f94a0b-58b9-437a-9157-c3ce95454def for instance with vm_state active and task_state None.
Feb 02 10:11:12 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 10:11:12 compute-1 ceph-mon[80115]: pgmap v954: 353 pgs: 353 active+clean; 88 MiB data, 285 MiB used, 60 GiB / 60 GiB avail; 17 KiB/s rd, 1.8 MiB/s wr, 27 op/s
Feb 02 10:11:13 compute-1 nova_compute[226294]: 2026-02-02 10:11:13.159 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:11:13 compute-1 podman[235798]: 2026-02-02 10:11:13.366269538 +0000 UTC m=+0.040938028 container health_status 9cab238676dda9a572dafc047b624a8b78a8fbec063bf997856559897160605d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'db4758ee7523fe447444c4bd2b867b543b1eee4e3bbcf6676cd1b27bf6147d86-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 02 10:11:13 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:11:13 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:11:13 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:11:13.391 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:11:13 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:11:13 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:11:13 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:11:13.440 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:11:13 compute-1 ceph-mon[80115]: from='client.? 192.168.122.102:0/3026647851' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Feb 02 10:11:14 compute-1 nova_compute[226294]: 2026-02-02 10:11:14.456 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:11:14 compute-1 NetworkManager[49055]: <info>  [1770027074.4575] manager: (patch-provnet-3738ab71-03c6-44c1-bc4f-10cf3e96782e-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/45)
Feb 02 10:11:14 compute-1 NetworkManager[49055]: <info>  [1770027074.4587] manager: (patch-br-int-to-provnet-3738ab71-03c6-44c1-bc4f-10cf3e96782e): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/46)
Feb 02 10:11:14 compute-1 ovn_controller[133666]: 2026-02-02T10:11:14Z|00063|binding|INFO|Releasing lport 1cafc178-edb3-4734-825c-ef4e45193789 from this chassis (sb_readonly=0)
Feb 02 10:11:14 compute-1 nova_compute[226294]: 2026-02-02 10:11:14.474 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:11:14 compute-1 ovn_controller[133666]: 2026-02-02T10:11:14Z|00064|binding|INFO|Releasing lport 1cafc178-edb3-4734-825c-ef4e45193789 from this chassis (sb_readonly=0)
Feb 02 10:11:14 compute-1 nova_compute[226294]: 2026-02-02 10:11:14.482 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:11:14 compute-1 nova_compute[226294]: 2026-02-02 10:11:14.691 226298 DEBUG nova.compute.manager [req-22172e43-e558-4380-ba2d-a5898554720c req-7f0dbca5-c83e-4071-ab91-276d00729086 b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] [instance: 42dc4712-7770-4ecd-abba-8c8e970f8e46] Received event network-changed-29f94a0b-58b9-437a-9157-c3ce95454def external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 02 10:11:14 compute-1 nova_compute[226294]: 2026-02-02 10:11:14.692 226298 DEBUG nova.compute.manager [req-22172e43-e558-4380-ba2d-a5898554720c req-7f0dbca5-c83e-4071-ab91-276d00729086 b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] [instance: 42dc4712-7770-4ecd-abba-8c8e970f8e46] Refreshing instance network info cache due to event network-changed-29f94a0b-58b9-437a-9157-c3ce95454def. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Feb 02 10:11:14 compute-1 nova_compute[226294]: 2026-02-02 10:11:14.692 226298 DEBUG oslo_concurrency.lockutils [req-22172e43-e558-4380-ba2d-a5898554720c req-7f0dbca5-c83e-4071-ab91-276d00729086 b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] Acquiring lock "refresh_cache-42dc4712-7770-4ecd-abba-8c8e970f8e46" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 02 10:11:14 compute-1 nova_compute[226294]: 2026-02-02 10:11:14.692 226298 DEBUG oslo_concurrency.lockutils [req-22172e43-e558-4380-ba2d-a5898554720c req-7f0dbca5-c83e-4071-ab91-276d00729086 b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] Acquired lock "refresh_cache-42dc4712-7770-4ecd-abba-8c8e970f8e46" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 02 10:11:14 compute-1 nova_compute[226294]: 2026-02-02 10:11:14.693 226298 DEBUG nova.network.neutron [req-22172e43-e558-4380-ba2d-a5898554720c req-7f0dbca5-c83e-4071-ab91-276d00729086 b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] [instance: 42dc4712-7770-4ecd-abba-8c8e970f8e46] Refreshing network info cache for port 29f94a0b-58b9-437a-9157-c3ce95454def _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Feb 02 10:11:14 compute-1 ceph-mon[80115]: from='client.? 192.168.122.102:0/11261957' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Feb 02 10:11:14 compute-1 ceph-mon[80115]: pgmap v955: 353 pgs: 353 active+clean; 88 MiB data, 285 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 1.8 MiB/s wr, 101 op/s
Feb 02 10:11:14 compute-1 nova_compute[226294]: 2026-02-02 10:11:14.875 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 02 10:11:14 compute-1 nova_compute[226294]: 2026-02-02 10:11:14.875 226298 DEBUG nova.compute.manager [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 02 10:11:14 compute-1 nova_compute[226294]: 2026-02-02 10:11:14.875 226298 DEBUG nova.compute.manager [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 02 10:11:15 compute-1 nova_compute[226294]: 2026-02-02 10:11:15.056 226298 DEBUG oslo_concurrency.lockutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Acquiring lock "refresh_cache-42dc4712-7770-4ecd-abba-8c8e970f8e46" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 02 10:11:15 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:11:15 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:11:15 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:11:15.394 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:11:15 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:11:15 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:11:15 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:11:15.443 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:11:15 compute-1 nova_compute[226294]: 2026-02-02 10:11:15.843 226298 DEBUG nova.network.neutron [req-22172e43-e558-4380-ba2d-a5898554720c req-7f0dbca5-c83e-4071-ab91-276d00729086 b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] [instance: 42dc4712-7770-4ecd-abba-8c8e970f8e46] Updated VIF entry in instance network info cache for port 29f94a0b-58b9-437a-9157-c3ce95454def. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Feb 02 10:11:15 compute-1 nova_compute[226294]: 2026-02-02 10:11:15.843 226298 DEBUG nova.network.neutron [req-22172e43-e558-4380-ba2d-a5898554720c req-7f0dbca5-c83e-4071-ab91-276d00729086 b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] [instance: 42dc4712-7770-4ecd-abba-8c8e970f8e46] Updating instance_info_cache with network_info: [{"id": "29f94a0b-58b9-437a-9157-c3ce95454def", "address": "fa:16:3e:5f:0c:ce", "network": {"id": "07b5f9e6-a53d-47d1-be8b-5269063b871d", "bridge": "br-int", "label": "tempest-network-smoke--1567229594", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.208", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "efbfe697ca674d72b47da5adf3e42c0c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap29f94a0b-58", "ovs_interfaceid": "29f94a0b-58b9-437a-9157-c3ce95454def", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 02 10:11:15 compute-1 nova_compute[226294]: 2026-02-02 10:11:15.920 226298 DEBUG oslo_concurrency.lockutils [req-22172e43-e558-4380-ba2d-a5898554720c req-7f0dbca5-c83e-4071-ab91-276d00729086 b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] Releasing lock "refresh_cache-42dc4712-7770-4ecd-abba-8c8e970f8e46" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 02 10:11:15 compute-1 nova_compute[226294]: 2026-02-02 10:11:15.921 226298 DEBUG oslo_concurrency.lockutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Acquired lock "refresh_cache-42dc4712-7770-4ecd-abba-8c8e970f8e46" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 02 10:11:15 compute-1 nova_compute[226294]: 2026-02-02 10:11:15.921 226298 DEBUG nova.network.neutron [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] [instance: 42dc4712-7770-4ecd-abba-8c8e970f8e46] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Feb 02 10:11:15 compute-1 nova_compute[226294]: 2026-02-02 10:11:15.922 226298 DEBUG nova.objects.instance [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Lazy-loading 'info_cache' on Instance uuid 42dc4712-7770-4ecd-abba-8c8e970f8e46 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 02 10:11:16 compute-1 nova_compute[226294]: 2026-02-02 10:11:16.343 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:11:16 compute-1 ceph-mon[80115]: pgmap v956: 353 pgs: 353 active+clean; 88 MiB data, 285 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 1.8 MiB/s wr, 100 op/s
Feb 02 10:11:17 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:11:17 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 10:11:17 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:11:17.396 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 10:11:17 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 10:11:17 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:11:17 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:11:17 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:11:17.445 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:11:17 compute-1 nova_compute[226294]: 2026-02-02 10:11:17.644 226298 DEBUG nova.network.neutron [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] [instance: 42dc4712-7770-4ecd-abba-8c8e970f8e46] Updating instance_info_cache with network_info: [{"id": "29f94a0b-58b9-437a-9157-c3ce95454def", "address": "fa:16:3e:5f:0c:ce", "network": {"id": "07b5f9e6-a53d-47d1-be8b-5269063b871d", "bridge": "br-int", "label": "tempest-network-smoke--1567229594", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.208", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "efbfe697ca674d72b47da5adf3e42c0c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap29f94a0b-58", "ovs_interfaceid": "29f94a0b-58b9-437a-9157-c3ce95454def", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 02 10:11:17 compute-1 nova_compute[226294]: 2026-02-02 10:11:17.687 226298 DEBUG oslo_concurrency.lockutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Releasing lock "refresh_cache-42dc4712-7770-4ecd-abba-8c8e970f8e46" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 02 10:11:17 compute-1 nova_compute[226294]: 2026-02-02 10:11:17.688 226298 DEBUG nova.compute.manager [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] [instance: 42dc4712-7770-4ecd-abba-8c8e970f8e46] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Feb 02 10:11:17 compute-1 nova_compute[226294]: 2026-02-02 10:11:17.688 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 02 10:11:17 compute-1 nova_compute[226294]: 2026-02-02 10:11:17.688 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 02 10:11:17 compute-1 nova_compute[226294]: 2026-02-02 10:11:17.688 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 02 10:11:17 compute-1 nova_compute[226294]: 2026-02-02 10:11:17.689 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 02 10:11:17 compute-1 nova_compute[226294]: 2026-02-02 10:11:17.689 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 02 10:11:17 compute-1 nova_compute[226294]: 2026-02-02 10:11:17.689 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 02 10:11:17 compute-1 nova_compute[226294]: 2026-02-02 10:11:17.689 226298 DEBUG nova.compute.manager [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 02 10:11:17 compute-1 nova_compute[226294]: 2026-02-02 10:11:17.689 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 02 10:11:17 compute-1 nova_compute[226294]: 2026-02-02 10:11:17.791 226298 DEBUG oslo_concurrency.lockutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 02 10:11:17 compute-1 nova_compute[226294]: 2026-02-02 10:11:17.792 226298 DEBUG oslo_concurrency.lockutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 02 10:11:17 compute-1 nova_compute[226294]: 2026-02-02 10:11:17.792 226298 DEBUG oslo_concurrency.lockutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 02 10:11:17 compute-1 nova_compute[226294]: 2026-02-02 10:11:17.792 226298 DEBUG nova.compute.resource_tracker [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 02 10:11:17 compute-1 nova_compute[226294]: 2026-02-02 10:11:17.792 226298 DEBUG oslo_concurrency.processutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 02 10:11:18 compute-1 nova_compute[226294]: 2026-02-02 10:11:18.162 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:11:18 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Feb 02 10:11:18 compute-1 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Feb 02 10:11:18 compute-1 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3049014938' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Feb 02 10:11:18 compute-1 nova_compute[226294]: 2026-02-02 10:11:18.305 226298 DEBUG oslo_concurrency.processutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.513s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 02 10:11:18 compute-1 nova_compute[226294]: 2026-02-02 10:11:18.439 226298 DEBUG nova.virt.libvirt.driver [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] skipping disk for instance-0000000b as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Feb 02 10:11:18 compute-1 nova_compute[226294]: 2026-02-02 10:11:18.439 226298 DEBUG nova.virt.libvirt.driver [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] skipping disk for instance-0000000b as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Feb 02 10:11:18 compute-1 nova_compute[226294]: 2026-02-02 10:11:18.651 226298 WARNING nova.virt.libvirt.driver [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 02 10:11:18 compute-1 nova_compute[226294]: 2026-02-02 10:11:18.653 226298 DEBUG nova.compute.resource_tracker [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4771MB free_disk=59.96738052368164GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 02 10:11:18 compute-1 nova_compute[226294]: 2026-02-02 10:11:18.653 226298 DEBUG oslo_concurrency.lockutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 02 10:11:18 compute-1 nova_compute[226294]: 2026-02-02 10:11:18.653 226298 DEBUG oslo_concurrency.lockutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 02 10:11:18 compute-1 nova_compute[226294]: 2026-02-02 10:11:18.813 226298 DEBUG nova.compute.resource_tracker [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Instance 42dc4712-7770-4ecd-abba-8c8e970f8e46 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Feb 02 10:11:18 compute-1 nova_compute[226294]: 2026-02-02 10:11:18.813 226298 DEBUG nova.compute.resource_tracker [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 02 10:11:18 compute-1 nova_compute[226294]: 2026-02-02 10:11:18.813 226298 DEBUG nova.compute.resource_tracker [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=59GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 02 10:11:18 compute-1 nova_compute[226294]: 2026-02-02 10:11:18.854 226298 DEBUG oslo_concurrency.processutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 02 10:11:19 compute-1 ceph-mon[80115]: from='client.? 192.168.122.101:0/3049014938' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Feb 02 10:11:19 compute-1 ceph-mon[80115]: pgmap v957: 353 pgs: 353 active+clean; 88 MiB data, 285 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 1.8 MiB/s wr, 101 op/s
Feb 02 10:11:19 compute-1 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Feb 02 10:11:19 compute-1 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3378367145' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Feb 02 10:11:19 compute-1 nova_compute[226294]: 2026-02-02 10:11:19.278 226298 DEBUG oslo_concurrency.processutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.424s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 02 10:11:19 compute-1 nova_compute[226294]: 2026-02-02 10:11:19.282 226298 DEBUG nova.compute.provider_tree [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Inventory has not changed in ProviderTree for provider: 8e32c057-ad28-4c19-8374-763e0c1c8622 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 02 10:11:19 compute-1 nova_compute[226294]: 2026-02-02 10:11:19.299 226298 DEBUG nova.scheduler.client.report [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Inventory has not changed for provider 8e32c057-ad28-4c19-8374-763e0c1c8622 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 02 10:11:19 compute-1 nova_compute[226294]: 2026-02-02 10:11:19.321 226298 DEBUG nova.compute.resource_tracker [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 02 10:11:19 compute-1 nova_compute[226294]: 2026-02-02 10:11:19.322 226298 DEBUG oslo_concurrency.lockutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.668s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 02 10:11:19 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:11:19 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:11:19 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:11:19.399 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:11:19 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:11:19 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 10:11:19 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:11:19.447 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 10:11:20 compute-1 nova_compute[226294]: 2026-02-02 10:11:20.092 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 02 10:11:20 compute-1 ceph-mon[80115]: from='client.? 192.168.122.101:0/3378367145' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Feb 02 10:11:20 compute-1 sudo[235867]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Feb 02 10:11:20 compute-1 sudo[235867]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 10:11:20 compute-1 sudo[235867]: pam_unix(sudo:session): session closed for user root
Feb 02 10:11:21 compute-1 ceph-mon[80115]: from='client.? 192.168.122.102:0/1956883334' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Feb 02 10:11:21 compute-1 ceph-mon[80115]: pgmap v958: 353 pgs: 353 active+clean; 88 MiB data, 285 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 12 KiB/s wr, 73 op/s
Feb 02 10:11:21 compute-1 nova_compute[226294]: 2026-02-02 10:11:21.345 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:11:21 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:11:21 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:11:21 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:11:21.401 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:11:21 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:11:21 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:11:21 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:11:21.450 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:11:22 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 10:11:22 compute-1 ovn_controller[133666]: 2026-02-02T10:11:22Z|00010|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:5f:0c:ce 10.100.0.6
Feb 02 10:11:22 compute-1 ovn_controller[133666]: 2026-02-02T10:11:22Z|00011|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:5f:0c:ce 10.100.0.6
Feb 02 10:11:22 compute-1 ceph-mon[80115]: pgmap v959: 353 pgs: 353 active+clean; 88 MiB data, 285 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 12 KiB/s wr, 73 op/s
Feb 02 10:11:23 compute-1 nova_compute[226294]: 2026-02-02 10:11:23.164 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:11:23 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:11:23 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:11:23 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:11:23.405 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:11:23 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:11:23 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:11:23 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:11:23.451 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:11:24 compute-1 ceph-mon[80115]: pgmap v960: 353 pgs: 353 active+clean; 167 MiB data, 349 MiB used, 60 GiB / 60 GiB avail; 2.2 MiB/s rd, 3.9 MiB/s wr, 164 op/s
Feb 02 10:11:25 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:11:25 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:11:25 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:11:25.406 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:11:25 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:11:25 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:11:25 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:11:25.453 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:11:26 compute-1 nova_compute[226294]: 2026-02-02 10:11:26.348 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:11:26 compute-1 ceph-mon[80115]: pgmap v961: 353 pgs: 353 active+clean; 167 MiB data, 349 MiB used, 60 GiB / 60 GiB avail; 342 KiB/s rd, 3.9 MiB/s wr, 91 op/s
Feb 02 10:11:27 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 10:11:27 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:11:27 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:11:27 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:11:27.408 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:11:27 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:11:27 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:11:27 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:11:27.455 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:11:28 compute-1 ceph-mon[80115]: from='client.? 192.168.122.102:0/2166185677' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Feb 02 10:11:28 compute-1 nova_compute[226294]: 2026-02-02 10:11:28.166 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:11:29 compute-1 ceph-mon[80115]: from='client.? 192.168.122.102:0/1044531648' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Feb 02 10:11:29 compute-1 ceph-mon[80115]: pgmap v962: 353 pgs: 353 active+clean; 167 MiB data, 349 MiB used, 60 GiB / 60 GiB avail; 342 KiB/s rd, 3.9 MiB/s wr, 91 op/s
Feb 02 10:11:29 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:11:29 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.002000053s ======
Feb 02 10:11:29 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:11:29.410 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000053s
Feb 02 10:11:29 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:11:29 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:11:29 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:11:29.457 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:11:30 compute-1 ceph-mon[80115]: pgmap v963: 353 pgs: 353 active+clean; 167 MiB data, 349 MiB used, 60 GiB / 60 GiB avail; 342 KiB/s rd, 3.9 MiB/s wr, 91 op/s
Feb 02 10:11:31 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-haproxy-nfs-cephfs-compute-1-sryqbx[86108]: [WARNING] 032/101131 (4) : Server backend/nfs.cephfs.1 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 1 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Feb 02 10:11:31 compute-1 nova_compute[226294]: 2026-02-02 10:11:31.350 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:11:31 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:11:31 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 10:11:31 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:11:31.413 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 10:11:31 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:11:31 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 10:11:31 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:11:31.459 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 10:11:32 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Feb 02 10:11:32 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 10:11:33 compute-1 nova_compute[226294]: 2026-02-02 10:11:33.167 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:11:33 compute-1 ceph-mon[80115]: pgmap v964: 353 pgs: 353 active+clean; 167 MiB data, 349 MiB used, 60 GiB / 60 GiB avail; 342 KiB/s rd, 3.9 MiB/s wr, 91 op/s
Feb 02 10:11:33 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:11:33 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 10:11:33 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:11:33.416 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 10:11:33 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:11:33 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:11:33 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:11:33.462 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:11:34 compute-1 ceph-mon[80115]: pgmap v965: 353 pgs: 353 active+clean; 167 MiB data, 349 MiB used, 60 GiB / 60 GiB avail; 2.2 MiB/s rd, 3.9 MiB/s wr, 165 op/s
Feb 02 10:11:35 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:11:35 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 10:11:35 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:11:35.418 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 10:11:35 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:11:35 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 10:11:35 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:11:35.464 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 10:11:36 compute-1 nova_compute[226294]: 2026-02-02 10:11:36.353 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:11:36 compute-1 podman[235900]: 2026-02-02 10:11:36.399667304 +0000 UTC m=+0.073938375 container health_status 1fb2696999ea15e9688144989093738543d3cef725f9986792d6dfd707aefbe2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:099d88ae13fa2b3409da5310cdcba7fa01d2c87a8bc98296299a57054b9a075e, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'db4758ee7523fe447444c4bd2b867b543b1eee4e3bbcf6676cd1b27bf6147d86-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:099d88ae13fa2b3409da5310cdcba7fa01d2c87a8bc98296299a57054b9a075e', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127)
Feb 02 10:11:36 compute-1 ceph-mon[80115]: pgmap v966: 353 pgs: 353 active+clean; 167 MiB data, 349 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 25 KiB/s wr, 74 op/s
Feb 02 10:11:37 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 10:11:37 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:11:37 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 10:11:37 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:11:37.421 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 10:11:37 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:11:37 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:11:37 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:11:37.467 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:11:38 compute-1 nova_compute[226294]: 2026-02-02 10:11:38.169 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:11:38 compute-1 ceph-mon[80115]: pgmap v967: 353 pgs: 353 active+clean; 167 MiB data, 349 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 25 KiB/s wr, 74 op/s
Feb 02 10:11:39 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:11:39 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 10:11:39 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:11:39.424 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 10:11:39 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:11:39 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:11:39 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:11:39.469 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:11:40 compute-1 sudo[235927]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Feb 02 10:11:40 compute-1 sudo[235927]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 10:11:40 compute-1 sudo[235927]: pam_unix(sudo:session): session closed for user root
Feb 02 10:11:40 compute-1 ceph-mon[80115]: pgmap v968: 353 pgs: 353 active+clean; 167 MiB data, 349 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 13 KiB/s wr, 73 op/s
Feb 02 10:11:41 compute-1 nova_compute[226294]: 2026-02-02 10:11:41.354 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:11:41 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:11:41 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:11:41 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:11:41.427 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:11:41 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:11:41 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:11:41 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:11:41.471 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:11:42 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 10:11:42 compute-1 ceph-mon[80115]: pgmap v969: 353 pgs: 353 active+clean; 167 MiB data, 349 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 13 KiB/s wr, 73 op/s
Feb 02 10:11:43 compute-1 nova_compute[226294]: 2026-02-02 10:11:43.171 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:11:43 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:11:43 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:11:43 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:11:43.429 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:11:43 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:11:43 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:11:43 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:11:43.473 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:11:43 compute-1 sshd-session[235954]: Invalid user solv from 80.94.92.184 port 56384
Feb 02 10:11:43 compute-1 podman[235956]: 2026-02-02 10:11:43.632882526 +0000 UTC m=+0.043942718 container health_status 9cab238676dda9a572dafc047b624a8b78a8fbec063bf997856559897160605d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'db4758ee7523fe447444c4bd2b867b543b1eee4e3bbcf6676cd1b27bf6147d86-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_managed=true)
Feb 02 10:11:43 compute-1 sshd-session[235954]: Connection closed by invalid user solv 80.94.92.184 port 56384 [preauth]
Feb 02 10:11:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 10:11:44.910 143542 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 02 10:11:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 10:11:44.910 143542 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 02 10:11:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 10:11:44.911 143542 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 02 10:11:45 compute-1 ceph-mon[80115]: pgmap v970: 353 pgs: 353 active+clean; 198 MiB data, 391 MiB used, 60 GiB / 60 GiB avail; 2.2 MiB/s rd, 2.1 MiB/s wr, 132 op/s
Feb 02 10:11:45 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:11:45 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 10:11:45 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:11:45.431 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 10:11:45 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:11:45 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:11:45 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:11:45.475 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:11:46 compute-1 nova_compute[226294]: 2026-02-02 10:11:46.356 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:11:47 compute-1 ceph-mon[80115]: pgmap v971: 353 pgs: 353 active+clean; 198 MiB data, 391 MiB used, 60 GiB / 60 GiB avail; 288 KiB/s rd, 2.1 MiB/s wr, 58 op/s
Feb 02 10:11:47 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 10:11:47 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:11:47 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:11:47 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:11:47.435 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:11:47 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:11:47 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:11:47 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:11:47.477 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:11:48 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Feb 02 10:11:48 compute-1 nova_compute[226294]: 2026-02-02 10:11:48.217 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:11:49 compute-1 ceph-mon[80115]: pgmap v972: 353 pgs: 353 active+clean; 200 MiB data, 392 MiB used, 60 GiB / 60 GiB avail; 318 KiB/s rd, 2.1 MiB/s wr, 63 op/s
Feb 02 10:11:49 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:11:49 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:11:49 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:11:49.438 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:11:49 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:11:49 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:11:49 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:11:49.479 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:11:50 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-haproxy-nfs-cephfs-compute-1-sryqbx[86108]: [WARNING] 032/101150 (4) : Server backend/nfs.cephfs.2 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 0 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Feb 02 10:11:50 compute-1 ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-haproxy-nfs-cephfs-compute-1-sryqbx[86108]: [ALERT] 032/101150 (4) : backend 'backend' has no server available!
Feb 02 10:11:50 compute-1 ceph-mon[80115]: pgmap v973: 353 pgs: 353 active+clean; 200 MiB data, 392 MiB used, 60 GiB / 60 GiB avail; 318 KiB/s rd, 2.1 MiB/s wr, 63 op/s
Feb 02 10:11:51 compute-1 nova_compute[226294]: 2026-02-02 10:11:51.357 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:11:51 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:11:51 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:11:51 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:11:51.440 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:11:51 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:11:51 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:11:51 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:11:51.481 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:11:52 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 10:11:53 compute-1 ceph-mon[80115]: pgmap v974: 353 pgs: 353 active+clean; 200 MiB data, 392 MiB used, 60 GiB / 60 GiB avail; 318 KiB/s rd, 2.1 MiB/s wr, 63 op/s
Feb 02 10:11:53 compute-1 nova_compute[226294]: 2026-02-02 10:11:53.219 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:11:53 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:11:53 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:11:53 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:11:53.443 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:11:53 compute-1 sudo[235980]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 02 10:11:53 compute-1 sudo[235980]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 10:11:53 compute-1 sudo[235980]: pam_unix(sudo:session): session closed for user root
Feb 02 10:11:53 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:11:53 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:11:53 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:11:53.484 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:11:53 compute-1 sudo[236005]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/d241d473-9fcb-5f74-b163-f1ca4454e7f1/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Feb 02 10:11:53 compute-1 sudo[236005]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 10:11:53 compute-1 sudo[236005]: pam_unix(sudo:session): session closed for user root
Feb 02 10:11:54 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Feb 02 10:11:54 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Feb 02 10:11:55 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb 02 10:11:55 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb 02 10:11:55 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Feb 02 10:11:55 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Feb 02 10:11:55 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Feb 02 10:11:55 compute-1 ceph-mon[80115]: pgmap v975: 353 pgs: 353 active+clean; 200 MiB data, 392 MiB used, 60 GiB / 60 GiB avail; 318 KiB/s rd, 2.1 MiB/s wr, 63 op/s
Feb 02 10:11:55 compute-1 ceph-mon[80115]: from='client.? 192.168.122.10:0/488075457' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Feb 02 10:11:55 compute-1 ceph-mon[80115]: from='client.? 192.168.122.10:0/488075457' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Feb 02 10:11:55 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:11:55 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 10:11:55 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:11:55.445 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 10:11:55 compute-1 nova_compute[226294]: 2026-02-02 10:11:55.479 226298 INFO nova.compute.manager [None req-55322302-f8bd-44d7-bf6e-df54484870c9 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] [instance: 42dc4712-7770-4ecd-abba-8c8e970f8e46] Get console output
Feb 02 10:11:55 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:11:55 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:11:55 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:11:55.486 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:11:55 compute-1 nova_compute[226294]: 2026-02-02 10:11:55.486 232427 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Feb 02 10:11:56 compute-1 nova_compute[226294]: 2026-02-02 10:11:56.359 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:11:57 compute-1 ceph-mon[80115]: pgmap v976: 353 pgs: 353 active+clean; 200 MiB data, 392 MiB used, 60 GiB / 60 GiB avail; 30 KiB/s rd, 18 KiB/s wr, 5 op/s
Feb 02 10:11:57 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 10:11:57 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:11:57 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 10:11:57 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:11:57.447 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 10:11:57 compute-1 nova_compute[226294]: 2026-02-02 10:11:57.479 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:11:57 compute-1 ovn_metadata_agent[143537]: 2026-02-02 10:11:57.480 143542 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=12, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '66:4f:4d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '4a:a7:f3:61:65:15'}, ipsec=False) old=SB_Global(nb_cfg=11) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 02 10:11:57 compute-1 ovn_metadata_agent[143537]: 2026-02-02 10:11:57.481 143542 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Feb 02 10:11:57 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:11:57 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:11:57 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:11:57.488 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:11:57 compute-1 nova_compute[226294]: 2026-02-02 10:11:57.781 226298 DEBUG nova.compute.manager [req-d615557d-c62a-4287-a9d7-01ec08732ff2 req-c7311684-d6be-4e7c-83ef-4f22f1f438eb b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] [instance: 42dc4712-7770-4ecd-abba-8c8e970f8e46] Received event network-changed-29f94a0b-58b9-437a-9157-c3ce95454def external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 02 10:11:57 compute-1 nova_compute[226294]: 2026-02-02 10:11:57.781 226298 DEBUG nova.compute.manager [req-d615557d-c62a-4287-a9d7-01ec08732ff2 req-c7311684-d6be-4e7c-83ef-4f22f1f438eb b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] [instance: 42dc4712-7770-4ecd-abba-8c8e970f8e46] Refreshing instance network info cache due to event network-changed-29f94a0b-58b9-437a-9157-c3ce95454def. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Feb 02 10:11:57 compute-1 nova_compute[226294]: 2026-02-02 10:11:57.782 226298 DEBUG oslo_concurrency.lockutils [req-d615557d-c62a-4287-a9d7-01ec08732ff2 req-c7311684-d6be-4e7c-83ef-4f22f1f438eb b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] Acquiring lock "refresh_cache-42dc4712-7770-4ecd-abba-8c8e970f8e46" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 02 10:11:57 compute-1 nova_compute[226294]: 2026-02-02 10:11:57.782 226298 DEBUG oslo_concurrency.lockutils [req-d615557d-c62a-4287-a9d7-01ec08732ff2 req-c7311684-d6be-4e7c-83ef-4f22f1f438eb b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] Acquired lock "refresh_cache-42dc4712-7770-4ecd-abba-8c8e970f8e46" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 02 10:11:57 compute-1 nova_compute[226294]: 2026-02-02 10:11:57.783 226298 DEBUG nova.network.neutron [req-d615557d-c62a-4287-a9d7-01ec08732ff2 req-c7311684-d6be-4e7c-83ef-4f22f1f438eb b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] [instance: 42dc4712-7770-4ecd-abba-8c8e970f8e46] Refreshing network info cache for port 29f94a0b-58b9-437a-9157-c3ce95454def _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Feb 02 10:11:57 compute-1 nova_compute[226294]: 2026-02-02 10:11:57.848 226298 DEBUG nova.compute.manager [req-3a231fa5-90ba-4967-8ce6-2da85f8f4a16 req-c201a885-16ff-4df3-8b61-5fbd1885d3a4 b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] [instance: 42dc4712-7770-4ecd-abba-8c8e970f8e46] Received event network-vif-unplugged-29f94a0b-58b9-437a-9157-c3ce95454def external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 02 10:11:57 compute-1 nova_compute[226294]: 2026-02-02 10:11:57.849 226298 DEBUG oslo_concurrency.lockutils [req-3a231fa5-90ba-4967-8ce6-2da85f8f4a16 req-c201a885-16ff-4df3-8b61-5fbd1885d3a4 b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] Acquiring lock "42dc4712-7770-4ecd-abba-8c8e970f8e46-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 02 10:11:57 compute-1 nova_compute[226294]: 2026-02-02 10:11:57.849 226298 DEBUG oslo_concurrency.lockutils [req-3a231fa5-90ba-4967-8ce6-2da85f8f4a16 req-c201a885-16ff-4df3-8b61-5fbd1885d3a4 b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] Lock "42dc4712-7770-4ecd-abba-8c8e970f8e46-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 02 10:11:57 compute-1 nova_compute[226294]: 2026-02-02 10:11:57.849 226298 DEBUG oslo_concurrency.lockutils [req-3a231fa5-90ba-4967-8ce6-2da85f8f4a16 req-c201a885-16ff-4df3-8b61-5fbd1885d3a4 b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] Lock "42dc4712-7770-4ecd-abba-8c8e970f8e46-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 02 10:11:57 compute-1 nova_compute[226294]: 2026-02-02 10:11:57.849 226298 DEBUG nova.compute.manager [req-3a231fa5-90ba-4967-8ce6-2da85f8f4a16 req-c201a885-16ff-4df3-8b61-5fbd1885d3a4 b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] [instance: 42dc4712-7770-4ecd-abba-8c8e970f8e46] No waiting events found dispatching network-vif-unplugged-29f94a0b-58b9-437a-9157-c3ce95454def pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 02 10:11:57 compute-1 nova_compute[226294]: 2026-02-02 10:11:57.849 226298 WARNING nova.compute.manager [req-3a231fa5-90ba-4967-8ce6-2da85f8f4a16 req-c201a885-16ff-4df3-8b61-5fbd1885d3a4 b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] [instance: 42dc4712-7770-4ecd-abba-8c8e970f8e46] Received unexpected event network-vif-unplugged-29f94a0b-58b9-437a-9157-c3ce95454def for instance with vm_state active and task_state None.
Feb 02 10:11:58 compute-1 nova_compute[226294]: 2026-02-02 10:11:58.221 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:11:58 compute-1 nova_compute[226294]: 2026-02-02 10:11:58.673 226298 INFO nova.compute.manager [None req-e0c78638-0642-4dfe-93c7-9276d0f7149b 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] [instance: 42dc4712-7770-4ecd-abba-8c8e970f8e46] Get console output
Feb 02 10:11:58 compute-1 nova_compute[226294]: 2026-02-02 10:11:58.679 232427 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Feb 02 10:11:59 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:11:59 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:11:59 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:11:59.449 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:11:59 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:11:59 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:11:59 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:11:59.490 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:11:59 compute-1 nova_compute[226294]: 2026-02-02 10:11:59.607 226298 DEBUG nova.network.neutron [req-d615557d-c62a-4287-a9d7-01ec08732ff2 req-c7311684-d6be-4e7c-83ef-4f22f1f438eb b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] [instance: 42dc4712-7770-4ecd-abba-8c8e970f8e46] Updated VIF entry in instance network info cache for port 29f94a0b-58b9-437a-9157-c3ce95454def. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Feb 02 10:11:59 compute-1 nova_compute[226294]: 2026-02-02 10:11:59.607 226298 DEBUG nova.network.neutron [req-d615557d-c62a-4287-a9d7-01ec08732ff2 req-c7311684-d6be-4e7c-83ef-4f22f1f438eb b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] [instance: 42dc4712-7770-4ecd-abba-8c8e970f8e46] Updating instance_info_cache with network_info: [{"id": "29f94a0b-58b9-437a-9157-c3ce95454def", "address": "fa:16:3e:5f:0c:ce", "network": {"id": "07b5f9e6-a53d-47d1-be8b-5269063b871d", "bridge": "br-int", "label": "tempest-network-smoke--1567229594", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.208", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "efbfe697ca674d72b47da5adf3e42c0c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap29f94a0b-58", "ovs_interfaceid": "29f94a0b-58b9-437a-9157-c3ce95454def", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 02 10:11:59 compute-1 nova_compute[226294]: 2026-02-02 10:11:59.657 226298 DEBUG oslo_concurrency.lockutils [req-d615557d-c62a-4287-a9d7-01ec08732ff2 req-c7311684-d6be-4e7c-83ef-4f22f1f438eb b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] Releasing lock "refresh_cache-42dc4712-7770-4ecd-abba-8c8e970f8e46" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 02 10:11:59 compute-1 ceph-mon[80115]: pgmap v977: 353 pgs: 353 active+clean; 200 MiB data, 392 MiB used, 60 GiB / 60 GiB avail; 37 KiB/s rd, 19 KiB/s wr, 6 op/s
Feb 02 10:12:00 compute-1 nova_compute[226294]: 2026-02-02 10:12:00.003 226298 DEBUG nova.compute.manager [req-582017d2-041f-4d88-a3e4-85e27d8e93e9 req-bc265ad0-bead-4382-aea8-12116520dcaa b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] [instance: 42dc4712-7770-4ecd-abba-8c8e970f8e46] Received event network-vif-plugged-29f94a0b-58b9-437a-9157-c3ce95454def external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 02 10:12:00 compute-1 nova_compute[226294]: 2026-02-02 10:12:00.004 226298 DEBUG oslo_concurrency.lockutils [req-582017d2-041f-4d88-a3e4-85e27d8e93e9 req-bc265ad0-bead-4382-aea8-12116520dcaa b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] Acquiring lock "42dc4712-7770-4ecd-abba-8c8e970f8e46-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 02 10:12:00 compute-1 nova_compute[226294]: 2026-02-02 10:12:00.004 226298 DEBUG oslo_concurrency.lockutils [req-582017d2-041f-4d88-a3e4-85e27d8e93e9 req-bc265ad0-bead-4382-aea8-12116520dcaa b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] Lock "42dc4712-7770-4ecd-abba-8c8e970f8e46-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 02 10:12:00 compute-1 nova_compute[226294]: 2026-02-02 10:12:00.004 226298 DEBUG oslo_concurrency.lockutils [req-582017d2-041f-4d88-a3e4-85e27d8e93e9 req-bc265ad0-bead-4382-aea8-12116520dcaa b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] Lock "42dc4712-7770-4ecd-abba-8c8e970f8e46-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 02 10:12:00 compute-1 nova_compute[226294]: 2026-02-02 10:12:00.005 226298 DEBUG nova.compute.manager [req-582017d2-041f-4d88-a3e4-85e27d8e93e9 req-bc265ad0-bead-4382-aea8-12116520dcaa b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] [instance: 42dc4712-7770-4ecd-abba-8c8e970f8e46] No waiting events found dispatching network-vif-plugged-29f94a0b-58b9-437a-9157-c3ce95454def pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 02 10:12:00 compute-1 nova_compute[226294]: 2026-02-02 10:12:00.005 226298 WARNING nova.compute.manager [req-582017d2-041f-4d88-a3e4-85e27d8e93e9 req-bc265ad0-bead-4382-aea8-12116520dcaa b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] [instance: 42dc4712-7770-4ecd-abba-8c8e970f8e46] Received unexpected event network-vif-plugged-29f94a0b-58b9-437a-9157-c3ce95454def for instance with vm_state active and task_state None.
Feb 02 10:12:00 compute-1 nova_compute[226294]: 2026-02-02 10:12:00.556 226298 DEBUG nova.compute.manager [req-7cd869a1-ab19-4c1f-b9f9-7cef4a76cfd0 req-ee0cf273-0626-4403-ac31-fefb887aceeb b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] [instance: 42dc4712-7770-4ecd-abba-8c8e970f8e46] Received event network-changed-29f94a0b-58b9-437a-9157-c3ce95454def external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 02 10:12:00 compute-1 nova_compute[226294]: 2026-02-02 10:12:00.557 226298 DEBUG nova.compute.manager [req-7cd869a1-ab19-4c1f-b9f9-7cef4a76cfd0 req-ee0cf273-0626-4403-ac31-fefb887aceeb b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] [instance: 42dc4712-7770-4ecd-abba-8c8e970f8e46] Refreshing instance network info cache due to event network-changed-29f94a0b-58b9-437a-9157-c3ce95454def. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Feb 02 10:12:00 compute-1 nova_compute[226294]: 2026-02-02 10:12:00.557 226298 DEBUG oslo_concurrency.lockutils [req-7cd869a1-ab19-4c1f-b9f9-7cef4a76cfd0 req-ee0cf273-0626-4403-ac31-fefb887aceeb b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] Acquiring lock "refresh_cache-42dc4712-7770-4ecd-abba-8c8e970f8e46" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 02 10:12:00 compute-1 nova_compute[226294]: 2026-02-02 10:12:00.557 226298 DEBUG oslo_concurrency.lockutils [req-7cd869a1-ab19-4c1f-b9f9-7cef4a76cfd0 req-ee0cf273-0626-4403-ac31-fefb887aceeb b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] Acquired lock "refresh_cache-42dc4712-7770-4ecd-abba-8c8e970f8e46" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 02 10:12:00 compute-1 nova_compute[226294]: 2026-02-02 10:12:00.558 226298 DEBUG nova.network.neutron [req-7cd869a1-ab19-4c1f-b9f9-7cef4a76cfd0 req-ee0cf273-0626-4403-ac31-fefb887aceeb b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] [instance: 42dc4712-7770-4ecd-abba-8c8e970f8e46] Refreshing network info cache for port 29f94a0b-58b9-437a-9157-c3ce95454def _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Feb 02 10:12:00 compute-1 sudo[236066]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 02 10:12:00 compute-1 sudo[236066]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 10:12:00 compute-1 sudo[236066]: pam_unix(sudo:session): session closed for user root
Feb 02 10:12:00 compute-1 sudo[236091]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Feb 02 10:12:00 compute-1 sudo[236091]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 10:12:00 compute-1 sudo[236091]: pam_unix(sudo:session): session closed for user root
Feb 02 10:12:00 compute-1 nova_compute[226294]: 2026-02-02 10:12:00.845 226298 INFO nova.compute.manager [None req-d72af9e5-dfb3-47c2-97db-8b229ea6a624 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] [instance: 42dc4712-7770-4ecd-abba-8c8e970f8e46] Get console output
Feb 02 10:12:00 compute-1 nova_compute[226294]: 2026-02-02 10:12:00.850 232427 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Feb 02 10:12:01 compute-1 nova_compute[226294]: 2026-02-02 10:12:01.401 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:12:01 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:12:01 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 10:12:01 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:12:01.450 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 10:12:01 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:12:01 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:12:01 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:12:01.492 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:12:01 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb 02 10:12:01 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb 02 10:12:02 compute-1 nova_compute[226294]: 2026-02-02 10:12:02.098 226298 DEBUG nova.compute.manager [req-dfa7e5fb-2da9-4da5-9fe2-fa93a95ded30 req-34dde5dd-d43e-40da-88f5-af401e8e0312 b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] [instance: 42dc4712-7770-4ecd-abba-8c8e970f8e46] Received event network-vif-plugged-29f94a0b-58b9-437a-9157-c3ce95454def external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 02 10:12:02 compute-1 nova_compute[226294]: 2026-02-02 10:12:02.098 226298 DEBUG oslo_concurrency.lockutils [req-dfa7e5fb-2da9-4da5-9fe2-fa93a95ded30 req-34dde5dd-d43e-40da-88f5-af401e8e0312 b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] Acquiring lock "42dc4712-7770-4ecd-abba-8c8e970f8e46-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 02 10:12:02 compute-1 nova_compute[226294]: 2026-02-02 10:12:02.099 226298 DEBUG oslo_concurrency.lockutils [req-dfa7e5fb-2da9-4da5-9fe2-fa93a95ded30 req-34dde5dd-d43e-40da-88f5-af401e8e0312 b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] Lock "42dc4712-7770-4ecd-abba-8c8e970f8e46-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 02 10:12:02 compute-1 nova_compute[226294]: 2026-02-02 10:12:02.099 226298 DEBUG oslo_concurrency.lockutils [req-dfa7e5fb-2da9-4da5-9fe2-fa93a95ded30 req-34dde5dd-d43e-40da-88f5-af401e8e0312 b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] Lock "42dc4712-7770-4ecd-abba-8c8e970f8e46-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 02 10:12:02 compute-1 nova_compute[226294]: 2026-02-02 10:12:02.099 226298 DEBUG nova.compute.manager [req-dfa7e5fb-2da9-4da5-9fe2-fa93a95ded30 req-34dde5dd-d43e-40da-88f5-af401e8e0312 b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] [instance: 42dc4712-7770-4ecd-abba-8c8e970f8e46] No waiting events found dispatching network-vif-plugged-29f94a0b-58b9-437a-9157-c3ce95454def pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 02 10:12:02 compute-1 nova_compute[226294]: 2026-02-02 10:12:02.099 226298 WARNING nova.compute.manager [req-dfa7e5fb-2da9-4da5-9fe2-fa93a95ded30 req-34dde5dd-d43e-40da-88f5-af401e8e0312 b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] [instance: 42dc4712-7770-4ecd-abba-8c8e970f8e46] Received unexpected event network-vif-plugged-29f94a0b-58b9-437a-9157-c3ce95454def for instance with vm_state active and task_state None.
Feb 02 10:12:02 compute-1 nova_compute[226294]: 2026-02-02 10:12:02.100 226298 DEBUG nova.compute.manager [req-dfa7e5fb-2da9-4da5-9fe2-fa93a95ded30 req-34dde5dd-d43e-40da-88f5-af401e8e0312 b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] [instance: 42dc4712-7770-4ecd-abba-8c8e970f8e46] Received event network-vif-plugged-29f94a0b-58b9-437a-9157-c3ce95454def external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 02 10:12:02 compute-1 nova_compute[226294]: 2026-02-02 10:12:02.100 226298 DEBUG oslo_concurrency.lockutils [req-dfa7e5fb-2da9-4da5-9fe2-fa93a95ded30 req-34dde5dd-d43e-40da-88f5-af401e8e0312 b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] Acquiring lock "42dc4712-7770-4ecd-abba-8c8e970f8e46-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 02 10:12:02 compute-1 nova_compute[226294]: 2026-02-02 10:12:02.100 226298 DEBUG oslo_concurrency.lockutils [req-dfa7e5fb-2da9-4da5-9fe2-fa93a95ded30 req-34dde5dd-d43e-40da-88f5-af401e8e0312 b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] Lock "42dc4712-7770-4ecd-abba-8c8e970f8e46-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 02 10:12:02 compute-1 nova_compute[226294]: 2026-02-02 10:12:02.100 226298 DEBUG oslo_concurrency.lockutils [req-dfa7e5fb-2da9-4da5-9fe2-fa93a95ded30 req-34dde5dd-d43e-40da-88f5-af401e8e0312 b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] Lock "42dc4712-7770-4ecd-abba-8c8e970f8e46-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 02 10:12:02 compute-1 nova_compute[226294]: 2026-02-02 10:12:02.101 226298 DEBUG nova.compute.manager [req-dfa7e5fb-2da9-4da5-9fe2-fa93a95ded30 req-34dde5dd-d43e-40da-88f5-af401e8e0312 b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] [instance: 42dc4712-7770-4ecd-abba-8c8e970f8e46] No waiting events found dispatching network-vif-plugged-29f94a0b-58b9-437a-9157-c3ce95454def pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 02 10:12:02 compute-1 nova_compute[226294]: 2026-02-02 10:12:02.101 226298 WARNING nova.compute.manager [req-dfa7e5fb-2da9-4da5-9fe2-fa93a95ded30 req-34dde5dd-d43e-40da-88f5-af401e8e0312 b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] [instance: 42dc4712-7770-4ecd-abba-8c8e970f8e46] Received unexpected event network-vif-plugged-29f94a0b-58b9-437a-9157-c3ce95454def for instance with vm_state active and task_state None.
Feb 02 10:12:02 compute-1 nova_compute[226294]: 2026-02-02 10:12:02.403 226298 DEBUG nova.network.neutron [req-7cd869a1-ab19-4c1f-b9f9-7cef4a76cfd0 req-ee0cf273-0626-4403-ac31-fefb887aceeb b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] [instance: 42dc4712-7770-4ecd-abba-8c8e970f8e46] Updated VIF entry in instance network info cache for port 29f94a0b-58b9-437a-9157-c3ce95454def. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Feb 02 10:12:02 compute-1 nova_compute[226294]: 2026-02-02 10:12:02.403 226298 DEBUG nova.network.neutron [req-7cd869a1-ab19-4c1f-b9f9-7cef4a76cfd0 req-ee0cf273-0626-4403-ac31-fefb887aceeb b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] [instance: 42dc4712-7770-4ecd-abba-8c8e970f8e46] Updating instance_info_cache with network_info: [{"id": "29f94a0b-58b9-437a-9157-c3ce95454def", "address": "fa:16:3e:5f:0c:ce", "network": {"id": "07b5f9e6-a53d-47d1-be8b-5269063b871d", "bridge": "br-int", "label": "tempest-network-smoke--1567229594", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.208", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "efbfe697ca674d72b47da5adf3e42c0c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap29f94a0b-58", "ovs_interfaceid": "29f94a0b-58b9-437a-9157-c3ce95454def", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 02 10:12:02 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 10:12:02 compute-1 nova_compute[226294]: 2026-02-02 10:12:02.427 226298 DEBUG oslo_concurrency.lockutils [req-7cd869a1-ab19-4c1f-b9f9-7cef4a76cfd0 req-ee0cf273-0626-4403-ac31-fefb887aceeb b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] Releasing lock "refresh_cache-42dc4712-7770-4ecd-abba-8c8e970f8e46" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 02 10:12:02 compute-1 ceph-mon[80115]: pgmap v978: 353 pgs: 353 active+clean; 200 MiB data, 392 MiB used, 60 GiB / 60 GiB avail; 6.6 KiB/s rd, 15 KiB/s wr, 1 op/s
Feb 02 10:12:02 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Feb 02 10:12:03 compute-1 nova_compute[226294]: 2026-02-02 10:12:03.223 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:12:03 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:12:03 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 10:12:03 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:12:03.453 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 10:12:03 compute-1 ovn_metadata_agent[143537]: 2026-02-02 10:12:03.483 143542 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=2f54a3b0-231a-4b96-9e3a-0a36e3e73216, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '12'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 02 10:12:03 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:12:03 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 10:12:03 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:12:03.493 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 10:12:04 compute-1 ceph-mon[80115]: pgmap v979: 353 pgs: 353 active+clean; 200 MiB data, 392 MiB used, 60 GiB / 60 GiB avail; 6.6 KiB/s rd, 15 KiB/s wr, 1 op/s
Feb 02 10:12:04 compute-1 ceph-mon[80115]: from='client.? 192.168.122.102:0/2034815468' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Feb 02 10:12:05 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:12:05 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:12:05 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:12:05.455 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:12:05 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:12:05 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 10:12:05 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:12:05.496 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 10:12:06 compute-1 nova_compute[226294]: 2026-02-02 10:12:06.227 226298 DEBUG oslo_concurrency.lockutils [None req-81cd9255-4fb5-4965-b4e9-9af24cd196fd 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Acquiring lock "42dc4712-7770-4ecd-abba-8c8e970f8e46" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 02 10:12:06 compute-1 nova_compute[226294]: 2026-02-02 10:12:06.228 226298 DEBUG oslo_concurrency.lockutils [None req-81cd9255-4fb5-4965-b4e9-9af24cd196fd 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Lock "42dc4712-7770-4ecd-abba-8c8e970f8e46" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 02 10:12:06 compute-1 nova_compute[226294]: 2026-02-02 10:12:06.229 226298 DEBUG oslo_concurrency.lockutils [None req-81cd9255-4fb5-4965-b4e9-9af24cd196fd 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Acquiring lock "42dc4712-7770-4ecd-abba-8c8e970f8e46-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 02 10:12:06 compute-1 nova_compute[226294]: 2026-02-02 10:12:06.229 226298 DEBUG oslo_concurrency.lockutils [None req-81cd9255-4fb5-4965-b4e9-9af24cd196fd 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Lock "42dc4712-7770-4ecd-abba-8c8e970f8e46-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 02 10:12:06 compute-1 nova_compute[226294]: 2026-02-02 10:12:06.230 226298 DEBUG oslo_concurrency.lockutils [None req-81cd9255-4fb5-4965-b4e9-9af24cd196fd 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Lock "42dc4712-7770-4ecd-abba-8c8e970f8e46-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 02 10:12:06 compute-1 nova_compute[226294]: 2026-02-02 10:12:06.232 226298 INFO nova.compute.manager [None req-81cd9255-4fb5-4965-b4e9-9af24cd196fd 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] [instance: 42dc4712-7770-4ecd-abba-8c8e970f8e46] Terminating instance
Feb 02 10:12:06 compute-1 nova_compute[226294]: 2026-02-02 10:12:06.234 226298 DEBUG nova.compute.manager [None req-81cd9255-4fb5-4965-b4e9-9af24cd196fd 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] [instance: 42dc4712-7770-4ecd-abba-8c8e970f8e46] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Feb 02 10:12:06 compute-1 kernel: tap29f94a0b-58 (unregistering): left promiscuous mode
Feb 02 10:12:06 compute-1 NetworkManager[49055]: <info>  [1770027126.3116] device (tap29f94a0b-58): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Feb 02 10:12:06 compute-1 ovn_controller[133666]: 2026-02-02T10:12:06Z|00065|binding|INFO|Releasing lport 29f94a0b-58b9-437a-9157-c3ce95454def from this chassis (sb_readonly=0)
Feb 02 10:12:06 compute-1 ovn_controller[133666]: 2026-02-02T10:12:06Z|00066|binding|INFO|Setting lport 29f94a0b-58b9-437a-9157-c3ce95454def down in Southbound
Feb 02 10:12:06 compute-1 nova_compute[226294]: 2026-02-02 10:12:06.330 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:12:06 compute-1 ovn_controller[133666]: 2026-02-02T10:12:06Z|00067|binding|INFO|Removing iface tap29f94a0b-58 ovn-installed in OVS
Feb 02 10:12:06 compute-1 nova_compute[226294]: 2026-02-02 10:12:06.341 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:12:06 compute-1 ovn_metadata_agent[143537]: 2026-02-02 10:12:06.351 143542 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:5f:0c:ce 10.100.0.6'], port_security=['fa:16:3e:5f:0c:ce 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '42dc4712-7770-4ecd-abba-8c8e970f8e46', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-07b5f9e6-a53d-47d1-be8b-5269063b871d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'efbfe697ca674d72b47da5adf3e42c0c', 'neutron:revision_number': '8', 'neutron:security_group_ids': 'a8371570-b364-43fb-9d49-41b819ae5fa9', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=7385ccf6-5875-4ca6-bbfb-418e49c25618, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f9094efd8b0>], logical_port=29f94a0b-58b9-437a-9157-c3ce95454def) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f9094efd8b0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 02 10:12:06 compute-1 ovn_metadata_agent[143537]: 2026-02-02 10:12:06.354 143542 INFO neutron.agent.ovn.metadata.agent [-] Port 29f94a0b-58b9-437a-9157-c3ce95454def in datapath 07b5f9e6-a53d-47d1-be8b-5269063b871d unbound from our chassis
Feb 02 10:12:06 compute-1 ovn_metadata_agent[143537]: 2026-02-02 10:12:06.356 143542 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 07b5f9e6-a53d-47d1-be8b-5269063b871d, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Feb 02 10:12:06 compute-1 ovn_metadata_agent[143537]: 2026-02-02 10:12:06.358 229827 DEBUG oslo.privsep.daemon [-] privsep: reply[3a7dc1d2-20ec-4862-a957-780e73dd6586]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 02 10:12:06 compute-1 ovn_metadata_agent[143537]: 2026-02-02 10:12:06.359 143542 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-07b5f9e6-a53d-47d1-be8b-5269063b871d namespace which is not needed anymore
Feb 02 10:12:06 compute-1 systemd[1]: machine-qemu\x2d3\x2dinstance\x2d0000000b.scope: Deactivated successfully.
Feb 02 10:12:06 compute-1 systemd[1]: machine-qemu\x2d3\x2dinstance\x2d0000000b.scope: Consumed 14.328s CPU time.
Feb 02 10:12:06 compute-1 systemd-machined[195072]: Machine qemu-3-instance-0000000b terminated.
Feb 02 10:12:06 compute-1 nova_compute[226294]: 2026-02-02 10:12:06.399 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:12:06 compute-1 nova_compute[226294]: 2026-02-02 10:12:06.467 226298 INFO nova.virt.libvirt.driver [-] [instance: 42dc4712-7770-4ecd-abba-8c8e970f8e46] Instance destroyed successfully.
Feb 02 10:12:06 compute-1 nova_compute[226294]: 2026-02-02 10:12:06.467 226298 DEBUG nova.objects.instance [None req-81cd9255-4fb5-4965-b4e9-9af24cd196fd 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Lazy-loading 'resources' on Instance uuid 42dc4712-7770-4ecd-abba-8c8e970f8e46 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 02 10:12:06 compute-1 neutron-haproxy-ovnmeta-07b5f9e6-a53d-47d1-be8b-5269063b871d[235781]: [NOTICE]   (235785) : haproxy version is 2.8.14-c23fe91
Feb 02 10:12:06 compute-1 neutron-haproxy-ovnmeta-07b5f9e6-a53d-47d1-be8b-5269063b871d[235781]: [NOTICE]   (235785) : path to executable is /usr/sbin/haproxy
Feb 02 10:12:06 compute-1 neutron-haproxy-ovnmeta-07b5f9e6-a53d-47d1-be8b-5269063b871d[235781]: [WARNING]  (235785) : Exiting Master process...
Feb 02 10:12:06 compute-1 neutron-haproxy-ovnmeta-07b5f9e6-a53d-47d1-be8b-5269063b871d[235781]: [ALERT]    (235785) : Current worker (235787) exited with code 143 (Terminated)
Feb 02 10:12:06 compute-1 neutron-haproxy-ovnmeta-07b5f9e6-a53d-47d1-be8b-5269063b871d[235781]: [WARNING]  (235785) : All workers exited. Exiting... (0)
Feb 02 10:12:06 compute-1 systemd[1]: libpod-4f62f65d56fc12e0d09bac18f19d04c76adbe3def2882e05c580438f3f6680e7.scope: Deactivated successfully.
Feb 02 10:12:06 compute-1 podman[236144]: 2026-02-02 10:12:06.514401991 +0000 UTC m=+0.054444277 container died 4f62f65d56fc12e0d09bac18f19d04c76adbe3def2882e05c580438f3f6680e7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc, name=neutron-haproxy-ovnmeta-07b5f9e6-a53d-47d1-be8b-5269063b871d, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Feb 02 10:12:06 compute-1 ceph-mon[80115]: pgmap v980: 353 pgs: 353 active+clean; 121 MiB data, 347 MiB used, 60 GiB / 60 GiB avail; 26 KiB/s rd, 20 KiB/s wr, 30 op/s
Feb 02 10:12:06 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-4f62f65d56fc12e0d09bac18f19d04c76adbe3def2882e05c580438f3f6680e7-userdata-shm.mount: Deactivated successfully.
Feb 02 10:12:06 compute-1 systemd[1]: var-lib-containers-storage-overlay-26cfac83bec4483eeb9fd6487ef88b8a7ccc7477473882fbfbe7498fdcc8d7a6-merged.mount: Deactivated successfully.
Feb 02 10:12:06 compute-1 podman[236144]: 2026-02-02 10:12:06.662320649 +0000 UTC m=+0.202362925 container cleanup 4f62f65d56fc12e0d09bac18f19d04c76adbe3def2882e05c580438f3f6680e7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc, name=neutron-haproxy-ovnmeta-07b5f9e6-a53d-47d1-be8b-5269063b871d, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2)
Feb 02 10:12:06 compute-1 systemd[1]: libpod-conmon-4f62f65d56fc12e0d09bac18f19d04c76adbe3def2882e05c580438f3f6680e7.scope: Deactivated successfully.
Feb 02 10:12:06 compute-1 nova_compute[226294]: 2026-02-02 10:12:06.698 226298 DEBUG nova.virt.libvirt.vif [None req-81cd9255-4fb5-4965-b4e9-9af24cd196fd 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-02-02T10:11:02Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1589717047',display_name='tempest-TestNetworkBasicOps-server-1589717047',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1589717047',id=11,image_ref='d5e062d7-95ef-409c-9ad0-60f7cf6f44ce',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBFKLhtXPFNo+7qMy7WM4rXv1wxOn6wk80R7orPjLFemWslU1farAMLdF2l7TazRd92gQv0m2wSsyelv9AIIl5lW/89YdwjsAA40J0bv4RJZ9H+7Em3wwtPI4Gx0836EIRw==',key_name='tempest-TestNetworkBasicOps-746965999',keypairs=<?>,launch_index=0,launched_at=2026-02-02T10:11:10Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='efbfe697ca674d72b47da5adf3e42c0c',ramdisk_id='',reservation_id='r-n5k0k93c',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='d5e062d7-95ef-409c-9ad0-60f7cf6f44ce',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-793549693',owner_user_name='tempest-TestNetworkBasicOps-793549693-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-02-02T10:11:10Z,user_data=None,user_id='1b1695a2a70d4aa0aa350ba17d8f6d5e',uuid=42dc4712-7770-4ecd-abba-8c8e970f8e46,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "29f94a0b-58b9-437a-9157-c3ce95454def", "address": "fa:16:3e:5f:0c:ce", "network": {"id": "07b5f9e6-a53d-47d1-be8b-5269063b871d", "bridge": "br-int", "label": "tempest-network-smoke--1567229594", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.208", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "efbfe697ca674d72b47da5adf3e42c0c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap29f94a0b-58", "ovs_interfaceid": "29f94a0b-58b9-437a-9157-c3ce95454def", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Feb 02 10:12:06 compute-1 nova_compute[226294]: 2026-02-02 10:12:06.699 226298 DEBUG nova.network.os_vif_util [None req-81cd9255-4fb5-4965-b4e9-9af24cd196fd 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Converting VIF {"id": "29f94a0b-58b9-437a-9157-c3ce95454def", "address": "fa:16:3e:5f:0c:ce", "network": {"id": "07b5f9e6-a53d-47d1-be8b-5269063b871d", "bridge": "br-int", "label": "tempest-network-smoke--1567229594", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.208", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "efbfe697ca674d72b47da5adf3e42c0c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap29f94a0b-58", "ovs_interfaceid": "29f94a0b-58b9-437a-9157-c3ce95454def", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 02 10:12:06 compute-1 nova_compute[226294]: 2026-02-02 10:12:06.700 226298 DEBUG nova.network.os_vif_util [None req-81cd9255-4fb5-4965-b4e9-9af24cd196fd 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:5f:0c:ce,bridge_name='br-int',has_traffic_filtering=True,id=29f94a0b-58b9-437a-9157-c3ce95454def,network=Network(07b5f9e6-a53d-47d1-be8b-5269063b871d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap29f94a0b-58') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 02 10:12:06 compute-1 nova_compute[226294]: 2026-02-02 10:12:06.701 226298 DEBUG os_vif [None req-81cd9255-4fb5-4965-b4e9-9af24cd196fd 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:5f:0c:ce,bridge_name='br-int',has_traffic_filtering=True,id=29f94a0b-58b9-437a-9157-c3ce95454def,network=Network(07b5f9e6-a53d-47d1-be8b-5269063b871d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap29f94a0b-58') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Feb 02 10:12:06 compute-1 podman[236166]: 2026-02-02 10:12:06.703112612 +0000 UTC m=+0.194903267 container health_status 1fb2696999ea15e9688144989093738543d3cef725f9986792d6dfd707aefbe2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:099d88ae13fa2b3409da5310cdcba7fa01d2c87a8bc98296299a57054b9a075e, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260127, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_controller, io.buildah.version=1.41.3, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'db4758ee7523fe447444c4bd2b867b543b1eee4e3bbcf6676cd1b27bf6147d86-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:099d88ae13fa2b3409da5310cdcba7fa01d2c87a8bc98296299a57054b9a075e', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Feb 02 10:12:06 compute-1 nova_compute[226294]: 2026-02-02 10:12:06.704 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:12:06 compute-1 nova_compute[226294]: 2026-02-02 10:12:06.704 226298 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap29f94a0b-58, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 02 10:12:06 compute-1 nova_compute[226294]: 2026-02-02 10:12:06.708 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:12:06 compute-1 nova_compute[226294]: 2026-02-02 10:12:06.711 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 02 10:12:06 compute-1 nova_compute[226294]: 2026-02-02 10:12:06.716 226298 INFO os_vif [None req-81cd9255-4fb5-4965-b4e9-9af24cd196fd 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:5f:0c:ce,bridge_name='br-int',has_traffic_filtering=True,id=29f94a0b-58b9-437a-9157-c3ce95454def,network=Network(07b5f9e6-a53d-47d1-be8b-5269063b871d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap29f94a0b-58')
Feb 02 10:12:06 compute-1 podman[236205]: 2026-02-02 10:12:06.745417136 +0000 UTC m=+0.058800783 container remove 4f62f65d56fc12e0d09bac18f19d04c76adbe3def2882e05c580438f3f6680e7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc, name=neutron-haproxy-ovnmeta-07b5f9e6-a53d-47d1-be8b-5269063b871d, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 02 10:12:06 compute-1 ovn_metadata_agent[143537]: 2026-02-02 10:12:06.751 229827 DEBUG oslo.privsep.daemon [-] privsep: reply[b1fc8833-f5ba-4c5f-9025-bc89b9e133d6]: (4, ('Mon Feb  2 10:12:06 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-07b5f9e6-a53d-47d1-be8b-5269063b871d (4f62f65d56fc12e0d09bac18f19d04c76adbe3def2882e05c580438f3f6680e7)\n4f62f65d56fc12e0d09bac18f19d04c76adbe3def2882e05c580438f3f6680e7\nMon Feb  2 10:12:06 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-07b5f9e6-a53d-47d1-be8b-5269063b871d (4f62f65d56fc12e0d09bac18f19d04c76adbe3def2882e05c580438f3f6680e7)\n4f62f65d56fc12e0d09bac18f19d04c76adbe3def2882e05c580438f3f6680e7\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 02 10:12:06 compute-1 ovn_metadata_agent[143537]: 2026-02-02 10:12:06.754 229827 DEBUG oslo.privsep.daemon [-] privsep: reply[039239ce-de94-4503-8815-1003ad9183b7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 02 10:12:06 compute-1 ovn_metadata_agent[143537]: 2026-02-02 10:12:06.755 143542 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap07b5f9e6-a0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 02 10:12:06 compute-1 nova_compute[226294]: 2026-02-02 10:12:06.757 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:12:06 compute-1 kernel: tap07b5f9e6-a0: left promiscuous mode
Feb 02 10:12:06 compute-1 nova_compute[226294]: 2026-02-02 10:12:06.767 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:12:06 compute-1 ovn_metadata_agent[143537]: 2026-02-02 10:12:06.770 229827 DEBUG oslo.privsep.daemon [-] privsep: reply[5350d28a-f5c5-4613-8adf-ec18c748967c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 02 10:12:06 compute-1 ovn_metadata_agent[143537]: 2026-02-02 10:12:06.788 229827 DEBUG oslo.privsep.daemon [-] privsep: reply[bd53ca92-338e-4650-ac7e-5a80c05a9eb3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 02 10:12:06 compute-1 ovn_metadata_agent[143537]: 2026-02-02 10:12:06.790 229827 DEBUG oslo.privsep.daemon [-] privsep: reply[29894cd1-9c6e-48f1-9839-a5d9a663313f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 02 10:12:06 compute-1 ovn_metadata_agent[143537]: 2026-02-02 10:12:06.806 229827 DEBUG oslo.privsep.daemon [-] privsep: reply[db61f0a2-6f45-4e82-85d5-faa37f20be3a]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 423203, 'reachable_time': 32173, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 236244, 'error': None, 'target': 'ovnmeta-07b5f9e6-a53d-47d1-be8b-5269063b871d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 02 10:12:06 compute-1 ovn_metadata_agent[143537]: 2026-02-02 10:12:06.809 143813 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-07b5f9e6-a53d-47d1-be8b-5269063b871d deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Feb 02 10:12:06 compute-1 ovn_metadata_agent[143537]: 2026-02-02 10:12:06.809 143813 DEBUG oslo.privsep.daemon [-] privsep: reply[3ffa7964-ddc5-4f95-8552-553e1e956488]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 02 10:12:06 compute-1 systemd[1]: run-netns-ovnmeta\x2d07b5f9e6\x2da53d\x2d47d1\x2dbe8b\x2d5269063b871d.mount: Deactivated successfully.
Feb 02 10:12:06 compute-1 nova_compute[226294]: 2026-02-02 10:12:06.873 226298 DEBUG nova.compute.manager [req-83ce3861-b05f-43d2-ab9b-3956c691206d req-35e130cc-bbc7-4ce3-99fb-6cb43fb905d6 b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] [instance: 42dc4712-7770-4ecd-abba-8c8e970f8e46] Received event network-changed-29f94a0b-58b9-437a-9157-c3ce95454def external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 02 10:12:06 compute-1 nova_compute[226294]: 2026-02-02 10:12:06.873 226298 DEBUG nova.compute.manager [req-83ce3861-b05f-43d2-ab9b-3956c691206d req-35e130cc-bbc7-4ce3-99fb-6cb43fb905d6 b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] [instance: 42dc4712-7770-4ecd-abba-8c8e970f8e46] Refreshing instance network info cache due to event network-changed-29f94a0b-58b9-437a-9157-c3ce95454def. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Feb 02 10:12:06 compute-1 nova_compute[226294]: 2026-02-02 10:12:06.874 226298 DEBUG oslo_concurrency.lockutils [req-83ce3861-b05f-43d2-ab9b-3956c691206d req-35e130cc-bbc7-4ce3-99fb-6cb43fb905d6 b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] Acquiring lock "refresh_cache-42dc4712-7770-4ecd-abba-8c8e970f8e46" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 02 10:12:06 compute-1 nova_compute[226294]: 2026-02-02 10:12:06.874 226298 DEBUG oslo_concurrency.lockutils [req-83ce3861-b05f-43d2-ab9b-3956c691206d req-35e130cc-bbc7-4ce3-99fb-6cb43fb905d6 b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] Acquired lock "refresh_cache-42dc4712-7770-4ecd-abba-8c8e970f8e46" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 02 10:12:06 compute-1 nova_compute[226294]: 2026-02-02 10:12:06.874 226298 DEBUG nova.network.neutron [req-83ce3861-b05f-43d2-ab9b-3956c691206d req-35e130cc-bbc7-4ce3-99fb-6cb43fb905d6 b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] [instance: 42dc4712-7770-4ecd-abba-8c8e970f8e46] Refreshing network info cache for port 29f94a0b-58b9-437a-9157-c3ce95454def _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Feb 02 10:12:07 compute-1 nova_compute[226294]: 2026-02-02 10:12:07.176 226298 INFO nova.virt.libvirt.driver [None req-81cd9255-4fb5-4965-b4e9-9af24cd196fd 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] [instance: 42dc4712-7770-4ecd-abba-8c8e970f8e46] Deleting instance files /var/lib/nova/instances/42dc4712-7770-4ecd-abba-8c8e970f8e46_del
Feb 02 10:12:07 compute-1 nova_compute[226294]: 2026-02-02 10:12:07.177 226298 INFO nova.virt.libvirt.driver [None req-81cd9255-4fb5-4965-b4e9-9af24cd196fd 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] [instance: 42dc4712-7770-4ecd-abba-8c8e970f8e46] Deletion of /var/lib/nova/instances/42dc4712-7770-4ecd-abba-8c8e970f8e46_del complete
Feb 02 10:12:07 compute-1 nova_compute[226294]: 2026-02-02 10:12:07.229 226298 INFO nova.compute.manager [None req-81cd9255-4fb5-4965-b4e9-9af24cd196fd 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] [instance: 42dc4712-7770-4ecd-abba-8c8e970f8e46] Took 0.99 seconds to destroy the instance on the hypervisor.
Feb 02 10:12:07 compute-1 nova_compute[226294]: 2026-02-02 10:12:07.230 226298 DEBUG oslo.service.loopingcall [None req-81cd9255-4fb5-4965-b4e9-9af24cd196fd 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Feb 02 10:12:07 compute-1 nova_compute[226294]: 2026-02-02 10:12:07.230 226298 DEBUG nova.compute.manager [-] [instance: 42dc4712-7770-4ecd-abba-8c8e970f8e46] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Feb 02 10:12:07 compute-1 nova_compute[226294]: 2026-02-02 10:12:07.231 226298 DEBUG nova.network.neutron [-] [instance: 42dc4712-7770-4ecd-abba-8c8e970f8e46] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Feb 02 10:12:07 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 10:12:07 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:12:07 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:12:07 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:12:07.459 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:12:07 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:12:07 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:12:07 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:12:07.499 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:12:08 compute-1 nova_compute[226294]: 2026-02-02 10:12:08.200 226298 DEBUG nova.compute.manager [req-b501d134-1f7c-4a1d-84b1-4381e36cd594 req-30cad1bf-c8ec-43a7-beeb-c8a6b3c96cd4 b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] [instance: 42dc4712-7770-4ecd-abba-8c8e970f8e46] Received event network-vif-unplugged-29f94a0b-58b9-437a-9157-c3ce95454def external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 02 10:12:08 compute-1 nova_compute[226294]: 2026-02-02 10:12:08.200 226298 DEBUG oslo_concurrency.lockutils [req-b501d134-1f7c-4a1d-84b1-4381e36cd594 req-30cad1bf-c8ec-43a7-beeb-c8a6b3c96cd4 b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] Acquiring lock "42dc4712-7770-4ecd-abba-8c8e970f8e46-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 02 10:12:08 compute-1 nova_compute[226294]: 2026-02-02 10:12:08.201 226298 DEBUG oslo_concurrency.lockutils [req-b501d134-1f7c-4a1d-84b1-4381e36cd594 req-30cad1bf-c8ec-43a7-beeb-c8a6b3c96cd4 b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] Lock "42dc4712-7770-4ecd-abba-8c8e970f8e46-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 02 10:12:08 compute-1 nova_compute[226294]: 2026-02-02 10:12:08.201 226298 DEBUG oslo_concurrency.lockutils [req-b501d134-1f7c-4a1d-84b1-4381e36cd594 req-30cad1bf-c8ec-43a7-beeb-c8a6b3c96cd4 b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] Lock "42dc4712-7770-4ecd-abba-8c8e970f8e46-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 02 10:12:08 compute-1 nova_compute[226294]: 2026-02-02 10:12:08.202 226298 DEBUG nova.compute.manager [req-b501d134-1f7c-4a1d-84b1-4381e36cd594 req-30cad1bf-c8ec-43a7-beeb-c8a6b3c96cd4 b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] [instance: 42dc4712-7770-4ecd-abba-8c8e970f8e46] No waiting events found dispatching network-vif-unplugged-29f94a0b-58b9-437a-9157-c3ce95454def pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 02 10:12:08 compute-1 nova_compute[226294]: 2026-02-02 10:12:08.202 226298 DEBUG nova.compute.manager [req-b501d134-1f7c-4a1d-84b1-4381e36cd594 req-30cad1bf-c8ec-43a7-beeb-c8a6b3c96cd4 b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] [instance: 42dc4712-7770-4ecd-abba-8c8e970f8e46] Received event network-vif-unplugged-29f94a0b-58b9-437a-9157-c3ce95454def for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Feb 02 10:12:08 compute-1 ceph-mon[80115]: pgmap v981: 353 pgs: 353 active+clean; 121 MiB data, 347 MiB used, 60 GiB / 60 GiB avail; 26 KiB/s rd, 6.7 KiB/s wr, 29 op/s
Feb 02 10:12:08 compute-1 nova_compute[226294]: 2026-02-02 10:12:08.939 226298 DEBUG nova.network.neutron [-] [instance: 42dc4712-7770-4ecd-abba-8c8e970f8e46] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 02 10:12:08 compute-1 nova_compute[226294]: 2026-02-02 10:12:08.973 226298 INFO nova.compute.manager [-] [instance: 42dc4712-7770-4ecd-abba-8c8e970f8e46] Took 1.74 seconds to deallocate network for instance.
Feb 02 10:12:09 compute-1 nova_compute[226294]: 2026-02-02 10:12:09.090 226298 DEBUG oslo_concurrency.lockutils [None req-81cd9255-4fb5-4965-b4e9-9af24cd196fd 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 02 10:12:09 compute-1 nova_compute[226294]: 2026-02-02 10:12:09.090 226298 DEBUG oslo_concurrency.lockutils [None req-81cd9255-4fb5-4965-b4e9-9af24cd196fd 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 02 10:12:09 compute-1 nova_compute[226294]: 2026-02-02 10:12:09.150 226298 DEBUG oslo_concurrency.processutils [None req-81cd9255-4fb5-4965-b4e9-9af24cd196fd 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 02 10:12:09 compute-1 nova_compute[226294]: 2026-02-02 10:12:09.192 226298 DEBUG nova.network.neutron [req-83ce3861-b05f-43d2-ab9b-3956c691206d req-35e130cc-bbc7-4ce3-99fb-6cb43fb905d6 b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] [instance: 42dc4712-7770-4ecd-abba-8c8e970f8e46] Updated VIF entry in instance network info cache for port 29f94a0b-58b9-437a-9157-c3ce95454def. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Feb 02 10:12:09 compute-1 nova_compute[226294]: 2026-02-02 10:12:09.193 226298 DEBUG nova.network.neutron [req-83ce3861-b05f-43d2-ab9b-3956c691206d req-35e130cc-bbc7-4ce3-99fb-6cb43fb905d6 b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] [instance: 42dc4712-7770-4ecd-abba-8c8e970f8e46] Updating instance_info_cache with network_info: [{"id": "29f94a0b-58b9-437a-9157-c3ce95454def", "address": "fa:16:3e:5f:0c:ce", "network": {"id": "07b5f9e6-a53d-47d1-be8b-5269063b871d", "bridge": "br-int", "label": "tempest-network-smoke--1567229594", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "efbfe697ca674d72b47da5adf3e42c0c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap29f94a0b-58", "ovs_interfaceid": "29f94a0b-58b9-437a-9157-c3ce95454def", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 02 10:12:09 compute-1 nova_compute[226294]: 2026-02-02 10:12:09.214 226298 DEBUG oslo_concurrency.lockutils [req-83ce3861-b05f-43d2-ab9b-3956c691206d req-35e130cc-bbc7-4ce3-99fb-6cb43fb905d6 b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] Releasing lock "refresh_cache-42dc4712-7770-4ecd-abba-8c8e970f8e46" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 02 10:12:09 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:12:09 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:12:09 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:12:09.460 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:12:09 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:12:09 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:12:09 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:12:09.501 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:12:09 compute-1 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Feb 02 10:12:09 compute-1 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3375339960' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Feb 02 10:12:09 compute-1 nova_compute[226294]: 2026-02-02 10:12:09.600 226298 DEBUG oslo_concurrency.processutils [None req-81cd9255-4fb5-4965-b4e9-9af24cd196fd 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.451s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 02 10:12:09 compute-1 nova_compute[226294]: 2026-02-02 10:12:09.607 226298 DEBUG nova.compute.provider_tree [None req-81cd9255-4fb5-4965-b4e9-9af24cd196fd 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Inventory has not changed in ProviderTree for provider: 8e32c057-ad28-4c19-8374-763e0c1c8622 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 02 10:12:09 compute-1 nova_compute[226294]: 2026-02-02 10:12:09.626 226298 DEBUG nova.scheduler.client.report [None req-81cd9255-4fb5-4965-b4e9-9af24cd196fd 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Inventory has not changed for provider 8e32c057-ad28-4c19-8374-763e0c1c8622 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 02 10:12:09 compute-1 ceph-mon[80115]: from='client.? 192.168.122.101:0/3375339960' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Feb 02 10:12:09 compute-1 nova_compute[226294]: 2026-02-02 10:12:09.657 226298 DEBUG oslo_concurrency.lockutils [None req-81cd9255-4fb5-4965-b4e9-9af24cd196fd 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.567s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 02 10:12:09 compute-1 nova_compute[226294]: 2026-02-02 10:12:09.703 226298 INFO nova.scheduler.client.report [None req-81cd9255-4fb5-4965-b4e9-9af24cd196fd 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Deleted allocations for instance 42dc4712-7770-4ecd-abba-8c8e970f8e46
Feb 02 10:12:09 compute-1 nova_compute[226294]: 2026-02-02 10:12:09.804 226298 DEBUG oslo_concurrency.lockutils [None req-81cd9255-4fb5-4965-b4e9-9af24cd196fd 1b1695a2a70d4aa0aa350ba17d8f6d5e efbfe697ca674d72b47da5adf3e42c0c - - default default] Lock "42dc4712-7770-4ecd-abba-8c8e970f8e46" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.575s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 02 10:12:10 compute-1 nova_compute[226294]: 2026-02-02 10:12:10.341 226298 DEBUG nova.compute.manager [req-393858a0-c480-4efa-86a9-a099ed9fc60c req-4c1146ab-9cb5-4a23-a2df-840e1142d921 b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] [instance: 42dc4712-7770-4ecd-abba-8c8e970f8e46] Received event network-vif-plugged-29f94a0b-58b9-437a-9157-c3ce95454def external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 02 10:12:10 compute-1 nova_compute[226294]: 2026-02-02 10:12:10.341 226298 DEBUG oslo_concurrency.lockutils [req-393858a0-c480-4efa-86a9-a099ed9fc60c req-4c1146ab-9cb5-4a23-a2df-840e1142d921 b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] Acquiring lock "42dc4712-7770-4ecd-abba-8c8e970f8e46-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 02 10:12:10 compute-1 nova_compute[226294]: 2026-02-02 10:12:10.342 226298 DEBUG oslo_concurrency.lockutils [req-393858a0-c480-4efa-86a9-a099ed9fc60c req-4c1146ab-9cb5-4a23-a2df-840e1142d921 b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] Lock "42dc4712-7770-4ecd-abba-8c8e970f8e46-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 02 10:12:10 compute-1 nova_compute[226294]: 2026-02-02 10:12:10.342 226298 DEBUG oslo_concurrency.lockutils [req-393858a0-c480-4efa-86a9-a099ed9fc60c req-4c1146ab-9cb5-4a23-a2df-840e1142d921 b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] Lock "42dc4712-7770-4ecd-abba-8c8e970f8e46-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 02 10:12:10 compute-1 nova_compute[226294]: 2026-02-02 10:12:10.342 226298 DEBUG nova.compute.manager [req-393858a0-c480-4efa-86a9-a099ed9fc60c req-4c1146ab-9cb5-4a23-a2df-840e1142d921 b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] [instance: 42dc4712-7770-4ecd-abba-8c8e970f8e46] No waiting events found dispatching network-vif-plugged-29f94a0b-58b9-437a-9157-c3ce95454def pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 02 10:12:10 compute-1 nova_compute[226294]: 2026-02-02 10:12:10.342 226298 WARNING nova.compute.manager [req-393858a0-c480-4efa-86a9-a099ed9fc60c req-4c1146ab-9cb5-4a23-a2df-840e1142d921 b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] [instance: 42dc4712-7770-4ecd-abba-8c8e970f8e46] Received unexpected event network-vif-plugged-29f94a0b-58b9-437a-9157-c3ce95454def for instance with vm_state deleted and task_state None.
Feb 02 10:12:10 compute-1 nova_compute[226294]: 2026-02-02 10:12:10.342 226298 DEBUG nova.compute.manager [req-393858a0-c480-4efa-86a9-a099ed9fc60c req-4c1146ab-9cb5-4a23-a2df-840e1142d921 b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] [instance: 42dc4712-7770-4ecd-abba-8c8e970f8e46] Received event network-vif-deleted-29f94a0b-58b9-437a-9157-c3ce95454def external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 02 10:12:10 compute-1 nova_compute[226294]: 2026-02-02 10:12:10.343 226298 INFO nova.compute.manager [req-393858a0-c480-4efa-86a9-a099ed9fc60c req-4c1146ab-9cb5-4a23-a2df-840e1142d921 b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] [instance: 42dc4712-7770-4ecd-abba-8c8e970f8e46] Neutron deleted interface 29f94a0b-58b9-437a-9157-c3ce95454def; detaching it from the instance and deleting it from the info cache
Feb 02 10:12:10 compute-1 nova_compute[226294]: 2026-02-02 10:12:10.343 226298 DEBUG nova.network.neutron [req-393858a0-c480-4efa-86a9-a099ed9fc60c req-4c1146ab-9cb5-4a23-a2df-840e1142d921 b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] [instance: 42dc4712-7770-4ecd-abba-8c8e970f8e46] Instance is deleted, no further info cache update update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:106
Feb 02 10:12:10 compute-1 nova_compute[226294]: 2026-02-02 10:12:10.346 226298 DEBUG nova.compute.manager [req-393858a0-c480-4efa-86a9-a099ed9fc60c req-4c1146ab-9cb5-4a23-a2df-840e1142d921 b497715c83c54dd784cfd8facd16e324 8bec08e43900467887b10711a12caf82 - - default default] [instance: 42dc4712-7770-4ecd-abba-8c8e970f8e46] Detach interface failed, port_id=29f94a0b-58b9-437a-9157-c3ce95454def, reason: Instance 42dc4712-7770-4ecd-abba-8c8e970f8e46 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882
Feb 02 10:12:11 compute-1 ceph-mon[80115]: pgmap v982: 353 pgs: 353 active+clean; 41 MiB data, 307 MiB used, 60 GiB / 60 GiB avail; 45 KiB/s rd, 7.9 KiB/s wr, 57 op/s
Feb 02 10:12:11 compute-1 ceph-mon[80115]: from='client.? 192.168.122.100:0/4219362887' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Feb 02 10:12:11 compute-1 ceph-mon[80115]: from='client.? 192.168.122.100:0/2263364014' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Feb 02 10:12:11 compute-1 nova_compute[226294]: 2026-02-02 10:12:11.400 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:12:11 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:12:11 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:12:11 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:12:11.462 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:12:11 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:12:11 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:12:11 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:12:11.503 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:12:11 compute-1 nova_compute[226294]: 2026-02-02 10:12:11.707 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:12:12 compute-1 ceph-mon[80115]: pgmap v983: 353 pgs: 353 active+clean; 41 MiB data, 307 MiB used, 60 GiB / 60 GiB avail; 38 KiB/s rd, 6.8 KiB/s wr, 56 op/s
Feb 02 10:12:12 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 10:12:13 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:12:13 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:12:13 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:12:13.465 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:12:13 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:12:13 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:12:13 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:12:13.505 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:12:13 compute-1 nova_compute[226294]: 2026-02-02 10:12:13.644 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 02 10:12:14 compute-1 ceph-mon[80115]: pgmap v984: 353 pgs: 353 active+clean; 41 MiB data, 307 MiB used, 60 GiB / 60 GiB avail; 38 KiB/s rd, 6.8 KiB/s wr, 56 op/s
Feb 02 10:12:14 compute-1 podman[236272]: 2026-02-02 10:12:14.391241507 +0000 UTC m=+0.066274311 container health_status 9cab238676dda9a572dafc047b624a8b78a8fbec063bf997856559897160605d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'db4758ee7523fe447444c4bd2b867b543b1eee4e3bbcf6676cd1b27bf6147d86-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Feb 02 10:12:14 compute-1 nova_compute[226294]: 2026-02-02 10:12:14.648 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 02 10:12:14 compute-1 nova_compute[226294]: 2026-02-02 10:12:14.649 226298 DEBUG nova.compute.manager [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 02 10:12:14 compute-1 nova_compute[226294]: 2026-02-02 10:12:14.649 226298 DEBUG nova.compute.manager [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 02 10:12:14 compute-1 nova_compute[226294]: 2026-02-02 10:12:14.662 226298 DEBUG nova.compute.manager [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Feb 02 10:12:14 compute-1 nova_compute[226294]: 2026-02-02 10:12:14.662 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 02 10:12:14 compute-1 nova_compute[226294]: 2026-02-02 10:12:14.662 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 02 10:12:14 compute-1 nova_compute[226294]: 2026-02-02 10:12:14.662 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 02 10:12:15 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:12:15 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 10:12:15 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:12:15.467 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 10:12:15 compute-1 ceph-mon[80115]: from='client.? 192.168.122.102:0/2445098842' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Feb 02 10:12:15 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:12:15 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 10:12:15 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:12:15.507 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 10:12:15 compute-1 nova_compute[226294]: 2026-02-02 10:12:15.649 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 02 10:12:16 compute-1 nova_compute[226294]: 2026-02-02 10:12:16.403 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:12:16 compute-1 nova_compute[226294]: 2026-02-02 10:12:16.644 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 02 10:12:16 compute-1 ceph-mon[80115]: pgmap v985: 353 pgs: 353 active+clean; 41 MiB data, 307 MiB used, 60 GiB / 60 GiB avail; 39 KiB/s rd, 6.8 KiB/s wr, 56 op/s
Feb 02 10:12:16 compute-1 ceph-mon[80115]: from='client.? 192.168.122.102:0/1968812184' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Feb 02 10:12:16 compute-1 nova_compute[226294]: 2026-02-02 10:12:16.709 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:12:16 compute-1 nova_compute[226294]: 2026-02-02 10:12:16.710 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:12:16 compute-1 nova_compute[226294]: 2026-02-02 10:12:16.755 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:12:17 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 10:12:17 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:12:17 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:12:17 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:12:17.470 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:12:17 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:12:17 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:12:17 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:12:17.510 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:12:17 compute-1 nova_compute[226294]: 2026-02-02 10:12:17.648 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 02 10:12:17 compute-1 nova_compute[226294]: 2026-02-02 10:12:17.648 226298 DEBUG nova.compute.manager [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 02 10:12:17 compute-1 nova_compute[226294]: 2026-02-02 10:12:17.649 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 02 10:12:17 compute-1 nova_compute[226294]: 2026-02-02 10:12:17.717 226298 DEBUG oslo_concurrency.lockutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 02 10:12:17 compute-1 nova_compute[226294]: 2026-02-02 10:12:17.718 226298 DEBUG oslo_concurrency.lockutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 02 10:12:17 compute-1 nova_compute[226294]: 2026-02-02 10:12:17.718 226298 DEBUG oslo_concurrency.lockutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 02 10:12:17 compute-1 nova_compute[226294]: 2026-02-02 10:12:17.718 226298 DEBUG nova.compute.resource_tracker [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 02 10:12:17 compute-1 nova_compute[226294]: 2026-02-02 10:12:17.719 226298 DEBUG oslo_concurrency.processutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 02 10:12:17 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Feb 02 10:12:18 compute-1 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Feb 02 10:12:18 compute-1 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1372985538' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Feb 02 10:12:18 compute-1 nova_compute[226294]: 2026-02-02 10:12:18.213 226298 DEBUG oslo_concurrency.processutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.494s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 02 10:12:18 compute-1 nova_compute[226294]: 2026-02-02 10:12:18.388 226298 WARNING nova.virt.libvirt.driver [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 02 10:12:18 compute-1 nova_compute[226294]: 2026-02-02 10:12:18.389 226298 DEBUG nova.compute.resource_tracker [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4911MB free_disk=59.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 02 10:12:18 compute-1 nova_compute[226294]: 2026-02-02 10:12:18.390 226298 DEBUG oslo_concurrency.lockutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 02 10:12:18 compute-1 nova_compute[226294]: 2026-02-02 10:12:18.390 226298 DEBUG oslo_concurrency.lockutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 02 10:12:18 compute-1 nova_compute[226294]: 2026-02-02 10:12:18.471 226298 DEBUG nova.compute.resource_tracker [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 02 10:12:18 compute-1 nova_compute[226294]: 2026-02-02 10:12:18.472 226298 DEBUG nova.compute.resource_tracker [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 02 10:12:18 compute-1 nova_compute[226294]: 2026-02-02 10:12:18.507 226298 DEBUG oslo_concurrency.processutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 02 10:12:18 compute-1 ceph-mon[80115]: pgmap v986: 353 pgs: 353 active+clean; 41 MiB data, 307 MiB used, 60 GiB / 60 GiB avail; 19 KiB/s rd, 1.2 KiB/s wr, 28 op/s
Feb 02 10:12:18 compute-1 ceph-mon[80115]: from='client.? 192.168.122.101:0/1372985538' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Feb 02 10:12:18 compute-1 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Feb 02 10:12:18 compute-1 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/4098705518' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Feb 02 10:12:18 compute-1 nova_compute[226294]: 2026-02-02 10:12:18.971 226298 DEBUG oslo_concurrency.processutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.464s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 02 10:12:18 compute-1 nova_compute[226294]: 2026-02-02 10:12:18.977 226298 DEBUG nova.compute.provider_tree [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Inventory has not changed in ProviderTree for provider: 8e32c057-ad28-4c19-8374-763e0c1c8622 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 02 10:12:18 compute-1 nova_compute[226294]: 2026-02-02 10:12:18.992 226298 DEBUG nova.scheduler.client.report [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Inventory has not changed for provider 8e32c057-ad28-4c19-8374-763e0c1c8622 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 02 10:12:19 compute-1 nova_compute[226294]: 2026-02-02 10:12:19.021 226298 DEBUG nova.compute.resource_tracker [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 02 10:12:19 compute-1 nova_compute[226294]: 2026-02-02 10:12:19.021 226298 DEBUG oslo_concurrency.lockutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.631s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 02 10:12:19 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:12:19 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 10:12:19 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:12:19.472 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 10:12:19 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:12:19 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:12:19 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:12:19.512 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:12:19 compute-1 ceph-mon[80115]: from='client.? 192.168.122.101:0/4098705518' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Feb 02 10:12:20 compute-1 sudo[236342]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Feb 02 10:12:20 compute-1 sudo[236342]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 10:12:20 compute-1 sudo[236342]: pam_unix(sudo:session): session closed for user root
Feb 02 10:12:20 compute-1 ceph-mon[80115]: pgmap v987: 353 pgs: 353 active+clean; 41 MiB data, 307 MiB used, 60 GiB / 60 GiB avail; 20 KiB/s rd, 1.2 KiB/s wr, 28 op/s
Feb 02 10:12:21 compute-1 nova_compute[226294]: 2026-02-02 10:12:21.023 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 02 10:12:21 compute-1 nova_compute[226294]: 2026-02-02 10:12:21.404 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:12:21 compute-1 nova_compute[226294]: 2026-02-02 10:12:21.465 226298 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1770027126.4637406, 42dc4712-7770-4ecd-abba-8c8e970f8e46 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 02 10:12:21 compute-1 nova_compute[226294]: 2026-02-02 10:12:21.465 226298 INFO nova.compute.manager [-] [instance: 42dc4712-7770-4ecd-abba-8c8e970f8e46] VM Stopped (Lifecycle Event)
Feb 02 10:12:21 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:12:21 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:12:21 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:12:21.475 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:12:21 compute-1 nova_compute[226294]: 2026-02-02 10:12:21.498 226298 DEBUG nova.compute.manager [None req-8bcc4e35-da37-46aa-9f06-793d17be0da9 - - - - - -] [instance: 42dc4712-7770-4ecd-abba-8c8e970f8e46] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 02 10:12:21 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:12:21 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:12:21 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:12:21.514 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:12:21 compute-1 nova_compute[226294]: 2026-02-02 10:12:21.711 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:12:21 compute-1 ceph-mon[80115]: pgmap v988: 353 pgs: 353 active+clean; 41 MiB data, 307 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Feb 02 10:12:22 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 10:12:23 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:12:23 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:12:23 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:12:23.477 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:12:23 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:12:23 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 10:12:23 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:12:23.516 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 10:12:24 compute-1 ceph-mon[80115]: pgmap v989: 353 pgs: 353 active+clean; 41 MiB data, 307 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Feb 02 10:12:25 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:12:25 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:12:25 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:12:25.479 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:12:25 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:12:25 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:12:25 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:12:25.519 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:12:26 compute-1 nova_compute[226294]: 2026-02-02 10:12:26.449 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:12:26 compute-1 ceph-mon[80115]: pgmap v990: 353 pgs: 353 active+clean; 41 MiB data, 307 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Feb 02 10:12:26 compute-1 nova_compute[226294]: 2026-02-02 10:12:26.713 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:12:27 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 10:12:27 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:12:27 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:12:27 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:12:27.481 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:12:27 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:12:27 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:12:27 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:12:27.521 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:12:28 compute-1 ceph-mon[80115]: pgmap v991: 353 pgs: 353 active+clean; 41 MiB data, 307 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Feb 02 10:12:29 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:12:29 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 10:12:29 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:12:29.483 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 10:12:29 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:12:29 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 10:12:29 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:12:29.522 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 10:12:30 compute-1 ceph-mon[80115]: pgmap v992: 353 pgs: 353 active+clean; 41 MiB data, 307 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Feb 02 10:12:31 compute-1 nova_compute[226294]: 2026-02-02 10:12:31.452 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:12:31 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:12:31 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 10:12:31 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:12:31.487 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 10:12:31 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:12:31 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:12:31 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:12:31.526 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:12:31 compute-1 nova_compute[226294]: 2026-02-02 10:12:31.715 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:12:32 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 10:12:32 compute-1 ceph-mon[80115]: pgmap v993: 353 pgs: 353 active+clean; 41 MiB data, 307 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Feb 02 10:12:32 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Feb 02 10:12:33 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:12:33 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:12:33 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:12:33.490 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:12:33 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:12:33 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:12:33 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:12:33.530 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:12:33 compute-1 ceph-mon[80115]: pgmap v994: 353 pgs: 353 active+clean; 41 MiB data, 307 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Feb 02 10:12:35 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:12:35 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 10:12:35 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:12:35.492 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 10:12:35 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:12:35 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:12:35 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:12:35.532 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:12:35 compute-1 ceph-mon[80115]: pgmap v995: 353 pgs: 353 active+clean; 41 MiB data, 307 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Feb 02 10:12:35 compute-1 ceph-mon[80115]: from='client.? 192.168.122.100:0/3176901229' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Feb 02 10:12:36 compute-1 nova_compute[226294]: 2026-02-02 10:12:36.454 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:12:36 compute-1 nova_compute[226294]: 2026-02-02 10:12:36.717 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:12:36 compute-1 ceph-mon[80115]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #58. Immutable memtables: 0.
Feb 02 10:12:36 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:12:36.886176) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Feb 02 10:12:36 compute-1 ceph-mon[80115]: rocksdb: [db/flush_job.cc:856] [default] [JOB 33] Flushing memtable with next log file: 58
Feb 02 10:12:36 compute-1 ceph-mon[80115]: rocksdb: EVENT_LOG_v1 {"time_micros": 1770027156886237, "job": 33, "event": "flush_started", "num_memtables": 1, "num_entries": 1403, "num_deletes": 501, "total_data_size": 2530714, "memory_usage": 2581072, "flush_reason": "Manual Compaction"}
Feb 02 10:12:36 compute-1 ceph-mon[80115]: rocksdb: [db/flush_job.cc:885] [default] [JOB 33] Level-0 flush table #59: started
Feb 02 10:12:36 compute-1 ceph-mon[80115]: rocksdb: EVENT_LOG_v1 {"time_micros": 1770027156900373, "cf_name": "default", "job": 33, "event": "table_file_creation", "file_number": 59, "file_size": 1165947, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 29852, "largest_seqno": 31250, "table_properties": {"data_size": 1161140, "index_size": 1755, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1989, "raw_key_size": 15357, "raw_average_key_size": 19, "raw_value_size": 1148938, "raw_average_value_size": 1467, "num_data_blocks": 77, "num_entries": 783, "num_filter_entries": 783, "num_deletions": 501, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1770027068, "oldest_key_time": 1770027068, "file_creation_time": 1770027156, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "fac1d709-8a2a-487d-b05b-57255ec289c7", "db_session_id": "DE871D21TSCUFP8UED8E", "orig_file_number": 59, "seqno_to_time_mapping": "N/A"}}
Feb 02 10:12:36 compute-1 ceph-mon[80115]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 33] Flush lasted 14229 microseconds, and 2708 cpu microseconds.
Feb 02 10:12:36 compute-1 ceph-mon[80115]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Feb 02 10:12:36 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:12:36.900415) [db/flush_job.cc:967] [default] [JOB 33] Level-0 flush table #59: 1165947 bytes OK
Feb 02 10:12:36 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:12:36.900432) [db/memtable_list.cc:519] [default] Level-0 commit table #59 started
Feb 02 10:12:36 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:12:36.904572) [db/memtable_list.cc:722] [default] Level-0 commit table #59: memtable #1 done
Feb 02 10:12:36 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:12:36.904592) EVENT_LOG_v1 {"time_micros": 1770027156904587, "job": 33, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Feb 02 10:12:36 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:12:36.904609) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Feb 02 10:12:36 compute-1 ceph-mon[80115]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 33] Try to delete WAL files size 2523116, prev total WAL file size 2523116, number of live WAL files 2.
Feb 02 10:12:36 compute-1 ceph-mon[80115]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000055.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb 02 10:12:36 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:12:36.905374) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730032323539' seq:72057594037927935, type:22 .. '7061786F730032353131' seq:0, type:0; will stop at (end)
Feb 02 10:12:36 compute-1 ceph-mon[80115]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 34] Compacting 1@0 + 1@6 files to L6, score -1.00
Feb 02 10:12:36 compute-1 ceph-mon[80115]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 33 Base level 0, inputs: [59(1138KB)], [57(15MB)]
Feb 02 10:12:36 compute-1 ceph-mon[80115]: rocksdb: EVENT_LOG_v1 {"time_micros": 1770027156905450, "job": 34, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [59], "files_L6": [57], "score": -1, "input_data_size": 17278607, "oldest_snapshot_seqno": -1}
Feb 02 10:12:37 compute-1 ceph-mon[80115]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 34] Generated table #60: 5786 keys, 11645443 bytes, temperature: kUnknown
Feb 02 10:12:37 compute-1 ceph-mon[80115]: rocksdb: EVENT_LOG_v1 {"time_micros": 1770027157034421, "cf_name": "default", "job": 34, "event": "table_file_creation", "file_number": 60, "file_size": 11645443, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 11609426, "index_size": 20419, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 14533, "raw_key_size": 149661, "raw_average_key_size": 25, "raw_value_size": 11507706, "raw_average_value_size": 1988, "num_data_blocks": 818, "num_entries": 5786, "num_filter_entries": 5786, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1770025175, "oldest_key_time": 0, "file_creation_time": 1770027156, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "fac1d709-8a2a-487d-b05b-57255ec289c7", "db_session_id": "DE871D21TSCUFP8UED8E", "orig_file_number": 60, "seqno_to_time_mapping": "N/A"}}
Feb 02 10:12:37 compute-1 ceph-mon[80115]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Feb 02 10:12:37 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:12:37.034622) [db/compaction/compaction_job.cc:1663] [default] [JOB 34] Compacted 1@0 + 1@6 files to L6 => 11645443 bytes
Feb 02 10:12:37 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:12:37.049783) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 133.9 rd, 90.3 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.1, 15.4 +0.0 blob) out(11.1 +0.0 blob), read-write-amplify(24.8) write-amplify(10.0) OK, records in: 6772, records dropped: 986 output_compression: NoCompression
Feb 02 10:12:37 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:12:37.049824) EVENT_LOG_v1 {"time_micros": 1770027157049809, "job": 34, "event": "compaction_finished", "compaction_time_micros": 129021, "compaction_time_cpu_micros": 31575, "output_level": 6, "num_output_files": 1, "total_output_size": 11645443, "num_input_records": 6772, "num_output_records": 5786, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Feb 02 10:12:37 compute-1 ceph-mon[80115]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000059.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb 02 10:12:37 compute-1 ceph-mon[80115]: rocksdb: EVENT_LOG_v1 {"time_micros": 1770027157050136, "job": 34, "event": "table_file_deletion", "file_number": 59}
Feb 02 10:12:37 compute-1 ceph-mon[80115]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000057.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb 02 10:12:37 compute-1 ceph-mon[80115]: rocksdb: EVENT_LOG_v1 {"time_micros": 1770027157051958, "job": 34, "event": "table_file_deletion", "file_number": 57}
Feb 02 10:12:37 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:12:36.905215) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 02 10:12:37 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:12:37.052033) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 02 10:12:37 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:12:37.052040) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 02 10:12:37 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:12:37.052043) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 02 10:12:37 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:12:37.052045) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 02 10:12:37 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:12:37.052048) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 02 10:12:37 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 10:12:37 compute-1 podman[236376]: 2026-02-02 10:12:37.423191036 +0000 UTC m=+0.094178802 container health_status 1fb2696999ea15e9688144989093738543d3cef725f9986792d6dfd707aefbe2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:099d88ae13fa2b3409da5310cdcba7fa01d2c87a8bc98296299a57054b9a075e, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'db4758ee7523fe447444c4bd2b867b543b1eee4e3bbcf6676cd1b27bf6147d86-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:099d88ae13fa2b3409da5310cdcba7fa01d2c87a8bc98296299a57054b9a075e', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3)
Feb 02 10:12:37 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:12:37 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:12:37 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:12:37.495 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:12:37 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:12:37 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:12:37 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:12:37.533 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:12:37 compute-1 ceph-mon[80115]: pgmap v996: 353 pgs: 353 active+clean; 41 MiB data, 307 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Feb 02 10:12:39 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:12:39 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:12:39 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:12:39.498 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:12:39 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:12:39 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 10:12:39 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:12:39.535 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 10:12:39 compute-1 ceph-mon[80115]: pgmap v997: 353 pgs: 353 active+clean; 88 MiB data, 328 MiB used, 60 GiB / 60 GiB avail; 18 KiB/s rd, 1.8 MiB/s wr, 27 op/s
Feb 02 10:12:40 compute-1 sudo[236404]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Feb 02 10:12:40 compute-1 sudo[236404]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 10:12:40 compute-1 sudo[236404]: pam_unix(sudo:session): session closed for user root
Feb 02 10:12:41 compute-1 nova_compute[226294]: 2026-02-02 10:12:41.498 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:12:41 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:12:41 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:12:41 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:12:41.500 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:12:41 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:12:41 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:12:41 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:12:41.538 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:12:41 compute-1 nova_compute[226294]: 2026-02-02 10:12:41.719 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:12:41 compute-1 ceph-mon[80115]: pgmap v998: 353 pgs: 353 active+clean; 88 MiB data, 328 MiB used, 60 GiB / 60 GiB avail; 18 KiB/s rd, 1.8 MiB/s wr, 27 op/s
Feb 02 10:12:41 compute-1 ceph-mon[80115]: from='client.? 192.168.122.100:0/1433165583' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Feb 02 10:12:41 compute-1 ceph-mon[80115]: from='client.? 192.168.122.100:0/3208385706' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Feb 02 10:12:42 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 10:12:43 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:12:43 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:12:43 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:12:43.502 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:12:43 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:12:43 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:12:43 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:12:43.540 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:12:43 compute-1 ceph-mon[80115]: pgmap v999: 353 pgs: 353 active+clean; 88 MiB data, 328 MiB used, 60 GiB / 60 GiB avail; 18 KiB/s rd, 1.8 MiB/s wr, 27 op/s
Feb 02 10:12:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 10:12:44.911 143542 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 02 10:12:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 10:12:44.911 143542 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 02 10:12:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 10:12:44.911 143542 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 02 10:12:45 compute-1 podman[236432]: 2026-02-02 10:12:45.363987662 +0000 UTC m=+0.043691142 container health_status 9cab238676dda9a572dafc047b624a8b78a8fbec063bf997856559897160605d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'db4758ee7523fe447444c4bd2b867b543b1eee4e3bbcf6676cd1b27bf6147d86-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20260127)
Feb 02 10:12:45 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:12:45 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:12:45 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:12:45.504 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:12:45 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:12:45 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:12:45 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:12:45.542 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:12:45 compute-1 ceph-mon[80115]: pgmap v1000: 353 pgs: 353 active+clean; 88 MiB data, 328 MiB used, 60 GiB / 60 GiB avail; 21 KiB/s rd, 1.8 MiB/s wr, 32 op/s
Feb 02 10:12:46 compute-1 nova_compute[226294]: 2026-02-02 10:12:46.501 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:12:46 compute-1 nova_compute[226294]: 2026-02-02 10:12:46.721 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:12:47 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 10:12:47 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:12:47 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 10:12:47 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:12:47.506 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 10:12:47 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:12:47 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:12:47 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:12:47.544 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:12:47 compute-1 ceph-mon[80115]: pgmap v1001: 353 pgs: 353 active+clean; 88 MiB data, 328 MiB used, 60 GiB / 60 GiB avail; 21 KiB/s rd, 1.8 MiB/s wr, 32 op/s
Feb 02 10:12:47 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Feb 02 10:12:49 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:12:49 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 10:12:49 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:12:49.509 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 10:12:49 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:12:49 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:12:49 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:12:49.546 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:12:49 compute-1 ceph-mon[80115]: pgmap v1002: 353 pgs: 353 active+clean; 88 MiB data, 328 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 1.8 MiB/s wr, 101 op/s
Feb 02 10:12:51 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:12:51 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a34cb5d0 =====
Feb 02 10:12:51 compute-1 radosgw[81528]: ====== req done req=0x7fc4a34cb5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 10:12:51 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 10:12:51 compute-1 radosgw[81528]: beast: 0x7fc4a34cb5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:12:51.560 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 10:12:51 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:12:51.560 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 10:12:51 compute-1 nova_compute[226294]: 2026-02-02 10:12:51.563 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:12:51 compute-1 nova_compute[226294]: 2026-02-02 10:12:51.723 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:12:52 compute-1 ceph-mon[80115]: pgmap v1003: 353 pgs: 353 active+clean; 88 MiB data, 328 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 12 KiB/s wr, 74 op/s
Feb 02 10:12:52 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 10:12:52 compute-1 ovn_controller[133666]: 2026-02-02T10:12:52Z|00068|memory_trim|INFO|Detected inactivity (last active 30005 ms ago): trimming memory
Feb 02 10:12:53 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:12:53 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a34cb5d0 =====
Feb 02 10:12:53 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:12:53 compute-1 radosgw[81528]: ====== req done req=0x7fc4a34cb5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:12:53 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:12:53.563 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:12:53 compute-1 radosgw[81528]: beast: 0x7fc4a34cb5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:12:53.563 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:12:54 compute-1 ceph-mon[80115]: pgmap v1004: 353 pgs: 353 active+clean; 88 MiB data, 328 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 12 KiB/s wr, 74 op/s
Feb 02 10:12:55 compute-1 ceph-mon[80115]: from='client.? 192.168.122.10:0/232670626' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Feb 02 10:12:55 compute-1 ceph-mon[80115]: from='client.? 192.168.122.10:0/232670626' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Feb 02 10:12:55 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:12:55 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:12:55 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:12:55.565 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:12:55 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:12:55 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 10:12:55 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:12:55.566 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 10:12:56 compute-1 ceph-mon[80115]: pgmap v1005: 353 pgs: 353 active+clean; 88 MiB data, 328 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 12 KiB/s wr, 74 op/s
Feb 02 10:12:56 compute-1 nova_compute[226294]: 2026-02-02 10:12:56.587 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:12:56 compute-1 nova_compute[226294]: 2026-02-02 10:12:56.725 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:12:57 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 10:12:57 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:12:57 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 10:12:57 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:12:57.568 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 10:12:57 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a34cb5d0 =====
Feb 02 10:12:57 compute-1 radosgw[81528]: ====== req done req=0x7fc4a34cb5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 10:12:57 compute-1 radosgw[81528]: beast: 0x7fc4a34cb5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:12:57.569 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 10:12:58 compute-1 ceph-mon[80115]: pgmap v1006: 353 pgs: 353 active+clean; 88 MiB data, 328 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 69 op/s
Feb 02 10:12:59 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:12:59 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 10:12:59 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a34cb5d0 =====
Feb 02 10:12:59 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:12:59.571 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 10:12:59 compute-1 radosgw[81528]: ====== req done req=0x7fc4a34cb5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:12:59 compute-1 radosgw[81528]: beast: 0x7fc4a34cb5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:12:59.572 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:13:00 compute-1 ceph-mon[80115]: pgmap v1007: 353 pgs: 353 active+clean; 121 MiB data, 349 MiB used, 60 GiB / 60 GiB avail; 2.3 MiB/s rd, 2.1 MiB/s wr, 135 op/s
Feb 02 10:13:00 compute-1 sudo[236458]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 02 10:13:00 compute-1 sudo[236458]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 10:13:00 compute-1 sudo[236458]: pam_unix(sudo:session): session closed for user root
Feb 02 10:13:00 compute-1 sudo[236484]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/d241d473-9fcb-5f74-b163-f1ca4454e7f1/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Feb 02 10:13:00 compute-1 sudo[236484]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 10:13:00 compute-1 sudo[236509]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Feb 02 10:13:00 compute-1 sudo[236509]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 10:13:00 compute-1 sudo[236509]: pam_unix(sudo:session): session closed for user root
Feb 02 10:13:01 compute-1 sudo[236484]: pam_unix(sudo:session): session closed for user root
Feb 02 10:13:01 compute-1 sudo[236566]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 02 10:13:01 compute-1 sudo[236566]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 10:13:01 compute-1 sudo[236566]: pam_unix(sudo:session): session closed for user root
Feb 02 10:13:01 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:13:01 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:13:01 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:13:01.572 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:13:01 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:13:01 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:13:01 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:13:01.574 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:13:01 compute-1 nova_compute[226294]: 2026-02-02 10:13:01.589 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:13:01 compute-1 sudo[236591]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/d241d473-9fcb-5f74-b163-f1ca4454e7f1/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --image quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec --timeout 895 list-networks
Feb 02 10:13:01 compute-1 sudo[236591]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 10:13:01 compute-1 nova_compute[226294]: 2026-02-02 10:13:01.726 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:13:01 compute-1 sudo[236591]: pam_unix(sudo:session): session closed for user root
Feb 02 10:13:02 compute-1 ceph-mon[80115]: pgmap v1008: 353 pgs: 353 active+clean; 121 MiB data, 349 MiB used, 60 GiB / 60 GiB avail; 390 KiB/s rd, 2.1 MiB/s wr, 66 op/s
Feb 02 10:13:02 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb 02 10:13:02 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb 02 10:13:02 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb 02 10:13:02 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb 02 10:13:02 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb 02 10:13:02 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb 02 10:13:02 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "config rm", "who": "osd/host:compute-0", "name": "osd_memory_target"}]: dispatch
Feb 02 10:13:02 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Feb 02 10:13:02 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Feb 02 10:13:02 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb 02 10:13:02 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb 02 10:13:02 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Feb 02 10:13:02 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Feb 02 10:13:02 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Feb 02 10:13:02 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 10:13:03 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Feb 02 10:13:03 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:13:03 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:13:03 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:13:03.575 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:13:03 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:13:03 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:13:03 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:13:03.576 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:13:04 compute-1 ceph-mon[80115]: pgmap v1009: 353 pgs: 353 active+clean; 121 MiB data, 349 MiB used, 60 GiB / 60 GiB avail; 390 KiB/s rd, 2.1 MiB/s wr, 66 op/s
Feb 02 10:13:05 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:13:05 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:13:05 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:13:05.576 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:13:05 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:13:05 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:13:05 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:13:05.578 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:13:05 compute-1 ovn_metadata_agent[143537]: 2026-02-02 10:13:05.808 143542 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=13, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '66:4f:4d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '4a:a7:f3:61:65:15'}, ipsec=False) old=SB_Global(nb_cfg=12) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 02 10:13:05 compute-1 nova_compute[226294]: 2026-02-02 10:13:05.808 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:13:05 compute-1 ovn_metadata_agent[143537]: 2026-02-02 10:13:05.809 143542 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Feb 02 10:13:06 compute-1 ceph-mon[80115]: pgmap v1010: 353 pgs: 353 active+clean; 121 MiB data, 349 MiB used, 60 GiB / 60 GiB avail; 391 KiB/s rd, 2.1 MiB/s wr, 67 op/s
Feb 02 10:13:06 compute-1 nova_compute[226294]: 2026-02-02 10:13:06.592 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:13:06 compute-1 nova_compute[226294]: 2026-02-02 10:13:06.728 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:13:06 compute-1 sudo[236638]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 02 10:13:06 compute-1 sudo[236638]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 10:13:06 compute-1 sudo[236638]: pam_unix(sudo:session): session closed for user root
Feb 02 10:13:07 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 10:13:07 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:13:07 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:13:07 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:13:07.579 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:13:07 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:13:07 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 10:13:07 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:13:07.580 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 10:13:07 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb 02 10:13:07 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb 02 10:13:07 compute-1 ceph-mon[80115]: from='client.? 192.168.122.100:0/3058317273' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Feb 02 10:13:08 compute-1 podman[236664]: 2026-02-02 10:13:08.443686691 +0000 UTC m=+0.117851341 container health_status 1fb2696999ea15e9688144989093738543d3cef725f9986792d6dfd707aefbe2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:099d88ae13fa2b3409da5310cdcba7fa01d2c87a8bc98296299a57054b9a075e, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'db4758ee7523fe447444c4bd2b867b543b1eee4e3bbcf6676cd1b27bf6147d86-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:099d88ae13fa2b3409da5310cdcba7fa01d2c87a8bc98296299a57054b9a075e', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Feb 02 10:13:08 compute-1 ceph-mon[80115]: pgmap v1011: 353 pgs: 353 active+clean; 121 MiB data, 349 MiB used, 60 GiB / 60 GiB avail; 391 KiB/s rd, 2.1 MiB/s wr, 66 op/s
Feb 02 10:13:09 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:13:09 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 10:13:09 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:13:09.582 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 10:13:09 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:13:09 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:13:09 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:13:09.584 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:13:09 compute-1 systemd[1]: virtsecretd.service: Deactivated successfully.
Feb 02 10:13:10 compute-1 ceph-mon[80115]: pgmap v1012: 353 pgs: 353 active+clean; 41 MiB data, 310 MiB used, 60 GiB / 60 GiB avail; 410 KiB/s rd, 2.1 MiB/s wr, 94 op/s
Feb 02 10:13:11 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:13:11 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:13:11 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:13:11.585 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:13:11 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:13:11 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:13:11 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:13:11.588 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:13:11 compute-1 nova_compute[226294]: 2026-02-02 10:13:11.595 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:13:11 compute-1 nova_compute[226294]: 2026-02-02 10:13:11.729 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:13:11 compute-1 ceph-mon[80115]: from='client.? 192.168.122.100:0/4170229159' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Feb 02 10:13:12 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 10:13:12 compute-1 ceph-mon[80115]: pgmap v1013: 353 pgs: 353 active+clean; 41 MiB data, 310 MiB used, 60 GiB / 60 GiB avail; 20 KiB/s rd, 13 KiB/s wr, 28 op/s
Feb 02 10:13:12 compute-1 ceph-mon[80115]: from='client.? 192.168.122.100:0/4161531560' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Feb 02 10:13:13 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:13:13 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 10:13:13 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:13:13.587 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 10:13:13 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:13:13 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:13:13 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:13:13.589 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:13:13 compute-1 ceph-mon[80115]: pgmap v1014: 353 pgs: 353 active+clean; 41 MiB data, 310 MiB used, 60 GiB / 60 GiB avail; 20 KiB/s rd, 13 KiB/s wr, 28 op/s
Feb 02 10:13:14 compute-1 nova_compute[226294]: 2026-02-02 10:13:14.649 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 02 10:13:14 compute-1 nova_compute[226294]: 2026-02-02 10:13:14.650 226298 DEBUG nova.compute.manager [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 02 10:13:14 compute-1 nova_compute[226294]: 2026-02-02 10:13:14.650 226298 DEBUG nova.compute.manager [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 02 10:13:14 compute-1 nova_compute[226294]: 2026-02-02 10:13:14.669 226298 DEBUG nova.compute.manager [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Feb 02 10:13:14 compute-1 nova_compute[226294]: 2026-02-02 10:13:14.669 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 02 10:13:14 compute-1 nova_compute[226294]: 2026-02-02 10:13:14.670 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 02 10:13:14 compute-1 ceph-mon[80115]: from='client.? 192.168.122.102:0/2480915646' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Feb 02 10:13:15 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:13:15 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:13:15 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:13:15.590 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:13:15 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:13:15 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:13:15 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:13:15.591 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:13:15 compute-1 nova_compute[226294]: 2026-02-02 10:13:15.649 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 02 10:13:15 compute-1 ovn_metadata_agent[143537]: 2026-02-02 10:13:15.811 143542 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=2f54a3b0-231a-4b96-9e3a-0a36e3e73216, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '13'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 02 10:13:15 compute-1 ceph-mon[80115]: pgmap v1015: 353 pgs: 353 active+clean; 41 MiB data, 310 MiB used, 60 GiB / 60 GiB avail; 20 KiB/s rd, 13 KiB/s wr, 29 op/s
Feb 02 10:13:15 compute-1 ceph-mon[80115]: from='client.? 192.168.122.102:0/4002384306' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Feb 02 10:13:16 compute-1 podman[236695]: 2026-02-02 10:13:16.363446976 +0000 UTC m=+0.039765167 container health_status 9cab238676dda9a572dafc047b624a8b78a8fbec063bf997856559897160605d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_metadata_agent, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'db4758ee7523fe447444c4bd2b867b543b1eee4e3bbcf6676cd1b27bf6147d86-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 02 10:13:16 compute-1 nova_compute[226294]: 2026-02-02 10:13:16.596 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:13:16 compute-1 nova_compute[226294]: 2026-02-02 10:13:16.643 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 02 10:13:16 compute-1 nova_compute[226294]: 2026-02-02 10:13:16.647 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 02 10:13:16 compute-1 nova_compute[226294]: 2026-02-02 10:13:16.648 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 02 10:13:16 compute-1 nova_compute[226294]: 2026-02-02 10:13:16.648 226298 DEBUG nova.compute.manager [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Feb 02 10:13:16 compute-1 nova_compute[226294]: 2026-02-02 10:13:16.731 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:13:17 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 10:13:17 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:13:17 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:13:17 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:13:17.592 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:13:17 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:13:17 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 10:13:17 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:13:17.592 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 10:13:17 compute-1 nova_compute[226294]: 2026-02-02 10:13:17.667 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 02 10:13:17 compute-1 nova_compute[226294]: 2026-02-02 10:13:17.668 226298 DEBUG nova.compute.manager [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 02 10:13:17 compute-1 ceph-mon[80115]: pgmap v1016: 353 pgs: 353 active+clean; 41 MiB data, 310 MiB used, 60 GiB / 60 GiB avail; 19 KiB/s rd, 1.2 KiB/s wr, 28 op/s
Feb 02 10:13:17 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Feb 02 10:13:18 compute-1 nova_compute[226294]: 2026-02-02 10:13:18.648 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 02 10:13:18 compute-1 nova_compute[226294]: 2026-02-02 10:13:18.673 226298 DEBUG oslo_concurrency.lockutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 02 10:13:18 compute-1 nova_compute[226294]: 2026-02-02 10:13:18.674 226298 DEBUG oslo_concurrency.lockutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 02 10:13:18 compute-1 nova_compute[226294]: 2026-02-02 10:13:18.674 226298 DEBUG oslo_concurrency.lockutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 02 10:13:18 compute-1 nova_compute[226294]: 2026-02-02 10:13:18.674 226298 DEBUG nova.compute.resource_tracker [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 02 10:13:18 compute-1 nova_compute[226294]: 2026-02-02 10:13:18.675 226298 DEBUG oslo_concurrency.processutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 02 10:13:19 compute-1 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Feb 02 10:13:19 compute-1 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2299787801' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Feb 02 10:13:19 compute-1 nova_compute[226294]: 2026-02-02 10:13:19.143 226298 DEBUG oslo_concurrency.processutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.468s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 02 10:13:19 compute-1 nova_compute[226294]: 2026-02-02 10:13:19.285 226298 WARNING nova.virt.libvirt.driver [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 02 10:13:19 compute-1 nova_compute[226294]: 2026-02-02 10:13:19.286 226298 DEBUG nova.compute.resource_tracker [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4943MB free_disk=59.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 02 10:13:19 compute-1 nova_compute[226294]: 2026-02-02 10:13:19.286 226298 DEBUG oslo_concurrency.lockutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 02 10:13:19 compute-1 nova_compute[226294]: 2026-02-02 10:13:19.286 226298 DEBUG oslo_concurrency.lockutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 02 10:13:19 compute-1 nova_compute[226294]: 2026-02-02 10:13:19.584 226298 DEBUG nova.compute.resource_tracker [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 02 10:13:19 compute-1 nova_compute[226294]: 2026-02-02 10:13:19.584 226298 DEBUG nova.compute.resource_tracker [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 02 10:13:19 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:13:19 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:13:19 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:13:19.594 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:13:19 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:13:19 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:13:19 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:13:19.595 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:13:19 compute-1 nova_compute[226294]: 2026-02-02 10:13:19.651 226298 DEBUG oslo_concurrency.processutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 02 10:13:19 compute-1 ceph-mon[80115]: pgmap v1017: 353 pgs: 353 active+clean; 41 MiB data, 310 MiB used, 60 GiB / 60 GiB avail; 20 KiB/s rd, 1.2 KiB/s wr, 28 op/s
Feb 02 10:13:19 compute-1 ceph-mon[80115]: from='client.? 192.168.122.101:0/2299787801' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Feb 02 10:13:20 compute-1 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Feb 02 10:13:20 compute-1 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2021419824' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Feb 02 10:13:20 compute-1 nova_compute[226294]: 2026-02-02 10:13:20.093 226298 DEBUG oslo_concurrency.processutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.442s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 02 10:13:20 compute-1 nova_compute[226294]: 2026-02-02 10:13:20.100 226298 DEBUG nova.compute.provider_tree [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Inventory has not changed in ProviderTree for provider: 8e32c057-ad28-4c19-8374-763e0c1c8622 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 02 10:13:20 compute-1 nova_compute[226294]: 2026-02-02 10:13:20.116 226298 DEBUG nova.scheduler.client.report [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Inventory has not changed for provider 8e32c057-ad28-4c19-8374-763e0c1c8622 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 02 10:13:20 compute-1 nova_compute[226294]: 2026-02-02 10:13:20.119 226298 DEBUG nova.compute.resource_tracker [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 02 10:13:20 compute-1 nova_compute[226294]: 2026-02-02 10:13:20.119 226298 DEBUG oslo_concurrency.lockutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.833s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 02 10:13:20 compute-1 nova_compute[226294]: 2026-02-02 10:13:20.648 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 02 10:13:20 compute-1 nova_compute[226294]: 2026-02-02 10:13:20.649 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 02 10:13:20 compute-1 ceph-mon[80115]: from='client.? 192.168.122.101:0/2021419824' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Feb 02 10:13:21 compute-1 sudo[236763]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Feb 02 10:13:21 compute-1 sudo[236763]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 10:13:21 compute-1 sudo[236763]: pam_unix(sudo:session): session closed for user root
Feb 02 10:13:21 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:13:21 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:13:21 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:13:21.596 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:13:21 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:13:21 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:13:21 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:13:21.597 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:13:21 compute-1 nova_compute[226294]: 2026-02-02 10:13:21.599 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:13:21 compute-1 nova_compute[226294]: 2026-02-02 10:13:21.733 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:13:21 compute-1 ceph-mon[80115]: pgmap v1018: 353 pgs: 353 active+clean; 41 MiB data, 310 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Feb 02 10:13:22 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 10:13:23 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:13:23 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:13:23 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:13:23.599 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:13:23 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:13:23 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:13:23 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:13:23.600 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:13:23 compute-1 ceph-mon[80115]: pgmap v1019: 353 pgs: 353 active+clean; 41 MiB data, 310 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Feb 02 10:13:25 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:13:25 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 10:13:25 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:13:25.601 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 10:13:25 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:13:25 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:13:25 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:13:25.603 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:13:25 compute-1 ceph-mon[80115]: pgmap v1020: 353 pgs: 353 active+clean; 41 MiB data, 310 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Feb 02 10:13:26 compute-1 nova_compute[226294]: 2026-02-02 10:13:26.638 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:13:26 compute-1 nova_compute[226294]: 2026-02-02 10:13:26.735 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:13:27 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 10:13:27 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:13:27 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:13:27 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:13:27.604 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:13:27 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:13:27 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:13:27 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:13:27.605 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:13:27 compute-1 ceph-mon[80115]: pgmap v1021: 353 pgs: 353 active+clean; 41 MiB data, 310 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Feb 02 10:13:28 compute-1 nova_compute[226294]: 2026-02-02 10:13:28.661 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 02 10:13:28 compute-1 nova_compute[226294]: 2026-02-02 10:13:28.662 226298 DEBUG nova.compute.manager [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Feb 02 10:13:28 compute-1 nova_compute[226294]: 2026-02-02 10:13:28.685 226298 DEBUG nova.compute.manager [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Feb 02 10:13:29 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:13:29 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:13:29 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:13:29.606 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:13:29 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:13:29 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:13:29 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:13:29.607 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:13:29 compute-1 ceph-mon[80115]: pgmap v1022: 353 pgs: 353 active+clean; 41 MiB data, 310 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Feb 02 10:13:31 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:13:31 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:13:31 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:13:31.608 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:13:31 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:13:31 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:13:31 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:13:31.609 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:13:31 compute-1 nova_compute[226294]: 2026-02-02 10:13:31.640 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:13:31 compute-1 nova_compute[226294]: 2026-02-02 10:13:31.737 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:13:31 compute-1 ceph-mon[80115]: pgmap v1023: 353 pgs: 353 active+clean; 41 MiB data, 310 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Feb 02 10:13:32 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 10:13:32 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Feb 02 10:13:33 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:13:33 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 10:13:33 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:13:33.610 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 10:13:33 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:13:33 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:13:33 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:13:33.611 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:13:34 compute-1 ceph-mon[80115]: pgmap v1024: 353 pgs: 353 active+clean; 41 MiB data, 310 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Feb 02 10:13:35 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:13:35 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 10:13:35 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:13:35.612 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 10:13:35 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a34cb5d0 =====
Feb 02 10:13:35 compute-1 radosgw[81528]: ====== req done req=0x7fc4a34cb5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:13:35 compute-1 radosgw[81528]: beast: 0x7fc4a34cb5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:13:35.613 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:13:36 compute-1 ceph-mon[80115]: pgmap v1025: 353 pgs: 353 active+clean; 41 MiB data, 310 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Feb 02 10:13:36 compute-1 nova_compute[226294]: 2026-02-02 10:13:36.669 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:13:36 compute-1 nova_compute[226294]: 2026-02-02 10:13:36.738 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:13:37 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 10:13:37 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:13:37 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a34cb5d0 =====
Feb 02 10:13:37 compute-1 radosgw[81528]: ====== req done req=0x7fc4a34cb5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:13:37 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:13:37 compute-1 radosgw[81528]: beast: 0x7fc4a34cb5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:13:37.615 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:13:37 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:13:37.615 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:13:38 compute-1 ceph-mon[80115]: pgmap v1026: 353 pgs: 353 active+clean; 41 MiB data, 310 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Feb 02 10:13:39 compute-1 podman[236797]: 2026-02-02 10:13:39.384803414 +0000 UTC m=+0.064765761 container health_status 1fb2696999ea15e9688144989093738543d3cef725f9986792d6dfd707aefbe2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:099d88ae13fa2b3409da5310cdcba7fa01d2c87a8bc98296299a57054b9a075e, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, managed_by=edpm_ansible, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'db4758ee7523fe447444c4bd2b867b543b1eee4e3bbcf6676cd1b27bf6147d86-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:099d88ae13fa2b3409da5310cdcba7fa01d2c87a8bc98296299a57054b9a075e', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_id=ovn_controller)
Feb 02 10:13:39 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:13:39 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a34cb5d0 =====
Feb 02 10:13:39 compute-1 radosgw[81528]: ====== req done req=0x7fc4a34cb5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:13:39 compute-1 radosgw[81528]: beast: 0x7fc4a34cb5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:13:39.618 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:13:39 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:13:39 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:13:39.618 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:13:40 compute-1 ceph-mon[80115]: pgmap v1027: 353 pgs: 353 active+clean; 41 MiB data, 310 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Feb 02 10:13:41 compute-1 sudo[236826]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Feb 02 10:13:41 compute-1 sudo[236826]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 10:13:41 compute-1 sudo[236826]: pam_unix(sudo:session): session closed for user root
Feb 02 10:13:41 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:13:41 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a34cb5d0 =====
Feb 02 10:13:41 compute-1 radosgw[81528]: ====== req done req=0x7fc4a34cb5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:13:41 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:13:41 compute-1 radosgw[81528]: beast: 0x7fc4a34cb5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:13:41.620 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:13:41 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:13:41.620 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:13:41 compute-1 nova_compute[226294]: 2026-02-02 10:13:41.672 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:13:41 compute-1 nova_compute[226294]: 2026-02-02 10:13:41.740 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:13:42 compute-1 ceph-mon[80115]: pgmap v1028: 353 pgs: 353 active+clean; 41 MiB data, 310 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Feb 02 10:13:42 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 10:13:43 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:13:43 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:13:43 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a34cb5d0 =====
Feb 02 10:13:43 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:13:43.622 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:13:43 compute-1 radosgw[81528]: ====== req done req=0x7fc4a34cb5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 10:13:43 compute-1 radosgw[81528]: beast: 0x7fc4a34cb5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:13:43.622 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 10:13:44 compute-1 ceph-mon[80115]: pgmap v1029: 353 pgs: 353 active+clean; 41 MiB data, 310 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Feb 02 10:13:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 10:13:44.912 143542 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 02 10:13:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 10:13:44.912 143542 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 02 10:13:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 10:13:44.912 143542 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 02 10:13:45 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:13:45 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a34cb5d0 =====
Feb 02 10:13:45 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:13:45 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:13:45.624 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:13:45 compute-1 radosgw[81528]: ====== req done req=0x7fc4a34cb5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:13:45 compute-1 radosgw[81528]: beast: 0x7fc4a34cb5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:13:45.624 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:13:46 compute-1 ceph-mon[80115]: pgmap v1030: 353 pgs: 353 active+clean; 41 MiB data, 310 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Feb 02 10:13:46 compute-1 nova_compute[226294]: 2026-02-02 10:13:46.674 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:13:46 compute-1 nova_compute[226294]: 2026-02-02 10:13:46.742 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:13:47 compute-1 podman[236854]: 2026-02-02 10:13:47.384042342 +0000 UTC m=+0.066447466 container health_status 9cab238676dda9a572dafc047b624a8b78a8fbec063bf997856559897160605d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'db4758ee7523fe447444c4bd2b867b543b1eee4e3bbcf6676cd1b27bf6147d86-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, managed_by=edpm_ansible)
Feb 02 10:13:47 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 10:13:47 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:13:47 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 10:13:47 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:13:47.627 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 10:13:47 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:13:47 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:13:47 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:13:47.630 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:13:48 compute-1 ceph-mon[80115]: pgmap v1031: 353 pgs: 353 active+clean; 41 MiB data, 310 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Feb 02 10:13:48 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Feb 02 10:13:49 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:13:49 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 10:13:49 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:13:49.629 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 10:13:49 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:13:49 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 10:13:49 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:13:49.631 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 10:13:50 compute-1 ceph-mon[80115]: pgmap v1032: 353 pgs: 353 active+clean; 41 MiB data, 310 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Feb 02 10:13:51 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:13:51 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:13:51 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:13:51.632 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:13:51 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:13:51 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:13:51 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:13:51.634 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:13:51 compute-1 nova_compute[226294]: 2026-02-02 10:13:51.676 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:13:51 compute-1 nova_compute[226294]: 2026-02-02 10:13:51.744 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:13:52 compute-1 ceph-mon[80115]: pgmap v1033: 353 pgs: 353 active+clean; 41 MiB data, 310 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Feb 02 10:13:52 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 10:13:53 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:13:53 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:13:53 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:13:53.633 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:13:53 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:13:53 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:13:53 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:13:53.635 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:13:54 compute-1 ceph-mon[80115]: pgmap v1034: 353 pgs: 353 active+clean; 41 MiB data, 310 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Feb 02 10:13:55 compute-1 ceph-mon[80115]: from='client.? 192.168.122.10:0/2490674294' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Feb 02 10:13:55 compute-1 ceph-mon[80115]: from='client.? 192.168.122.10:0/2490674294' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Feb 02 10:13:55 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:13:55 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:13:55 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:13:55.636 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:13:55 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:13:55 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:13:55 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:13:55.637 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:13:56 compute-1 ceph-mon[80115]: pgmap v1035: 353 pgs: 353 active+clean; 41 MiB data, 310 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Feb 02 10:13:56 compute-1 nova_compute[226294]: 2026-02-02 10:13:56.710 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:13:56 compute-1 nova_compute[226294]: 2026-02-02 10:13:56.746 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:13:57 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 10:13:57 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:13:57 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:13:57 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:13:57.638 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:13:57 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:13:57 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 10:13:57 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:13:57.639 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 10:13:58 compute-1 ceph-mon[80115]: pgmap v1036: 353 pgs: 353 active+clean; 41 MiB data, 310 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Feb 02 10:13:59 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:13:59 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:13:59 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:13:59.639 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:13:59 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:13:59 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 10:13:59 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:13:59.642 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 10:14:00 compute-1 ceph-mon[80115]: pgmap v1037: 353 pgs: 353 active+clean; 41 MiB data, 310 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Feb 02 10:14:01 compute-1 sudo[236880]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Feb 02 10:14:01 compute-1 sudo[236880]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 10:14:01 compute-1 sudo[236880]: pam_unix(sudo:session): session closed for user root
Feb 02 10:14:01 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:14:01 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 10:14:01 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:14:01.641 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 10:14:01 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:14:01 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:14:01 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:14:01.645 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:14:01 compute-1 nova_compute[226294]: 2026-02-02 10:14:01.716 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:14:01 compute-1 nova_compute[226294]: 2026-02-02 10:14:01.747 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:14:02 compute-1 ceph-mon[80115]: pgmap v1038: 353 pgs: 353 active+clean; 41 MiB data, 310 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Feb 02 10:14:02 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Feb 02 10:14:02 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 10:14:03 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:14:03 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:14:03 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:14:03.644 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:14:03 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:14:03 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:14:03 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:14:03.647 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:14:04 compute-1 ceph-mon[80115]: pgmap v1039: 353 pgs: 353 active+clean; 41 MiB data, 310 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Feb 02 10:14:05 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:14:05 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:14:05 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:14:05.646 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:14:05 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:14:05 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:14:05 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:14:05.649 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:14:06 compute-1 ceph-mon[80115]: pgmap v1040: 353 pgs: 353 active+clean; 41 MiB data, 310 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Feb 02 10:14:06 compute-1 nova_compute[226294]: 2026-02-02 10:14:06.749 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 02 10:14:06 compute-1 nova_compute[226294]: 2026-02-02 10:14:06.751 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 02 10:14:06 compute-1 nova_compute[226294]: 2026-02-02 10:14:06.751 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 02 10:14:06 compute-1 nova_compute[226294]: 2026-02-02 10:14:06.752 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 02 10:14:06 compute-1 nova_compute[226294]: 2026-02-02 10:14:06.758 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:14:06 compute-1 nova_compute[226294]: 2026-02-02 10:14:06.759 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 02 10:14:07 compute-1 sudo[236909]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 02 10:14:07 compute-1 sudo[236909]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 10:14:07 compute-1 sudo[236909]: pam_unix(sudo:session): session closed for user root
Feb 02 10:14:07 compute-1 sudo[236934]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/d241d473-9fcb-5f74-b163-f1ca4454e7f1/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --image quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec --timeout 895 ls
Feb 02 10:14:07 compute-1 sudo[236934]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 10:14:07 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 10:14:07 compute-1 sshd-session[237001]: Accepted publickey for zuul from 192.168.122.10 port 37774 ssh2: ECDSA SHA256:RIWOugHsRom13QN8+H2eekzMj6VNcm6gUxie+zDStiQ
Feb 02 10:14:07 compute-1 systemd-logind[805]: New session 55 of user zuul.
Feb 02 10:14:07 compute-1 systemd[1]: Started Session 55 of User zuul.
Feb 02 10:14:07 compute-1 sshd-session[237001]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Feb 02 10:14:07 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:14:07 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:14:07 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:14:07.648 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:14:07 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:14:07 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:14:07 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:14:07.652 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:14:07 compute-1 sudo[237043]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/bash -c 'rm -rf /var/tmp/sos-osp && mkdir /var/tmp/sos-osp && sos report --batch --all-logs --tmp-dir=/var/tmp/sos-osp  -p container,openstack_edpm,system,storage,virt'
Feb 02 10:14:07 compute-1 sudo[237043]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 10:14:07 compute-1 podman[237036]: 2026-02-02 10:14:07.744331067 +0000 UTC m=+0.074058757 container exec 01cf0f34952fee77036b3444f3bcffc4063a933b24861a159799fb77cf2e6e0e (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-crash-compute-1, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325, io.buildah.version=1.40.1, CEPH_REF=squid, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default)
Feb 02 10:14:07 compute-1 podman[237036]: 2026-02-02 10:14:07.840073913 +0000 UTC m=+0.169801583 container exec_died 01cf0f34952fee77036b3444f3bcffc4063a933b24861a159799fb77cf2e6e0e (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-crash-compute-1, CEPH_REF=squid, org.label-schema.build-date=20250325, OSD_FLAVOR=default, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, ceph=True)
Feb 02 10:14:08 compute-1 ceph-mon[80115]: pgmap v1041: 353 pgs: 353 active+clean; 41 MiB data, 310 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Feb 02 10:14:08 compute-1 podman[237254]: 2026-02-02 10:14:08.966587122 +0000 UTC m=+0.048392973 container exec 8d747add3fe1f1bfe0d198ee6025faae61dcb9cc9c5a16a41a7205162dac9d07 (image=quay.io/prometheus/node-exporter:v1.7.0, name=ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-node-exporter-compute-1, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Feb 02 10:14:09 compute-1 podman[237254]: 2026-02-02 10:14:09.00251041 +0000 UTC m=+0.084316261 container exec_died 8d747add3fe1f1bfe0d198ee6025faae61dcb9cc9c5a16a41a7205162dac9d07 (image=quay.io/prometheus/node-exporter:v1.7.0, name=ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-node-exporter-compute-1, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Feb 02 10:14:09 compute-1 podman[237400]: 2026-02-02 10:14:09.381908007 +0000 UTC m=+0.060966328 container exec 956874b4175e8a2fade475e5bb43454e3b46ed50db9c41f7163b928c5dfb05c2 (image=quay.io/ceph/haproxy:2.3, name=ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-haproxy-nfs-cephfs-compute-1-sryqbx)
Feb 02 10:14:09 compute-1 podman[237400]: 2026-02-02 10:14:09.59147437 +0000 UTC m=+0.270532711 container exec_died 956874b4175e8a2fade475e5bb43454e3b46ed50db9c41f7163b928c5dfb05c2 (image=quay.io/ceph/haproxy:2.3, name=ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-haproxy-nfs-cephfs-compute-1-sryqbx)
Feb 02 10:14:09 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:14:09 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:14:09 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:14:09.650 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:14:09 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:14:09 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:14:09 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:14:09.654 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:14:09 compute-1 podman[237451]: 2026-02-02 10:14:09.740102377 +0000 UTC m=+0.103036251 container health_status 1fb2696999ea15e9688144989093738543d3cef725f9986792d6dfd707aefbe2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:099d88ae13fa2b3409da5310cdcba7fa01d2c87a8bc98296299a57054b9a075e, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'db4758ee7523fe447444c4bd2b867b543b1eee4e3bbcf6676cd1b27bf6147d86-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:099d88ae13fa2b3409da5310cdcba7fa01d2c87a8bc98296299a57054b9a075e', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20260127, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Feb 02 10:14:09 compute-1 podman[237533]: 2026-02-02 10:14:09.842212652 +0000 UTC m=+0.061499982 container exec 2d34301ed9240709c7b956110bf2f7a09db1491812be3ea91463444d19b431a2 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-keepalived-nfs-cephfs-compute-1-whrwoq, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.component=keepalived-container, release=1793, io.openshift.expose-services=, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=keepalived for Ceph, io.k8s.display-name=Keepalived on RHEL 9, io.openshift.tags=Ceph keepalived, io.buildah.version=1.28.2, version=2.2.4, summary=Provides keepalived on RHEL 9 for Ceph., distribution-scope=public, name=keepalived, vcs-type=git, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, vendor=Red Hat, Inc., url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, build-date=2023-02-22T09:23:20)
Feb 02 10:14:09 compute-1 podman[237533]: 2026-02-02 10:14:09.85150335 +0000 UTC m=+0.070790730 container exec_died 2d34301ed9240709c7b956110bf2f7a09db1491812be3ea91463444d19b431a2 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-d241d473-9fcb-5f74-b163-f1ca4454e7f1-keepalived-nfs-cephfs-compute-1-whrwoq, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, description=keepalived for Ceph, com.redhat.component=keepalived-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=keepalived, build-date=2023-02-22T09:23:20, io.buildah.version=1.28.2, architecture=x86_64, vendor=Red Hat, Inc., io.k8s.display-name=Keepalived on RHEL 9, release=1793, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, io.openshift.expose-services=, version=2.2.4, summary=Provides keepalived on RHEL 9 for Ceph., io.openshift.tags=Ceph keepalived, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793)
Feb 02 10:14:09 compute-1 sudo[236934]: pam_unix(sudo:session): session closed for user root
Feb 02 10:14:09 compute-1 sudo[237584]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 02 10:14:09 compute-1 sudo[237584]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 10:14:09 compute-1 sudo[237584]: pam_unix(sudo:session): session closed for user root
Feb 02 10:14:10 compute-1 sudo[237609]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/d241d473-9fcb-5f74-b163-f1ca4454e7f1/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Feb 02 10:14:10 compute-1 sudo[237609]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 10:14:10 compute-1 sudo[237609]: pam_unix(sudo:session): session closed for user root
Feb 02 10:14:10 compute-1 ceph-mon[80115]: pgmap v1042: 353 pgs: 353 active+clean; 41 MiB data, 310 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Feb 02 10:14:10 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb 02 10:14:10 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb 02 10:14:10 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb 02 10:14:10 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb 02 10:14:10 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "config rm", "who": "osd/host:compute-2", "name": "osd_memory_target"}]: dispatch
Feb 02 10:14:10 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "config rm", "who": "osd/host:compute-1", "name": "osd_memory_target"}]: dispatch
Feb 02 10:14:10 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Feb 02 10:14:10 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Feb 02 10:14:10 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb 02 10:14:10 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb 02 10:14:10 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Feb 02 10:14:10 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Feb 02 10:14:10 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Feb 02 10:14:11 compute-1 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "status"} v 0)
Feb 02 10:14:11 compute-1 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3049693552' entity='client.admin' cmd=[{"prefix": "status"}]: dispatch
Feb 02 10:14:11 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:14:11 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:14:11 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:14:11.652 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:14:11 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:14:11 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:14:11 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:14:11.656 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:14:11 compute-1 nova_compute[226294]: 2026-02-02 10:14:11.758 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:14:11 compute-1 ceph-mon[80115]: from='client.26575 -' entity='client.admin' cmd=[{"prefix": "orch status", "target": ["mon-mgr", ""]}]: dispatch
Feb 02 10:14:11 compute-1 ceph-mon[80115]: from='client.16905 -' entity='client.admin' cmd=[{"prefix": "orch status", "target": ["mon-mgr", ""]}]: dispatch
Feb 02 10:14:11 compute-1 ceph-mon[80115]: from='client.26522 -' entity='client.admin' cmd=[{"prefix": "orch status", "target": ["mon-mgr", ""]}]: dispatch
Feb 02 10:14:11 compute-1 ceph-mon[80115]: pgmap v1043: 353 pgs: 353 active+clean; 41 MiB data, 310 MiB used, 60 GiB / 60 GiB avail; 525 B/s rd, 0 op/s
Feb 02 10:14:11 compute-1 ceph-mon[80115]: from='client.26587 -' entity='client.admin' cmd=[{"prefix": "crash ls", "target": ["mon-mgr", ""]}]: dispatch
Feb 02 10:14:11 compute-1 ceph-mon[80115]: from='client.16914 -' entity='client.admin' cmd=[{"prefix": "crash ls", "target": ["mon-mgr", ""]}]: dispatch
Feb 02 10:14:11 compute-1 ceph-mon[80115]: from='client.26531 -' entity='client.admin' cmd=[{"prefix": "crash ls", "target": ["mon-mgr", ""]}]: dispatch
Feb 02 10:14:11 compute-1 ceph-mon[80115]: Health check failed: 2 failed cephadm daemon(s) (CEPHADM_FAILED_DAEMON)
Feb 02 10:14:11 compute-1 ceph-mon[80115]: from='client.? 192.168.122.102:0/3461409384' entity='client.admin' cmd=[{"prefix": "status"}]: dispatch
Feb 02 10:14:11 compute-1 ceph-mon[80115]: from='client.? 192.168.122.100:0/1672189195' entity='client.admin' cmd=[{"prefix": "status"}]: dispatch
Feb 02 10:14:11 compute-1 ceph-mon[80115]: from='client.? 192.168.122.101:0/3049693552' entity='client.admin' cmd=[{"prefix": "status"}]: dispatch
Feb 02 10:14:12 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 10:14:12 compute-1 ceph-mon[80115]: pgmap v1044: 353 pgs: 353 active+clean; 41 MiB data, 310 MiB used, 60 GiB / 60 GiB avail; 525 B/s rd, 0 op/s
Feb 02 10:14:13 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:14:13 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:14:13 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:14:13.655 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:14:13 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:14:13 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 10:14:13 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:14:13.658 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 10:14:13 compute-1 ceph-mon[80115]: from='client.? 192.168.122.100:0/2728669691' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Feb 02 10:14:13 compute-1 ceph-mon[80115]: from='client.? 192.168.122.100:0/612839699' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Feb 02 10:14:14 compute-1 nova_compute[226294]: 2026-02-02 10:14:14.673 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 02 10:14:14 compute-1 ceph-mon[80115]: pgmap v1045: 353 pgs: 353 active+clean; 41 MiB data, 310 MiB used, 60 GiB / 60 GiB avail; 788 B/s rd, 0 op/s
Feb 02 10:14:15 compute-1 nova_compute[226294]: 2026-02-02 10:14:15.648 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 02 10:14:15 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:14:15 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:14:15 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:14:15.658 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:14:15 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:14:15 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 10:14:15 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:14:15.661 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 10:14:15 compute-1 ceph-mon[80115]: from='client.? 192.168.122.102:0/344012355' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Feb 02 10:14:15 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb 02 10:14:15 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb 02 10:14:16 compute-1 sudo[237840]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 02 10:14:16 compute-1 sudo[237840]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 10:14:16 compute-1 sudo[237840]: pam_unix(sudo:session): session closed for user root
Feb 02 10:14:16 compute-1 nova_compute[226294]: 2026-02-02 10:14:16.648 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 02 10:14:16 compute-1 nova_compute[226294]: 2026-02-02 10:14:16.649 226298 DEBUG nova.compute.manager [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 02 10:14:16 compute-1 nova_compute[226294]: 2026-02-02 10:14:16.649 226298 DEBUG nova.compute.manager [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 02 10:14:16 compute-1 nova_compute[226294]: 2026-02-02 10:14:16.670 226298 DEBUG nova.compute.manager [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Feb 02 10:14:16 compute-1 nova_compute[226294]: 2026-02-02 10:14:16.670 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 02 10:14:16 compute-1 nova_compute[226294]: 2026-02-02 10:14:16.760 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 02 10:14:16 compute-1 nova_compute[226294]: 2026-02-02 10:14:16.762 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 02 10:14:16 compute-1 nova_compute[226294]: 2026-02-02 10:14:16.762 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 02 10:14:16 compute-1 nova_compute[226294]: 2026-02-02 10:14:16.762 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 02 10:14:16 compute-1 nova_compute[226294]: 2026-02-02 10:14:16.799 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:14:16 compute-1 nova_compute[226294]: 2026-02-02 10:14:16.801 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 02 10:14:17 compute-1 ceph-mon[80115]: from='client.? 192.168.122.102:0/2791913223' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Feb 02 10:14:17 compute-1 ceph-mon[80115]: pgmap v1046: 353 pgs: 353 active+clean; 41 MiB data, 310 MiB used, 60 GiB / 60 GiB avail; 525 B/s rd, 0 op/s
Feb 02 10:14:17 compute-1 ovs-vsctl[237895]: ovs|00001|db_ctl_base|ERR|no key "dpdk-init" in Open_vSwitch record "." column other_config
Feb 02 10:14:17 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 10:14:17 compute-1 nova_compute[226294]: 2026-02-02 10:14:17.648 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 02 10:14:17 compute-1 nova_compute[226294]: 2026-02-02 10:14:17.648 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 02 10:14:17 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:14:17 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:14:17 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:14:17.660 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:14:17 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:14:17 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:14:17 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:14:17.664 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:14:17 compute-1 nova_compute[226294]: 2026-02-02 10:14:17.687 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 02 10:14:17 compute-1 virtqemud[225988]: Failed to connect socket to '/var/run/libvirt/virtnetworkd-sock-ro': No such file or directory
Feb 02 10:14:17 compute-1 virtqemud[225988]: Failed to connect socket to '/var/run/libvirt/virtnwfilterd-sock-ro': No such file or directory
Feb 02 10:14:17 compute-1 virtqemud[225988]: Failed to connect socket to '/var/run/libvirt/virtstoraged-sock-ro': No such file or directory
Feb 02 10:14:18 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb 02 10:14:18 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Feb 02 10:14:18 compute-1 podman[238108]: 2026-02-02 10:14:18.380792829 +0000 UTC m=+0.078521576 container health_status 9cab238676dda9a572dafc047b624a8b78a8fbec063bf997856559897160605d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'db4758ee7523fe447444c4bd2b867b543b1eee4e3bbcf6676cd1b27bf6147d86-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, container_name=ovn_metadata_agent)
Feb 02 10:14:18 compute-1 ceph-mds[85402]: mds.cephfs.compute-1.khfsen asok_command: cache status {prefix=cache status} (starting...)
Feb 02 10:14:18 compute-1 ceph-mds[85402]: mds.cephfs.compute-1.khfsen Can't run that command on an inactive MDS!
Feb 02 10:14:18 compute-1 lvm[238236]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Feb 02 10:14:18 compute-1 lvm[238236]: VG ceph_vg0 finished
Feb 02 10:14:18 compute-1 ceph-mds[85402]: mds.cephfs.compute-1.khfsen asok_command: client ls {prefix=client ls} (starting...)
Feb 02 10:14:18 compute-1 ceph-mds[85402]: mds.cephfs.compute-1.khfsen Can't run that command on an inactive MDS!
Feb 02 10:14:19 compute-1 ceph-mds[85402]: mds.cephfs.compute-1.khfsen asok_command: damage ls {prefix=damage ls} (starting...)
Feb 02 10:14:19 compute-1 ceph-mds[85402]: mds.cephfs.compute-1.khfsen Can't run that command on an inactive MDS!
Feb 02 10:14:19 compute-1 ceph-mds[85402]: mds.cephfs.compute-1.khfsen asok_command: dump loads {prefix=dump loads} (starting...)
Feb 02 10:14:19 compute-1 ceph-mds[85402]: mds.cephfs.compute-1.khfsen Can't run that command on an inactive MDS!
Feb 02 10:14:19 compute-1 ceph-mon[80115]: pgmap v1047: 353 pgs: 353 active+clean; 41 MiB data, 310 MiB used, 60 GiB / 60 GiB avail; 788 B/s rd, 0 op/s
Feb 02 10:14:19 compute-1 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "report"} v 0)
Feb 02 10:14:19 compute-1 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2812603984' entity='client.admin' cmd=[{"prefix": "report"}]: dispatch
Feb 02 10:14:19 compute-1 ceph-mds[85402]: mds.cephfs.compute-1.khfsen asok_command: dump tree {prefix=dump tree,root=/} (starting...)
Feb 02 10:14:19 compute-1 ceph-mds[85402]: mds.cephfs.compute-1.khfsen Can't run that command on an inactive MDS!
Feb 02 10:14:19 compute-1 ceph-mds[85402]: mds.cephfs.compute-1.khfsen asok_command: dump_blocked_ops {prefix=dump_blocked_ops} (starting...)
Feb 02 10:14:19 compute-1 ceph-mds[85402]: mds.cephfs.compute-1.khfsen Can't run that command on an inactive MDS!
Feb 02 10:14:19 compute-1 ceph-mds[85402]: mds.cephfs.compute-1.khfsen asok_command: dump_historic_ops {prefix=dump_historic_ops} (starting...)
Feb 02 10:14:19 compute-1 ceph-mds[85402]: mds.cephfs.compute-1.khfsen Can't run that command on an inactive MDS!
Feb 02 10:14:19 compute-1 nova_compute[226294]: 2026-02-02 10:14:19.647 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 02 10:14:19 compute-1 nova_compute[226294]: 2026-02-02 10:14:19.648 226298 DEBUG nova.compute.manager [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 02 10:14:19 compute-1 nova_compute[226294]: 2026-02-02 10:14:19.649 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 02 10:14:19 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:14:19 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:14:19 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:14:19.662 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:14:19 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:14:19 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:14:19 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:14:19.666 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:14:19 compute-1 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Feb 02 10:14:19 compute-1 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3436703442' entity='client.admin' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Feb 02 10:14:19 compute-1 ceph-mds[85402]: mds.cephfs.compute-1.khfsen asok_command: dump_historic_ops_by_duration {prefix=dump_historic_ops_by_duration} (starting...)
Feb 02 10:14:19 compute-1 ceph-mds[85402]: mds.cephfs.compute-1.khfsen Can't run that command on an inactive MDS!
Feb 02 10:14:19 compute-1 nova_compute[226294]: 2026-02-02 10:14:19.791 226298 DEBUG oslo_concurrency.lockutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 02 10:14:19 compute-1 nova_compute[226294]: 2026-02-02 10:14:19.791 226298 DEBUG oslo_concurrency.lockutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 02 10:14:19 compute-1 nova_compute[226294]: 2026-02-02 10:14:19.791 226298 DEBUG oslo_concurrency.lockutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 02 10:14:19 compute-1 nova_compute[226294]: 2026-02-02 10:14:19.791 226298 DEBUG nova.compute.resource_tracker [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 02 10:14:19 compute-1 nova_compute[226294]: 2026-02-02 10:14:19.792 226298 DEBUG oslo_concurrency.processutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 02 10:14:19 compute-1 ceph-mds[85402]: mds.cephfs.compute-1.khfsen asok_command: dump_ops_in_flight {prefix=dump_ops_in_flight} (starting...)
Feb 02 10:14:19 compute-1 ceph-mds[85402]: mds.cephfs.compute-1.khfsen Can't run that command on an inactive MDS!
Feb 02 10:14:20 compute-1 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "config log"} v 0)
Feb 02 10:14:20 compute-1 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1326465420' entity='client.admin' cmd=[{"prefix": "config log"}]: dispatch
Feb 02 10:14:20 compute-1 ceph-mds[85402]: mds.cephfs.compute-1.khfsen asok_command: get subtrees {prefix=get subtrees} (starting...)
Feb 02 10:14:20 compute-1 ceph-mds[85402]: mds.cephfs.compute-1.khfsen Can't run that command on an inactive MDS!
Feb 02 10:14:20 compute-1 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Feb 02 10:14:20 compute-1 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3782917356' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Feb 02 10:14:20 compute-1 nova_compute[226294]: 2026-02-02 10:14:20.218 226298 DEBUG oslo_concurrency.processutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.426s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 02 10:14:20 compute-1 ceph-mon[80115]: from='client.26626 -' entity='client.admin' cmd=[{"prefix": "balancer eval", "target": ["mon-mgr", ""]}]: dispatch
Feb 02 10:14:20 compute-1 ceph-mon[80115]: from='client.26558 -' entity='client.admin' cmd=[{"prefix": "balancer eval", "target": ["mon-mgr", ""]}]: dispatch
Feb 02 10:14:20 compute-1 ceph-mon[80115]: from='client.? 192.168.122.102:0/3787116785' entity='client.admin' cmd=[{"prefix": "report"}]: dispatch
Feb 02 10:14:20 compute-1 ceph-mon[80115]: from='client.? ' entity='client.admin' cmd=[{"prefix": "report"}]: dispatch
Feb 02 10:14:20 compute-1 ceph-mon[80115]: from='client.26638 -' entity='client.admin' cmd=[{"prefix": "balancer status", "target": ["mon-mgr", ""]}]: dispatch
Feb 02 10:14:20 compute-1 ceph-mon[80115]: from='client.? 192.168.122.101:0/2812603984' entity='client.admin' cmd=[{"prefix": "report"}]: dispatch
Feb 02 10:14:20 compute-1 ceph-mon[80115]: from='client.? ' entity='client.admin' cmd=[{"prefix": "report"}]: dispatch
Feb 02 10:14:20 compute-1 ceph-mon[80115]: from='client.26579 -' entity='client.admin' cmd=[{"prefix": "balancer status", "target": ["mon-mgr", ""]}]: dispatch
Feb 02 10:14:20 compute-1 ceph-mon[80115]: from='client.? 192.168.122.102:0/2161669210' entity='client.admin' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Feb 02 10:14:20 compute-1 ceph-mon[80115]: from='client.26653 -' entity='client.admin' cmd=[{"prefix": "balancer status detail", "target": ["mon-mgr", ""]}]: dispatch
Feb 02 10:14:20 compute-1 ceph-mon[80115]: from='client.? 192.168.122.101:0/3436703442' entity='client.admin' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Feb 02 10:14:20 compute-1 ceph-mon[80115]: from='client.26594 -' entity='client.admin' cmd=[{"prefix": "balancer status detail", "target": ["mon-mgr", ""]}]: dispatch
Feb 02 10:14:20 compute-1 ceph-mon[80115]: from='client.? 192.168.122.102:0/425384938' entity='client.admin' cmd=[{"prefix": "config log"}]: dispatch
Feb 02 10:14:20 compute-1 ceph-mon[80115]: from='client.? 192.168.122.101:0/1326465420' entity='client.admin' cmd=[{"prefix": "config log"}]: dispatch
Feb 02 10:14:20 compute-1 ceph-mon[80115]: from='client.? 192.168.122.101:0/3782917356' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Feb 02 10:14:20 compute-1 nova_compute[226294]: 2026-02-02 10:14:20.342 226298 WARNING nova.virt.libvirt.driver [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 02 10:14:20 compute-1 nova_compute[226294]: 2026-02-02 10:14:20.343 226298 DEBUG nova.compute.resource_tracker [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4716MB free_disk=59.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 02 10:14:20 compute-1 nova_compute[226294]: 2026-02-02 10:14:20.344 226298 DEBUG oslo_concurrency.lockutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 02 10:14:20 compute-1 nova_compute[226294]: 2026-02-02 10:14:20.344 226298 DEBUG oslo_concurrency.lockutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 02 10:14:20 compute-1 ceph-mds[85402]: mds.cephfs.compute-1.khfsen asok_command: ops {prefix=ops} (starting...)
Feb 02 10:14:20 compute-1 ceph-mds[85402]: mds.cephfs.compute-1.khfsen Can't run that command on an inactive MDS!
Feb 02 10:14:20 compute-1 nova_compute[226294]: 2026-02-02 10:14:20.473 226298 DEBUG nova.compute.resource_tracker [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 02 10:14:20 compute-1 nova_compute[226294]: 2026-02-02 10:14:20.474 226298 DEBUG nova.compute.resource_tracker [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 02 10:14:20 compute-1 nova_compute[226294]: 2026-02-02 10:14:20.487 226298 DEBUG nova.scheduler.client.report [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Refreshing inventories for resource provider 8e32c057-ad28-4c19-8374-763e0c1c8622 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Feb 02 10:14:20 compute-1 nova_compute[226294]: 2026-02-02 10:14:20.504 226298 DEBUG nova.scheduler.client.report [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Updating ProviderTree inventory for provider 8e32c057-ad28-4c19-8374-763e0c1c8622 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Feb 02 10:14:20 compute-1 nova_compute[226294]: 2026-02-02 10:14:20.504 226298 DEBUG nova.compute.provider_tree [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Updating inventory in ProviderTree for provider 8e32c057-ad28-4c19-8374-763e0c1c8622 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Feb 02 10:14:20 compute-1 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "config-key dump"} v 0)
Feb 02 10:14:20 compute-1 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3733392687' entity='client.admin' cmd=[{"prefix": "config-key dump"}]: dispatch
Feb 02 10:14:20 compute-1 nova_compute[226294]: 2026-02-02 10:14:20.523 226298 DEBUG nova.scheduler.client.report [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Refreshing aggregate associations for resource provider 8e32c057-ad28-4c19-8374-763e0c1c8622, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Feb 02 10:14:20 compute-1 nova_compute[226294]: 2026-02-02 10:14:20.549 226298 DEBUG nova.scheduler.client.report [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Refreshing trait associations for resource provider 8e32c057-ad28-4c19-8374-763e0c1c8622, traits: HW_CPU_X86_AESNI,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_SECURITY_TPM_1_2,HW_CPU_X86_SSE2,HW_CPU_X86_MMX,COMPUTE_STORAGE_BUS_USB,COMPUTE_STORAGE_BUS_IDE,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_SVM,HW_CPU_X86_F16C,HW_CPU_X86_AMD_SVM,HW_CPU_X86_SSE,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_GRAPHICS_MODEL_VIRTIO,HW_CPU_X86_BMI2,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_DEVICE_TAGGING,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_GRAPHICS_MODEL_VGA,HW_CPU_X86_AVX,COMPUTE_VOLUME_EXTEND,COMPUTE_SECURITY_UEFI_SECURE_BOOT,HW_CPU_X86_SHA,COMPUTE_SECURITY_TPM_2_0,HW_CPU_X86_ABM,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_CLMUL,HW_CPU_X86_FMA3,COMPUTE_ACCELERATORS,COMPUTE_TRUSTED_CERTS,HW_CPU_X86_BMI,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_SSE41,COMPUTE_VIOMMU_MODEL_AUTO,HW_CPU_X86_AVX2,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_RESCUE_BFV,HW_CPU_X86_SSE42,COMPUTE_NODE,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_SSSE3,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_NET_VIF_MODEL_PCNET,HW_CPU_X86_SSE4A,COMPUTE_IMAGE_TYPE_QCOW2 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Feb 02 10:14:20 compute-1 nova_compute[226294]: 2026-02-02 10:14:20.578 226298 DEBUG oslo_concurrency.processutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 02 10:14:20 compute-1 ceph-mds[85402]: mds.cephfs.compute-1.khfsen asok_command: session ls {prefix=session ls} (starting...)
Feb 02 10:14:20 compute-1 ceph-mds[85402]: mds.cephfs.compute-1.khfsen Can't run that command on an inactive MDS!
Feb 02 10:14:20 compute-1 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Feb 02 10:14:20 compute-1 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2046104399' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Feb 02 10:14:21 compute-1 nova_compute[226294]: 2026-02-02 10:14:21.015 226298 DEBUG oslo_concurrency.processutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.437s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 02 10:14:21 compute-1 nova_compute[226294]: 2026-02-02 10:14:21.020 226298 DEBUG nova.compute.provider_tree [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Inventory has not changed in ProviderTree for provider: 8e32c057-ad28-4c19-8374-763e0c1c8622 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 02 10:14:21 compute-1 nova_compute[226294]: 2026-02-02 10:14:21.043 226298 DEBUG nova.scheduler.client.report [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Inventory has not changed for provider 8e32c057-ad28-4c19-8374-763e0c1c8622 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 02 10:14:21 compute-1 nova_compute[226294]: 2026-02-02 10:14:21.044 226298 DEBUG nova.compute.resource_tracker [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 02 10:14:21 compute-1 nova_compute[226294]: 2026-02-02 10:14:21.044 226298 DEBUG oslo_concurrency.lockutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.700s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 02 10:14:21 compute-1 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr dump"} v 0)
Feb 02 10:14:21 compute-1 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3260189932' entity='client.admin' cmd=[{"prefix": "mgr dump"}]: dispatch
Feb 02 10:14:21 compute-1 ceph-mds[85402]: mds.cephfs.compute-1.khfsen asok_command: status {prefix=status} (starting...)
Feb 02 10:14:21 compute-1 sudo[238695]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Feb 02 10:14:21 compute-1 sudo[238695]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 10:14:21 compute-1 sudo[238695]: pam_unix(sudo:session): session closed for user root
Feb 02 10:14:21 compute-1 ceph-mon[80115]: from='client.26668 -' entity='client.admin' cmd=[{"prefix": "healthcheck history ls", "target": ["mon-mgr", ""]}]: dispatch
Feb 02 10:14:21 compute-1 ceph-mon[80115]: from='client.26621 -' entity='client.admin' cmd=[{"prefix": "healthcheck history ls", "target": ["mon-mgr", ""]}]: dispatch
Feb 02 10:14:21 compute-1 ceph-mon[80115]: from='client.? 192.168.122.102:0/4160798957' entity='client.admin' cmd=[{"prefix": "config-key dump"}]: dispatch
Feb 02 10:14:21 compute-1 ceph-mon[80115]: from='client.16980 -' entity='client.admin' cmd=[{"prefix": "balancer eval", "target": ["mon-mgr", ""]}]: dispatch
Feb 02 10:14:21 compute-1 ceph-mon[80115]: from='client.? 192.168.122.102:0/2277810388' entity='client.admin' cmd=[{"prefix": "log last", "channel": "cephadm"}]: dispatch
Feb 02 10:14:21 compute-1 ceph-mon[80115]: from='client.? 192.168.122.101:0/3733392687' entity='client.admin' cmd=[{"prefix": "config-key dump"}]: dispatch
Feb 02 10:14:21 compute-1 ceph-mon[80115]: pgmap v1048: 353 pgs: 353 active+clean; 41 MiB data, 310 MiB used, 60 GiB / 60 GiB avail; 525 B/s rd, 0 op/s
Feb 02 10:14:21 compute-1 ceph-mon[80115]: from='client.? 192.168.122.100:0/3367092476' entity='client.admin' cmd=[{"prefix": "report"}]: dispatch
Feb 02 10:14:21 compute-1 ceph-mon[80115]: from='client.? 192.168.122.101:0/812023751' entity='client.admin' cmd=[{"prefix": "log last", "channel": "cephadm"}]: dispatch
Feb 02 10:14:21 compute-1 ceph-mon[80115]: from='client.? 192.168.122.102:0/451823690' entity='client.admin' cmd=[{"prefix": "mgr dump"}]: dispatch
Feb 02 10:14:21 compute-1 ceph-mon[80115]: from='client.? 192.168.122.100:0/1271815499' entity='client.admin' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Feb 02 10:14:21 compute-1 ceph-mon[80115]: from='client.? 192.168.122.101:0/2046104399' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Feb 02 10:14:21 compute-1 ceph-mon[80115]: from='client.? 192.168.122.101:0/3260189932' entity='client.admin' cmd=[{"prefix": "mgr dump"}]: dispatch
Feb 02 10:14:21 compute-1 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr metadata"} v 0)
Feb 02 10:14:21 compute-1 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3985708071' entity='client.admin' cmd=[{"prefix": "mgr metadata"}]: dispatch
Feb 02 10:14:21 compute-1 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "features"} v 0)
Feb 02 10:14:21 compute-1 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2896364104' entity='client.admin' cmd=[{"prefix": "features"}]: dispatch
Feb 02 10:14:21 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:14:21 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:14:21 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:14:21.664 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:14:21 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:14:21 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:14:21 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:14:21.668 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:14:21 compute-1 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr module ls"} v 0)
Feb 02 10:14:21 compute-1 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/507030427' entity='client.admin' cmd=[{"prefix": "mgr module ls"}]: dispatch
Feb 02 10:14:21 compute-1 nova_compute[226294]: 2026-02-02 10:14:21.800 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:14:21 compute-1 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr module ls"} v 0)
Feb 02 10:14:21 compute-1 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3286499931' entity='client.admin' cmd=[{"prefix": "mgr module ls"}]: dispatch
Feb 02 10:14:22 compute-1 nova_compute[226294]: 2026-02-02 10:14:22.045 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 02 10:14:22 compute-1 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr services"} v 0)
Feb 02 10:14:22 compute-1 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2273732711' entity='client.admin' cmd=[{"prefix": "mgr services"}]: dispatch
Feb 02 10:14:22 compute-1 ceph-mon[80115]: from='client.26701 -' entity='client.admin' cmd=[{"prefix": "crash ls", "target": ["mon-mgr", ""]}]: dispatch
Feb 02 10:14:22 compute-1 ceph-mon[80115]: from='client.26663 -' entity='client.admin' cmd=[{"prefix": "crash ls", "target": ["mon-mgr", ""]}]: dispatch
Feb 02 10:14:22 compute-1 ceph-mon[80115]: from='client.17001 -' entity='client.admin' cmd=[{"prefix": "balancer status", "target": ["mon-mgr", ""]}]: dispatch
Feb 02 10:14:22 compute-1 ceph-mon[80115]: from='client.17019 -' entity='client.admin' cmd=[{"prefix": "crash stat", "target": ["mon-mgr", ""]}]: dispatch
Feb 02 10:14:22 compute-1 ceph-mon[80115]: from='client.26722 -' entity='client.admin' cmd=[{"prefix": "crash stat", "target": ["mon-mgr", ""]}]: dispatch
Feb 02 10:14:22 compute-1 ceph-mon[80115]: from='client.17028 -' entity='client.admin' cmd=[{"prefix": "balancer status detail", "target": ["mon-mgr", ""]}]: dispatch
Feb 02 10:14:22 compute-1 ceph-mon[80115]: from='client.? 192.168.122.102:0/1753263393' entity='client.admin' cmd=[{"prefix": "mgr metadata"}]: dispatch
Feb 02 10:14:22 compute-1 ceph-mon[80115]: from='client.? 192.168.122.100:0/2174409702' entity='client.admin' cmd=[{"prefix": "config log"}]: dispatch
Feb 02 10:14:22 compute-1 ceph-mon[80115]: from='client.? 192.168.122.101:0/3985708071' entity='client.admin' cmd=[{"prefix": "mgr metadata"}]: dispatch
Feb 02 10:14:22 compute-1 ceph-mon[80115]: from='client.? 192.168.122.102:0/3915256540' entity='client.admin' cmd=[{"prefix": "features"}]: dispatch
Feb 02 10:14:22 compute-1 ceph-mon[80115]: from='client.? ' entity='client.admin' cmd=[{"prefix": "features"}]: dispatch
Feb 02 10:14:22 compute-1 ceph-mon[80115]: from='client.? 192.168.122.101:0/2896364104' entity='client.admin' cmd=[{"prefix": "features"}]: dispatch
Feb 02 10:14:22 compute-1 ceph-mon[80115]: from='client.? ' entity='client.admin' cmd=[{"prefix": "features"}]: dispatch
Feb 02 10:14:22 compute-1 ceph-mon[80115]: from='client.17052 -' entity='client.admin' cmd=[{"prefix": "healthcheck history ls", "target": ["mon-mgr", ""]}]: dispatch
Feb 02 10:14:22 compute-1 ceph-mon[80115]: from='client.? 192.168.122.102:0/507030427' entity='client.admin' cmd=[{"prefix": "mgr module ls"}]: dispatch
Feb 02 10:14:22 compute-1 ceph-mon[80115]: from='client.? 192.168.122.100:0/2426249143' entity='client.admin' cmd=[{"prefix": "config-key dump"}]: dispatch
Feb 02 10:14:22 compute-1 ceph-mon[80115]: from='client.? 192.168.122.101:0/3286499931' entity='client.admin' cmd=[{"prefix": "mgr module ls"}]: dispatch
Feb 02 10:14:22 compute-1 ceph-mon[80115]: from='client.? 192.168.122.102:0/1936979949' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch
Feb 02 10:14:22 compute-1 ceph-mon[80115]: from='client.? 192.168.122.102:0/3378491942' entity='client.admin' cmd=[{"prefix": "mgr services"}]: dispatch
Feb 02 10:14:22 compute-1 ceph-mon[80115]: from='client.? 192.168.122.101:0/781959846' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch
Feb 02 10:14:22 compute-1 ceph-mon[80115]: from='client.? 192.168.122.100:0/2835134920' entity='client.admin' cmd=[{"prefix": "log last", "channel": "cephadm"}]: dispatch
Feb 02 10:14:22 compute-1 ceph-mon[80115]: from='client.? 192.168.122.101:0/2273732711' entity='client.admin' cmd=[{"prefix": "mgr services"}]: dispatch
Feb 02 10:14:22 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 10:14:22 compute-1 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr stat"} v 0)
Feb 02 10:14:22 compute-1 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3889457626' entity='client.admin' cmd=[{"prefix": "mgr stat"}]: dispatch
Feb 02 10:14:22 compute-1 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "log last", "num": 10000, "level": "debug", "channel": "audit"} v 0)
Feb 02 10:14:22 compute-1 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/4267692337' entity='client.admin' cmd=[{"prefix": "log last", "num": 10000, "level": "debug", "channel": "audit"}]: dispatch
Feb 02 10:14:23 compute-1 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr versions"} v 0)
Feb 02 10:14:23 compute-1 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/300668717' entity='client.admin' cmd=[{"prefix": "mgr versions"}]: dispatch
Feb 02 10:14:23 compute-1 ceph-mon[80115]: from='client.17085 -' entity='client.admin' cmd=[{"prefix": "crash ls", "target": ["mon-mgr", ""]}]: dispatch
Feb 02 10:14:23 compute-1 ceph-mon[80115]: from='client.26800 -' entity='client.admin' cmd=[{"prefix": "insights", "target": ["mon-mgr", ""]}]: dispatch
Feb 02 10:14:23 compute-1 ceph-mon[80115]: from='client.26765 -' entity='client.admin' cmd=[{"prefix": "insights", "target": ["mon-mgr", ""]}]: dispatch
Feb 02 10:14:23 compute-1 ceph-mon[80115]: pgmap v1049: 353 pgs: 353 active+clean; 41 MiB data, 310 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Feb 02 10:14:23 compute-1 ceph-mon[80115]: from='client.? 192.168.122.102:0/1452229676' entity='client.admin' cmd=[{"prefix": "mgr stat"}]: dispatch
Feb 02 10:14:23 compute-1 ceph-mon[80115]: from='client.17106 -' entity='client.admin' cmd=[{"prefix": "crash stat", "target": ["mon-mgr", ""]}]: dispatch
Feb 02 10:14:23 compute-1 ceph-mon[80115]: from='client.? 192.168.122.100:0/592185051' entity='client.admin' cmd=[{"prefix": "mgr dump"}]: dispatch
Feb 02 10:14:23 compute-1 ceph-mon[80115]: from='client.? 192.168.122.101:0/3889457626' entity='client.admin' cmd=[{"prefix": "mgr stat"}]: dispatch
Feb 02 10:14:23 compute-1 ceph-mon[80115]: from='client.? 192.168.122.102:0/1346187352' entity='client.admin' cmd=[{"prefix": "log last", "num": 10000, "level": "debug", "channel": "audit"}]: dispatch
Feb 02 10:14:23 compute-1 ceph-mon[80115]: from='client.? 192.168.122.102:0/202681717' entity='client.admin' cmd=[{"prefix": "mgr versions"}]: dispatch
Feb 02 10:14:23 compute-1 ceph-mon[80115]: from='client.? 192.168.122.101:0/4267692337' entity='client.admin' cmd=[{"prefix": "log last", "num": 10000, "level": "debug", "channel": "audit"}]: dispatch
Feb 02 10:14:23 compute-1 ceph-mon[80115]: from='client.? 192.168.122.100:0/1350398264' entity='client.admin' cmd=[{"prefix": "features"}]: dispatch
Feb 02 10:14:23 compute-1 ceph-mon[80115]: from='client.? 192.168.122.100:0/2712376753' entity='client.admin' cmd=[{"prefix": "mgr metadata"}]: dispatch
Feb 02 10:14:23 compute-1 ceph-mon[80115]: from='client.? 192.168.122.101:0/300668717' entity='client.admin' cmd=[{"prefix": "mgr versions"}]: dispatch
Feb 02 10:14:23 compute-1 ceph-mon[80115]: from='client.? 192.168.122.102:0/3294780291' entity='client.admin' cmd=[{"prefix": "log last", "num": 10000, "level": "debug", "channel": "cluster"}]: dispatch
Feb 02 10:14:23 compute-1 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "log last", "num": 10000, "level": "debug", "channel": "cluster"} v 0)
Feb 02 10:14:23 compute-1 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3225208465' entity='client.admin' cmd=[{"prefix": "log last", "num": 10000, "level": "debug", "channel": "cluster"}]: dispatch
Feb 02 10:14:23 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:14:23 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 10:14:23 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:14:23.666 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 10:14:23 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:14:23 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:14:23 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:14:23.671 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:14:23 compute-1 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr dump"} v 0)
Feb 02 10:14:23 compute-1 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/734854259' entity='client.admin' cmd=[{"prefix": "mgr dump"}]: dispatch
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 120 pg[9.19( v 44'1041 (0'0,44'1041] local-lis/les=0/0 n=7 ec=57/38 lis/c=118/82 les/c/f=119/83/0 sis=120) [0] r=0 lpr=120 pi=[82,120)/1 crt=44'1041 mlcod 0'0 unknown mbc={}] enter Start
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 120 pg[9.1a( empty local-lis/les=0/0 n=0 ec=57/38 lis/c=84/84 les/c/f=85/85/0 sis=120) [0]/[1] r=-1 lpr=120 pi=[84,120)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] exit Start 0.000176 0 0.000000
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 120 pg[9.19( v 44'1041 (0'0,44'1041] local-lis/les=0/0 n=7 ec=57/38 lis/c=118/82 les/c/f=119/83/0 sis=120) [0] r=0 lpr=120 pi=[82,120)/1 crt=44'1041 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 120 pg[9.19( v 44'1041 (0'0,44'1041] local-lis/les=0/0 n=7 ec=57/38 lis/c=118/82 les/c/f=119/83/0 sis=120) [0] r=0 lpr=120 pi=[82,120)/1 crt=44'1041 mlcod 0'0 unknown mbc={}] exit Start 0.000229 0 0.000000
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 120 pg[9.19( v 44'1041 (0'0,44'1041] local-lis/les=0/0 n=7 ec=57/38 lis/c=118/82 les/c/f=119/83/0 sis=120) [0] r=0 lpr=120 pi=[82,120)/1 crt=44'1041 mlcod 0'0 unknown mbc={}] enter Started/Primary
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 120 pg[9.19( v 44'1041 (0'0,44'1041] local-lis/les=0/0 n=7 ec=57/38 lis/c=118/82 les/c/f=119/83/0 sis=120) [0] r=0 lpr=120 pi=[82,120)/1 crt=44'1041 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 120 pg[9.1a( empty local-lis/les=0/0 n=0 ec=57/38 lis/c=84/84 les/c/f=85/85/0 sis=120) [0]/[1] r=-1 lpr=120 pi=[84,120)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] enter Started/Stray
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 120 pg[9.19( v 44'1041 (0'0,44'1041] local-lis/les=0/0 n=7 ec=57/38 lis/c=118/82 les/c/f=119/83/0 sis=120) [0] r=0 lpr=120 pi=[82,120)/1 crt=44'1041 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 120 handle_osd_map epochs [120,120], i have 120, src has [1,120]
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 120 pg[9.19( v 44'1041 (0'0,44'1041] local-lis/les=0/0 n=7 ec=57/38 lis/c=118/82 les/c/f=119/83/0 sis=120) [0] r=0 lpr=120 pi=[82,120)/1 crt=44'1041 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.010611 2 0.001010
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 120 pg[9.19( v 44'1041 (0'0,44'1041] local-lis/les=0/0 n=7 ec=57/38 lis/c=118/82 les/c/f=119/83/0 sis=120) [0] r=0 lpr=120 pi=[82,120)/1 crt=44'1041 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Feb 02 10:14:24 compute-1 ceph-osd[77691]: merge_log_dups log.dups.size()=0olog.dups.size()=41
Feb 02 10:14:24 compute-1 ceph-osd[77691]: end of merge_log_dups changed=1 log.dups.size()=0 olog.dups.size()=41
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 120 pg[9.19( v 44'1041 (0'0,44'1041] local-lis/les=118/119 n=7 ec=57/38 lis/c=118/82 les/c/f=119/83/0 sis=120) [0] r=0 lpr=120 pi=[82,120)/1 crt=44'1041 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.001634 2 0.000246
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 120 pg[9.19( v 44'1041 (0'0,44'1041] local-lis/les=118/119 n=7 ec=57/38 lis/c=118/82 les/c/f=119/83/0 sis=120) [0] r=0 lpr=120 pi=[82,120)/1 crt=44'1041 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 120 pg[9.19( v 44'1041 (0'0,44'1041] local-lis/les=118/119 n=7 ec=57/38 lis/c=118/82 les/c/f=119/83/0 sis=120) [0] r=0 lpr=120 pi=[82,120)/1 crt=44'1041 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000021 0 0.000000
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 120 pg[9.19( v 44'1041 (0'0,44'1041] local-lis/les=118/119 n=7 ec=57/38 lis/c=118/82 les/c/f=119/83/0 sis=120) [0] r=0 lpr=120 pi=[82,120)/1 crt=44'1041 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 120 handle_osd_map epochs [120,120], i have 120, src has [1,120]
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:42:50.879131+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 81944576 unmapped: 5128192 heap: 87072768 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 120 handle_osd_map epochs [120,121], i have 120, src has [1,121]
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 120 handle_osd_map epochs [121,121], i have 121, src has [1,121]
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 121 pg[9.19( v 44'1041 (0'0,44'1041] local-lis/les=118/119 n=7 ec=57/38 lis/c=118/82 les/c/f=119/83/0 sis=120) [0] r=0 lpr=120 pi=[82,120)/1 crt=44'1041 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 0.976458 2 0.000193
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 121 pg[9.19( v 44'1041 (0'0,44'1041] local-lis/les=118/119 n=7 ec=57/38 lis/c=118/82 les/c/f=119/83/0 sis=120) [0] r=0 lpr=120 pi=[82,120)/1 crt=44'1041 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 0.989453 0 0.000000
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 121 pg[9.19( v 44'1041 (0'0,44'1041] local-lis/les=118/119 n=7 ec=57/38 lis/c=118/82 les/c/f=119/83/0 sis=120) [0] r=0 lpr=120 pi=[82,120)/1 crt=44'1041 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 121 pg[9.19( v 44'1041 (0'0,44'1041] local-lis/les=120/121 n=7 ec=57/38 lis/c=118/82 les/c/f=119/83/0 sis=120) [0] r=0 lpr=120 pi=[82,120)/1 crt=44'1041 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 121 handle_osd_map epochs [121,121], i have 121, src has [1,121]
Feb 02 10:14:24 compute-1 ceph-osd[77691]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Feb 02 10:14:24 compute-1 ceph-osd[77691]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 121 pg[9.1a( v 44'1041 lc 0'0 (0'0,44'1041] local-lis/les=0/0 n=4 ec=57/38 lis/c=84/84 les/c/f=85/85/0 sis=120) [0]/[1] r=-1 lpr=120 pi=[84,120)/1 crt=44'1041 mlcod 0'0 remapped NOTIFY m=4 mbc={}] exit Started/Stray 0.993056 6 0.000648
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 121 pg[9.1a( v 44'1041 lc 0'0 (0'0,44'1041] local-lis/les=0/0 n=4 ec=57/38 lis/c=84/84 les/c/f=85/85/0 sis=120) [0]/[1] r=-1 lpr=120 pi=[84,120)/1 crt=44'1041 mlcod 0'0 remapped NOTIFY m=4 mbc={}] enter Started/ReplicaActive
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 121 pg[9.1a( v 44'1041 lc 0'0 (0'0,44'1041] local-lis/les=0/0 n=4 ec=57/38 lis/c=84/84 les/c/f=85/85/0 sis=120) [0]/[1] r=-1 lpr=120 pi=[84,120)/1 crt=44'1041 mlcod 0'0 remapped NOTIFY m=4 mbc={}] enter Started/ReplicaActive/RepNotRecovering
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 121 pg[9.19( v 44'1041 (0'0,44'1041] local-lis/les=120/121 n=7 ec=57/38 lis/c=118/82 les/c/f=119/83/0 sis=120) [0] r=0 lpr=120 pi=[82,120)/1 crt=44'1041 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 121 pg[9.19( v 44'1041 (0'0,44'1041] local-lis/les=120/121 n=7 ec=57/38 lis/c=120/82 les/c/f=121/83/0 sis=120) [0] r=0 lpr=120 pi=[82,120)/1 crt=44'1041 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.010420 4 0.000754
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 121 pg[9.19( v 44'1041 (0'0,44'1041] local-lis/les=120/121 n=7 ec=57/38 lis/c=120/82 les/c/f=121/83/0 sis=120) [0] r=0 lpr=120 pi=[82,120)/1 crt=44'1041 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 121 pg[9.19( v 44'1041 (0'0,44'1041] local-lis/les=120/121 n=7 ec=57/38 lis/c=120/82 les/c/f=121/83/0 sis=120) [0] r=0 lpr=120 pi=[82,120)/1 crt=44'1041 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000011 0 0.000000
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 121 pg[9.19( v 44'1041 (0'0,44'1041] local-lis/les=120/121 n=7 ec=57/38 lis/c=120/82 les/c/f=121/83/0 sis=120) [0] r=0 lpr=120 pi=[82,120)/1 crt=44'1041 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 121 pg[9.1a( v 44'1041 lc 44'299 (0'0,44'1041] local-lis/les=0/0 n=4 ec=57/38 lis/c=120/84 les/c/f=121/85/0 sis=120) [0]/[1] r=-1 lpr=120 pi=[84,120)/1 luod=0'0 crt=44'1041 lcod 0'0 mlcod 0'0 active+remapped m=4 mbc={}] exit Started/ReplicaActive/RepNotRecovering 0.007917 3 0.000146
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 121 pg[9.1a( v 44'1041 lc 44'299 (0'0,44'1041] local-lis/les=0/0 n=4 ec=57/38 lis/c=120/84 les/c/f=121/85/0 sis=120) [0]/[1] r=-1 lpr=120 pi=[84,120)/1 luod=0'0 crt=44'1041 lcod 0'0 mlcod 0'0 active+remapped m=4 mbc={}] enter Started/ReplicaActive/RepWaitRecoveryReserved
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 121 pg[9.1a( v 44'1041 lc 44'299 (0'0,44'1041] local-lis/les=0/0 n=4 ec=57/38 lis/c=120/84 les/c/f=121/85/0 sis=120) [0]/[1] r=-1 lpr=120 pi=[84,120)/1 luod=0'0 crt=44'1041 lcod 0'0 mlcod 0'0 active+remapped m=4 mbc={}] exit Started/ReplicaActive/RepWaitRecoveryReserved 0.000096 1 0.000039
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 121 pg[9.1a( v 44'1041 lc 44'299 (0'0,44'1041] local-lis/les=0/0 n=4 ec=57/38 lis/c=120/84 les/c/f=121/85/0 sis=120) [0]/[1] r=-1 lpr=120 pi=[84,120)/1 luod=0'0 crt=44'1041 lcod 0'0 mlcod 0'0 active+remapped m=4 mbc={}] enter Started/ReplicaActive/RepRecovering
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 121 pg[9.1a( v 44'1041 (0'0,44'1041] local-lis/les=0/0 n=4 ec=57/38 lis/c=120/84 les/c/f=121/85/0 sis=120) [0]/[1] r=-1 lpr=120 pi=[84,120)/1 luod=0'0 crt=44'1041 mlcod 0'0 active+remapped mbc={}] exit Started/ReplicaActive/RepRecovering 0.045353 1 0.000121
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 121 pg[9.1a( v 44'1041 (0'0,44'1041] local-lis/les=0/0 n=4 ec=57/38 lis/c=120/84 les/c/f=121/85/0 sis=120) [0]/[1] r=-1 lpr=120 pi=[84,120)/1 luod=0'0 crt=44'1041 mlcod 0'0 active+remapped mbc={}] enter Started/ReplicaActive/RepNotRecovering
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 121 handle_osd_map epochs [122,122], i have 121, src has [1,122]
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 122 pg[9.1a( v 44'1041 (0'0,44'1041] local-lis/les=0/0 n=4 ec=57/38 lis/c=120/84 les/c/f=121/85/0 sis=120) [0]/[1] r=-1 lpr=120 pi=[84,120)/1 luod=0'0 crt=44'1041 mlcod 0'0 active+remapped mbc={}] exit Started/ReplicaActive/RepNotRecovering 0.034063 1 0.000042
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 122 pg[9.1a( v 44'1041 (0'0,44'1041] local-lis/les=0/0 n=4 ec=57/38 lis/c=120/84 les/c/f=121/85/0 sis=120) [0]/[1] r=-1 lpr=120 pi=[84,120)/1 luod=0'0 crt=44'1041 mlcod 0'0 active+remapped mbc={}] exit Started/ReplicaActive 0.087581 0 0.000000
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 122 pg[9.1a( v 44'1041 (0'0,44'1041] local-lis/les=0/0 n=4 ec=57/38 lis/c=120/84 les/c/f=121/85/0 sis=120) [0]/[1] r=-1 lpr=120 pi=[84,120)/1 luod=0'0 crt=44'1041 mlcod 0'0 active+remapped mbc={}] exit Started 1.081170 0 0.000000
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 122 pg[9.1a( v 44'1041 (0'0,44'1041] local-lis/les=0/0 n=4 ec=57/38 lis/c=120/84 les/c/f=121/85/0 sis=120) [0]/[1] r=-1 lpr=120 pi=[84,120)/1 luod=0'0 crt=44'1041 mlcod 0'0 active+remapped mbc={}] enter Reset
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 122 pg[9.1a( v 44'1041 (0'0,44'1041] local-lis/les=0/0 n=4 ec=57/38 lis/c=120/84 les/c/f=121/85/0 sis=122) [0] r=0 lpr=122 pi=[84,122)/1 luod=0'0 crt=44'1041 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 0 -> 0, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 122 pg[9.1a( v 44'1041 (0'0,44'1041] local-lis/les=0/0 n=4 ec=57/38 lis/c=120/84 les/c/f=121/85/0 sis=122) [0] r=0 lpr=122 pi=[84,122)/1 crt=44'1041 mlcod 0'0 unknown mbc={}] exit Reset 0.000104 1 0.000164
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 122 pg[9.1a( v 44'1041 (0'0,44'1041] local-lis/les=0/0 n=4 ec=57/38 lis/c=120/84 les/c/f=121/85/0 sis=122) [0] r=0 lpr=122 pi=[84,122)/1 crt=44'1041 mlcod 0'0 unknown mbc={}] enter Started
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 122 pg[9.1a( v 44'1041 (0'0,44'1041] local-lis/les=0/0 n=4 ec=57/38 lis/c=120/84 les/c/f=121/85/0 sis=122) [0] r=0 lpr=122 pi=[84,122)/1 crt=44'1041 mlcod 0'0 unknown mbc={}] enter Start
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 122 pg[9.1a( v 44'1041 (0'0,44'1041] local-lis/les=0/0 n=4 ec=57/38 lis/c=120/84 les/c/f=121/85/0 sis=122) [0] r=0 lpr=122 pi=[84,122)/1 crt=44'1041 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 122 pg[9.1a( v 44'1041 (0'0,44'1041] local-lis/les=0/0 n=4 ec=57/38 lis/c=120/84 les/c/f=121/85/0 sis=122) [0] r=0 lpr=122 pi=[84,122)/1 crt=44'1041 mlcod 0'0 unknown mbc={}] exit Start 0.000006 0 0.000000
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 122 pg[9.1a( v 44'1041 (0'0,44'1041] local-lis/les=0/0 n=4 ec=57/38 lis/c=120/84 les/c/f=121/85/0 sis=122) [0] r=0 lpr=122 pi=[84,122)/1 crt=44'1041 mlcod 0'0 unknown mbc={}] enter Started/Primary
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 122 pg[9.1a( v 44'1041 (0'0,44'1041] local-lis/les=0/0 n=4 ec=57/38 lis/c=120/84 les/c/f=121/85/0 sis=122) [0] r=0 lpr=122 pi=[84,122)/1 crt=44'1041 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 122 pg[9.1a( v 44'1041 (0'0,44'1041] local-lis/les=0/0 n=4 ec=57/38 lis/c=120/84 les/c/f=121/85/0 sis=122) [0] r=0 lpr=122 pi=[84,122)/1 crt=44'1041 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 122 pg[9.1a( v 44'1041 (0'0,44'1041] local-lis/les=0/0 n=4 ec=57/38 lis/c=120/84 les/c/f=121/85/0 sis=122) [0] r=0 lpr=122 pi=[84,122)/1 crt=44'1041 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.000056 1 0.000059
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 122 pg[9.1a( v 44'1041 (0'0,44'1041] local-lis/les=0/0 n=4 ec=57/38 lis/c=120/84 les/c/f=121/85/0 sis=122) [0] r=0 lpr=122 pi=[84,122)/1 crt=44'1041 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Feb 02 10:14:24 compute-1 ceph-osd[77691]: merge_log_dups log.dups.size()=0olog.dups.size()=24
Feb 02 10:14:24 compute-1 ceph-osd[77691]: end of merge_log_dups changed=1 log.dups.size()=0 olog.dups.size()=24
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 122 pg[9.1a( v 44'1041 (0'0,44'1041] local-lis/les=120/121 n=4 ec=57/38 lis/c=120/84 les/c/f=121/85/0 sis=122) [0] r=0 lpr=122 pi=[84,122)/1 crt=44'1041 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.002091 3 0.000044
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 122 pg[9.1a( v 44'1041 (0'0,44'1041] local-lis/les=120/121 n=4 ec=57/38 lis/c=120/84 les/c/f=121/85/0 sis=122) [0] r=0 lpr=122 pi=[84,122)/1 crt=44'1041 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 122 pg[9.1a( v 44'1041 (0'0,44'1041] local-lis/les=120/121 n=4 ec=57/38 lis/c=120/84 les/c/f=121/85/0 sis=122) [0] r=0 lpr=122 pi=[84,122)/1 crt=44'1041 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000027 0 0.000000
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 122 pg[9.1a( v 44'1041 (0'0,44'1041] local-lis/les=120/121 n=4 ec=57/38 lis/c=120/84 les/c/f=121/85/0 sis=122) [0] r=0 lpr=122 pi=[84,122)/1 crt=44'1041 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:42:51.879297+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82018304 unmapped: 6103040 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 122 handle_osd_map epochs [122,123], i have 122, src has [1,123]
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 122 handle_osd_map epochs [122,123], i have 123, src has [1,123]
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 123 pg[9.1a( v 44'1041 (0'0,44'1041] local-lis/les=120/121 n=4 ec=57/38 lis/c=120/84 les/c/f=121/85/0 sis=122) [0] r=0 lpr=122 pi=[84,122)/1 crt=44'1041 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 1.391129 2 0.000227
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 123 pg[9.1a( v 44'1041 (0'0,44'1041] local-lis/les=120/121 n=4 ec=57/38 lis/c=120/84 les/c/f=121/85/0 sis=122) [0] r=0 lpr=122 pi=[84,122)/1 crt=44'1041 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 1.393440 0 0.000000
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 123 pg[9.1a( v 44'1041 (0'0,44'1041] local-lis/les=120/121 n=4 ec=57/38 lis/c=120/84 les/c/f=121/85/0 sis=122) [0] r=0 lpr=122 pi=[84,122)/1 crt=44'1041 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 123 pg[9.1a( v 44'1041 (0'0,44'1041] local-lis/les=122/123 n=4 ec=57/38 lis/c=120/84 les/c/f=121/85/0 sis=122) [0] r=0 lpr=122 pi=[84,122)/1 crt=44'1041 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 123 pg[9.1a( v 44'1041 (0'0,44'1041] local-lis/les=122/123 n=4 ec=57/38 lis/c=120/84 les/c/f=121/85/0 sis=122) [0] r=0 lpr=122 pi=[84,122)/1 crt=44'1041 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 123 pg[9.1a( v 44'1041 (0'0,44'1041] local-lis/les=122/123 n=4 ec=57/38 lis/c=122/84 les/c/f=123/85/0 sis=122) [0] r=0 lpr=122 pi=[84,122)/1 crt=44'1041 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.044078 3 0.000225
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 123 pg[9.1a( v 44'1041 (0'0,44'1041] local-lis/les=122/123 n=4 ec=57/38 lis/c=122/84 les/c/f=123/85/0 sis=122) [0] r=0 lpr=122 pi=[84,122)/1 crt=44'1041 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 123 pg[9.1a( v 44'1041 (0'0,44'1041] local-lis/les=122/123 n=4 ec=57/38 lis/c=122/84 les/c/f=123/85/0 sis=122) [0] r=0 lpr=122 pi=[84,122)/1 crt=44'1041 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000039 0 0.000000
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 123 pg[9.1a( v 44'1041 (0'0,44'1041] local-lis/les=122/123 n=4 ec=57/38 lis/c=122/84 les/c/f=123/85/0 sis=122) [0] r=0 lpr=122 pi=[84,122)/1 crt=44'1041 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:42:52.879493+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82026496 unmapped: 6094848 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 123 handle_osd_map epochs [123,123], i have 123, src has [1,123]
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 123 heartbeat osd_stat(store_statfs(0x4fcaa5000/0x0/0x4ffc00000, data 0xd2086/0x173000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:42:53.879740+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82051072 unmapped: 6070272 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 123 handle_osd_map epochs [123,124], i have 123, src has [1,124]
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 10.140940666s of 11.563747406s, submitted: 64
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 124 pg[9.1b(unlocked)] enter Initial
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 124 pg[9.1b( empty local-lis/les=0/0 n=0 ec=57/38 lis/c=68/68 les/c/f=69/69/0 sis=124) [0] r=0 lpr=0 pi=[68,124)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Initial 0.000118 0 0.000000
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 124 pg[9.1b( empty local-lis/les=0/0 n=0 ec=57/38 lis/c=68/68 les/c/f=69/69/0 sis=124) [0] r=0 lpr=0 pi=[68,124)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Reset
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 124 pg[9.1b( empty local-lis/les=0/0 n=0 ec=57/38 lis/c=68/68 les/c/f=69/69/0 sis=124) [0] r=0 lpr=124 pi=[68,124)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Reset 0.000037 1 0.000070
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 124 pg[9.1b( empty local-lis/les=0/0 n=0 ec=57/38 lis/c=68/68 les/c/f=69/69/0 sis=124) [0] r=0 lpr=124 pi=[68,124)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 124 pg[9.1b( empty local-lis/les=0/0 n=0 ec=57/38 lis/c=68/68 les/c/f=69/69/0 sis=124) [0] r=0 lpr=124 pi=[68,124)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Start
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 124 pg[9.1b( empty local-lis/les=0/0 n=0 ec=57/38 lis/c=68/68 les/c/f=69/69/0 sis=124) [0] r=0 lpr=124 pi=[68,124)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 124 pg[9.1b( empty local-lis/les=0/0 n=0 ec=57/38 lis/c=68/68 les/c/f=69/69/0 sis=124) [0] r=0 lpr=124 pi=[68,124)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Start 0.000146 0 0.000000
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 124 pg[9.1b( empty local-lis/les=0/0 n=0 ec=57/38 lis/c=68/68 les/c/f=69/69/0 sis=124) [0] r=0 lpr=124 pi=[68,124)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 124 pg[9.1b( empty local-lis/les=0/0 n=0 ec=57/38 lis/c=68/68 les/c/f=69/69/0 sis=124) [0] r=0 lpr=124 pi=[68,124)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 124 pg[9.1b( empty local-lis/les=0/0 n=0 ec=57/38 lis/c=68/68 les/c/f=69/69/0 sis=124) [0] r=0 lpr=124 pi=[68,124)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 124 pg[9.1b( empty local-lis/les=0/0 n=0 ec=57/38 lis/c=68/68 les/c/f=69/69/0 sis=124) [0] r=0 lpr=124 pi=[68,124)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.000322 1 0.000323
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 124 pg[9.1b( empty local-lis/les=0/0 n=0 ec=57/38 lis/c=68/68 les/c/f=69/69/0 sis=124) [0] r=0 lpr=124 pi=[68,124)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 124 pg[9.1b( empty local-lis/les=0/0 n=0 ec=57/38 lis/c=68/68 les/c/f=69/69/0 sis=124) [0] r=0 lpr=124 pi=[68,124)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.000077 0 0.000000
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 124 pg[9.1b( empty local-lis/les=0/0 n=0 ec=57/38 lis/c=68/68 les/c/f=69/69/0 sis=124) [0] r=0 lpr=124 pi=[68,124)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 0.000493 0 0.000000
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 124 handle_osd_map epochs [124,124], i have 124, src has [1,124]
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 124 pg[9.1b( empty local-lis/les=0/0 n=0 ec=57/38 lis/c=68/68 les/c/f=69/69/0 sis=124) [0] r=0 lpr=124 pi=[68,124)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/WaitActingChange
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:42:54.879858+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 883946 data_alloc: 218103808 data_used: 143360
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82067456 unmapped: 6053888 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 124 handle_osd_map epochs [124,125], i have 124, src has [1,125]
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 125 pg[9.1b( empty local-lis/les=0/0 n=0 ec=57/38 lis/c=68/68 les/c/f=69/69/0 sis=124) [0] r=0 lpr=124 pi=[68,124)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Started/Primary/WaitActingChange 1.112428 2 0.000315
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 125 pg[9.1b( empty local-lis/les=0/0 n=0 ec=57/38 lis/c=68/68 les/c/f=69/69/0 sis=124) [0] r=0 lpr=124 pi=[68,124)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Started/Primary 1.113085 0 0.000000
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 125 pg[9.1b( empty local-lis/les=0/0 n=0 ec=57/38 lis/c=68/68 les/c/f=69/69/0 sis=124) [0] r=0 lpr=124 pi=[68,124)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Started 1.113293 0 0.000000
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 125 pg[9.1b( empty local-lis/les=0/0 n=0 ec=57/38 lis/c=68/68 les/c/f=69/69/0 sis=124) [0] r=0 lpr=124 pi=[68,124)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Reset
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 125 pg[9.1b( empty local-lis/les=0/0 n=0 ec=57/38 lis/c=68/68 les/c/f=69/69/0 sis=125) [0]/[2] r=-1 lpr=125 pi=[68,125)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 125 pg[9.1b( empty local-lis/les=0/0 n=0 ec=57/38 lis/c=68/68 les/c/f=69/69/0 sis=125) [0]/[2] r=-1 lpr=125 pi=[68,125)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] exit Reset 0.000136 1 0.000191
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 125 pg[9.1b( empty local-lis/les=0/0 n=0 ec=57/38 lis/c=68/68 les/c/f=69/69/0 sis=125) [0]/[2] r=-1 lpr=125 pi=[68,125)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] enter Started
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 125 pg[9.1b( empty local-lis/les=0/0 n=0 ec=57/38 lis/c=68/68 les/c/f=69/69/0 sis=125) [0]/[2] r=-1 lpr=125 pi=[68,125)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] enter Start
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 125 pg[9.1b( empty local-lis/les=0/0 n=0 ec=57/38 lis/c=68/68 les/c/f=69/69/0 sis=125) [0]/[2] r=-1 lpr=125 pi=[68,125)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 125 pg[9.1b( empty local-lis/les=0/0 n=0 ec=57/38 lis/c=68/68 les/c/f=69/69/0 sis=125) [0]/[2] r=-1 lpr=125 pi=[68,125)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] exit Start 0.000008 0 0.000000
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 125 pg[9.1b( empty local-lis/les=0/0 n=0 ec=57/38 lis/c=68/68 les/c/f=69/69/0 sis=125) [0]/[2] r=-1 lpr=125 pi=[68,125)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] enter Started/Stray
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 125 handle_osd_map epochs [125,125], i have 125, src has [1,125]
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:42:55.879976+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82075648 unmapped: 6045696 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _renew_subs
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 125 handle_osd_map epochs [126,126], i have 125, src has [1,126]
Feb 02 10:14:24 compute-1 ceph-osd[77691]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Feb 02 10:14:24 compute-1 ceph-osd[77691]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 126 pg[9.1b( v 44'1041 lc 0'0 (0'0,44'1041] local-lis/les=0/0 n=2 ec=57/38 lis/c=68/68 les/c/f=69/69/0 sis=125) [0]/[2] r=-1 lpr=125 pi=[68,125)/1 crt=44'1041 mlcod 0'0 remapped NOTIFY m=2 mbc={}] exit Started/Stray 1.121016 5 0.000074
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 126 pg[9.1b( v 44'1041 lc 0'0 (0'0,44'1041] local-lis/les=0/0 n=2 ec=57/38 lis/c=68/68 les/c/f=69/69/0 sis=125) [0]/[2] r=-1 lpr=125 pi=[68,125)/1 crt=44'1041 mlcod 0'0 remapped NOTIFY m=2 mbc={}] enter Started/ReplicaActive
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 126 pg[9.1b( v 44'1041 lc 0'0 (0'0,44'1041] local-lis/les=0/0 n=2 ec=57/38 lis/c=68/68 les/c/f=69/69/0 sis=125) [0]/[2] r=-1 lpr=125 pi=[68,125)/1 crt=44'1041 mlcod 0'0 remapped NOTIFY m=2 mbc={}] enter Started/ReplicaActive/RepNotRecovering
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 126 pg[9.1b( v 44'1041 lc 44'549 (0'0,44'1041] local-lis/les=0/0 n=2 ec=57/38 lis/c=125/68 les/c/f=126/69/0 sis=125) [0]/[2] r=-1 lpr=125 pi=[68,125)/1 luod=0'0 crt=44'1041 lcod 0'0 mlcod 0'0 active+remapped m=2 mbc={}] exit Started/ReplicaActive/RepNotRecovering 0.013432 4 0.000192
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 126 pg[9.1b( v 44'1041 lc 44'549 (0'0,44'1041] local-lis/les=0/0 n=2 ec=57/38 lis/c=125/68 les/c/f=126/69/0 sis=125) [0]/[2] r=-1 lpr=125 pi=[68,125)/1 luod=0'0 crt=44'1041 lcod 0'0 mlcod 0'0 active+remapped m=2 mbc={}] enter Started/ReplicaActive/RepWaitRecoveryReserved
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 126 pg[9.1b( v 44'1041 lc 44'549 (0'0,44'1041] local-lis/les=0/0 n=2 ec=57/38 lis/c=125/68 les/c/f=126/69/0 sis=125) [0]/[2] r=-1 lpr=125 pi=[68,125)/1 luod=0'0 crt=44'1041 lcod 0'0 mlcod 0'0 active+remapped m=2 mbc={}] exit Started/ReplicaActive/RepWaitRecoveryReserved 0.000093 1 0.000045
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 126 pg[9.1b( v 44'1041 lc 44'549 (0'0,44'1041] local-lis/les=0/0 n=2 ec=57/38 lis/c=125/68 les/c/f=126/69/0 sis=125) [0]/[2] r=-1 lpr=125 pi=[68,125)/1 luod=0'0 crt=44'1041 lcod 0'0 mlcod 0'0 active+remapped m=2 mbc={}] enter Started/ReplicaActive/RepRecovering
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 126 pg[9.1b( v 44'1041 (0'0,44'1041] local-lis/les=0/0 n=2 ec=57/38 lis/c=125/68 les/c/f=126/69/0 sis=125) [0]/[2] r=-1 lpr=125 pi=[68,125)/1 luod=0'0 crt=44'1041 mlcod 0'0 active+remapped mbc={}] exit Started/ReplicaActive/RepRecovering 0.016881 1 0.000101
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 126 pg[9.1b( v 44'1041 (0'0,44'1041] local-lis/les=0/0 n=2 ec=57/38 lis/c=125/68 les/c/f=126/69/0 sis=125) [0]/[2] r=-1 lpr=125 pi=[68,125)/1 luod=0'0 crt=44'1041 mlcod 0'0 active+remapped mbc={}] enter Started/ReplicaActive/RepNotRecovering
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:42:56.880120+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82092032 unmapped: 6029312 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 126 heartbeat osd_stat(store_statfs(0x4fcaa1000/0x0/0x4ffc00000, data 0xd6146/0x179000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 126 handle_osd_map epochs [127,127], i have 126, src has [1,127]
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 127 pg[9.1b( v 44'1041 (0'0,44'1041] local-lis/les=0/0 n=2 ec=57/38 lis/c=125/68 les/c/f=126/69/0 sis=125) [0]/[2] r=-1 lpr=125 pi=[68,125)/1 luod=0'0 crt=44'1041 mlcod 0'0 active+remapped mbc={}] exit Started/ReplicaActive/RepNotRecovering 0.903217 1 0.000050
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 127 pg[9.1b( v 44'1041 (0'0,44'1041] local-lis/les=0/0 n=2 ec=57/38 lis/c=125/68 les/c/f=126/69/0 sis=125) [0]/[2] r=-1 lpr=125 pi=[68,125)/1 luod=0'0 crt=44'1041 mlcod 0'0 active+remapped mbc={}] exit Started/ReplicaActive 0.933791 0 0.000000
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 127 pg[9.1b( v 44'1041 (0'0,44'1041] local-lis/les=0/0 n=2 ec=57/38 lis/c=125/68 les/c/f=126/69/0 sis=125) [0]/[2] r=-1 lpr=125 pi=[68,125)/1 luod=0'0 crt=44'1041 mlcod 0'0 active+remapped mbc={}] exit Started 2.054894 0 0.000000
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 127 pg[9.1b( v 44'1041 (0'0,44'1041] local-lis/les=0/0 n=2 ec=57/38 lis/c=125/68 les/c/f=126/69/0 sis=125) [0]/[2] r=-1 lpr=125 pi=[68,125)/1 luod=0'0 crt=44'1041 mlcod 0'0 active+remapped mbc={}] enter Reset
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 127 pg[9.1b( v 44'1041 (0'0,44'1041] local-lis/les=0/0 n=2 ec=57/38 lis/c=125/68 les/c/f=126/69/0 sis=127) [0] r=0 lpr=127 pi=[68,127)/1 luod=0'0 crt=44'1041 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 0 -> 0, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 127 pg[9.1b( v 44'1041 (0'0,44'1041] local-lis/les=0/0 n=2 ec=57/38 lis/c=125/68 les/c/f=126/69/0 sis=127) [0] r=0 lpr=127 pi=[68,127)/1 crt=44'1041 mlcod 0'0 unknown mbc={}] exit Reset 0.000272 1 0.000335
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 127 pg[9.1b( v 44'1041 (0'0,44'1041] local-lis/les=0/0 n=2 ec=57/38 lis/c=125/68 les/c/f=126/69/0 sis=127) [0] r=0 lpr=127 pi=[68,127)/1 crt=44'1041 mlcod 0'0 unknown mbc={}] enter Started
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 127 pg[9.1b( v 44'1041 (0'0,44'1041] local-lis/les=0/0 n=2 ec=57/38 lis/c=125/68 les/c/f=126/69/0 sis=127) [0] r=0 lpr=127 pi=[68,127)/1 crt=44'1041 mlcod 0'0 unknown mbc={}] enter Start
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 127 pg[9.1b( v 44'1041 (0'0,44'1041] local-lis/les=0/0 n=2 ec=57/38 lis/c=125/68 les/c/f=126/69/0 sis=127) [0] r=0 lpr=127 pi=[68,127)/1 crt=44'1041 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 127 pg[9.1b( v 44'1041 (0'0,44'1041] local-lis/les=0/0 n=2 ec=57/38 lis/c=125/68 les/c/f=126/69/0 sis=127) [0] r=0 lpr=127 pi=[68,127)/1 crt=44'1041 mlcod 0'0 unknown mbc={}] exit Start 0.000011 0 0.000000
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 127 pg[9.1b( v 44'1041 (0'0,44'1041] local-lis/les=0/0 n=2 ec=57/38 lis/c=125/68 les/c/f=126/69/0 sis=127) [0] r=0 lpr=127 pi=[68,127)/1 crt=44'1041 mlcod 0'0 unknown mbc={}] enter Started/Primary
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 127 pg[9.1b( v 44'1041 (0'0,44'1041] local-lis/les=0/0 n=2 ec=57/38 lis/c=125/68 les/c/f=126/69/0 sis=127) [0] r=0 lpr=127 pi=[68,127)/1 crt=44'1041 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 127 pg[9.1b( v 44'1041 (0'0,44'1041] local-lis/les=0/0 n=2 ec=57/38 lis/c=125/68 les/c/f=126/69/0 sis=127) [0] r=0 lpr=127 pi=[68,127)/1 crt=44'1041 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 127 pg[9.1b( v 44'1041 (0'0,44'1041] local-lis/les=0/0 n=2 ec=57/38 lis/c=125/68 les/c/f=126/69/0 sis=127) [0] r=0 lpr=127 pi=[68,127)/1 crt=44'1041 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.000051 1 0.000066
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 127 pg[9.1b( v 44'1041 (0'0,44'1041] local-lis/les=0/0 n=2 ec=57/38 lis/c=125/68 les/c/f=126/69/0 sis=127) [0] r=0 lpr=127 pi=[68,127)/1 crt=44'1041 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Feb 02 10:14:24 compute-1 ceph-osd[77691]: merge_log_dups log.dups.size()=0olog.dups.size()=15
Feb 02 10:14:24 compute-1 ceph-osd[77691]: end of merge_log_dups changed=1 log.dups.size()=0 olog.dups.size()=15
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 127 pg[9.1b( v 44'1041 (0'0,44'1041] local-lis/les=125/126 n=2 ec=57/38 lis/c=125/68 les/c/f=126/69/0 sis=127) [0] r=0 lpr=127 pi=[68,127)/1 crt=44'1041 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.001882 3 0.000063
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 127 pg[9.1b( v 44'1041 (0'0,44'1041] local-lis/les=125/126 n=2 ec=57/38 lis/c=125/68 les/c/f=126/69/0 sis=127) [0] r=0 lpr=127 pi=[68,127)/1 crt=44'1041 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 127 pg[9.1b( v 44'1041 (0'0,44'1041] local-lis/les=125/126 n=2 ec=57/38 lis/c=125/68 les/c/f=126/69/0 sis=127) [0] r=0 lpr=127 pi=[68,127)/1 crt=44'1041 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000008 0 0.000000
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 127 pg[9.1b( v 44'1041 (0'0,44'1041] local-lis/les=125/126 n=2 ec=57/38 lis/c=125/68 les/c/f=126/69/0 sis=127) [0] r=0 lpr=127 pi=[68,127)/1 crt=44'1041 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:42:57.880300+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82108416 unmapped: 6012928 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 127 ms_handle_reset con 0x5616e226d800 session 0x5616e275dc20
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 127 ms_handle_reset con 0x5616dff2d400 session 0x5616e1ec8780
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 127 handle_osd_map epochs [127,128], i have 127, src has [1,128]
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 128 pg[9.1b( v 44'1041 (0'0,44'1041] local-lis/les=125/126 n=2 ec=57/38 lis/c=125/68 les/c/f=126/69/0 sis=127) [0] r=0 lpr=127 pi=[68,127)/1 crt=44'1041 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 1.034532 2 0.000088
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 128 pg[9.1b( v 44'1041 (0'0,44'1041] local-lis/les=125/126 n=2 ec=57/38 lis/c=125/68 les/c/f=126/69/0 sis=127) [0] r=0 lpr=127 pi=[68,127)/1 crt=44'1041 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 1.036567 0 0.000000
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 128 pg[9.1b( v 44'1041 (0'0,44'1041] local-lis/les=125/126 n=2 ec=57/38 lis/c=125/68 les/c/f=126/69/0 sis=127) [0] r=0 lpr=127 pi=[68,127)/1 crt=44'1041 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 128 pg[9.1b( v 44'1041 (0'0,44'1041] local-lis/les=127/128 n=2 ec=57/38 lis/c=125/68 les/c/f=126/69/0 sis=127) [0] r=0 lpr=127 pi=[68,127)/1 crt=44'1041 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 128 handle_osd_map epochs [127,128], i have 128, src has [1,128]
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 128 pg[9.1b( v 44'1041 (0'0,44'1041] local-lis/les=127/128 n=2 ec=57/38 lis/c=125/68 les/c/f=126/69/0 sis=127) [0] r=0 lpr=127 pi=[68,127)/1 crt=44'1041 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 128 pg[9.1b( v 44'1041 (0'0,44'1041] local-lis/les=127/128 n=2 ec=57/38 lis/c=127/68 les/c/f=128/69/0 sis=127) [0] r=0 lpr=127 pi=[68,127)/1 crt=44'1041 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.006295 3 0.000262
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 128 pg[9.1b( v 44'1041 (0'0,44'1041] local-lis/les=127/128 n=2 ec=57/38 lis/c=127/68 les/c/f=128/69/0 sis=127) [0] r=0 lpr=127 pi=[68,127)/1 crt=44'1041 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 128 pg[9.1b( v 44'1041 (0'0,44'1041] local-lis/les=127/128 n=2 ec=57/38 lis/c=127/68 les/c/f=128/69/0 sis=127) [0] r=0 lpr=127 pi=[68,127)/1 crt=44'1041 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000025 0 0.000000
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 128 pg[9.1b( v 44'1041 (0'0,44'1041] local-lis/les=127/128 n=2 ec=57/38 lis/c=127/68 les/c/f=128/69/0 sis=127) [0] r=0 lpr=127 pi=[68,127)/1 crt=44'1041 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 128 handle_osd_map epochs [128,128], i have 128, src has [1,128]
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:42:58.880433+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 128 heartbeat osd_stat(store_statfs(0x4fca96000/0x0/0x4ffc00000, data 0xdc36c/0x183000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82132992 unmapped: 5988352 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:42:59.880623+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 900489 data_alloc: 218103808 data_used: 143360
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82141184 unmapped: 5980160 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 128 handle_osd_map epochs [129,130], i have 128, src has [1,130]
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:43:00.880790+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82149376 unmapped: 5971968 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:43:01.880979+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82149376 unmapped: 5971968 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 130 handle_osd_map epochs [131,132], i have 130, src has [1,132]
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:43:02.881230+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82214912 unmapped: 5906432 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 132 heartbeat osd_stat(store_statfs(0x4fca8c000/0x0/0x4ffc00000, data 0xe4254/0x18f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:43:03.881372+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 132 handle_osd_map epochs [133,133], i have 132, src has [1,133]
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 133 pg[9.1e(unlocked)] enter Initial
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 133 pg[9.1e( empty local-lis/les=0/0 n=0 ec=57/38 lis/c=74/74 les/c/f=75/75/0 sis=133) [0] r=0 lpr=0 pi=[74,133)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Initial 0.000063 0 0.000000
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 133 pg[9.1e( empty local-lis/les=0/0 n=0 ec=57/38 lis/c=74/74 les/c/f=75/75/0 sis=133) [0] r=0 lpr=0 pi=[74,133)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Reset
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 133 pg[9.1e( empty local-lis/les=0/0 n=0 ec=57/38 lis/c=74/74 les/c/f=75/75/0 sis=133) [0] r=0 lpr=133 pi=[74,133)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Reset 0.000018 1 0.000034
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 133 pg[9.1e( empty local-lis/les=0/0 n=0 ec=57/38 lis/c=74/74 les/c/f=75/75/0 sis=133) [0] r=0 lpr=133 pi=[74,133)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 133 pg[9.1e( empty local-lis/les=0/0 n=0 ec=57/38 lis/c=74/74 les/c/f=75/75/0 sis=133) [0] r=0 lpr=133 pi=[74,133)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Start
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 133 pg[9.1e( empty local-lis/les=0/0 n=0 ec=57/38 lis/c=74/74 les/c/f=75/75/0 sis=133) [0] r=0 lpr=133 pi=[74,133)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 133 pg[9.1e( empty local-lis/les=0/0 n=0 ec=57/38 lis/c=74/74 les/c/f=75/75/0 sis=133) [0] r=0 lpr=133 pi=[74,133)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Start 0.000008 0 0.000000
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 133 pg[9.1e( empty local-lis/les=0/0 n=0 ec=57/38 lis/c=74/74 les/c/f=75/75/0 sis=133) [0] r=0 lpr=133 pi=[74,133)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 133 pg[9.1e( empty local-lis/les=0/0 n=0 ec=57/38 lis/c=74/74 les/c/f=75/75/0 sis=133) [0] r=0 lpr=133 pi=[74,133)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 133 pg[9.1e( empty local-lis/les=0/0 n=0 ec=57/38 lis/c=74/74 les/c/f=75/75/0 sis=133) [0] r=0 lpr=133 pi=[74,133)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 133 pg[9.1e( empty local-lis/les=0/0 n=0 ec=57/38 lis/c=74/74 les/c/f=75/75/0 sis=133) [0] r=0 lpr=133 pi=[74,133)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.000121 1 0.000055
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 133 pg[9.1e( empty local-lis/les=0/0 n=0 ec=57/38 lis/c=74/74 les/c/f=75/75/0 sis=133) [0] r=0 lpr=133 pi=[74,133)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 133 pg[9.1e( empty local-lis/les=0/0 n=0 ec=57/38 lis/c=74/74 les/c/f=75/75/0 sis=133) [0] r=0 lpr=133 pi=[74,133)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.000028 0 0.000000
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 133 pg[9.1e( empty local-lis/les=0/0 n=0 ec=57/38 lis/c=74/74 les/c/f=75/75/0 sis=133) [0] r=0 lpr=133 pi=[74,133)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 0.000169 0 0.000000
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 133 pg[9.1e( empty local-lis/les=0/0 n=0 ec=57/38 lis/c=74/74 les/c/f=75/75/0 sis=133) [0] r=0 lpr=133 pi=[74,133)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/WaitActingChange
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82223104 unmapped: 5898240 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:43:04.881551+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 916045 data_alloc: 218103808 data_used: 143360
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 133 handle_osd_map epochs [133,134], i have 133, src has [1,134]
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 10.393966675s of 10.596799850s, submitted: 70
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 133 handle_osd_map epochs [134,134], i have 134, src has [1,134]
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 134 pg[9.1e( empty local-lis/les=0/0 n=0 ec=57/38 lis/c=74/74 les/c/f=75/75/0 sis=133) [0] r=0 lpr=133 pi=[74,133)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Started/Primary/WaitActingChange 1.021068 2 0.000058
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 134 pg[9.1e( empty local-lis/les=0/0 n=0 ec=57/38 lis/c=74/74 les/c/f=75/75/0 sis=133) [0] r=0 lpr=133 pi=[74,133)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Started/Primary 1.021285 0 0.000000
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 134 pg[9.1e( empty local-lis/les=0/0 n=0 ec=57/38 lis/c=74/74 les/c/f=75/75/0 sis=133) [0] r=0 lpr=133 pi=[74,133)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Started 1.021327 0 0.000000
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 134 pg[9.1e( empty local-lis/les=0/0 n=0 ec=57/38 lis/c=74/74 les/c/f=75/75/0 sis=133) [0] r=0 lpr=133 pi=[74,133)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Reset
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 134 pg[9.1e( empty local-lis/les=0/0 n=0 ec=57/38 lis/c=74/74 les/c/f=75/75/0 sis=134) [0]/[1] r=-1 lpr=134 pi=[74,134)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 134 pg[9.1e( empty local-lis/les=0/0 n=0 ec=57/38 lis/c=74/74 les/c/f=75/75/0 sis=134) [0]/[1] r=-1 lpr=134 pi=[74,134)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] exit Reset 0.000183 1 0.000267
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 134 pg[9.1e( empty local-lis/les=0/0 n=0 ec=57/38 lis/c=74/74 les/c/f=75/75/0 sis=134) [0]/[1] r=-1 lpr=134 pi=[74,134)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] enter Started
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 134 pg[9.1e( empty local-lis/les=0/0 n=0 ec=57/38 lis/c=74/74 les/c/f=75/75/0 sis=134) [0]/[1] r=-1 lpr=134 pi=[74,134)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] enter Start
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 134 pg[9.1e( empty local-lis/les=0/0 n=0 ec=57/38 lis/c=74/74 les/c/f=75/75/0 sis=134) [0]/[1] r=-1 lpr=134 pi=[74,134)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 134 pg[9.1e( empty local-lis/les=0/0 n=0 ec=57/38 lis/c=74/74 les/c/f=75/75/0 sis=134) [0]/[1] r=-1 lpr=134 pi=[74,134)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] exit Start 0.000038 0 0.000000
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 134 pg[9.1e( empty local-lis/les=0/0 n=0 ec=57/38 lis/c=74/74 les/c/f=75/75/0 sis=134) [0]/[1] r=-1 lpr=134 pi=[74,134)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] enter Started/Stray
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82223104 unmapped: 5898240 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:43:05.881747+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 134 handle_osd_map epochs [135,135], i have 134, src has [1,135]
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 135 pg[9.1f(unlocked)] enter Initial
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 135 pg[9.1f( empty local-lis/les=0/0 n=0 ec=57/38 lis/c=96/96 les/c/f=97/97/0 sis=135) [0] r=0 lpr=0 pi=[96,135)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Initial 0.000060 0 0.000000
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 135 pg[9.1f( empty local-lis/les=0/0 n=0 ec=57/38 lis/c=96/96 les/c/f=97/97/0 sis=135) [0] r=0 lpr=0 pi=[96,135)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Reset
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 135 pg[9.1f( empty local-lis/les=0/0 n=0 ec=57/38 lis/c=96/96 les/c/f=97/97/0 sis=135) [0] r=0 lpr=135 pi=[96,135)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Reset 0.000016 1 0.000029
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 135 pg[9.1f( empty local-lis/les=0/0 n=0 ec=57/38 lis/c=96/96 les/c/f=97/97/0 sis=135) [0] r=0 lpr=135 pi=[96,135)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 135 pg[9.1f( empty local-lis/les=0/0 n=0 ec=57/38 lis/c=96/96 les/c/f=97/97/0 sis=135) [0] r=0 lpr=135 pi=[96,135)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Start
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 135 pg[9.1f( empty local-lis/les=0/0 n=0 ec=57/38 lis/c=96/96 les/c/f=97/97/0 sis=135) [0] r=0 lpr=135 pi=[96,135)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 135 pg[9.1f( empty local-lis/les=0/0 n=0 ec=57/38 lis/c=96/96 les/c/f=97/97/0 sis=135) [0] r=0 lpr=135 pi=[96,135)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Start 0.000005 0 0.000000
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 135 pg[9.1f( empty local-lis/les=0/0 n=0 ec=57/38 lis/c=96/96 les/c/f=97/97/0 sis=135) [0] r=0 lpr=135 pi=[96,135)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 135 pg[9.1f( empty local-lis/les=0/0 n=0 ec=57/38 lis/c=96/96 les/c/f=97/97/0 sis=135) [0] r=0 lpr=135 pi=[96,135)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 135 pg[9.1f( empty local-lis/les=0/0 n=0 ec=57/38 lis/c=96/96 les/c/f=97/97/0 sis=135) [0] r=0 lpr=135 pi=[96,135)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 135 pg[9.1f( empty local-lis/les=0/0 n=0 ec=57/38 lis/c=96/96 les/c/f=97/97/0 sis=135) [0] r=0 lpr=135 pi=[96,135)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.001011 1 0.000044
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 135 pg[9.1f( empty local-lis/les=0/0 n=0 ec=57/38 lis/c=96/96 les/c/f=97/97/0 sis=135) [0] r=0 lpr=135 pi=[96,135)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 135 pg[9.1f( empty local-lis/les=0/0 n=0 ec=57/38 lis/c=96/96 les/c/f=97/97/0 sis=135) [0] r=0 lpr=135 pi=[96,135)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.000032 0 0.000000
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 135 pg[9.1f( empty local-lis/les=0/0 n=0 ec=57/38 lis/c=96/96 les/c/f=97/97/0 sis=135) [0] r=0 lpr=135 pi=[96,135)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 0.001067 0 0.000000
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 135 pg[9.1f( empty local-lis/les=0/0 n=0 ec=57/38 lis/c=96/96 les/c/f=97/97/0 sis=135) [0] r=0 lpr=135 pi=[96,135)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/WaitActingChange
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Feb 02 10:14:24 compute-1 ceph-osd[77691]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Feb 02 10:14:24 compute-1 ceph-osd[77691]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 135 pg[9.1e( v 44'1041 lc 0'0 (0'0,44'1041] local-lis/les=0/0 n=5 ec=57/38 lis/c=74/74 les/c/f=75/75/0 sis=134) [0]/[1] r=-1 lpr=134 pi=[74,134)/1 crt=44'1041 mlcod 0'0 remapped NOTIFY m=5 mbc={}] exit Started/Stray 1.019780 5 0.000135
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 135 pg[9.1e( v 44'1041 lc 0'0 (0'0,44'1041] local-lis/les=0/0 n=5 ec=57/38 lis/c=74/74 les/c/f=75/75/0 sis=134) [0]/[1] r=-1 lpr=134 pi=[74,134)/1 crt=44'1041 mlcod 0'0 remapped NOTIFY m=5 mbc={}] enter Started/ReplicaActive
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 135 pg[9.1e( v 44'1041 lc 0'0 (0'0,44'1041] local-lis/les=0/0 n=5 ec=57/38 lis/c=74/74 les/c/f=75/75/0 sis=134) [0]/[1] r=-1 lpr=134 pi=[74,134)/1 crt=44'1041 mlcod 0'0 remapped NOTIFY m=5 mbc={}] enter Started/ReplicaActive/RepNotRecovering
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 135 pg[9.1e( v 44'1041 lc 44'600 (0'0,44'1041] local-lis/les=0/0 n=5 ec=57/38 lis/c=134/74 les/c/f=135/75/0 sis=134) [0]/[1] r=-1 lpr=134 pi=[74,134)/1 luod=0'0 crt=44'1041 lcod 0'0 mlcod 0'0 active+remapped m=5 mbc={}] exit Started/ReplicaActive/RepNotRecovering 0.014964 4 0.000157
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 135 pg[9.1e( v 44'1041 lc 44'600 (0'0,44'1041] local-lis/les=0/0 n=5 ec=57/38 lis/c=134/74 les/c/f=135/75/0 sis=134) [0]/[1] r=-1 lpr=134 pi=[74,134)/1 luod=0'0 crt=44'1041 lcod 0'0 mlcod 0'0 active+remapped m=5 mbc={}] enter Started/ReplicaActive/RepWaitRecoveryReserved
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 135 pg[9.1e( v 44'1041 lc 44'600 (0'0,44'1041] local-lis/les=0/0 n=5 ec=57/38 lis/c=134/74 les/c/f=135/75/0 sis=134) [0]/[1] r=-1 lpr=134 pi=[74,134)/1 luod=0'0 crt=44'1041 lcod 0'0 mlcod 0'0 active+remapped m=5 mbc={}] exit Started/ReplicaActive/RepWaitRecoveryReserved 0.000134 1 0.000096
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 135 pg[9.1e( v 44'1041 lc 44'600 (0'0,44'1041] local-lis/les=0/0 n=5 ec=57/38 lis/c=134/74 les/c/f=135/75/0 sis=134) [0]/[1] r=-1 lpr=134 pi=[74,134)/1 luod=0'0 crt=44'1041 lcod 0'0 mlcod 0'0 active+remapped m=5 mbc={}] enter Started/ReplicaActive/RepRecovering
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82280448 unmapped: 5840896 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 135 pg[9.1e( v 44'1041 (0'0,44'1041] local-lis/les=0/0 n=5 ec=57/38 lis/c=134/74 les/c/f=135/75/0 sis=134) [0]/[1] r=-1 lpr=134 pi=[74,134)/1 luod=0'0 crt=44'1041 mlcod 0'0 active+remapped mbc={}] exit Started/ReplicaActive/RepRecovering 0.047682 1 0.000062
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 135 pg[9.1e( v 44'1041 (0'0,44'1041] local-lis/les=0/0 n=5 ec=57/38 lis/c=134/74 les/c/f=135/75/0 sis=134) [0]/[1] r=-1 lpr=134 pi=[74,134)/1 luod=0'0 crt=44'1041 mlcod 0'0 active+remapped mbc={}] enter Started/ReplicaActive/RepNotRecovering
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 135 handle_osd_map epochs [135,136], i have 135, src has [1,136]
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 135 handle_osd_map epochs [136,136], i have 136, src has [1,136]
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 136 pg[9.1f( empty local-lis/les=0/0 n=0 ec=57/38 lis/c=96/96 les/c/f=97/97/0 sis=135) [0] r=0 lpr=135 pi=[96,135)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Started/Primary/WaitActingChange 0.357885 2 0.000069
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 136 pg[9.1f( empty local-lis/les=0/0 n=0 ec=57/38 lis/c=96/96 les/c/f=97/97/0 sis=135) [0] r=0 lpr=135 pi=[96,135)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Started/Primary 0.358991 0 0.000000
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 136 pg[9.1f( empty local-lis/les=0/0 n=0 ec=57/38 lis/c=96/96 les/c/f=97/97/0 sis=135) [0] r=0 lpr=135 pi=[96,135)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Started 0.359019 0 0.000000
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 136 pg[9.1f( empty local-lis/les=0/0 n=0 ec=57/38 lis/c=96/96 les/c/f=97/97/0 sis=135) [0] r=0 lpr=135 pi=[96,135)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Reset
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 136 pg[9.1e( v 44'1041 (0'0,44'1041] local-lis/les=0/0 n=5 ec=57/38 lis/c=134/74 les/c/f=135/75/0 sis=134) [0]/[1] r=-1 lpr=134 pi=[74,134)/1 luod=0'0 crt=44'1041 mlcod 0'0 active+remapped mbc={}] exit Started/ReplicaActive/RepNotRecovering 0.294867 1 0.000041
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 136 pg[9.1e( v 44'1041 (0'0,44'1041] local-lis/les=0/0 n=5 ec=57/38 lis/c=134/74 les/c/f=135/75/0 sis=134) [0]/[1] r=-1 lpr=134 pi=[74,134)/1 luod=0'0 crt=44'1041 mlcod 0'0 active+remapped mbc={}] exit Started/ReplicaActive 0.357817 0 0.000000
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 136 pg[9.1f( empty local-lis/les=0/0 n=0 ec=57/38 lis/c=96/96 les/c/f=97/97/0 sis=136) [0]/[1] r=-1 lpr=136 pi=[96,136)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 136 pg[9.1e( v 44'1041 (0'0,44'1041] local-lis/les=0/0 n=5 ec=57/38 lis/c=134/74 les/c/f=135/75/0 sis=134) [0]/[1] r=-1 lpr=134 pi=[74,134)/1 luod=0'0 crt=44'1041 mlcod 0'0 active+remapped mbc={}] exit Started 1.377689 0 0.000000
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 136 pg[9.1e( v 44'1041 (0'0,44'1041] local-lis/les=0/0 n=5 ec=57/38 lis/c=134/74 les/c/f=135/75/0 sis=134) [0]/[1] r=-1 lpr=134 pi=[74,134)/1 luod=0'0 crt=44'1041 mlcod 0'0 active+remapped mbc={}] enter Reset
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 136 pg[9.1f( empty local-lis/les=0/0 n=0 ec=57/38 lis/c=96/96 les/c/f=97/97/0 sis=136) [0]/[1] r=-1 lpr=136 pi=[96,136)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] exit Reset 0.000112 1 0.000172
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 136 pg[9.1f( empty local-lis/les=0/0 n=0 ec=57/38 lis/c=96/96 les/c/f=97/97/0 sis=136) [0]/[1] r=-1 lpr=136 pi=[96,136)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] enter Started
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 136 pg[9.1f( empty local-lis/les=0/0 n=0 ec=57/38 lis/c=96/96 les/c/f=97/97/0 sis=136) [0]/[1] r=-1 lpr=136 pi=[96,136)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] enter Start
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 136 pg[9.1f( empty local-lis/les=0/0 n=0 ec=57/38 lis/c=96/96 les/c/f=97/97/0 sis=136) [0]/[1] r=-1 lpr=136 pi=[96,136)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 136 pg[9.1f( empty local-lis/les=0/0 n=0 ec=57/38 lis/c=96/96 les/c/f=97/97/0 sis=136) [0]/[1] r=-1 lpr=136 pi=[96,136)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] exit Start 0.000010 0 0.000000
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 136 pg[9.1e( v 44'1041 (0'0,44'1041] local-lis/les=0/0 n=5 ec=57/38 lis/c=134/74 les/c/f=135/75/0 sis=136) [0] r=0 lpr=136 pi=[74,136)/1 luod=0'0 crt=44'1041 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 0 -> 0, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 136 pg[9.1e( v 44'1041 (0'0,44'1041] local-lis/les=0/0 n=5 ec=57/38 lis/c=134/74 les/c/f=135/75/0 sis=136) [0] r=0 lpr=136 pi=[74,136)/1 crt=44'1041 mlcod 0'0 unknown mbc={}] exit Reset 0.000124 1 0.000218
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 136 pg[9.1e( v 44'1041 (0'0,44'1041] local-lis/les=0/0 n=5 ec=57/38 lis/c=134/74 les/c/f=135/75/0 sis=136) [0] r=0 lpr=136 pi=[74,136)/1 crt=44'1041 mlcod 0'0 unknown mbc={}] enter Started
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 136 pg[9.1e( v 44'1041 (0'0,44'1041] local-lis/les=0/0 n=5 ec=57/38 lis/c=134/74 les/c/f=135/75/0 sis=136) [0] r=0 lpr=136 pi=[74,136)/1 crt=44'1041 mlcod 0'0 unknown mbc={}] enter Start
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 136 pg[9.1e( v 44'1041 (0'0,44'1041] local-lis/les=0/0 n=5 ec=57/38 lis/c=134/74 les/c/f=135/75/0 sis=136) [0] r=0 lpr=136 pi=[74,136)/1 crt=44'1041 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 136 pg[9.1e( v 44'1041 (0'0,44'1041] local-lis/les=0/0 n=5 ec=57/38 lis/c=134/74 les/c/f=135/75/0 sis=136) [0] r=0 lpr=136 pi=[74,136)/1 crt=44'1041 mlcod 0'0 unknown mbc={}] exit Start 0.000026 0 0.000000
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 136 pg[9.1e( v 44'1041 (0'0,44'1041] local-lis/les=0/0 n=5 ec=57/38 lis/c=134/74 les/c/f=135/75/0 sis=136) [0] r=0 lpr=136 pi=[74,136)/1 crt=44'1041 mlcod 0'0 unknown mbc={}] enter Started/Primary
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 136 pg[9.1e( v 44'1041 (0'0,44'1041] local-lis/les=0/0 n=5 ec=57/38 lis/c=134/74 les/c/f=135/75/0 sis=136) [0] r=0 lpr=136 pi=[74,136)/1 crt=44'1041 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 136 pg[9.1e( v 44'1041 (0'0,44'1041] local-lis/les=0/0 n=5 ec=57/38 lis/c=134/74 les/c/f=135/75/0 sis=136) [0] r=0 lpr=136 pi=[74,136)/1 crt=44'1041 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 136 pg[9.1e( v 44'1041 (0'0,44'1041] local-lis/les=0/0 n=5 ec=57/38 lis/c=134/74 les/c/f=135/75/0 sis=136) [0] r=0 lpr=136 pi=[74,136)/1 crt=44'1041 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.000089 1 0.000143
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 136 pg[9.1e( v 44'1041 (0'0,44'1041] local-lis/les=0/0 n=5 ec=57/38 lis/c=134/74 les/c/f=135/75/0 sis=136) [0] r=0 lpr=136 pi=[74,136)/1 crt=44'1041 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 136 pg[9.1f( empty local-lis/les=0/0 n=0 ec=57/38 lis/c=96/96 les/c/f=97/97/0 sis=136) [0]/[1] r=-1 lpr=136 pi=[96,136)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] enter Started/Stray
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Feb 02 10:14:24 compute-1 ceph-osd[77691]: merge_log_dups log.dups.size()=0olog.dups.size()=29
Feb 02 10:14:24 compute-1 ceph-osd[77691]: end of merge_log_dups changed=1 log.dups.size()=0 olog.dups.size()=29
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 136 pg[9.1e( v 44'1041 (0'0,44'1041] local-lis/les=134/135 n=5 ec=57/38 lis/c=134/74 les/c/f=135/75/0 sis=136) [0] r=0 lpr=136 pi=[74,136)/1 crt=44'1041 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.002163 3 0.000172
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 136 pg[9.1e( v 44'1041 (0'0,44'1041] local-lis/les=134/135 n=5 ec=57/38 lis/c=134/74 les/c/f=135/75/0 sis=136) [0] r=0 lpr=136 pi=[74,136)/1 crt=44'1041 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 136 pg[9.1e( v 44'1041 (0'0,44'1041] local-lis/les=134/135 n=5 ec=57/38 lis/c=134/74 les/c/f=135/75/0 sis=136) [0] r=0 lpr=136 pi=[74,136)/1 crt=44'1041 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000009 0 0.000000
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 136 pg[9.1e( v 44'1041 (0'0,44'1041] local-lis/les=134/135 n=5 ec=57/38 lis/c=134/74 les/c/f=135/75/0 sis=136) [0] r=0 lpr=136 pi=[74,136)/1 crt=44'1041 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 136 heartbeat osd_stat(store_statfs(0x4fca84000/0x0/0x4ffc00000, data 0xe8314/0x195000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:43:06.881912+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82296832 unmapped: 5824512 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 136 handle_osd_map epochs [137,137], i have 136, src has [1,137]
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 136 handle_osd_map epochs [137,137], i have 137, src has [1,137]
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 137 pg[9.1e( v 44'1041 (0'0,44'1041] local-lis/les=134/135 n=5 ec=57/38 lis/c=134/74 les/c/f=135/75/0 sis=136) [0] r=0 lpr=136 pi=[74,136)/1 crt=44'1041 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 0.969000 2 0.000098
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 137 pg[9.1e( v 44'1041 (0'0,44'1041] local-lis/les=134/135 n=5 ec=57/38 lis/c=134/74 les/c/f=135/75/0 sis=136) [0] r=0 lpr=136 pi=[74,136)/1 crt=44'1041 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 0.971374 0 0.000000
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 137 pg[9.1e( v 44'1041 (0'0,44'1041] local-lis/les=134/135 n=5 ec=57/38 lis/c=134/74 les/c/f=135/75/0 sis=136) [0] r=0 lpr=136 pi=[74,136)/1 crt=44'1041 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Feb 02 10:14:24 compute-1 ceph-osd[77691]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Feb 02 10:14:24 compute-1 ceph-osd[77691]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 137 pg[9.1f( v 44'1041 lc 0'0 (0'0,44'1041] local-lis/les=0/0 n=5 ec=57/38 lis/c=96/96 les/c/f=97/97/0 sis=136) [0]/[1] r=-1 lpr=136 pi=[96,136)/1 crt=44'1041 mlcod 0'0 remapped NOTIFY m=5 mbc={}] exit Started/Stray 0.971604 5 0.000515
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 137 pg[9.1f( v 44'1041 lc 0'0 (0'0,44'1041] local-lis/les=0/0 n=5 ec=57/38 lis/c=96/96 les/c/f=97/97/0 sis=136) [0]/[1] r=-1 lpr=136 pi=[96,136)/1 crt=44'1041 mlcod 0'0 remapped NOTIFY m=5 mbc={}] enter Started/ReplicaActive
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 137 pg[9.1f( v 44'1041 lc 0'0 (0'0,44'1041] local-lis/les=0/0 n=5 ec=57/38 lis/c=96/96 les/c/f=97/97/0 sis=136) [0]/[1] r=-1 lpr=136 pi=[96,136)/1 crt=44'1041 mlcod 0'0 remapped NOTIFY m=5 mbc={}] enter Started/ReplicaActive/RepNotRecovering
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 137 pg[9.1e( v 44'1041 (0'0,44'1041] local-lis/les=136/137 n=5 ec=57/38 lis/c=134/74 les/c/f=135/75/0 sis=136) [0] r=0 lpr=136 pi=[74,136)/1 crt=44'1041 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 137 pg[9.1e( v 44'1041 (0'0,44'1041] local-lis/les=136/137 n=5 ec=57/38 lis/c=134/74 les/c/f=135/75/0 sis=136) [0] r=0 lpr=136 pi=[74,136)/1 crt=44'1041 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 137 pg[9.1e( v 44'1041 (0'0,44'1041] local-lis/les=136/137 n=5 ec=57/38 lis/c=136/74 les/c/f=137/75/0 sis=136) [0] r=0 lpr=136 pi=[74,136)/1 crt=44'1041 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.003521 4 0.000519
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 137 pg[9.1e( v 44'1041 (0'0,44'1041] local-lis/les=136/137 n=5 ec=57/38 lis/c=136/74 les/c/f=137/75/0 sis=136) [0] r=0 lpr=136 pi=[74,136)/1 crt=44'1041 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 137 pg[9.1e( v 44'1041 (0'0,44'1041] local-lis/les=136/137 n=5 ec=57/38 lis/c=136/74 les/c/f=137/75/0 sis=136) [0] r=0 lpr=136 pi=[74,136)/1 crt=44'1041 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000019 0 0.000000
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 137 pg[9.1e( v 44'1041 (0'0,44'1041] local-lis/les=136/137 n=5 ec=57/38 lis/c=136/74 les/c/f=137/75/0 sis=136) [0] r=0 lpr=136 pi=[74,136)/1 crt=44'1041 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 137 pg[9.1f( v 44'1041 lc 44'471 (0'0,44'1041] local-lis/les=0/0 n=5 ec=57/38 lis/c=136/96 les/c/f=137/97/0 sis=136) [0]/[1] r=-1 lpr=136 pi=[96,136)/1 luod=0'0 crt=44'1041 lcod 0'0 mlcod 0'0 active+remapped m=5 mbc={}] exit Started/ReplicaActive/RepNotRecovering 0.006987 4 0.000211
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 137 pg[9.1f( v 44'1041 lc 44'471 (0'0,44'1041] local-lis/les=0/0 n=5 ec=57/38 lis/c=136/96 les/c/f=137/97/0 sis=136) [0]/[1] r=-1 lpr=136 pi=[96,136)/1 luod=0'0 crt=44'1041 lcod 0'0 mlcod 0'0 active+remapped m=5 mbc={}] enter Started/ReplicaActive/RepWaitRecoveryReserved
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 137 pg[9.1f( v 44'1041 lc 44'471 (0'0,44'1041] local-lis/les=0/0 n=5 ec=57/38 lis/c=136/96 les/c/f=137/97/0 sis=136) [0]/[1] r=-1 lpr=136 pi=[96,136)/1 luod=0'0 crt=44'1041 lcod 0'0 mlcod 0'0 active+remapped m=5 mbc={}] exit Started/ReplicaActive/RepWaitRecoveryReserved 0.000083 1 0.000067
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 137 pg[9.1f( v 44'1041 lc 44'471 (0'0,44'1041] local-lis/les=0/0 n=5 ec=57/38 lis/c=136/96 les/c/f=137/97/0 sis=136) [0]/[1] r=-1 lpr=136 pi=[96,136)/1 luod=0'0 crt=44'1041 lcod 0'0 mlcod 0'0 active+remapped m=5 mbc={}] enter Started/ReplicaActive/RepRecovering
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 137 pg[9.1f( v 44'1041 (0'0,44'1041] local-lis/les=0/0 n=5 ec=57/38 lis/c=136/96 les/c/f=137/97/0 sis=136) [0]/[1] r=-1 lpr=136 pi=[96,136)/1 luod=0'0 crt=44'1041 mlcod 0'0 active+remapped mbc={}] exit Started/ReplicaActive/RepRecovering 0.043071 1 0.000050
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 137 pg[9.1f( v 44'1041 (0'0,44'1041] local-lis/les=0/0 n=5 ec=57/38 lis/c=136/96 les/c/f=137/97/0 sis=136) [0]/[1] r=-1 lpr=136 pi=[96,136)/1 luod=0'0 crt=44'1041 mlcod 0'0 active+remapped mbc={}] enter Started/ReplicaActive/RepNotRecovering
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:43:07.882054+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82132992 unmapped: 5988352 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 137 handle_osd_map epochs [138,138], i have 137, src has [1,138]
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 138 pg[9.1f( v 44'1041 (0'0,44'1041] local-lis/les=0/0 n=5 ec=57/38 lis/c=136/96 les/c/f=137/97/0 sis=136) [0]/[1] r=-1 lpr=136 pi=[96,136)/1 luod=0'0 crt=44'1041 mlcod 0'0 active+remapped mbc={}] exit Started/ReplicaActive/RepNotRecovering 0.978616 1 0.000080
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 138 pg[9.1f( v 44'1041 (0'0,44'1041] local-lis/les=0/0 n=5 ec=57/38 lis/c=136/96 les/c/f=137/97/0 sis=136) [0]/[1] r=-1 lpr=136 pi=[96,136)/1 luod=0'0 crt=44'1041 mlcod 0'0 active+remapped mbc={}] exit Started/ReplicaActive 1.028919 0 0.000000
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 138 pg[9.1f( v 44'1041 (0'0,44'1041] local-lis/les=0/0 n=5 ec=57/38 lis/c=136/96 les/c/f=137/97/0 sis=136) [0]/[1] r=-1 lpr=136 pi=[96,136)/1 luod=0'0 crt=44'1041 mlcod 0'0 active+remapped mbc={}] exit Started 2.000586 0 0.000000
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 138 pg[9.1f( v 44'1041 (0'0,44'1041] local-lis/les=0/0 n=5 ec=57/38 lis/c=136/96 les/c/f=137/97/0 sis=136) [0]/[1] r=-1 lpr=136 pi=[96,136)/1 luod=0'0 crt=44'1041 mlcod 0'0 active+remapped mbc={}] enter Reset
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 138 pg[9.1f( v 44'1041 (0'0,44'1041] local-lis/les=0/0 n=5 ec=57/38 lis/c=136/96 les/c/f=137/97/0 sis=138) [0] r=0 lpr=138 pi=[96,138)/1 luod=0'0 crt=44'1041 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 0 -> 0, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 138 pg[9.1f( v 44'1041 (0'0,44'1041] local-lis/les=0/0 n=5 ec=57/38 lis/c=136/96 les/c/f=137/97/0 sis=138) [0] r=0 lpr=138 pi=[96,138)/1 crt=44'1041 mlcod 0'0 unknown mbc={}] exit Reset 0.000075 1 0.000117
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 138 pg[9.1f( v 44'1041 (0'0,44'1041] local-lis/les=0/0 n=5 ec=57/38 lis/c=136/96 les/c/f=137/97/0 sis=138) [0] r=0 lpr=138 pi=[96,138)/1 crt=44'1041 mlcod 0'0 unknown mbc={}] enter Started
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 138 pg[9.1f( v 44'1041 (0'0,44'1041] local-lis/les=0/0 n=5 ec=57/38 lis/c=136/96 les/c/f=137/97/0 sis=138) [0] r=0 lpr=138 pi=[96,138)/1 crt=44'1041 mlcod 0'0 unknown mbc={}] enter Start
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 138 pg[9.1f( v 44'1041 (0'0,44'1041] local-lis/les=0/0 n=5 ec=57/38 lis/c=136/96 les/c/f=137/97/0 sis=138) [0] r=0 lpr=138 pi=[96,138)/1 crt=44'1041 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 138 pg[9.1f( v 44'1041 (0'0,44'1041] local-lis/les=0/0 n=5 ec=57/38 lis/c=136/96 les/c/f=137/97/0 sis=138) [0] r=0 lpr=138 pi=[96,138)/1 crt=44'1041 mlcod 0'0 unknown mbc={}] exit Start 0.000008 0 0.000000
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 138 pg[9.1f( v 44'1041 (0'0,44'1041] local-lis/les=0/0 n=5 ec=57/38 lis/c=136/96 les/c/f=137/97/0 sis=138) [0] r=0 lpr=138 pi=[96,138)/1 crt=44'1041 mlcod 0'0 unknown mbc={}] enter Started/Primary
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 138 pg[9.1f( v 44'1041 (0'0,44'1041] local-lis/les=0/0 n=5 ec=57/38 lis/c=136/96 les/c/f=137/97/0 sis=138) [0] r=0 lpr=138 pi=[96,138)/1 crt=44'1041 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 138 pg[9.1f( v 44'1041 (0'0,44'1041] local-lis/les=0/0 n=5 ec=57/38 lis/c=136/96 les/c/f=137/97/0 sis=138) [0] r=0 lpr=138 pi=[96,138)/1 crt=44'1041 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 138 pg[9.1f( v 44'1041 (0'0,44'1041] local-lis/les=0/0 n=5 ec=57/38 lis/c=136/96 les/c/f=137/97/0 sis=138) [0] r=0 lpr=138 pi=[96,138)/1 crt=44'1041 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.000343 1 0.000060
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 138 pg[9.1f( v 44'1041 (0'0,44'1041] local-lis/les=0/0 n=5 ec=57/38 lis/c=136/96 les/c/f=137/97/0 sis=138) [0] r=0 lpr=138 pi=[96,138)/1 crt=44'1041 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Feb 02 10:14:24 compute-1 ceph-osd[77691]: merge_log_dups log.dups.size()=0olog.dups.size()=31
Feb 02 10:14:24 compute-1 ceph-osd[77691]: end of merge_log_dups changed=1 log.dups.size()=0 olog.dups.size()=31
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 138 pg[9.1f( v 44'1041 (0'0,44'1041] local-lis/les=136/137 n=5 ec=57/38 lis/c=136/96 les/c/f=137/97/0 sis=138) [0] r=0 lpr=138 pi=[96,138)/1 crt=44'1041 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.001450 3 0.000062
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 138 pg[9.1f( v 44'1041 (0'0,44'1041] local-lis/les=136/137 n=5 ec=57/38 lis/c=136/96 les/c/f=137/97/0 sis=138) [0] r=0 lpr=138 pi=[96,138)/1 crt=44'1041 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 138 pg[9.1f( v 44'1041 (0'0,44'1041] local-lis/les=136/137 n=5 ec=57/38 lis/c=136/96 les/c/f=137/97/0 sis=138) [0] r=0 lpr=138 pi=[96,138)/1 crt=44'1041 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000007 0 0.000000
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 138 pg[9.1f( v 44'1041 (0'0,44'1041] local-lis/les=136/137 n=5 ec=57/38 lis/c=136/96 les/c/f=137/97/0 sis=138) [0] r=0 lpr=138 pi=[96,138)/1 crt=44'1041 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:43:08.882180+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82190336 unmapped: 5931008 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e321d000
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 138 handle_osd_map epochs [138,139], i have 138, src has [1,139]
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 139 pg[9.1f( v 44'1041 (0'0,44'1041] local-lis/les=136/137 n=5 ec=57/38 lis/c=136/96 les/c/f=137/97/0 sis=138) [0] r=0 lpr=138 pi=[96,138)/1 crt=44'1041 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 1.065664 2 0.000105
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 139 pg[9.1f( v 44'1041 (0'0,44'1041] local-lis/les=136/137 n=5 ec=57/38 lis/c=136/96 les/c/f=137/97/0 sis=138) [0] r=0 lpr=138 pi=[96,138)/1 crt=44'1041 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 1.067577 0 0.000000
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 139 pg[9.1f( v 44'1041 (0'0,44'1041] local-lis/les=136/137 n=5 ec=57/38 lis/c=136/96 les/c/f=137/97/0 sis=138) [0] r=0 lpr=138 pi=[96,138)/1 crt=44'1041 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 139 pg[9.1f( v 44'1041 (0'0,44'1041] local-lis/les=138/139 n=5 ec=57/38 lis/c=136/96 les/c/f=137/97/0 sis=138) [0] r=0 lpr=138 pi=[96,138)/1 crt=44'1041 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 handle_osd_map epochs [139,139], i have 139, src has [1,139]
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 139 pg[9.1f( v 44'1041 (0'0,44'1041] local-lis/les=138/139 n=5 ec=57/38 lis/c=136/96 les/c/f=137/97/0 sis=138) [0] r=0 lpr=138 pi=[96,138)/1 crt=44'1041 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 139 pg[9.1f( v 44'1041 (0'0,44'1041] local-lis/les=138/139 n=5 ec=57/38 lis/c=138/96 les/c/f=139/97/0 sis=138) [0] r=0 lpr=138 pi=[96,138)/1 crt=44'1041 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.025509 4 0.000275
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 139 pg[9.1f( v 44'1041 (0'0,44'1041] local-lis/les=138/139 n=5 ec=57/38 lis/c=138/96 les/c/f=139/97/0 sis=138) [0] r=0 lpr=138 pi=[96,138)/1 crt=44'1041 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 139 pg[9.1f( v 44'1041 (0'0,44'1041] local-lis/les=138/139 n=5 ec=57/38 lis/c=138/96 les/c/f=139/97/0 sis=138) [0] r=0 lpr=138 pi=[96,138)/1 crt=44'1041 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000066 0 0.000000
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 pg_epoch: 139 pg[9.1f( v 44'1041 (0'0,44'1041] local-lis/les=138/139 n=5 ec=57/38 lis/c=138/96 les/c/f=139/97/0 sis=138) [0] r=0 lpr=138 pi=[96,138)/1 crt=44'1041 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:43:09.882329+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 949870 data_alloc: 218103808 data_used: 143360
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82198528 unmapped: 5922816 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fca75000/0x0/0x4ffc00000, data 0xf239c/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:43:10.882470+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82198528 unmapped: 5922816 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fca75000/0x0/0x4ffc00000, data 0xf239c/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:43:11.882629+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82059264 unmapped: 6062080 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:43:12.882800+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82059264 unmapped: 6062080 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:43:13.882967+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82059264 unmapped: 6062080 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fca75000/0x0/0x4ffc00000, data 0xf239c/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:43:14.883103+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 951382 data_alloc: 218103808 data_used: 143360
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82075648 unmapped: 6045696 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:43:15.883275+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82075648 unmapped: 6045696 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:43:16.883425+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82075648 unmapped: 6045696 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 11.920713425s of 12.206723213s, submitted: 63
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:43:17.883584+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82083840 unmapped: 6037504 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fca77000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:43:18.883726+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: mgrc handle_mgr_map Got map version 30
Feb 02 10:14:24 compute-1 ceph-osd[77691]: mgrc handle_mgr_map Active mgr is now [v2:192.168.122.100:6800/1282799344,v1:192.168.122.100:6801/1282799344]
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82255872 unmapped: 5865472 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:43:19.883922+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 950130 data_alloc: 218103808 data_used: 143360
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82255872 unmapped: 5865472 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fca77000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:43:20.884057+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82255872 unmapped: 5865472 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fca77000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:43:21.884232+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82255872 unmapped: 5865472 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:43:22.884390+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82264064 unmapped: 5857280 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fca77000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:43:23.884549+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82264064 unmapped: 5857280 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fca77000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:43:24.884660+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 949998 data_alloc: 218103808 data_used: 143360
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82264064 unmapped: 5857280 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:43:25.884767+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82264064 unmapped: 5857280 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:43:26.884905+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82264064 unmapped: 5857280 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:43:27.885047+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82272256 unmapped: 5849088 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fca77000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:43:28.885218+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82272256 unmapped: 5849088 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:43:29.885373+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 949998 data_alloc: 218103808 data_used: 143360
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82280448 unmapped: 5840896 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fca77000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:43:30.885515+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82280448 unmapped: 5840896 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:43:31.885675+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82288640 unmapped: 5832704 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:43:32.885878+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82288640 unmapped: 5832704 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:43:33.886054+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82288640 unmapped: 5832704 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fca77000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:43:34.886207+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 949998 data_alloc: 218103808 data_used: 143360
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82296832 unmapped: 5824512 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:43:35.886421+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fca77000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82296832 unmapped: 5824512 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:43:36.886589+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82305024 unmapped: 5816320 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:43:37.886815+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82305024 unmapped: 5816320 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:43:38.886981+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82305024 unmapped: 5816320 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:43:39.887226+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 949998 data_alloc: 218103808 data_used: 143360
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82313216 unmapped: 5808128 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:43:40.887346+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82313216 unmapped: 5808128 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fca77000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:43:41.887504+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82321408 unmapped: 5799936 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:43:42.887671+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82321408 unmapped: 5799936 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fca77000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:43:43.887875+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fca77000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82321408 unmapped: 5799936 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:43:44.888033+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 949998 data_alloc: 218103808 data_used: 143360
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82329600 unmapped: 5791744 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:43:45.888237+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 ms_handle_reset con 0x5616e321d000 session 0x5616dfb63c20
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82329600 unmapped: 5791744 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:43:46.888392+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82345984 unmapped: 5775360 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fca77000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:43:47.888539+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82362368 unmapped: 5758976 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:43:48.888761+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82370560 unmapped: 5750784 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fca77000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:43:49.888902+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 949998 data_alloc: 218103808 data_used: 143360
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82370560 unmapped: 5750784 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:43:50.889026+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82370560 unmapped: 5750784 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:43:51.889167+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fca77000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82378752 unmapped: 5742592 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fca77000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:43:52.889407+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82378752 unmapped: 5742592 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fca77000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:43:53.889582+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82386944 unmapped: 5734400 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fca77000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:43:54.889727+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 949998 data_alloc: 218103808 data_used: 143360
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82395136 unmapped: 5726208 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:43:55.889913+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82403328 unmapped: 5718016 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e3543400
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 39.616950989s of 39.631927490s, submitted: 4
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:43:56.890095+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82403328 unmapped: 5718016 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:43:57.890261+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82403328 unmapped: 5718016 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:43:58.890402+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fca77000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82411520 unmapped: 5709824 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:43:59.890527+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 949778 data_alloc: 218103808 data_used: 143360
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82411520 unmapped: 5709824 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:44:00.890688+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82419712 unmapped: 5701632 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:44:01.890872+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82419712 unmapped: 5701632 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:44:02.891026+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e2617800
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82419712 unmapped: 5701632 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fca77000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:44:03.891294+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82427904 unmapped: 5693440 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:44:04.891429+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 951290 data_alloc: 218103808 data_used: 143360
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82427904 unmapped: 5693440 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fca77000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:44:05.891658+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82436096 unmapped: 5685248 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:44:06.891847+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82444288 unmapped: 5677056 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:44:07.892006+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82452480 unmapped: 5668864 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:44:08.892258+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82452480 unmapped: 5668864 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fca77000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:44:09.892457+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 950699 data_alloc: 218103808 data_used: 143360
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82460672 unmapped: 5660672 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:44:10.892604+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82460672 unmapped: 5660672 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:44:11.892781+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82468864 unmapped: 5652480 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fca77000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:44:12.937464+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82468864 unmapped: 5652480 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:44:13.937598+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 17.212181091s of 17.223489761s, submitted: 3
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82477056 unmapped: 5644288 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fca77000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:44:14.937741+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fca77000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 950567 data_alloc: 218103808 data_used: 143360
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82477056 unmapped: 5644288 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fca77000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:44:15.937914+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82485248 unmapped: 5636096 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fca77000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:44:16.938071+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82485248 unmapped: 5636096 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:44:17.938260+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82485248 unmapped: 5636096 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:44:18.938417+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82493440 unmapped: 5627904 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:44:19.938555+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 950567 data_alloc: 218103808 data_used: 143360
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82493440 unmapped: 5627904 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fca77000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:44:20.938703+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82493440 unmapped: 5627904 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:44:21.938875+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82501632 unmapped: 5619712 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:44:22.939227+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82501632 unmapped: 5619712 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:44:23.940271+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82509824 unmapped: 5611520 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:44:24.940627+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 950567 data_alloc: 218103808 data_used: 143360
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82509824 unmapped: 5611520 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:44:25.940884+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 ms_handle_reset con 0x5616e2617800 session 0x5616e1eb5c20
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 ms_handle_reset con 0x5616e3543400 session 0x5616e359af00
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fca77000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82518016 unmapped: 5603328 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:44:26.941278+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fca77000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82518016 unmapped: 5603328 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:44:27.941393+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fca77000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82526208 unmapped: 5595136 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:44:28.941719+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82526208 unmapped: 5595136 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:44:29.941879+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 950567 data_alloc: 218103808 data_used: 143360
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82526208 unmapped: 5595136 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:44:30.942176+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82534400 unmapped: 5586944 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:44:31.942312+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82534400 unmapped: 5586944 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fca77000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:44:32.942530+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 ms_handle_reset con 0x5616e21acc00 session 0x5616e04dc1e0
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 ms_handle_reset con 0x5616e34d4c00 session 0x5616dfb745a0
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82550784 unmapped: 5570560 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:44:33.943271+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82550784 unmapped: 5570560 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fca77000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:44:34.943475+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 950567 data_alloc: 218103808 data_used: 143360
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82558976 unmapped: 5562368 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:44:35.943911+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82567168 unmapped: 5554176 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:44:36.944126+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e2617800
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 23.014419556s of 23.017801285s, submitted: 1
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82567168 unmapped: 5554176 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fca77000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:44:37.944652+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 ms_handle_reset con 0x5616e2273400 session 0x5616e34745a0
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 ms_handle_reset con 0x5616e3284800 session 0x5616e1ec9e00
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fca77000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82575360 unmapped: 5545984 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:44:38.944777+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82575360 unmapped: 5545984 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:44:39.944922+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 950699 data_alloc: 218103808 data_used: 143360
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82575360 unmapped: 5545984 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:44:40.945097+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82583552 unmapped: 5537792 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fca77000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:44:41.945238+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82583552 unmapped: 5537792 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:44:42.945421+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e226f400
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fca77000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82583552 unmapped: 5537792 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:44:43.945677+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e327c000
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82591744 unmapped: 5529600 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:44:44.945852+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 950831 data_alloc: 218103808 data_used: 143360
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82591744 unmapped: 5529600 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:44:45.946000+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82599936 unmapped: 5521408 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fca77000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:44:46.946232+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82599936 unmapped: 5521408 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:44:47.946433+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82599936 unmapped: 5521408 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fca77000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e3284000
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 11.597830772s of 11.617882729s, submitted: 2
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:44:48.946586+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82608128 unmapped: 5513216 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:44:49.946738+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 950963 data_alloc: 218103808 data_used: 143360
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82608128 unmapped: 5513216 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:44:50.946891+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fca77000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82616320 unmapped: 5505024 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:44:51.947041+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82616320 unmapped: 5505024 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:44:52.947302+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82624512 unmapped: 5496832 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fca77000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:44:53.947439+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82632704 unmapped: 5488640 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:44:54.947568+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 950831 data_alloc: 218103808 data_used: 143360
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82632704 unmapped: 5488640 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:44:55.947746+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e226b400
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82640896 unmapped: 5480448 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:44:56.947927+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fca77000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82640896 unmapped: 5480448 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:44:57.948114+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82640896 unmapped: 5480448 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:44:58.948368+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 10.640151978s of 10.649944305s, submitted: 3
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82657280 unmapped: 5464064 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:44:59.948622+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fca77000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 951752 data_alloc: 218103808 data_used: 143360
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82665472 unmapped: 5455872 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:45:00.948811+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fca77000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82665472 unmapped: 5455872 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:45:01.948955+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82673664 unmapped: 5447680 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:45:02.949185+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82681856 unmapped: 5439488 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:45:03.949318+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82681856 unmapped: 5439488 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:45:04.949482+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 951620 data_alloc: 218103808 data_used: 143360
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82690048 unmapped: 5431296 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:45:05.949728+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fca77000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fca77000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82690048 unmapped: 5431296 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:45:06.949958+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fca77000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82690048 unmapped: 5431296 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:45:07.950206+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fca77000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82698240 unmapped: 5423104 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:45:08.950368+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82698240 unmapped: 5423104 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:45:09.950705+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fca77000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 951488 data_alloc: 218103808 data_used: 143360
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82706432 unmapped: 5414912 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:45:10.950846+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82706432 unmapped: 5414912 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:45:11.951066+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82714624 unmapped: 5406720 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:45:12.951318+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82722816 unmapped: 5398528 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:45:13.951598+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82722816 unmapped: 5398528 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:45:14.951810+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fca77000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 951488 data_alloc: 218103808 data_used: 143360
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fca77000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82731008 unmapped: 5390336 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:45:15.951956+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82731008 unmapped: 5390336 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:45:16.952177+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82739200 unmapped: 5382144 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:45:17.952383+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82747392 unmapped: 5373952 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:45:18.952616+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82755584 unmapped: 5365760 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:45:19.952774+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fca77000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 951488 data_alloc: 218103808 data_used: 143360
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82755584 unmapped: 5365760 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:45:20.953038+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82755584 unmapped: 5365760 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:45:21.953188+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82763776 unmapped: 5357568 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:45:22.953414+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82763776 unmapped: 5357568 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fca77000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:45:23.953638+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82763776 unmapped: 5357568 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:45:24.953869+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 951488 data_alloc: 218103808 data_used: 143360
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82771968 unmapped: 5349376 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:45:25.954027+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82771968 unmapped: 5349376 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:45:26.954238+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82780160 unmapped: 5341184 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fca77000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:45:27.954493+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fca77000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82780160 unmapped: 5341184 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:45:28.954713+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82780160 unmapped: 5341184 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:45:29.954848+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 ms_handle_reset con 0x5616e226f400 session 0x5616e2d70b40
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 ms_handle_reset con 0x5616e2617800 session 0x5616e0572780
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 951488 data_alloc: 218103808 data_used: 143360
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82788352 unmapped: 5332992 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:45:30.954987+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82788352 unmapped: 5332992 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:45:31.955206+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82796544 unmapped: 5324800 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:45:32.955419+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fca77000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82796544 unmapped: 5324800 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:45:33.955533+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82796544 unmapped: 5324800 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:45:34.955666+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 951488 data_alloc: 218103808 data_used: 143360
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82804736 unmapped: 5316608 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:45:35.955794+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82804736 unmapped: 5316608 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:45:36.955949+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fca77000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82812928 unmapped: 5308416 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:45:37.956088+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 ms_handle_reset con 0x5616e3284000 session 0x5616df7d8780
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fca77000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82665472 unmapped: 5455872 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:45:38.956245+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82673664 unmapped: 5447680 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:45:39.956390+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 951488 data_alloc: 218103808 data_used: 143360
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fca77000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e327dc00
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 41.430633545s of 41.440040588s, submitted: 3
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82673664 unmapped: 5447680 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:45:40.956634+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82681856 unmapped: 5439488 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:45:41.956780+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82681856 unmapped: 5439488 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:45:42.956941+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82681856 unmapped: 5439488 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:45:43.957090+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82681856 unmapped: 5439488 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:45:44.957256+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fca77000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 953132 data_alloc: 218103808 data_used: 143360
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82690048 unmapped: 5431296 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:45:45.957401+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82690048 unmapped: 5431296 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:45:46.957591+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82698240 unmapped: 5423104 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:45:47.957721+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fca77000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82698240 unmapped: 5423104 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e11f9000
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:45:48.957868+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82706432 unmapped: 5414912 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:45:49.957990+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 953264 data_alloc: 218103808 data_used: 143360
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82706432 unmapped: 5414912 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:45:50.958122+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82706432 unmapped: 5414912 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:45:51.958196+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 11.190359116s of 11.200112343s, submitted: 3
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fca77000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82722816 unmapped: 5398528 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:45:52.958409+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82722816 unmapped: 5398528 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:45:53.958547+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fca77000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82731008 unmapped: 5390336 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:45:54.958720+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e321f800
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 954185 data_alloc: 218103808 data_used: 143360
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82731008 unmapped: 5390336 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:45:55.958861+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82739200 unmapped: 5382144 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:45:56.959023+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82739200 unmapped: 5382144 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:45:57.959208+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82739200 unmapped: 5382144 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:45:58.959326+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fca77000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82747392 unmapped: 5373952 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:45:59.959446+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fca77000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 954053 data_alloc: 218103808 data_used: 143360
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82747392 unmapped: 5373952 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:46:00.959595+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82747392 unmapped: 5373952 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:46:01.959790+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82755584 unmapped: 5365760 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:46:02.959971+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 11.691310883s of 11.704643250s, submitted: 4
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fca77000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82771968 unmapped: 5349376 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:46:03.960119+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82771968 unmapped: 5349376 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:46:04.960315+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 953330 data_alloc: 218103808 data_used: 143360
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82780160 unmapped: 5341184 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:46:05.960472+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fca77000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82780160 unmapped: 5341184 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:46:06.960605+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82788352 unmapped: 5332992 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:46:07.960756+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fca77000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82788352 unmapped: 5332992 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:46:08.960876+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fca77000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82796544 unmapped: 5324800 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:46:09.961006+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 953330 data_alloc: 218103808 data_used: 143360
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82796544 unmapped: 5324800 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:46:10.961177+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82796544 unmapped: 5324800 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:46:11.961356+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82804736 unmapped: 5316608 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:46:12.961536+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fca77000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82804736 unmapped: 5316608 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:46:13.961702+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82812928 unmapped: 5308416 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:46:14.961842+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 953330 data_alloc: 218103808 data_used: 143360
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82812928 unmapped: 5308416 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:46:15.961968+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82812928 unmapped: 5308416 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:46:16.962102+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82821120 unmapped: 5300224 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:46:17.962241+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82821120 unmapped: 5300224 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:46:18.962363+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fca77000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82829312 unmapped: 5292032 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:46:19.962522+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 953330 data_alloc: 218103808 data_used: 143360
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82829312 unmapped: 5292032 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:46:20.962724+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:46:21.962866+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82837504 unmapped: 5283840 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:46:22.963013+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82837504 unmapped: 5283840 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fca77000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:46:23.963194+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82845696 unmapped: 5275648 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:46:24.963315+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82845696 unmapped: 5275648 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 953330 data_alloc: 218103808 data_used: 143360
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:46:25.963488+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82853888 unmapped: 5267456 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fca77000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:46:26.963658+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82853888 unmapped: 5267456 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:46:27.963815+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82862080 unmapped: 5259264 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:46:28.964008+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82862080 unmapped: 5259264 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:46:29.964181+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82870272 unmapped: 5251072 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fca77000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 953330 data_alloc: 218103808 data_used: 143360
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:46:30.964338+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82870272 unmapped: 5251072 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:46:31.964494+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82878464 unmapped: 5242880 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:46:32.964749+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82886656 unmapped: 5234688 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:46:33.964915+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82886656 unmapped: 5234688 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:46:34.965071+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82894848 unmapped: 5226496 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 953330 data_alloc: 218103808 data_used: 143360
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:46:35.965226+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fca77000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82894848 unmapped: 5226496 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:46:36.965415+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82894848 unmapped: 5226496 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:46:37.965569+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82903040 unmapped: 5218304 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fca77000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:46:38.965749+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82821120 unmapped: 5300224 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:46:39.965883+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82829312 unmapped: 5292032 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 953330 data_alloc: 218103808 data_used: 143360
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:46:40.966049+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82829312 unmapped: 5292032 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:46:41.966228+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82829312 unmapped: 5292032 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fca77000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fca77000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:46:42.966454+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82837504 unmapped: 5283840 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:46:43.966646+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82837504 unmapped: 5283840 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:46:44.966815+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82845696 unmapped: 5275648 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fca77000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 953330 data_alloc: 218103808 data_used: 143360
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:46:45.966983+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82845696 unmapped: 5275648 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fca77000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:46:46.967205+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82845696 unmapped: 5275648 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:46:47.967415+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82853888 unmapped: 5267456 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:46:48.967638+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82862080 unmapped: 5259264 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:46:49.967836+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82862080 unmapped: 5259264 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 953330 data_alloc: 218103808 data_used: 143360
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:46:50.968004+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82870272 unmapped: 5251072 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:46:51.968197+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82870272 unmapped: 5251072 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fca77000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:46:52.968543+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82878464 unmapped: 5242880 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:46:53.968750+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82878464 unmapped: 5242880 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:46:54.969112+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fca77000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82886656 unmapped: 5234688 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 953330 data_alloc: 218103808 data_used: 143360
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:46:55.969394+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82886656 unmapped: 5234688 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:46:56.969535+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82886656 unmapped: 5234688 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:46:57.969758+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82894848 unmapped: 5226496 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:46:58.969925+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82894848 unmapped: 5226496 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:46:59.970131+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82903040 unmapped: 5218304 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fca77000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 953330 data_alloc: 218103808 data_used: 143360
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:47:00.970358+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82903040 unmapped: 5218304 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:47:01.971349+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82903040 unmapped: 5218304 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:47:02.972250+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82911232 unmapped: 5210112 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:47:03.973040+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82911232 unmapped: 5210112 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fca77000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:47:04.973718+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82911232 unmapped: 5210112 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 953330 data_alloc: 218103808 data_used: 143360
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:47:05.974379+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82919424 unmapped: 5201920 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fca77000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:47:06.974519+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82919424 unmapped: 5201920 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:47:07.974995+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82935808 unmapped: 5185536 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:47:08.975176+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82935808 unmapped: 5185536 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:47:09.975321+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82935808 unmapped: 5185536 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 953330 data_alloc: 218103808 data_used: 143360
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:47:10.975487+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82944000 unmapped: 5177344 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:47:11.975766+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fca77000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82952192 unmapped: 5169152 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:47:12.976118+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82952192 unmapped: 5169152 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:47:13.976246+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82960384 unmapped: 5160960 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fca77000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:47:14.976528+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82960384 unmapped: 5160960 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 953330 data_alloc: 218103808 data_used: 143360
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:47:15.976660+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82968576 unmapped: 5152768 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:47:16.976807+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82968576 unmapped: 5152768 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:47:17.977014+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82976768 unmapped: 5144576 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:47:18.977222+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82976768 unmapped: 5144576 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:47:19.977473+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82976768 unmapped: 5144576 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fca77000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 953330 data_alloc: 218103808 data_used: 143360
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:47:20.977777+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82984960 unmapped: 5136384 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:47:21.978041+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fca77000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82984960 unmapped: 5136384 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:47:22.978326+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82993152 unmapped: 5128192 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:47:23.978516+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 82993152 unmapped: 5128192 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:47:24.978655+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83001344 unmapped: 5120000 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 953330 data_alloc: 218103808 data_used: 143360
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:47:25.978871+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fca77000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83001344 unmapped: 5120000 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:47:26.979056+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83001344 unmapped: 5120000 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:47:27.979255+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83009536 unmapped: 5111808 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:47:28.979430+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83009536 unmapped: 5111808 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:47:29.980889+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83009536 unmapped: 5111808 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:47:30.981028+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 953330 data_alloc: 218103808 data_used: 143360
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83017728 unmapped: 5103616 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fca77000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:47:31.981166+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 ms_handle_reset con 0x5616e321f800 session 0x5616e2d714a0
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 ms_handle_reset con 0x5616e11f9000 session 0x5616e1ec90e0
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83017728 unmapped: 5103616 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:47:32.981339+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83034112 unmapped: 5087232 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:47:33.981527+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83034112 unmapped: 5087232 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:47:34.981728+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83034112 unmapped: 5087232 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fca77000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:47:35.981893+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 953330 data_alloc: 218103808 data_used: 143360
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83042304 unmapped: 5079040 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:47:36.982060+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83042304 unmapped: 5079040 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:47:37.982266+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83050496 unmapped: 5070848 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fca77000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:47:38.982514+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83050496 unmapped: 5070848 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:47:39.982713+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fca77000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83050496 unmapped: 5070848 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:47:40.982918+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 953330 data_alloc: 218103808 data_used: 143360
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83058688 unmapped: 5062656 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:47:41.983135+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83058688 unmapped: 5062656 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e2cafc00
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 99.223167419s of 99.225830078s, submitted: 1
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:47:42.983425+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83066880 unmapped: 5054464 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:47:43.983651+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fca77000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83066880 unmapped: 5054464 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:47:44.983860+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83075072 unmapped: 5046272 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:47:45.984048+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e2a98800
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 956486 data_alloc: 218103808 data_used: 143360
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 84131840 unmapped: 3989504 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:47:46.984248+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fca77000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 84131840 unmapped: 3989504 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:47:47.984427+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 ms_handle_reset con 0x5616e2a98800 session 0x5616e2d71a40
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 ms_handle_reset con 0x5616e327dc00 session 0x5616e2d710e0
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 84140032 unmapped: 3981312 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:47:48.984633+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e3283c00
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 84140032 unmapped: 3981312 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:47:49.984841+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 84140032 unmapped: 3981312 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:47:50.985027+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 956486 data_alloc: 218103808 data_used: 143360
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 84148224 unmapped: 3973120 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:47:51.985193+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 84148224 unmapped: 3973120 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:47:52.985319+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fca77000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 84148224 unmapped: 3973120 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:47:53.985462+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 84156416 unmapped: 3964928 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fca77000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:47:54.985605+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 12.070820808s of 12.081949234s, submitted: 3
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83107840 unmapped: 5013504 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:47:55.985714+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 955895 data_alloc: 218103808 data_used: 143360
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83116032 unmapped: 5005312 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:47:56.985891+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83116032 unmapped: 5005312 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:47:57.986008+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83124224 unmapped: 4997120 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:47:58.986225+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e2cad400
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83124224 unmapped: 4997120 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:47:59.986407+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83124224 unmapped: 4997120 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fca77000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:48:00.986563+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 955895 data_alloc: 218103808 data_used: 143360
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83132416 unmapped: 4988928 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:48:01.986704+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fca77000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83132416 unmapped: 4988928 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:48:02.986884+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fca77000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83140608 unmapped: 4980736 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:48:03.987036+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83140608 unmapped: 4980736 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:48:04.987182+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e2254800
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83140608 unmapped: 4980736 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:48:05.987313+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 957407 data_alloc: 218103808 data_used: 143360
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83148800 unmapped: 4972544 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:48:06.987459+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83148800 unmapped: 4972544 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:48:07.987584+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 13.191962242s of 13.212522507s, submitted: 4
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83156992 unmapped: 4964352 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fca77000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:48:08.987706+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fca77000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83156992 unmapped: 4964352 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:48:09.987881+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83156992 unmapped: 4964352 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:48:10.988023+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 956816 data_alloc: 218103808 data_used: 143360
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83165184 unmapped: 4956160 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:48:11.988131+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83165184 unmapped: 4956160 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:48:12.988306+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fca77000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83165184 unmapped: 4956160 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:48:13.988460+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83173376 unmapped: 4947968 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:48:14.988596+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83173376 unmapped: 4947968 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:48:15.988759+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 956093 data_alloc: 218103808 data_used: 143360
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83181568 unmapped: 4939776 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fca77000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:48:16.988897+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 ms_handle_reset con 0x5616e226b400 session 0x5616e2d70d20
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 ms_handle_reset con 0x5616e327c000 session 0x5616e1f7be00
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83181568 unmapped: 4939776 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:48:17.989022+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83181568 unmapped: 4939776 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:48:18.989195+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83197952 unmapped: 4923392 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:48:19.989331+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83197952 unmapped: 4923392 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:48:20.989476+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 956093 data_alloc: 218103808 data_used: 143360
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83206144 unmapped: 4915200 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:48:21.989584+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83206144 unmapped: 4915200 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fca77000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:48:22.989709+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fca77000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83214336 unmapped: 4907008 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fca77000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:48:23.989859+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83214336 unmapped: 4907008 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fca77000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:48:24.990008+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83214336 unmapped: 4907008 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:48:25.990175+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 956093 data_alloc: 218103808 data_used: 143360
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83222528 unmapped: 4898816 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:48:26.990365+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83222528 unmapped: 4898816 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:48:27.990523+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e3542800
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 19.953041077s of 19.965154648s, submitted: 3
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83230720 unmapped: 4890624 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fca77000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:48:28.990635+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83230720 unmapped: 4890624 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:48:29.990829+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fca77000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83238912 unmapped: 4882432 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:48:30.990988+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 956225 data_alloc: 218103808 data_used: 143360
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83247104 unmapped: 4874240 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fca77000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:48:31.991195+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83247104 unmapped: 4874240 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:48:32.991366+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83247104 unmapped: 4874240 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:48:33.991544+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fca77000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83255296 unmapped: 4866048 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:48:34.991805+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fca77000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83255296 unmapped: 4866048 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:48:35.992003+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 959249 data_alloc: 218103808 data_used: 143360
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83263488 unmapped: 4857856 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:48:36.992172+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83263488 unmapped: 4857856 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:48:37.992434+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83271680 unmapped: 4849664 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                           ** DB Stats **
                                           Uptime(secs): 600.1 total, 600.0 interval
                                           Cumulative writes: 8069 writes, 33K keys, 8069 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.04 MB/s
                                           Cumulative WAL: 8069 writes, 1528 syncs, 5.28 writes per sync, written: 0.02 GB, 0.04 MB/s
                                           Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                           Interval writes: 8069 writes, 33K keys, 8069 commit groups, 1.0 writes per commit group, ingest: 21.03 MB, 0.04 MB/s
                                           Interval WAL: 8069 writes, 1528 syncs, 5.28 writes per sync, written: 0.02 GB, 0.04 MB/s
                                           Interval stall: 00:00:0.000 H:M:S, 0.0 percent
                                           
                                           ** Compaction Stats [default] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      2/0    2.63 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.2      0.01              0.00         1    0.005       0      0       0.0       0.0
                                            Sum      2/0    2.63 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.2      0.01              0.00         1    0.005       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [default] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.2      0.01              0.00         1    0.005       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x5616dea3d350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 3.1e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [default] **
                                           
                                           ** Compaction Stats [m-0] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [m-0] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x5616dea3d350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 3.1e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [m-0] **
                                           
                                           ** Compaction Stats [m-1] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [m-1] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x5616dea3d350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 3.1e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [m-1] **
                                           
                                           ** Compaction Stats [m-2] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [m-2] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x5616dea3d350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 3.1e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [m-2] **
                                           
                                           ** Compaction Stats [p-0] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      1/0    1.56 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.2      0.01              0.00         1    0.007       0      0       0.0       0.0
                                            Sum      1/0    1.56 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.2      0.01              0.00         1    0.007       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [p-0] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.2      0.01              0.00         1    0.007       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x5616dea3d350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 3.1e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [p-0] **
                                           
                                           ** Compaction Stats [p-1] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [p-1] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x5616dea3d350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 3.1e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [p-1] **
                                           
                                           ** Compaction Stats [p-2] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [p-2] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x5616dea3d350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 3.1e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [p-2] **
                                           
                                           ** Compaction Stats [O-0] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [O-0] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x5616dea3c9b0#2 capacity: 224.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 2 last_secs: 4e-06 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,8.85555e-05%) FilterBlock(1,0.11 KB,4.76837e-05%) IndexBlock(1,0.14 KB,6.13076e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [O-0] **
                                           
                                           ** Compaction Stats [O-1] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [O-1] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x5616dea3c9b0#2 capacity: 224.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 2 last_secs: 4e-06 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,8.85555e-05%) FilterBlock(1,0.11 KB,4.76837e-05%) IndexBlock(1,0.14 KB,6.13076e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [O-1] **
                                           
                                           ** Compaction Stats [O-2] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      1/0    1.25 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.4      0.00              0.00         1    0.003       0      0       0.0       0.0
                                            Sum      1/0    1.25 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.4      0.00              0.00         1    0.003       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [O-2] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.4      0.00              0.00         1    0.003       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x5616dea3c9b0#2 capacity: 224.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 2 last_secs: 4e-06 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,8.85555e-05%) FilterBlock(1,0.11 KB,4.76837e-05%) IndexBlock(1,0.14 KB,6.13076e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [O-2] **
                                           
                                           ** Compaction Stats [L] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.001       0      0       0.0       0.0
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.001       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [L] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.001       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x5616dea3d350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 3.1e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [L] **
                                           
                                           ** Compaction Stats [P] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [P] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x5616dea3d350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 3.1e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [P] **
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:48:38.992842+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83337216 unmapped: 4784128 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:48:39.993679+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fca77000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83337216 unmapped: 4784128 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:48:40.994871+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 958658 data_alloc: 218103808 data_used: 143360
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83345408 unmapped: 4775936 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:48:41.995024+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83345408 unmapped: 4775936 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:48:42.995256+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83353600 unmapped: 4767744 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:48:43.995924+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 16.006532669s of 16.020818710s, submitted: 4
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fca77000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83353600 unmapped: 4767744 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:48:44.996244+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83353600 unmapped: 4767744 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:48:45.996384+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 ms_handle_reset con 0x5616e2254800 session 0x5616e05723c0
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 ms_handle_reset con 0x5616e2cad400 session 0x5616e2d71c20
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 958526 data_alloc: 218103808 data_used: 143360
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83361792 unmapped: 4759552 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:48:46.996551+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83361792 unmapped: 4759552 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:48:47.996703+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83369984 unmapped: 4751360 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:48:48.997046+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83378176 unmapped: 4743168 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:48:49.997175+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fca77000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83378176 unmapped: 4743168 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:48:50.997288+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 958526 data_alloc: 218103808 data_used: 143360
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fca77000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83386368 unmapped: 4734976 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:48:51.997424+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83386368 unmapped: 4734976 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:48:52.997575+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83394560 unmapped: 4726784 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:48:53.997726+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83394560 unmapped: 4726784 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:48:54.998018+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83394560 unmapped: 4726784 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:48:55.998212+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fca77000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 958526 data_alloc: 218103808 data_used: 143360
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83402752 unmapped: 4718592 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:48:56.998500+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e108a400
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 12.845773697s of 12.849323273s, submitted: 1
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83402752 unmapped: 4718592 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:48:57.998630+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fca77000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83402752 unmapped: 4718592 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:48:58.998967+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83410944 unmapped: 4710400 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:48:59.999217+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e10eec00
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83410944 unmapped: 4710400 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:49:00.999359+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 960170 data_alloc: 218103808 data_used: 143360
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83419136 unmapped: 4702208 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:49:01.999499+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83419136 unmapped: 4702208 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:49:02.999993+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e2619c00
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83427328 unmapped: 4694016 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:49:04.000227+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fca77000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83427328 unmapped: 4694016 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:49:05.000439+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83427328 unmapped: 4694016 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:49:06.000605+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 959579 data_alloc: 218103808 data_used: 143360
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83435520 unmapped: 4685824 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:49:07.000788+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83435520 unmapped: 4685824 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:49:08.000917+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fca77000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83443712 unmapped: 4677632 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:49:09.001085+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fca77000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 12.056375504s of 12.069332123s, submitted: 3
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83443712 unmapped: 4677632 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:49:10.001278+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83451904 unmapped: 4669440 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:49:11.001425+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fca77000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 958988 data_alloc: 218103808 data_used: 143360
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83451904 unmapped: 4669440 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fca77000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:49:12.001664+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83451904 unmapped: 4669440 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:49:13.001845+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83460096 unmapped: 4661248 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:49:14.002003+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83460096 unmapped: 4661248 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:49:15.002120+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83468288 unmapped: 4653056 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:49:16.002192+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 958856 data_alloc: 218103808 data_used: 143360
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83468288 unmapped: 4653056 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:49:17.002316+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fca77000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83468288 unmapped: 4653056 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:49:18.002452+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83476480 unmapped: 4644864 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:49:19.002535+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83476480 unmapped: 4644864 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:49:20.002651+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83484672 unmapped: 4636672 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:49:21.002772+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 958856 data_alloc: 218103808 data_used: 143360
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83484672 unmapped: 4636672 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:49:22.002941+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83484672 unmapped: 4636672 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:49:23.003097+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fca77000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83492864 unmapped: 4628480 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:49:24.003232+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83492864 unmapped: 4628480 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:49:25.003349+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83501056 unmapped: 4620288 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:49:26.003547+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 958856 data_alloc: 218103808 data_used: 143360
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83501056 unmapped: 4620288 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:49:27.003704+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83509248 unmapped: 4612096 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:49:28.003827+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83517440 unmapped: 4603904 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fca77000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:49:29.003992+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fca77000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83517440 unmapped: 4603904 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:49:30.004129+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83525632 unmapped: 4595712 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:49:31.004310+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 958856 data_alloc: 218103808 data_used: 143360
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83525632 unmapped: 4595712 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:49:32.004445+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83533824 unmapped: 4587520 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:49:33.004606+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83533824 unmapped: 4587520 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:49:34.004756+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 ms_handle_reset con 0x5616e2619c00 session 0x5616e26210e0
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 ms_handle_reset con 0x5616e108a400 session 0x5616dfb752c0
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83533824 unmapped: 4587520 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:49:35.004900+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fca77000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83533824 unmapped: 4587520 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:49:36.005075+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 958856 data_alloc: 218103808 data_used: 143360
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fca77000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83542016 unmapped: 4579328 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:49:37.005213+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83542016 unmapped: 4579328 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:49:38.005362+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fca77000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83566592 unmapped: 4554752 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:49:39.005527+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83566592 unmapped: 4554752 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:49:40.005708+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83574784 unmapped: 4546560 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:49:41.005890+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 958856 data_alloc: 218103808 data_used: 143360
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83574784 unmapped: 4546560 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:49:42.006053+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83582976 unmapped: 4538368 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:49:43.006223+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 ms_handle_reset con 0x5616e10eec00 session 0x5616e359a3c0
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 ms_handle_reset con 0x5616e3542800 session 0x5616e1eb41e0
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83582976 unmapped: 4538368 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:49:44.006366+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fca77000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e3542400
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 35.694198608s of 35.783321381s, submitted: 2
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83591168 unmapped: 4530176 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:49:45.006914+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83599360 unmapped: 4521984 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:49:46.007421+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fca77000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 958988 data_alloc: 218103808 data_used: 143360
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fca77000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83599360 unmapped: 4521984 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:49:47.007567+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83599360 unmapped: 4521984 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:49:48.007706+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83607552 unmapped: 4513792 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:49:49.007853+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83607552 unmapped: 4513792 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:49:50.008038+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fca77000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83607552 unmapped: 4513792 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:49:51.008234+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 958988 data_alloc: 218103808 data_used: 143360
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83615744 unmapped: 4505600 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:49:52.008492+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fca77000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fca77000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83615744 unmapped: 4505600 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:49:53.008745+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83615744 unmapped: 4505600 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:49:54.008975+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e327e400
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 9.844978333s of 10.003371239s, submitted: 44
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83632128 unmapped: 4489216 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:49:55.009105+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83877888 unmapped: 4243456 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:49:56.009187+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 959120 data_alloc: 218103808 data_used: 143360
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83951616 unmapped: 4169728 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:49:57.009301+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83951616 unmapped: 4169728 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:49:58.009422+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83951616 unmapped: 4169728 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:49:59.009538+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83951616 unmapped: 4169728 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:50:00.009706+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83951616 unmapped: 4169728 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:50:01.009888+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 959120 data_alloc: 218103808 data_used: 143360
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83951616 unmapped: 4169728 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:50:02.010014+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83951616 unmapped: 4169728 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:50:03.010217+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83951616 unmapped: 4169728 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:50:04.010327+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83951616 unmapped: 4169728 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:50:05.010529+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83951616 unmapped: 4169728 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:50:06.010694+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 958988 data_alloc: 218103808 data_used: 143360
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83951616 unmapped: 4169728 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:50:07.010826+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:50:08.010974+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83951616 unmapped: 4169728 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 10.813514709s of 13.270147324s, submitted: 293
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:50:09.011162+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83951616 unmapped: 4169728 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:50:10.011350+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83951616 unmapped: 4169728 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:50:11.011541+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83951616 unmapped: 4169728 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 958856 data_alloc: 218103808 data_used: 143360
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:50:12.011696+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83951616 unmapped: 4169728 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:50:13.011870+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83951616 unmapped: 4169728 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:50:14.012179+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83951616 unmapped: 4169728 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:50:15.012284+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83951616 unmapped: 4169728 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:50:16.015670+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83951616 unmapped: 4169728 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 958856 data_alloc: 218103808 data_used: 143360
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:50:17.016910+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83951616 unmapped: 4169728 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:50:18.018212+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83951616 unmapped: 4169728 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:50:19.019254+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83951616 unmapped: 4169728 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:50:20.020367+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83951616 unmapped: 4169728 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:50:21.021258+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83951616 unmapped: 4169728 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 958856 data_alloc: 218103808 data_used: 143360
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:50:22.021981+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83951616 unmapped: 4169728 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:50:23.022228+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83951616 unmapped: 4169728 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:50:24.022419+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83951616 unmapped: 4169728 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:50:25.022823+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83951616 unmapped: 4169728 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:50:26.023199+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83951616 unmapped: 4169728 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 ms_handle_reset con 0x5616e327e400 session 0x5616e33d2b40
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 958856 data_alloc: 218103808 data_used: 143360
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:50:27.027465+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83951616 unmapped: 4169728 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:50:28.027617+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83951616 unmapped: 4169728 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:50:29.027807+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83951616 unmapped: 4169728 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:50:30.027968+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83968000 unmapped: 4153344 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:50:31.028434+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83968000 unmapped: 4153344 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 958856 data_alloc: 218103808 data_used: 143360
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:50:32.028582+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83968000 unmapped: 4153344 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:50:33.028943+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83976192 unmapped: 4145152 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:50:34.029216+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83976192 unmapped: 4145152 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:50:35.029503+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83976192 unmapped: 4145152 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:50:36.029704+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 ms_handle_reset con 0x5616e3542400 session 0x5616e34ea5a0
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83984384 unmapped: 4136960 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 958856 data_alloc: 218103808 data_used: 143360
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:50:37.029834+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e0540c00
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 28.845741272s of 28.848985672s, submitted: 1
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83984384 unmapped: 4136960 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:50:38.029989+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83992576 unmapped: 4128768 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:50:39.030241+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 83992576 unmapped: 4128768 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:50:40.030394+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 84000768 unmapped: 4120576 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:50:41.030593+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 84000768 unmapped: 4120576 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 960500 data_alloc: 218103808 data_used: 143360
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:50:42.030737+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 84000768 unmapped: 4120576 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:50:43.030880+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 84000768 unmapped: 4120576 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:50:44.031021+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 84000768 unmapped: 4120576 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:50:45.031186+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 84000768 unmapped: 4120576 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:50:46.031373+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 84000768 unmapped: 4120576 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 960500 data_alloc: 218103808 data_used: 143360
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:50:47.031517+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e2270c00
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 10.034110069s of 10.046130180s, submitted: 2
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 84008960 unmapped: 4112384 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:50:48.031638+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 84008960 unmapped: 4112384 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:50:49.031800+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 84008960 unmapped: 4112384 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:50:50.032001+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 84008960 unmapped: 4112384 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:50:51.032170+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 84008960 unmapped: 4112384 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 960500 data_alloc: 218103808 data_used: 143360
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:50:52.032322+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 84008960 unmapped: 4112384 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:50:53.032620+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e3543000
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 84008960 unmapped: 4112384 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:50:54.032788+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 84008960 unmapped: 4112384 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:50:55.033776+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 84008960 unmapped: 4112384 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:50:56.033920+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 84008960 unmapped: 4112384 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 962012 data_alloc: 218103808 data_used: 143360
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:50:57.034056+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 84008960 unmapped: 4112384 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:50:58.034243+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 84008960 unmapped: 4112384 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:50:59.034414+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 12.128645897s of 12.140229225s, submitted: 3
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 84017152 unmapped: 4104192 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:51:00.034558+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 84017152 unmapped: 4104192 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:51:01.034718+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 84017152 unmapped: 4104192 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 961421 data_alloc: 218103808 data_used: 143360
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:51:02.034911+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 84017152 unmapped: 4104192 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:51:03.035095+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 84017152 unmapped: 4104192 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:51:04.035292+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 84017152 unmapped: 4104192 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:51:05.035536+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 84017152 unmapped: 4104192 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:51:06.035749+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 84017152 unmapped: 4104192 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 961289 data_alloc: 218103808 data_used: 143360
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:51:07.035966+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 84025344 unmapped: 4096000 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:51:08.036242+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 84025344 unmapped: 4096000 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:51:09.036425+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 84025344 unmapped: 4096000 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:51:10.036581+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 84033536 unmapped: 4087808 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:51:11.036782+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 84033536 unmapped: 4087808 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 961289 data_alloc: 218103808 data_used: 143360
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:51:12.036926+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 84033536 unmapped: 4087808 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:51:13.037095+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 84033536 unmapped: 4087808 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 ms_handle_reset con 0x5616e0540c00 session 0x5616e1f7af00
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:51:14.037357+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 84033536 unmapped: 4087808 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:51:15.037516+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 84033536 unmapped: 4087808 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:51:16.037700+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 84033536 unmapped: 4087808 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 961289 data_alloc: 218103808 data_used: 143360
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:51:17.037830+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 84033536 unmapped: 4087808 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:51:18.038002+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 ms_handle_reset con 0x5616e3543000 session 0x5616e1f7ba40
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 ms_handle_reset con 0x5616e2270c00 session 0x5616e337fe00
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 84033536 unmapped: 4087808 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:51:19.038140+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 84033536 unmapped: 4087808 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:51:20.038366+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 84033536 unmapped: 4087808 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:51:21.038560+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 84033536 unmapped: 4087808 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 961289 data_alloc: 218103808 data_used: 143360
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:51:22.038756+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 84033536 unmapped: 4087808 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:51:23.038982+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 84033536 unmapped: 4087808 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:51:24.039210+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e321f800
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 25.132347107s of 25.178052902s, submitted: 3
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 84041728 unmapped: 4079616 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:51:25.039345+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 84041728 unmapped: 4079616 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:51:26.039503+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 84041728 unmapped: 4079616 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 961421 data_alloc: 218103808 data_used: 143360
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:51:27.039675+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 84041728 unmapped: 4079616 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:51:28.039833+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 84041728 unmapped: 4079616 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:51:29.040066+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e11f9000
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 84041728 unmapped: 4079616 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:51:30.040198+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 84041728 unmapped: 4079616 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:51:31.040409+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 84041728 unmapped: 4079616 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 961553 data_alloc: 218103808 data_used: 143360
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:51:32.040610+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 84049920 unmapped: 4071424 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:51:33.040819+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 84049920 unmapped: 4071424 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:51:34.041022+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 84049920 unmapped: 4071424 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:51:35.041224+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 84049920 unmapped: 4071424 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:51:36.041377+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e321e000
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 12.086726189s of 12.105925560s, submitted: 3
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85098496 unmapped: 3022848 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 962474 data_alloc: 218103808 data_used: 143360
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:51:37.041547+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85098496 unmapped: 3022848 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:51:38.041711+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85098496 unmapped: 3022848 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:51:39.041917+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85098496 unmapped: 3022848 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:51:40.042095+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85098496 unmapped: 3022848 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:51:41.042294+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85098496 unmapped: 3022848 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 962474 data_alloc: 218103808 data_used: 143360
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:51:42.042449+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85098496 unmapped: 3022848 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:51:43.042588+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85098496 unmapped: 3022848 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:51:44.042733+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85098496 unmapped: 3022848 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:51:45.042917+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85098496 unmapped: 3022848 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:51:46.043077+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85098496 unmapped: 3022848 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 962342 data_alloc: 218103808 data_used: 143360
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:51:47.043225+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 10.559317589s of 10.567891121s, submitted: 2
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85106688 unmapped: 3014656 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:51:48.043392+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85106688 unmapped: 3014656 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:51:49.043533+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85106688 unmapped: 3014656 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:51:50.043721+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85106688 unmapped: 3014656 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:51:51.043852+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85106688 unmapped: 3014656 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 962210 data_alloc: 218103808 data_used: 143360
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:51:52.044034+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85106688 unmapped: 3014656 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 ms_handle_reset con 0x5616e321e000 session 0x5616e34dbc20
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 ms_handle_reset con 0x5616e321f800 session 0x5616e0d17860
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:51:53.044226+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85106688 unmapped: 3014656 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:51:54.044380+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85106688 unmapped: 3014656 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:51:55.044510+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85106688 unmapped: 3014656 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:51:56.044695+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85114880 unmapped: 3006464 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 962210 data_alloc: 218103808 data_used: 143360
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:51:57.044833+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85114880 unmapped: 3006464 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:51:58.044940+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85114880 unmapped: 3006464 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:51:59.045121+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85114880 unmapped: 3006464 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:52:00.045400+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85114880 unmapped: 3006464 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:52:01.045576+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85114880 unmapped: 3006464 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 962210 data_alloc: 218103808 data_used: 143360
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:52:02.045736+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85114880 unmapped: 3006464 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:52:03.045959+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85114880 unmapped: 3006464 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e226ac00
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 16.825603485s of 16.828807831s, submitted: 1
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:52:04.046201+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85131264 unmapped: 2990080 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:52:05.046397+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85131264 unmapped: 2990080 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:52:06.046606+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85131264 unmapped: 2990080 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 962342 data_alloc: 218103808 data_used: 143360
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:52:07.046827+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85131264 unmapped: 2990080 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:52:08.047051+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85131264 unmapped: 2990080 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:52:09.047248+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85131264 unmapped: 2990080 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e3542000
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:52:10.047446+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85131264 unmapped: 2990080 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:52:11.047691+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85131264 unmapped: 2990080 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:52:12.047931+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 963854 data_alloc: 218103808 data_used: 143360
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85131264 unmapped: 2990080 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:52:13.048276+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85131264 unmapped: 2990080 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:52:14.048446+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85131264 unmapped: 2990080 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:52:15.048634+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85131264 unmapped: 2990080 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 12.089159966s of 12.096708298s, submitted: 2
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:52:16.048781+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85131264 unmapped: 2990080 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:52:17.048983+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 963263 data_alloc: 218103808 data_used: 143360
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85131264 unmapped: 2990080 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:52:18.049216+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85139456 unmapped: 2981888 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:52:19.049430+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85139456 unmapped: 2981888 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:52:20.049666+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85139456 unmapped: 2981888 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:52:21.049877+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85139456 unmapped: 2981888 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:52:22.050068+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 963131 data_alloc: 218103808 data_used: 143360
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85139456 unmapped: 2981888 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:52:23.050289+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85139456 unmapped: 2981888 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:52:24.050423+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85139456 unmapped: 2981888 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:52:25.050549+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85139456 unmapped: 2981888 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:52:26.050713+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85139456 unmapped: 2981888 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:52:27.050894+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 963131 data_alloc: 218103808 data_used: 143360
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85147648 unmapped: 2973696 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:52:28.051042+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85147648 unmapped: 2973696 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:52:29.051219+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85147648 unmapped: 2973696 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:52:30.051372+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85147648 unmapped: 2973696 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:52:31.051522+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85147648 unmapped: 2973696 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:52:32.051666+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 963131 data_alloc: 218103808 data_used: 143360
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85147648 unmapped: 2973696 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:52:33.051860+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85147648 unmapped: 2973696 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:52:34.052034+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85147648 unmapped: 2973696 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:52:35.052217+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85147648 unmapped: 2973696 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:52:36.052416+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85147648 unmapped: 2973696 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:52:37.052576+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 963131 data_alloc: 218103808 data_used: 143360
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85147648 unmapped: 2973696 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:52:38.052773+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85147648 unmapped: 2973696 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:52:39.052941+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85147648 unmapped: 2973696 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:52:40.053087+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85147648 unmapped: 2973696 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:52:41.053225+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85147648 unmapped: 2973696 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:52:42.053351+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 963131 data_alloc: 218103808 data_used: 143360
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85147648 unmapped: 2973696 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:52:43.053562+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85147648 unmapped: 2973696 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:52:44.053793+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85155840 unmapped: 2965504 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:52:45.053976+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85155840 unmapped: 2965504 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:52:46.054238+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85155840 unmapped: 2965504 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:52:47.054406+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 963131 data_alloc: 218103808 data_used: 143360
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85155840 unmapped: 2965504 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:52:48.054562+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85164032 unmapped: 2957312 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:52:49.054692+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85164032 unmapped: 2957312 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:52:50.054871+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85164032 unmapped: 2957312 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:52:51.055019+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85172224 unmapped: 2949120 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:52:52.055257+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 963131 data_alloc: 218103808 data_used: 143360
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85172224 unmapped: 2949120 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:52:53.074992+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85172224 unmapped: 2949120 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:52:54.075201+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85172224 unmapped: 2949120 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:52:55.075371+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85172224 unmapped: 2949120 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:52:56.075542+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85172224 unmapped: 2949120 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:52:57.075731+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 963131 data_alloc: 218103808 data_used: 143360
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85172224 unmapped: 2949120 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:52:58.075918+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85172224 unmapped: 2949120 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:52:59.076064+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85172224 unmapped: 2949120 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:53:00.076311+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85172224 unmapped: 2949120 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:53:01.076499+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85172224 unmapped: 2949120 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:53:02.076649+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 963131 data_alloc: 218103808 data_used: 143360
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 ms_handle_reset con 0x5616e3283c00 session 0x5616e3750f00
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 ms_handle_reset con 0x5616e2cafc00 session 0x5616e3750960
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85172224 unmapped: 2949120 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:53:03.076804+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 ms_handle_reset con 0x5616e3542000 session 0x5616e311a000
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 ms_handle_reset con 0x5616e226ac00 session 0x5616e2797860
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85172224 unmapped: 2949120 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:53:04.076947+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85172224 unmapped: 2949120 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:53:05.077108+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85172224 unmapped: 2949120 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:53:06.077206+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85172224 unmapped: 2949120 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:53:07.077343+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 963131 data_alloc: 218103808 data_used: 143360
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85172224 unmapped: 2949120 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:53:08.077487+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85172224 unmapped: 2949120 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:53:09.077632+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85172224 unmapped: 2949120 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:53:10.077766+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85172224 unmapped: 2949120 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:53:11.077924+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85172224 unmapped: 2949120 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:53:12.078062+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 963131 data_alloc: 218103808 data_used: 143360
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85172224 unmapped: 2949120 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:53:13.078223+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e3220400
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 57.016975403s of 57.231136322s, submitted: 2
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85172224 unmapped: 2949120 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:53:14.078381+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e11f9800
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85188608 unmapped: 2932736 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:53:15.078552+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85188608 unmapped: 2932736 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:53:16.078734+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e3281c00
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85196800 unmapped: 2924544 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:53:17.078874+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 964907 data_alloc: 218103808 data_used: 143360
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85196800 unmapped: 2924544 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:53:18.079037+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85196800 unmapped: 2924544 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:53:19.079250+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e321a800
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85196800 unmapped: 2924544 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:53:20.079388+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e226c400
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85204992 unmapped: 2916352 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:53:21.079537+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85204992 unmapped: 2916352 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:53:22.079758+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 966419 data_alloc: 218103808 data_used: 143360
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85204992 unmapped: 2916352 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:53:23.080194+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85204992 unmapped: 2916352 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:53:24.080318+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85204992 unmapped: 2916352 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:53:25.080453+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85204992 unmapped: 2916352 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:53:26.080634+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 13.142086029s of 13.158586502s, submitted: 4
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85204992 unmapped: 2916352 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:53:27.080814+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 965828 data_alloc: 218103808 data_used: 143360
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:53:28.080953+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85204992 unmapped: 2916352 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:53:29.083589+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85213184 unmapped: 2908160 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:53:30.084446+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85213184 unmapped: 2908160 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:53:31.084595+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85213184 unmapped: 2908160 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:53:32.084741+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85221376 unmapped: 2899968 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 965564 data_alloc: 218103808 data_used: 143360
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:53:33.086063+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85221376 unmapped: 2899968 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:53:34.086240+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85221376 unmapped: 2899968 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 ms_handle_reset con 0x5616e11f9000 session 0x5616e278a000
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 ms_handle_reset con 0x5616e3281c00 session 0x5616e312c000
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:53:35.086363+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85221376 unmapped: 2899968 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 ms_handle_reset con 0x5616e226c400 session 0x5616e311fe00
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 ms_handle_reset con 0x5616e11f9800 session 0x5616e311a3c0
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:53:36.086503+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85221376 unmapped: 2899968 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:53:37.086640+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85221376 unmapped: 2899968 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 965564 data_alloc: 218103808 data_used: 143360
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:53:38.086786+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85221376 unmapped: 2899968 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:53:39.087262+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85221376 unmapped: 2899968 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:53:40.087411+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85221376 unmapped: 2899968 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:53:41.087543+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85221376 unmapped: 2899968 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:53:42.087684+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85221376 unmapped: 2899968 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 965564 data_alloc: 218103808 data_used: 143360
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:53:43.087880+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85221376 unmapped: 2899968 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:53:44.088070+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85221376 unmapped: 2899968 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e3281000
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 18.639591217s of 18.651557922s, submitted: 3
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:53:45.088240+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85221376 unmapped: 2899968 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:53:46.088483+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85221376 unmapped: 2899968 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e1193000
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:53:47.088677+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85221376 unmapped: 2899968 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 965828 data_alloc: 218103808 data_used: 143360
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:53:48.088824+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85221376 unmapped: 2899968 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:53:49.088977+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85221376 unmapped: 2899968 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:53:50.089139+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85221376 unmapped: 2899968 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:53:51.089321+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e21e4000
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85237760 unmapped: 2883584 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:53:52.089461+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85237760 unmapped: 2883584 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 965828 data_alloc: 218103808 data_used: 143360
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:53:53.089753+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85237760 unmapped: 2883584 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:53:54.090388+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85237760 unmapped: 2883584 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:53:55.090552+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85237760 unmapped: 2883584 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 10.387768745s of 10.394624710s, submitted: 2
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:53:56.090719+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85237760 unmapped: 2883584 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:53:57.090887+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85237760 unmapped: 2883584 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 964646 data_alloc: 218103808 data_used: 143360
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:53:58.091041+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85237760 unmapped: 2883584 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:53:59.091280+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85237760 unmapped: 2883584 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:54:00.091486+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85237760 unmapped: 2883584 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:54:01.091685+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85237760 unmapped: 2883584 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:54:02.091819+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85237760 unmapped: 2883584 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 964514 data_alloc: 218103808 data_used: 143360
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:54:03.091981+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85237760 unmapped: 2883584 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:54:04.092203+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85237760 unmapped: 2883584 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:54:05.092335+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85237760 unmapped: 2883584 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:54:06.092471+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85237760 unmapped: 2883584 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:54:07.092582+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85237760 unmapped: 2883584 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 964382 data_alloc: 218103808 data_used: 143360
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:54:08.092781+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85237760 unmapped: 2883584 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:54:09.092911+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85237760 unmapped: 2883584 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:54:10.093048+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85237760 unmapped: 2883584 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:54:11.093208+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85237760 unmapped: 2883584 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:54:12.093344+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85237760 unmapped: 2883584 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 964382 data_alloc: 218103808 data_used: 143360
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:54:13.093568+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85237760 unmapped: 2883584 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:54:14.093718+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85237760 unmapped: 2883584 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:54:15.093865+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85237760 unmapped: 2883584 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:54:16.094004+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85237760 unmapped: 2883584 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:54:17.094210+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85237760 unmapped: 2883584 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 964382 data_alloc: 218103808 data_used: 143360
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:54:18.094401+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85237760 unmapped: 2883584 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 ms_handle_reset con 0x5616e21e4000 session 0x5616e311af00
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 ms_handle_reset con 0x5616e3281000 session 0x5616e312ef00
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:54:19.094531+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85237760 unmapped: 2883584 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:54:20.094678+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85245952 unmapped: 2875392 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:54:21.094846+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85245952 unmapped: 2875392 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:54:22.095023+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85245952 unmapped: 2875392 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 964382 data_alloc: 218103808 data_used: 143360
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:54:23.095219+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85245952 unmapped: 2875392 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:54:24.095396+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85245952 unmapped: 2875392 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:54:25.095605+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85245952 unmapped: 2875392 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:54:26.095736+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85245952 unmapped: 2875392 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:54:27.095900+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85245952 unmapped: 2875392 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 964382 data_alloc: 218103808 data_used: 143360
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:54:28.096060+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85245952 unmapped: 2875392 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e2caf800
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 33.559337616s of 33.630935669s, submitted: 4
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:54:29.096202+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85245952 unmapped: 2875392 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:54:30.096345+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85245952 unmapped: 2875392 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:54:31.096488+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85245952 unmapped: 2875392 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:54:32.096686+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e3222000
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85245952 unmapped: 2875392 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 966026 data_alloc: 218103808 data_used: 143360
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:54:33.097500+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85245952 unmapped: 2875392 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:54:34.097652+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85245952 unmapped: 2875392 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:54:35.097848+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85245952 unmapped: 2875392 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:54:36.098022+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85245952 unmapped: 2875392 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:54:37.098181+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85245952 unmapped: 2875392 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 966026 data_alloc: 218103808 data_used: 143360
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:54:38.098322+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85245952 unmapped: 2875392 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:54:39.098518+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85245952 unmapped: 2875392 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:54:40.098715+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85245952 unmapped: 2875392 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:54:41.098899+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85245952 unmapped: 2875392 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:54:42.099052+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85245952 unmapped: 2875392 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 966026 data_alloc: 218103808 data_used: 143360
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:54:43.099204+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85245952 unmapped: 2875392 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:54:44.099389+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85245952 unmapped: 2875392 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 15.959052086s of 15.965865135s, submitted: 2
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:54:45.099531+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85245952 unmapped: 2875392 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:54:46.099700+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85245952 unmapped: 2875392 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:54:47.099839+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85245952 unmapped: 2875392 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 965894 data_alloc: 218103808 data_used: 143360
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 ms_handle_reset con 0x5616e0541c00 session 0x5616e0d6e1e0
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e11f9800
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:54:48.099989+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85245952 unmapped: 2875392 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:54:49.100260+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85245952 unmapped: 2875392 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:54:50.100414+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85245952 unmapped: 2875392 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:54:51.100576+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85245952 unmapped: 2875392 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:54:52.100735+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85245952 unmapped: 2875392 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 965894 data_alloc: 218103808 data_used: 143360
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:54:53.100962+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85245952 unmapped: 2875392 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:54:54.101174+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85245952 unmapped: 2875392 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:54:55.101375+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85245952 unmapped: 2875392 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:54:56.101552+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85245952 unmapped: 2875392 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:54:57.101741+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85245952 unmapped: 2875392 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 965894 data_alloc: 218103808 data_used: 143360
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:54:58.101883+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85245952 unmapped: 2875392 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:54:59.102041+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85245952 unmapped: 2875392 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:55:00.102194+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85254144 unmapped: 2867200 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:55:01.102347+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85254144 unmapped: 2867200 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:55:02.102499+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85254144 unmapped: 2867200 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 965894 data_alloc: 218103808 data_used: 143360
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:55:03.102667+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85254144 unmapped: 2867200 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:55:04.102810+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85254144 unmapped: 2867200 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:55:05.102939+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85254144 unmapped: 2867200 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:55:06.103073+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85254144 unmapped: 2867200 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:55:07.103264+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85254144 unmapped: 2867200 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 965894 data_alloc: 218103808 data_used: 143360
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:55:08.103468+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85254144 unmapped: 2867200 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:55:09.103622+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85254144 unmapped: 2867200 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:55:10.103769+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85254144 unmapped: 2867200 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:55:11.103913+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85254144 unmapped: 2867200 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:55:12.104080+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85254144 unmapped: 2867200 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 965894 data_alloc: 218103808 data_used: 143360
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:55:13.104213+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85254144 unmapped: 2867200 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:55:14.104438+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85254144 unmapped: 2867200 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:55:15.104601+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85254144 unmapped: 2867200 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:55:16.104769+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85254144 unmapped: 2867200 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:55:17.104942+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85254144 unmapped: 2867200 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 965894 data_alloc: 218103808 data_used: 143360
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:55:18.105094+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85254144 unmapped: 2867200 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:55:19.105244+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85254144 unmapped: 2867200 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:55:20.105400+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85254144 unmapped: 2867200 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:55:21.105648+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85254144 unmapped: 2867200 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:55:22.105859+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85254144 unmapped: 2867200 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 965894 data_alloc: 218103808 data_used: 143360
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:55:23.106096+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85254144 unmapped: 2867200 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:55:24.106281+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85254144 unmapped: 2867200 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:55:25.106485+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85254144 unmapped: 2867200 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:55:26.106671+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85254144 unmapped: 2867200 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:55:27.106887+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85254144 unmapped: 2867200 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 ms_handle_reset con 0x5616e3222000 session 0x5616e311b0e0
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 ms_handle_reset con 0x5616e1193000 session 0x5616e312c960
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 965894 data_alloc: 218103808 data_used: 143360
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:55:28.107044+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85254144 unmapped: 2867200 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:55:29.107212+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85254144 unmapped: 2867200 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:55:30.107355+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85254144 unmapped: 2867200 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:55:31.107497+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85254144 unmapped: 2867200 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:55:32.107647+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85262336 unmapped: 2859008 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 965894 data_alloc: 218103808 data_used: 143360
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:55:33.107807+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85262336 unmapped: 2859008 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:55:34.107945+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85262336 unmapped: 2859008 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:55:35.108114+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85262336 unmapped: 2859008 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:55:36.108279+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85262336 unmapped: 2859008 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:55:37.108544+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85262336 unmapped: 2859008 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 965894 data_alloc: 218103808 data_used: 143360
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:55:38.108918+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85262336 unmapped: 2859008 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e3542c00
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 53.633445740s of 53.641864777s, submitted: 1
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:55:39.109057+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85262336 unmapped: 2859008 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:55:40.109431+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85262336 unmapped: 2859008 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:55:41.109919+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85262336 unmapped: 2859008 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e21adc00
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:55:42.110618+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85262336 unmapped: 2859008 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 967538 data_alloc: 218103808 data_used: 143360
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:55:43.111269+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85262336 unmapped: 2859008 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:55:44.111403+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85262336 unmapped: 2859008 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e2a99800
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:55:45.112589+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85270528 unmapped: 2850816 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:55:46.113055+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85270528 unmapped: 2850816 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:55:47.113438+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85270528 unmapped: 2850816 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 967538 data_alloc: 218103808 data_used: 143360
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:55:48.113620+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85270528 unmapped: 2850816 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:55:49.113781+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85270528 unmapped: 2850816 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:55:50.113958+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85270528 unmapped: 2850816 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 12.074842453s of 12.082664490s, submitted: 2
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:55:51.114134+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85270528 unmapped: 2850816 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:55:52.114533+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85270528 unmapped: 2850816 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 966947 data_alloc: 218103808 data_used: 143360
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:55:53.114949+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85270528 unmapped: 2850816 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:55:54.115130+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85270528 unmapped: 2850816 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:55:55.115336+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85270528 unmapped: 2850816 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:55:56.115536+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85278720 unmapped: 2842624 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:55:57.115680+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85278720 unmapped: 2842624 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 966815 data_alloc: 218103808 data_used: 143360
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:55:58.115823+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85278720 unmapped: 2842624 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:55:59.116051+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85278720 unmapped: 2842624 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:56:00.116212+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85278720 unmapped: 2842624 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:56:01.116364+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85278720 unmapped: 2842624 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:56:02.116497+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 ms_handle_reset con 0x5616e321a800 session 0x5616e311a780
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 ms_handle_reset con 0x5616e3220400 session 0x5616e34ea5a0
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85278720 unmapped: 2842624 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 966815 data_alloc: 218103808 data_used: 143360
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:56:03.116671+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85278720 unmapped: 2842624 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:56:04.116811+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85278720 unmapped: 2842624 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:56:05.116962+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85278720 unmapped: 2842624 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:56:06.117203+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85278720 unmapped: 2842624 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:56:07.117344+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85278720 unmapped: 2842624 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 966815 data_alloc: 218103808 data_used: 143360
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:56:08.117468+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85278720 unmapped: 2842624 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:56:09.117580+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85278720 unmapped: 2842624 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:56:10.117706+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85278720 unmapped: 2842624 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:56:11.117839+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85286912 unmapped: 2834432 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:56:12.118002+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85286912 unmapped: 2834432 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 966815 data_alloc: 218103808 data_used: 143360
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:56:13.118154+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e327a000
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 22.421203613s of 22.434373856s, submitted: 2
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85295104 unmapped: 2826240 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:56:14.118299+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85295104 unmapped: 2826240 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:56:15.118496+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85295104 unmapped: 2826240 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:56:16.118643+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85295104 unmapped: 2826240 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:56:17.118845+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85295104 unmapped: 2826240 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 968459 data_alloc: 218103808 data_used: 143360
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:56:18.119074+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85295104 unmapped: 2826240 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:56:19.119291+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85295104 unmapped: 2826240 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:56:20.119424+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85295104 unmapped: 2826240 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:56:21.119627+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85295104 unmapped: 2826240 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:56:22.119774+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85303296 unmapped: 2818048 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 967868 data_alloc: 218103808 data_used: 143360
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:56:23.119972+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85303296 unmapped: 2818048 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:56:24.120106+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85303296 unmapped: 2818048 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:56:25.120289+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85303296 unmapped: 2818048 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:56:26.120487+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85303296 unmapped: 2818048 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:56:27.120677+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85303296 unmapped: 2818048 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 967868 data_alloc: 218103808 data_used: 143360
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:56:28.120804+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 15.030465126s of 15.042689323s, submitted: 3
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 ms_handle_reset con 0x5616e327a000 session 0x5616e3750960
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85311488 unmapped: 2809856 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:56:29.121068+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85311488 unmapped: 2809856 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:56:30.121247+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85311488 unmapped: 2809856 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:56:31.121418+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85311488 unmapped: 2809856 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:56:32.121593+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85311488 unmapped: 2809856 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 967736 data_alloc: 218103808 data_used: 143360
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:56:33.121805+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85311488 unmapped: 2809856 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:56:34.121961+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85311488 unmapped: 2809856 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:56:35.122121+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85311488 unmapped: 2809856 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:56:36.122331+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85311488 unmapped: 2809856 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:56:37.122511+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85311488 unmapped: 2809856 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 967736 data_alloc: 218103808 data_used: 143360
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:56:38.122711+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85311488 unmapped: 2809856 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:56:39.122878+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e3282400
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 10.958620071s of 10.963719368s, submitted: 1
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85311488 unmapped: 2809856 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:56:40.123102+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85311488 unmapped: 2809856 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:56:41.123282+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85311488 unmapped: 2809856 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:56:42.123506+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85311488 unmapped: 2809856 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 970892 data_alloc: 218103808 data_used: 143360
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:56:43.123993+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85311488 unmapped: 2809856 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:56:44.125060+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85311488 unmapped: 2809856 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:56:45.125513+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85311488 unmapped: 2809856 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:56:46.125648+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85311488 unmapped: 2809856 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:56:47.125784+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85311488 unmapped: 2809856 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:56:48.125932+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 970892 data_alloc: 218103808 data_used: 143360
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85311488 unmapped: 2809856 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:56:49.126063+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85311488 unmapped: 2809856 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:56:50.126198+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85311488 unmapped: 2809856 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:56:51.126373+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85311488 unmapped: 2809856 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:56:52.126800+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85319680 unmapped: 2801664 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:56:53.127011+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 970301 data_alloc: 218103808 data_used: 143360
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 13.377700806s of 13.986701965s, submitted: 4
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85319680 unmapped: 2801664 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:56:54.127269+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85319680 unmapped: 2801664 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:56:55.127455+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85319680 unmapped: 2801664 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:56:56.127643+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85319680 unmapped: 2801664 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:56:57.127866+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85319680 unmapped: 2801664 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:56:58.128060+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 970169 data_alloc: 218103808 data_used: 143360
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85319680 unmapped: 2801664 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:56:59.128228+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85319680 unmapped: 2801664 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:57:00.128393+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85319680 unmapped: 2801664 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:57:01.128554+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85319680 unmapped: 2801664 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:57:02.128699+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85319680 unmapped: 2801664 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:57:03.128933+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 970169 data_alloc: 218103808 data_used: 143360
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85319680 unmapped: 2801664 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:57:04.129073+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85319680 unmapped: 2801664 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:57:05.129190+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85319680 unmapped: 2801664 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:57:06.129310+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85319680 unmapped: 2801664 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:57:07.129473+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85319680 unmapped: 2801664 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:57:08.129596+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 970169 data_alloc: 218103808 data_used: 143360
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85319680 unmapped: 2801664 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:57:09.129706+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85319680 unmapped: 2801664 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:57:10.129817+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85319680 unmapped: 2801664 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:57:11.130050+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85319680 unmapped: 2801664 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:57:12.130252+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85319680 unmapped: 2801664 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:57:13.130427+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 970169 data_alloc: 218103808 data_used: 143360
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85319680 unmapped: 2801664 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:57:14.130548+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85319680 unmapped: 2801664 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:57:15.130717+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85319680 unmapped: 2801664 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:57:16.130855+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85319680 unmapped: 2801664 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:57:17.131015+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85319680 unmapped: 2801664 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:57:18.131184+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 970169 data_alloc: 218103808 data_used: 143360
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85327872 unmapped: 2793472 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:57:19.131325+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85327872 unmapped: 2793472 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:57:20.131485+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85327872 unmapped: 2793472 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:57:21.131657+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85327872 unmapped: 2793472 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:57:22.131800+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85327872 unmapped: 2793472 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:57:23.132014+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 970169 data_alloc: 218103808 data_used: 143360
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85327872 unmapped: 2793472 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:57:24.132163+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85327872 unmapped: 2793472 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:57:25.132313+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85327872 unmapped: 2793472 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:57:26.132452+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85327872 unmapped: 2793472 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:57:27.132579+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85327872 unmapped: 2793472 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:57:28.132730+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 970169 data_alloc: 218103808 data_used: 143360
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85327872 unmapped: 2793472 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:57:29.132931+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85327872 unmapped: 2793472 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:57:30.133114+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 ms_handle_reset con 0x5616e3282400 session 0x5616e311a3c0
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85327872 unmapped: 2793472 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:57:31.133210+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85327872 unmapped: 2793472 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:57:32.133359+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85327872 unmapped: 2793472 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:57:33.133543+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 970169 data_alloc: 218103808 data_used: 143360
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85327872 unmapped: 2793472 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:57:34.133742+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85327872 unmapped: 2793472 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:57:35.133913+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85327872 unmapped: 2793472 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:57:36.134179+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85327872 unmapped: 2793472 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:57:37.134395+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85327872 unmapped: 2793472 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:57:38.134642+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 970169 data_alloc: 218103808 data_used: 143360
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85327872 unmapped: 2793472 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:57:39.134926+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85327872 unmapped: 2793472 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:57:40.135132+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85327872 unmapped: 2793472 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-mon[80115]: from='client.26851 -' entity='client.admin' cmd=[{"prefix": "orch host ls", "target": ["mon-mgr", ""]}]: dispatch
Feb 02 10:14:24 compute-1 ceph-mon[80115]: from='client.? 192.168.122.101:0/3225208465' entity='client.admin' cmd=[{"prefix": "log last", "num": 10000, "level": "debug", "channel": "cluster"}]: dispatch
Feb 02 10:14:24 compute-1 ceph-mon[80115]: from='client.? 192.168.122.100:0/4269553776' entity='client.admin' cmd=[{"prefix": "mgr module ls"}]: dispatch
Feb 02 10:14:24 compute-1 ceph-mon[80115]: from='client.26831 -' entity='client.admin' cmd=[{"prefix": "orch host ls", "target": ["mon-mgr", ""]}]: dispatch
Feb 02 10:14:24 compute-1 ceph-mon[80115]: from='client.? 192.168.122.100:0/1147547086' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch
Feb 02 10:14:24 compute-1 ceph-mon[80115]: from='client.26875 -' entity='client.admin' cmd=[{"prefix": "orch device ls", "target": ["mon-mgr", ""]}]: dispatch
Feb 02 10:14:24 compute-1 ceph-mon[80115]: from='client.? 192.168.122.102:0/2525322596' entity='client.admin' cmd=[{"prefix": "mgr dump"}]: dispatch
Feb 02 10:14:24 compute-1 ceph-mon[80115]: from='client.? 192.168.122.101:0/734854259' entity='client.admin' cmd=[{"prefix": "mgr dump"}]: dispatch
Feb 02 10:14:24 compute-1 ceph-mon[80115]: from='client.? 192.168.122.100:0/1407366583' entity='client.admin' cmd=[{"prefix": "mgr services"}]: dispatch
Feb 02 10:14:24 compute-1 ceph-mon[80115]: from='client.? 192.168.122.102:0/3165997439' entity='client.admin' cmd=[{"prefix": "mgr metadata"}]: dispatch
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:57:41.135382+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e3285400
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 48.270744324s of 48.274307251s, submitted: 1
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85327872 unmapped: 2793472 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:57:42.135573+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85327872 unmapped: 2793472 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:57:43.135803+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 970301 data_alloc: 218103808 data_used: 143360
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85327872 unmapped: 2793472 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:57:44.135939+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85327872 unmapped: 2793472 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:57:45.136096+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85327872 unmapped: 2793472 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:57:46.136263+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85327872 unmapped: 2793472 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:57:47.136387+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e1193c00
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85344256 unmapped: 2777088 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:57:48.136506+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 971813 data_alloc: 218103808 data_used: 143360
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85344256 unmapped: 2777088 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:57:49.136639+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85344256 unmapped: 2777088 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:57:50.136726+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:57:51.136852+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85344256 unmapped: 2777088 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:57:52.137046+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85344256 unmapped: 2777088 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:57:53.137213+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85344256 unmapped: 2777088 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 970631 data_alloc: 218103808 data_used: 143360
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:57:54.137377+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85344256 unmapped: 2777088 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:57:55.137522+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85344256 unmapped: 2777088 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:57:56.137697+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85344256 unmapped: 2777088 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:57:57.137841+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85344256 unmapped: 2777088 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:57:58.137973+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85344256 unmapped: 2777088 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 970631 data_alloc: 218103808 data_used: 143360
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:57:59.138289+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85344256 unmapped: 2777088 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 17.719709396s of 17.732963562s, submitted: 4
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:58:00.138465+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85344256 unmapped: 2777088 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:58:01.138621+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85344256 unmapped: 2777088 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:58:02.138783+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85344256 unmapped: 2777088 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:58:03.139015+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85344256 unmapped: 2777088 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 970499 data_alloc: 218103808 data_used: 143360
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:58:04.139398+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85344256 unmapped: 2777088 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:58:05.139560+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85344256 unmapped: 2777088 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:58:06.139745+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85344256 unmapped: 2777088 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:58:07.139966+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85344256 unmapped: 2777088 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:58:08.140222+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85344256 unmapped: 2777088 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 970499 data_alloc: 218103808 data_used: 143360
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:58:09.140413+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85344256 unmapped: 2777088 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:58:10.140571+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85344256 unmapped: 2777088 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:58:11.140768+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85344256 unmapped: 2777088 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:58:12.141044+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85344256 unmapped: 2777088 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:58:13.141347+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85344256 unmapped: 2777088 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 970499 data_alloc: 218103808 data_used: 143360
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:58:14.141500+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85344256 unmapped: 2777088 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:58:15.141665+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85344256 unmapped: 2777088 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:58:16.141812+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85344256 unmapped: 2777088 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:58:17.141980+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85344256 unmapped: 2777088 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:58:18.142097+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85344256 unmapped: 2777088 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 970499 data_alloc: 218103808 data_used: 143360
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:58:19.142313+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85344256 unmapped: 2777088 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:58:20.142469+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85352448 unmapped: 2768896 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:58:21.142615+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85352448 unmapped: 2768896 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:58:22.142793+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85352448 unmapped: 2768896 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 ms_handle_reset con 0x5616e21adc00 session 0x5616e311b860
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 ms_handle_reset con 0x5616e2caf800 session 0x5616e337f2c0
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:58:23.142977+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85352448 unmapped: 2768896 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 970499 data_alloc: 218103808 data_used: 143360
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:58:24.143059+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85352448 unmapped: 2768896 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:58:25.143202+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85352448 unmapped: 2768896 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:58:26.143378+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85352448 unmapped: 2768896 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:58:27.143525+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85352448 unmapped: 2768896 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:58:28.143726+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85352448 unmapped: 2768896 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 970499 data_alloc: 218103808 data_used: 143360
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:58:29.143908+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85352448 unmapped: 2768896 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:58:30.144060+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85352448 unmapped: 2768896 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:58:31.144588+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85352448 unmapped: 2768896 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:58:32.144912+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85352448 unmapped: 2768896 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:58:33.145256+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85352448 unmapped: 2768896 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e21ac400
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 34.156009674s of 34.160236359s, submitted: 1
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 970631 data_alloc: 218103808 data_used: 143360
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:58:34.145394+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85352448 unmapped: 2768896 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:58:35.145554+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85352448 unmapped: 2768896 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:58:36.145696+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85360640 unmapped: 2760704 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:58:37.145850+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85360640 unmapped: 2760704 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:58:38.146048+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85360640 unmapped: 2760704 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 972143 data_alloc: 218103808 data_used: 143360
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                           ** DB Stats **
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Cumulative writes: 8815 writes, 34K keys, 8815 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.02 MB/s
                                           Cumulative WAL: 8815 writes, 1876 syncs, 4.70 writes per sync, written: 0.02 GB, 0.02 MB/s
                                           Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                           Interval writes: 746 writes, 1209 keys, 746 commit groups, 1.0 writes per commit group, ingest: 0.41 MB, 0.00 MB/s
                                           Interval WAL: 746 writes, 348 syncs, 2.14 writes per sync, written: 0.00 GB, 0.00 MB/s
                                           Interval stall: 00:00:0.000 H:M:S, 0.0 percent
                                           
                                           ** Compaction Stats [default] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      2/0    2.63 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.2      0.01              0.00         1    0.005       0      0       0.0       0.0
                                            Sum      2/0    2.63 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.2      0.01              0.00         1    0.005       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [default] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.2      0.01              0.00         1    0.005       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x5616dea3d350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 8.3e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [default] **
                                           
                                           ** Compaction Stats [m-0] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [m-0] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x5616dea3d350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 8.3e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [m-0] **
                                           
                                           ** Compaction Stats [m-1] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [m-1] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x5616dea3d350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 8.3e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [m-1] **
                                           
                                           ** Compaction Stats [m-2] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [m-2] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x5616dea3d350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 8.3e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [m-2] **
                                           
                                           ** Compaction Stats [p-0] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      1/0    1.56 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.2      0.01              0.00         1    0.007       0      0       0.0       0.0
                                            Sum      1/0    1.56 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.2      0.01              0.00         1    0.007       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [p-0] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.2      0.01              0.00         1    0.007       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x5616dea3d350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 8.3e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [p-0] **
                                           
                                           ** Compaction Stats [p-1] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [p-1] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x5616dea3d350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 8.3e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [p-1] **
                                           
                                           ** Compaction Stats [p-2] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [p-2] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x5616dea3d350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 8.3e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [p-2] **
                                           
                                           ** Compaction Stats [O-0] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [O-0] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x5616dea3c9b0#2 capacity: 224.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 2 last_secs: 1.3e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,8.85555e-05%) FilterBlock(1,0.11 KB,4.76837e-05%) IndexBlock(1,0.14 KB,6.13076e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [O-0] **
                                           
                                           ** Compaction Stats [O-1] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [O-1] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x5616dea3c9b0#2 capacity: 224.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 2 last_secs: 1.3e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,8.85555e-05%) FilterBlock(1,0.11 KB,4.76837e-05%) IndexBlock(1,0.14 KB,6.13076e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [O-1] **
                                           
                                           ** Compaction Stats [O-2] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      1/0    1.25 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.4      0.00              0.00         1    0.003       0      0       0.0       0.0
                                            Sum      1/0    1.25 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.4      0.00              0.00         1    0.003       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [O-2] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.4      0.00              0.00         1    0.003       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x5616dea3c9b0#2 capacity: 224.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 2 last_secs: 1.3e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,8.85555e-05%) FilterBlock(1,0.11 KB,4.76837e-05%) IndexBlock(1,0.14 KB,6.13076e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [O-2] **
                                           
                                           ** Compaction Stats [L] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.001       0      0       0.0       0.0
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.001       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [L] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.001       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x5616dea3d350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 8.3e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [L] **
                                           
                                           ** Compaction Stats [P] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [P] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x5616dea3d350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 8.3e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [P] **
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:58:39.146257+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85360640 unmapped: 2760704 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:58:40.146419+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85360640 unmapped: 2760704 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:58:41.146618+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85360640 unmapped: 2760704 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:58:42.146762+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85360640 unmapped: 2760704 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 ms_handle_reset con 0x5616e1193c00 session 0x5616e311bc20
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 ms_handle_reset con 0x5616e3285400 session 0x5616e311a780
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:58:43.146949+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85360640 unmapped: 2760704 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 972143 data_alloc: 218103808 data_used: 143360
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:58:44.147038+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85360640 unmapped: 2760704 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr metadata"} v 0)
Feb 02 10:14:24 compute-1 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/933138151' entity='client.admin' cmd=[{"prefix": "mgr metadata"}]: dispatch
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:58:45.147241+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85360640 unmapped: 2760704 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:58:46.147385+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85360640 unmapped: 2760704 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:58:47.147548+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85360640 unmapped: 2760704 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:58:48.147670+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85360640 unmapped: 2760704 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 972143 data_alloc: 218103808 data_used: 143360
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 15.711874962s of 15.720390320s, submitted: 2
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:58:49.147795+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85360640 unmapped: 2760704 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:58:50.147968+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85360640 unmapped: 2760704 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:58:51.148111+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85360640 unmapped: 2760704 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:58:52.148321+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85360640 unmapped: 2760704 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:58:53.148748+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85360640 unmapped: 2760704 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e10efc00
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 972143 data_alloc: 218103808 data_used: 143360
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:58:54.148964+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85368832 unmapped: 2752512 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:58:55.149127+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85368832 unmapped: 2752512 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:58:56.149319+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85368832 unmapped: 2752512 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:58:57.149489+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85368832 unmapped: 2752512 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:58:58.149659+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85368832 unmapped: 2752512 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 972143 data_alloc: 218103808 data_used: 143360
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:58:59.149917+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85368832 unmapped: 2752512 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e34d4800
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:59:00.150102+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85368832 unmapped: 2752512 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:59:01.150301+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85368832 unmapped: 2752512 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:59:02.150479+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85368832 unmapped: 2752512 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:59:03.150673+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85368832 unmapped: 2752512 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 972143 data_alloc: 218103808 data_used: 143360
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:59:04.150809+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85368832 unmapped: 2752512 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:59:05.150910+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85368832 unmapped: 2752512 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 16.441757202s of 16.448879242s, submitted: 2
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:59:06.151079+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85368832 unmapped: 2752512 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:59:07.151243+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85368832 unmapped: 2752512 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:59:08.151415+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85377024 unmapped: 2744320 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 971552 data_alloc: 218103808 data_used: 143360
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:59:09.151554+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85377024 unmapped: 2744320 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:59:10.151758+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85377024 unmapped: 2744320 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:59:11.151909+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85377024 unmapped: 2744320 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:59:12.152099+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85377024 unmapped: 2744320 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:59:13.152363+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85377024 unmapped: 2744320 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 971420 data_alloc: 218103808 data_used: 143360
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:59:14.152522+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85377024 unmapped: 2744320 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:59:15.152609+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85377024 unmapped: 2744320 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:59:16.152756+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85377024 unmapped: 2744320 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:59:17.152905+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85377024 unmapped: 2744320 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:59:18.153071+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85377024 unmapped: 2744320 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 971420 data_alloc: 218103808 data_used: 143360
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:59:19.153238+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85377024 unmapped: 2744320 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:59:20.153431+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85377024 unmapped: 2744320 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:59:21.153566+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85377024 unmapped: 2744320 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:59:22.153741+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85377024 unmapped: 2744320 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:59:23.153959+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85377024 unmapped: 2744320 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 971420 data_alloc: 218103808 data_used: 143360
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:59:24.154119+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85377024 unmapped: 2744320 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:59:25.154260+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85377024 unmapped: 2744320 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:59:26.154449+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85377024 unmapped: 2744320 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:59:27.154673+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85377024 unmapped: 2744320 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:59:28.154834+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85377024 unmapped: 2744320 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 971420 data_alloc: 218103808 data_used: 143360
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:59:29.154979+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85377024 unmapped: 2744320 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:59:30.155114+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85377024 unmapped: 2744320 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:59:31.155323+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85377024 unmapped: 2744320 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:59:32.155496+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85377024 unmapped: 2744320 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:59:33.155716+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85377024 unmapped: 2744320 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 971420 data_alloc: 218103808 data_used: 143360
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:59:34.155881+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85377024 unmapped: 2744320 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:59:35.156056+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85385216 unmapped: 2736128 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:59:36.156204+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85385216 unmapped: 2736128 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:59:37.156397+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85385216 unmapped: 2736128 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:59:38.156544+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85385216 unmapped: 2736128 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 971420 data_alloc: 218103808 data_used: 143360
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:59:39.156756+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85385216 unmapped: 2736128 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:59:40.156946+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85385216 unmapped: 2736128 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:59:41.484530+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85385216 unmapped: 2736128 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:59:42.484651+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85385216 unmapped: 2736128 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:59:43.484832+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85393408 unmapped: 2727936 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 971420 data_alloc: 218103808 data_used: 143360
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:59:44.484990+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 ms_handle_reset con 0x5616e34d4800 session 0x5616e3688780
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 ms_handle_reset con 0x5616e10efc00 session 0x5616e312ef00
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85393408 unmapped: 2727936 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:59:45.485179+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85393408 unmapped: 2727936 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:59:46.485358+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85393408 unmapped: 2727936 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:59:47.485547+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85393408 unmapped: 2727936 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:59:48.485732+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85401600 unmapped: 2719744 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 971420 data_alloc: 218103808 data_used: 143360
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:59:49.485872+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85401600 unmapped: 2719744 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:59:50.485994+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85401600 unmapped: 2719744 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:59:51.486115+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85401600 unmapped: 2719744 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:59:52.486276+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85401600 unmapped: 2719744 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:59:53.486464+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85401600 unmapped: 2719744 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 971420 data_alloc: 218103808 data_used: 143360
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:59:54.486654+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85401600 unmapped: 2719744 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 49.622116089s of 49.629035950s, submitted: 2
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e3280c00
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:59:55.486795+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85401600 unmapped: 3768320 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:59:56.486935+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85557248 unmapped: 3612672 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:59:57.487076+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85671936 unmapped: 3497984 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:59:58.487261+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85671936 unmapped: 3497984 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 971552 data_alloc: 218103808 data_used: 143360
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:59:59.487398+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85671936 unmapped: 3497984 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:00:00.487583+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85671936 unmapped: 3497984 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e108a800
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:00:01.487724+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85671936 unmapped: 3497984 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:00:02.487870+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85671936 unmapped: 3497984 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:00:03.488029+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85671936 unmapped: 3497984 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 973064 data_alloc: 218103808 data_used: 143360
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:00:04.488218+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85671936 unmapped: 3497984 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:00:05.488371+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85671936 unmapped: 3497984 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:00:06.488538+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85671936 unmapped: 3497984 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:00:07.488691+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85671936 unmapped: 3497984 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread fragmentation_score=0.000029 took=0.000037s
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:00:08.488906+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85671936 unmapped: 3497984 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 973064 data_alloc: 218103808 data_used: 143360
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:00:09.489026+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85671936 unmapped: 3497984 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:00:10.489229+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85671936 unmapped: 3497984 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:00:11.489385+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85671936 unmapped: 3497984 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:00:12.489532+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85671936 unmapped: 3497984 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 16.980213165s of 18.280017853s, submitted: 343
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:00:13.489704+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85688320 unmapped: 3481600 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 972932 data_alloc: 218103808 data_used: 143360
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:00:14.489814+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85688320 unmapped: 3481600 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:00:15.489960+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85688320 unmapped: 3481600 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:00:16.490117+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85688320 unmapped: 3481600 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:00:17.490325+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85688320 unmapped: 3481600 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:00:18.490486+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85688320 unmapped: 3481600 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 972932 data_alloc: 218103808 data_used: 143360
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:00:19.490618+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85688320 unmapped: 3481600 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:00:20.490749+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85688320 unmapped: 3481600 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:00:21.490870+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85688320 unmapped: 3481600 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:00:22.491020+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85688320 unmapped: 3481600 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:00:23.491228+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85688320 unmapped: 3481600 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 972932 data_alloc: 218103808 data_used: 143360
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:00:24.491394+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85688320 unmapped: 3481600 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:00:25.491575+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85688320 unmapped: 3481600 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:00:26.491716+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85688320 unmapped: 3481600 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:00:27.491851+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85688320 unmapped: 3481600 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:00:28.492035+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85688320 unmapped: 3481600 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 972932 data_alloc: 218103808 data_used: 143360
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:00:29.492207+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85688320 unmapped: 3481600 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:00:30.492399+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85688320 unmapped: 3481600 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:00:31.492535+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85688320 unmapped: 3481600 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:00:32.492664+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85688320 unmapped: 3481600 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:00:33.492828+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85688320 unmapped: 3481600 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 972932 data_alloc: 218103808 data_used: 143360
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:00:34.492979+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85688320 unmapped: 3481600 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:00:35.493114+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85688320 unmapped: 3481600 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:00:36.493262+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85688320 unmapped: 3481600 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:00:37.493409+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85688320 unmapped: 3481600 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:00:38.493537+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85688320 unmapped: 3481600 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 972932 data_alloc: 218103808 data_used: 143360
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:00:39.493705+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85688320 unmapped: 3481600 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:00:40.493844+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85688320 unmapped: 3481600 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:00:41.493966+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85688320 unmapped: 3481600 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:00:42.494070+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85688320 unmapped: 3481600 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:00:43.494205+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85688320 unmapped: 3481600 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 972932 data_alloc: 218103808 data_used: 143360
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:00:44.494359+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 ms_handle_reset con 0x5616e21ac400 session 0x5616e3688000
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85688320 unmapped: 3481600 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:00:45.494503+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85688320 unmapped: 3481600 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:00:46.494631+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85688320 unmapped: 3481600 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:00:47.494774+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85688320 unmapped: 3481600 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:00:48.494870+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85688320 unmapped: 3481600 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 972932 data_alloc: 218103808 data_used: 143360
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:00:49.495013+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85688320 unmapped: 3481600 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:00:50.495186+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85688320 unmapped: 3481600 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:00:51.495328+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85688320 unmapped: 3481600 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:00:52.495497+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85688320 unmapped: 3481600 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:00:53.495702+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85688320 unmapped: 3481600 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 972932 data_alloc: 218103808 data_used: 143360
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:00:54.495915+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85688320 unmapped: 3481600 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e2273400
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 41.939346313s of 41.943122864s, submitted: 1
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:00:55.496060+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85696512 unmapped: 3473408 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:00:56.496213+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85696512 unmapped: 3473408 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:00:57.496400+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85696512 unmapped: 3473408 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:00:58.496542+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85696512 unmapped: 3473408 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 974576 data_alloc: 218103808 data_used: 143360
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:00:59.496698+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85696512 unmapped: 3473408 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:01:00.496860+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85696512 unmapped: 3473408 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:01:01.497070+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85696512 unmapped: 3473408 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:01:02.497269+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85696512 unmapped: 3473408 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:01:03.497443+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85696512 unmapped: 3473408 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 974576 data_alloc: 218103808 data_used: 143360
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:01:04.497615+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85696512 unmapped: 3473408 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:01:05.497793+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85696512 unmapped: 3473408 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:01:06.497964+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85696512 unmapped: 3473408 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:01:07.498105+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85696512 unmapped: 3473408 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:01:08.498294+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85696512 unmapped: 3473408 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 973985 data_alloc: 218103808 data_used: 143360
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:01:09.498455+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85696512 unmapped: 3473408 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:01:10.498629+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85696512 unmapped: 3473408 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:01:11.498773+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85696512 unmapped: 3473408 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:01:12.498962+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85696512 unmapped: 3473408 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 17.912214279s of 17.978521347s, submitted: 3
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:01:13.499241+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _renew_subs
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 139 handle_osd_map epochs [140,140], i have 139, src has [1,140]
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85696512 unmapped: 3473408 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 977619 data_alloc: 218103808 data_used: 143360
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:01:14.499443+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 140 handle_osd_map epochs [140,141], i have 140, src has [1,141]
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e327c800
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 94093312 unmapped: 3465216 heap: 97558528 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:01:15.499621+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 141 ms_handle_reset con 0x5616e327c800 session 0x5616e1f7b0e0
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85827584 unmapped: 20127744 heap: 105955328 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 141 heartbeat osd_stat(store_statfs(0x4fb65f000/0x0/0x4ffc00000, data 0x10f66e2/0x11ab000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 141 handle_osd_map epochs [142,142], i have 141, src has [1,142]
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 141 handle_osd_map epochs [142,142], i have 142, src has [1,142]
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e21e4c00
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:01:16.499853+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85868544 unmapped: 20086784 heap: 105955328 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:01:17.500017+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 142 ms_handle_reset con 0x5616e21e4c00 session 0x5616e37ce780
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85876736 unmapped: 20078592 heap: 105955328 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:01:18.500239+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85876736 unmapped: 20078592 heap: 105955328 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1141871 data_alloc: 218103808 data_used: 143360
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:01:19.500400+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85876736 unmapped: 20078592 heap: 105955328 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 142 handle_osd_map epochs [143,143], i have 142, src has [1,143]
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:01:20.500615+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85876736 unmapped: 20078592 heap: 105955328 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:01:21.500814+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85876736 unmapped: 20078592 heap: 105955328 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 143 heartbeat osd_stat(store_statfs(0x4fae59000/0x0/0x4ffc00000, data 0x18fa8f2/0x19b1000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 143 handle_osd_map epochs [143,144], i have 143, src has [1,144]
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:01:22.501009+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85909504 unmapped: 20045824 heap: 105955328 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fae57000/0x0/0x4ffc00000, data 0x18fc8c4/0x19b4000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:01:23.501230+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85909504 unmapped: 20045824 heap: 105955328 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1147723 data_alloc: 218103808 data_used: 143360
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:01:24.501381+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85909504 unmapped: 20045824 heap: 105955328 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:01:25.501581+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85909504 unmapped: 20045824 heap: 105955328 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:01:26.501729+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85909504 unmapped: 20045824 heap: 105955328 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:01:27.501895+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85909504 unmapped: 20045824 heap: 105955328 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fae57000/0x0/0x4ffc00000, data 0x18fc8c4/0x19b4000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:01:28.502058+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85909504 unmapped: 20045824 heap: 105955328 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1147723 data_alloc: 218103808 data_used: 143360
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:01:29.502231+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fae57000/0x0/0x4ffc00000, data 0x18fc8c4/0x19b4000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85909504 unmapped: 20045824 heap: 105955328 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fae57000/0x0/0x4ffc00000, data 0x18fc8c4/0x19b4000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:01:30.502345+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85909504 unmapped: 20045824 heap: 105955328 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fae57000/0x0/0x4ffc00000, data 0x18fc8c4/0x19b4000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:01:31.502532+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85909504 unmapped: 20045824 heap: 105955328 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:01:32.502756+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85909504 unmapped: 20045824 heap: 105955328 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:01:33.502972+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85909504 unmapped: 20045824 heap: 105955328 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1147723 data_alloc: 218103808 data_used: 143360
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:01:34.503138+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85909504 unmapped: 20045824 heap: 105955328 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fae57000/0x0/0x4ffc00000, data 0x18fc8c4/0x19b4000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:01:35.503376+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85909504 unmapped: 20045824 heap: 105955328 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:01:36.503580+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85909504 unmapped: 20045824 heap: 105955328 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fae57000/0x0/0x4ffc00000, data 0x18fc8c4/0x19b4000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:01:37.503712+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85909504 unmapped: 20045824 heap: 105955328 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:01:38.503892+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85909504 unmapped: 20045824 heap: 105955328 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1147723 data_alloc: 218103808 data_used: 143360
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:01:39.504096+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85909504 unmapped: 20045824 heap: 105955328 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:01:40.504244+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85909504 unmapped: 20045824 heap: 105955328 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:01:41.504408+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fae57000/0x0/0x4ffc00000, data 0x18fc8c4/0x19b4000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85909504 unmapped: 20045824 heap: 105955328 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fae57000/0x0/0x4ffc00000, data 0x18fc8c4/0x19b4000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:01:42.504598+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 144 ms_handle_reset con 0x5616e108a800 session 0x5616e3688d20
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 144 ms_handle_reset con 0x5616e3280c00 session 0x5616e36965a0
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fae57000/0x0/0x4ffc00000, data 0x18fc8c4/0x19b4000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85917696 unmapped: 20037632 heap: 105955328 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:01:43.504825+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85917696 unmapped: 20037632 heap: 105955328 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1147723 data_alloc: 218103808 data_used: 143360
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:01:44.504928+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85917696 unmapped: 20037632 heap: 105955328 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:01:45.505099+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85917696 unmapped: 20037632 heap: 105955328 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:01:46.505294+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fae57000/0x0/0x4ffc00000, data 0x18fc8c4/0x19b4000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85917696 unmapped: 20037632 heap: 105955328 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:01:47.505437+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85917696 unmapped: 20037632 heap: 105955328 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:01:48.505609+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85917696 unmapped: 20037632 heap: 105955328 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1147723 data_alloc: 218103808 data_used: 143360
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:01:49.505748+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fae57000/0x0/0x4ffc00000, data 0x18fc8c4/0x19b4000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85917696 unmapped: 20037632 heap: 105955328 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:01:50.505896+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85917696 unmapped: 20037632 heap: 105955328 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:01:51.506052+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85917696 unmapped: 20037632 heap: 105955328 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:01:52.506222+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85917696 unmapped: 20037632 heap: 105955328 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e3222400
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 39.927474976s of 40.098861694s, submitted: 30
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:01:53.506430+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fae57000/0x0/0x4ffc00000, data 0x18fc8c4/0x19b4000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85917696 unmapped: 20037632 heap: 105955328 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1147855 data_alloc: 218103808 data_used: 143360
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fae57000/0x0/0x4ffc00000, data 0x18fc8c4/0x19b4000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:01:54.506554+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85917696 unmapped: 20037632 heap: 105955328 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:01:55.506701+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85917696 unmapped: 20037632 heap: 105955328 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:01:56.506846+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e321c800
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85819392 unmapped: 20135936 heap: 105955328 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:01:57.506982+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fae58000/0x0/0x4ffc00000, data 0x18fc8c4/0x19b4000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85819392 unmapped: 20135936 heap: 105955328 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:01:58.507121+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85819392 unmapped: 20135936 heap: 105955328 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1148527 data_alloc: 218103808 data_used: 143360
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:01:59.507266+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e327c400
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e226a400
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 144 ms_handle_reset con 0x5616e226a400 session 0x5616e1eb4780
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 100524032 unmapped: 5431296 heap: 105955328 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:02:00.507406+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 100524032 unmapped: 5431296 heap: 105955328 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:02:01.507558+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 144 handle_osd_map epochs [145,145], i have 144, src has [1,145]
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 145 heartbeat osd_stat(store_statfs(0x4fae58000/0x0/0x4ffc00000, data 0x18fc8c4/0x19b4000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 100524032 unmapped: 5431296 heap: 105955328 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:02:02.507715+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 145 handle_osd_map epochs [145,146], i have 145, src has [1,146]
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e2cac000
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 146 ms_handle_reset con 0x5616e2cac000 session 0x5616e34770e0
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 102490112 unmapped: 3465216 heap: 105955328 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:02:03.507916+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 102514688 unmapped: 3440640 heap: 105955328 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1212945 data_alloc: 234881024 data_used: 13774848
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:02:04.508530+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 102555648 unmapped: 3399680 heap: 105955328 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:02:05.508874+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 11.954066277s of 12.084420204s, submitted: 24
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 102555648 unmapped: 3399680 heap: 105955328 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:02:06.509036+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 102555648 unmapped: 3399680 heap: 105955328 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fad62000/0x0/0x4ffc00000, data 0x19eeaf0/0x1aa8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:02:07.509421+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 146 handle_osd_map epochs [147,147], i have 146, src has [1,147]
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 102416384 unmapped: 3538944 heap: 105955328 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:02:08.509602+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 102416384 unmapped: 3538944 heap: 105955328 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1213508 data_alloc: 234881024 data_used: 13774848
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:02:09.509948+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e321dc00
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 103014400 unmapped: 2940928 heap: 105955328 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:02:10.510165+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 103120896 unmapped: 2834432 heap: 105955328 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4fad60000/0x0/0x4ffc00000, data 0x19f0ac2/0x1aab000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:02:11.510461+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 103120896 unmapped: 2834432 heap: 105955328 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:02:12.510919+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 103120896 unmapped: 2834432 heap: 105955328 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:02:13.511237+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 103120896 unmapped: 2834432 heap: 105955328 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4fad60000/0x0/0x4ffc00000, data 0x19f0ac2/0x1aab000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1218848 data_alloc: 234881024 data_used: 14458880
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e327c400 session 0x5616e34eb2c0
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e3222400 session 0x5616e3378780
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:02:14.511372+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 103120896 unmapped: 2834432 heap: 105955328 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:02:15.511630+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 103120896 unmapped: 2834432 heap: 105955328 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:02:16.511998+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 103120896 unmapped: 2834432 heap: 105955328 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:02:17.512193+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 103120896 unmapped: 2834432 heap: 105955328 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:02:18.512463+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 103120896 unmapped: 2834432 heap: 105955328 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1218848 data_alloc: 234881024 data_used: 14458880
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4fad60000/0x0/0x4ffc00000, data 0x19f0ac2/0x1aab000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:02:19.512678+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 103120896 unmapped: 2834432 heap: 105955328 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:02:20.512883+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 15.370294571s of 15.401467323s, submitted: 11
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 106135552 unmapped: 2965504 heap: 109101056 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4fad60000/0x0/0x4ffc00000, data 0x19f0ac2/0x1aab000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [0,0,1])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:02:21.513082+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 106307584 unmapped: 2793472 heap: 109101056 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #43. Immutable memtables: 0.
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:02:22.513254+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 107315200 unmapped: 2834432 heap: 110149632 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:02:23.513439+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 107315200 unmapped: 2834432 heap: 110149632 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1281178 data_alloc: 234881024 data_used: 14733312
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:02:24.513566+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e21ac400
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 107315200 unmapped: 2834432 heap: 110149632 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:02:25.513741+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 107315200 unmapped: 2834432 heap: 110149632 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:02:26.513969+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f944e000/0x0/0x4ffc00000, data 0x215bac2/0x2216000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 107315200 unmapped: 2834432 heap: 110149632 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:02:27.514121+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 106733568 unmapped: 3416064 heap: 110149632 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:02:28.514420+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 106733568 unmapped: 3416064 heap: 110149632 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1276638 data_alloc: 234881024 data_used: 14733312
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:02:29.514626+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 106733568 unmapped: 3416064 heap: 110149632 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:02:30.514806+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e2619000
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 9.930480957s of 10.239095688s, submitted: 95
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 106741760 unmapped: 3407872 heap: 110149632 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:02:31.514964+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9435000/0x0/0x4ffc00000, data 0x217cac2/0x2237000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 106741760 unmapped: 3407872 heap: 110149632 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:02:32.515206+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 106741760 unmapped: 3407872 heap: 110149632 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:02:33.515393+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 106741760 unmapped: 3407872 heap: 110149632 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1277223 data_alloc: 234881024 data_used: 14733312
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:02:34.515544+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9435000/0x0/0x4ffc00000, data 0x217cac2/0x2237000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 106831872 unmapped: 3317760 heap: 110149632 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:02:35.515728+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 106831872 unmapped: 3317760 heap: 110149632 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:02:36.515917+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 106831872 unmapped: 3317760 heap: 110149632 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:02:37.516059+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 106831872 unmapped: 3317760 heap: 110149632 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:02:38.516196+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 106831872 unmapped: 3317760 heap: 110149632 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1277072 data_alloc: 234881024 data_used: 14733312
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:02:39.516347+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 106831872 unmapped: 3317760 heap: 110149632 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:02:40.516503+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f942c000/0x0/0x4ffc00000, data 0x2185ac2/0x2240000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e321e000
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e321e000 session 0x5616e1f04960
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e327d800
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e327d800 session 0x5616e1f054a0
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e0ff1c00
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e0ff1c00 session 0x5616e1f05e00
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 106577920 unmapped: 3571712 heap: 110149632 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:02:41.516655+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 106577920 unmapped: 3571712 heap: 110149632 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:02:42.516809+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e3283800
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e3283800 session 0x5616e1f04780
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 108347392 unmapped: 1802240 heap: 110149632 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 12.226655960s of 12.266713142s, submitted: 7
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:02:43.516986+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e321f000
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e321f000 session 0x5616e337e000
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e0ff1c00
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 108953600 unmapped: 3293184 heap: 112246784 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e0ff1c00 session 0x5616e337f860
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1293665 data_alloc: 234881024 data_used: 15781888
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:02:44.517122+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e321e000
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e321e000 session 0x5616e0d161e0
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f92cc000/0x0/0x4ffc00000, data 0x22e4b24/0x23a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 108994560 unmapped: 3252224 heap: 112246784 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:02:45.517300+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 108994560 unmapped: 3252224 heap: 112246784 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:02:46.517456+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 108994560 unmapped: 3252224 heap: 112246784 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:02:47.517608+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 108994560 unmapped: 3252224 heap: 112246784 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:02:48.517785+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 108994560 unmapped: 3252224 heap: 112246784 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:02:49.517917+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1293681 data_alloc: 234881024 data_used: 15781888
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 108994560 unmapped: 3252224 heap: 112246784 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:02:50.518101+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f92cc000/0x0/0x4ffc00000, data 0x22e4b24/0x23a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 109027328 unmapped: 3219456 heap: 112246784 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:02:51.518275+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 109027328 unmapped: 3219456 heap: 112246784 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:02:52.518459+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e2a99000
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e2a99000 session 0x5616e1f7b2c0
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 109043712 unmapped: 3203072 heap: 112246784 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:02:53.518746+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e10eec00
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e3542000
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 10.034008980s of 10.189837456s, submitted: 37
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 109182976 unmapped: 3063808 heap: 112246784 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:02:54.518890+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1294918 data_alloc: 234881024 data_used: 15740928
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 109232128 unmapped: 3014656 heap: 112246784 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:02:55.519002+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f92cc000/0x0/0x4ffc00000, data 0x22e4b24/0x23a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 109232128 unmapped: 3014656 heap: 112246784 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:02:56.519133+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 109232128 unmapped: 3014656 heap: 112246784 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:02:57.519260+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 109232128 unmapped: 3014656 heap: 112246784 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:02:58.519368+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 109232128 unmapped: 3014656 heap: 112246784 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:02:59.546943+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1295070 data_alloc: 234881024 data_used: 15749120
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 109232128 unmapped: 3014656 heap: 112246784 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:03:00.547096+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f92cc000/0x0/0x4ffc00000, data 0x22e4b24/0x23a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 109232128 unmapped: 3014656 heap: 112246784 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:03:01.547243+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 109232128 unmapped: 3014656 heap: 112246784 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:03:02.547390+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e321c800 session 0x5616e3750d20
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e2273400 session 0x5616e36892c0
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 108748800 unmapped: 3497984 heap: 112246784 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:03:03.547558+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f92c9000/0x0/0x4ffc00000, data 0x22e7b24/0x23a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 10.504937172s of 10.520147324s, submitted: 3
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 108732416 unmapped: 5611520 heap: 114343936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:03:04.547728+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1341628 data_alloc: 234881024 data_used: 15806464
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 109731840 unmapped: 4612096 heap: 114343936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:03:05.547860+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 110092288 unmapped: 4251648 heap: 114343936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:03:06.548047+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f8c03000/0x0/0x4ffc00000, data 0x29abb24/0x2a67000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 110092288 unmapped: 4251648 heap: 114343936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:03:07.548193+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 110092288 unmapped: 4251648 heap: 114343936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:03:08.548375+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e2a99800 session 0x5616e27985a0
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e3542c00 session 0x5616e2797860
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 110092288 unmapped: 4251648 heap: 114343936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:03:09.548580+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1352434 data_alloc: 234881024 data_used: 15826944
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 110092288 unmapped: 4251648 heap: 114343936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:03:10.549122+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f8c02000/0x0/0x4ffc00000, data 0x29aeb24/0x2a6a000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 110092288 unmapped: 4251648 heap: 114343936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:03:11.549257+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f8c02000/0x0/0x4ffc00000, data 0x29aeb24/0x2a6a000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 110092288 unmapped: 4251648 heap: 114343936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:03:12.549376+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 110092288 unmapped: 4251648 heap: 114343936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e226b000
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:03:13.549506+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e10eec00 session 0x5616e311e1e0
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e3542000 session 0x5616e2620000
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e2617400
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 9.650221825s of 10.002939224s, submitted: 115
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e2617400 session 0x5616e337e3c0
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 109658112 unmapped: 4685824 heap: 114343936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:03:14.549655+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1286162 data_alloc: 234881024 data_used: 15585280
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 109658112 unmapped: 4685824 heap: 114343936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:03:15.549794+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9422000/0x0/0x4ffc00000, data 0x218eac2/0x2249000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 109658112 unmapped: 4685824 heap: 114343936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:03:16.549957+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 109658112 unmapped: 4685824 heap: 114343936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f941f000/0x0/0x4ffc00000, data 0x2191ac2/0x224c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:03:17.550092+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 109658112 unmapped: 4685824 heap: 114343936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:03:18.550239+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f941f000/0x0/0x4ffc00000, data 0x2191ac2/0x224c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 109658112 unmapped: 4685824 heap: 114343936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:03:19.550397+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1286522 data_alloc: 234881024 data_used: 15585280
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e3220800
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 109674496 unmapped: 4669440 heap: 114343936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:03:20.550563+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 109674496 unmapped: 4669440 heap: 114343936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:03:21.550739+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e321dc00 session 0x5616e3475a40
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e108bc00
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 108904448 unmapped: 5439488 heap: 114343936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:03:22.550882+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e108bc00 session 0x5616e3751860
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 108904448 unmapped: 5439488 heap: 114343936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:03:23.551077+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 108904448 unmapped: 5439488 heap: 114343936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:03:24.551203+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1212351 data_alloc: 234881024 data_used: 14626816
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9caf000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 108904448 unmapped: 5439488 heap: 114343936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:03:25.551294+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e226c000
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 11.967247963s of 12.068682671s, submitted: 26
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e34d4000
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 108904448 unmapped: 5439488 heap: 114343936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:03:26.551466+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 108904448 unmapped: 5439488 heap: 114343936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:03:27.551627+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 108904448 unmapped: 5439488 heap: 114343936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:03:28.551737+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 108904448 unmapped: 5439488 heap: 114343936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:03:29.551863+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1215375 data_alloc: 234881024 data_used: 14626816
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9caf000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9caf000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 108904448 unmapped: 5439488 heap: 114343936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:03:30.551987+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9caf000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 108904448 unmapped: 5439488 heap: 114343936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:03:31.552136+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 108912640 unmapped: 5431296 heap: 114343936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:03:32.552292+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 108912640 unmapped: 5431296 heap: 114343936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:03:33.552514+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9caf000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 108920832 unmapped: 5423104 heap: 114343936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:03:34.552669+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1215111 data_alloc: 234881024 data_used: 14626816
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 108920832 unmapped: 5423104 heap: 114343936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:03:35.552847+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 108920832 unmapped: 5423104 heap: 114343936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:03:36.553021+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 108920832 unmapped: 5423104 heap: 114343936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:03:37.553168+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 108920832 unmapped: 5423104 heap: 114343936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:03:38.553308+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9caf000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 108920832 unmapped: 5423104 heap: 114343936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:03:39.554253+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1215111 data_alloc: 234881024 data_used: 14626816
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 108920832 unmapped: 5423104 heap: 114343936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:03:40.554450+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9caf000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 108920832 unmapped: 5423104 heap: 114343936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:03:41.554806+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e327e000
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 15.472117424s of 15.489373207s, submitted: 5
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e327e000 session 0x5616e2797c20
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 108961792 unmapped: 22183936 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:03:42.554977+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 108961792 unmapped: 22183936 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:03:43.555219+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f922c000/0x0/0x4ffc00000, data 0x2385ac2/0x2440000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 108961792 unmapped: 22183936 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:03:44.555370+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1290065 data_alloc: 234881024 data_used: 14626816
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 108961792 unmapped: 22183936 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:03:45.555556+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 108961792 unmapped: 22183936 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:03:46.555720+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 108961792 unmapped: 22183936 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:03:47.555932+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f922c000/0x0/0x4ffc00000, data 0x2385ac2/0x2440000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 108961792 unmapped: 22183936 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:03:48.556104+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 108961792 unmapped: 22183936 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:03:49.556233+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1290065 data_alloc: 234881024 data_used: 14626816
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e226c000 session 0x5616e2d70d20
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e3220800 session 0x5616e312f4a0
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 108961792 unmapped: 22183936 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:03:50.556425+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e108bc00
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e108bc00 session 0x5616e337ef00
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e226c000
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 109068288 unmapped: 22077440 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:03:51.556619+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e321dc00
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9208000/0x0/0x4ffc00000, data 0x23a9ac2/0x2464000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [1])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 114139136 unmapped: 17006592 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:03:52.556829+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 118480896 unmapped: 12664832 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:03:53.557067+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9208000/0x0/0x4ffc00000, data 0x23a9ac2/0x2464000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 118497280 unmapped: 12648448 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:03:54.557229+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1366317 data_alloc: 234881024 data_used: 25640960
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9208000/0x0/0x4ffc00000, data 0x23a9ac2/0x2464000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 118497280 unmapped: 12648448 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:03:55.557428+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 118497280 unmapped: 12648448 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:03:56.557569+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 118497280 unmapped: 12648448 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:03:57.557739+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9208000/0x0/0x4ffc00000, data 0x23a9ac2/0x2464000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 118497280 unmapped: 12648448 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:03:58.557941+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 118497280 unmapped: 12648448 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:03:59.558137+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1366317 data_alloc: 234881024 data_used: 25640960
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 118497280 unmapped: 12648448 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:04:00.558381+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e321a000
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 18.837284088s of 18.897920609s, submitted: 9
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 118497280 unmapped: 12648448 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:04:01.558558+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9208000/0x0/0x4ffc00000, data 0x23a9ac2/0x2464000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:04:02.558753+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 119037952 unmapped: 12107776 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:04:03.558944+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 119062528 unmapped: 12083200 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f901d000/0x0/0x4ffc00000, data 0x2594ac2/0x264f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:04:04.559235+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 119308288 unmapped: 11837440 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1397161 data_alloc: 234881024 data_used: 26198016
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:04:05.559421+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 119308288 unmapped: 11837440 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:04:06.559629+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 119308288 unmapped: 11837440 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e34d6000
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:04:07.559805+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 119308288 unmapped: 11837440 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9015000/0x0/0x4ffc00000, data 0x259cac2/0x2657000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:04:08.559964+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 119308288 unmapped: 11837440 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:04:09.560137+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 119308288 unmapped: 11837440 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1396570 data_alloc: 234881024 data_used: 26198016
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:04:10.560351+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 120356864 unmapped: 10788864 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:04:11.560581+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 120356864 unmapped: 10788864 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9015000/0x0/0x4ffc00000, data 0x259cac2/0x2657000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:04:12.560794+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 120356864 unmapped: 10788864 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 11.991930008s of 12.083539009s, submitted: 22
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:04:13.561050+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 120356864 unmapped: 10788864 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9015000/0x0/0x4ffc00000, data 0x259cac2/0x2657000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:04:14.561289+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 120356864 unmapped: 10788864 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1394635 data_alloc: 234881024 data_used: 26198016
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9015000/0x0/0x4ffc00000, data 0x259cac2/0x2657000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:04:15.561556+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 120356864 unmapped: 10788864 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:04:16.561697+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 120356864 unmapped: 10788864 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:04:17.561905+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 120356864 unmapped: 10788864 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:04:18.562069+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 120356864 unmapped: 10788864 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9015000/0x0/0x4ffc00000, data 0x259cac2/0x2657000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:04:19.562262+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 120356864 unmapped: 10788864 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1394503 data_alloc: 234881024 data_used: 26198016
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:04:20.562469+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 120356864 unmapped: 10788864 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:04:21.562613+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 120356864 unmapped: 10788864 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:04:22.562744+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 120356864 unmapped: 10788864 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:04:23.562948+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 120356864 unmapped: 10788864 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9015000/0x0/0x4ffc00000, data 0x259cac2/0x2657000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:04:24.563057+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 120356864 unmapped: 10788864 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1394503 data_alloc: 234881024 data_used: 26198016
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9015000/0x0/0x4ffc00000, data 0x259cac2/0x2657000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:04:25.563241+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 120356864 unmapped: 10788864 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:04:26.563364+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 120356864 unmapped: 10788864 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:04:27.563561+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 120356864 unmapped: 10788864 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9015000/0x0/0x4ffc00000, data 0x259cac2/0x2657000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:04:28.563715+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 120356864 unmapped: 10788864 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:04:29.563866+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 120356864 unmapped: 10788864 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1394503 data_alloc: 234881024 data_used: 26198016
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e321dc00 session 0x5616e3475e00
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e226c000 session 0x5616e0ffeb40
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e3222c00
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 17.675941467s of 17.711282730s, submitted: 2
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:04:30.563991+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 114180096 unmapped: 16965632 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e3222c00 session 0x5616e3689a40
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:04:31.564137+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 114180096 unmapped: 16965632 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9caf000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:04:32.564374+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 114180096 unmapped: 16965632 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:04:33.564602+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 114180096 unmapped: 16965632 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:04:34.564768+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 114180096 unmapped: 16965632 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1224201 data_alloc: 234881024 data_used: 14626816
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:04:35.564977+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 114180096 unmapped: 16965632 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:04:36.565201+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9caf000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 114180096 unmapped: 16965632 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9caf000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:04:37.565545+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 114180096 unmapped: 16965632 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:04:38.565698+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 114180096 unmapped: 16965632 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:04:39.565838+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 114180096 unmapped: 16965632 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1224201 data_alloc: 234881024 data_used: 14626816
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:04:40.566022+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9caf000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 114180096 unmapped: 16965632 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:04:41.566223+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 114180096 unmapped: 16965632 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:04:42.566346+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 114180096 unmapped: 16965632 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:04:43.566508+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 114180096 unmapped: 16965632 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:04:44.566678+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 114180096 unmapped: 16965632 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1224201 data_alloc: 234881024 data_used: 14626816
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:04:45.566956+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 114180096 unmapped: 16965632 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9caf000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:04:46.567106+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 114180096 unmapped: 16965632 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:04:47.567317+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 114180096 unmapped: 16965632 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:04:48.567496+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 114180096 unmapped: 16965632 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:04:49.567734+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 114180096 unmapped: 16965632 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1224201 data_alloc: 234881024 data_used: 14626816
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9caf000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9caf000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:04:50.567863+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 114180096 unmapped: 16965632 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9caf000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:04:51.568008+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 114180096 unmapped: 16965632 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:04:52.568219+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 114180096 unmapped: 16965632 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:04:53.568441+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 114180096 unmapped: 16965632 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:04:54.568578+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 114180096 unmapped: 16965632 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1224201 data_alloc: 234881024 data_used: 14626816
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:04:55.568718+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9caf000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 114180096 unmapped: 16965632 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:04:56.568877+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 114180096 unmapped: 16965632 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:04:57.568998+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 114180096 unmapped: 16965632 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e2273400
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e2273400 session 0x5616e34eaf00
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e108bc00
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e108bc00 session 0x5616e34ead20
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e226c000
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e226c000 session 0x5616e278ad20
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:04:58.569123+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e2271c00
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 114892800 unmapped: 16252928 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e2271c00 session 0x5616e3408000
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e3284800
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e3284800 session 0x5616e36914a0
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e2a99c00
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 29.119201660s of 29.155227661s, submitted: 11
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:04:59.569234+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e2a99c00 session 0x5616e311b0e0
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 113942528 unmapped: 20881408 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e108bc00
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e108bc00 session 0x5616e10e7c20
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e226c000
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e226c000 session 0x5616e10e63c0
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e2271c00
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e2271c00 session 0x5616e0572780
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e3284800
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e3284800 session 0x5616e1f7a1e0
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1273675 data_alloc: 234881024 data_used: 14823424
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9cae000/0x0/0x4ffc00000, data 0x1902ad2/0x19be000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:05:00.569408+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 113958912 unmapped: 20865024 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9651000/0x0/0x4ffc00000, data 0x1f5fad2/0x201b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9651000/0x0/0x4ffc00000, data 0x1f5fad2/0x201b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:05:01.569516+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 113958912 unmapped: 20865024 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:05:02.569615+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 113958912 unmapped: 20865024 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:05:03.569771+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9651000/0x0/0x4ffc00000, data 0x1f5fad2/0x201b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 113844224 unmapped: 20979712 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:05:04.569926+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 113844224 unmapped: 20979712 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e34d4000 session 0x5616e0d17860
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e226b000 session 0x5616e37cef00
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1273675 data_alloc: 234881024 data_used: 14823424
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e108bc00
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e108bc00 session 0x5616e34ea780
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:05:05.570100+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 113713152 unmapped: 21110784 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e226c000
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e2271c00
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:05:06.570238+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 113713152 unmapped: 21110784 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:05:07.570407+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 113713152 unmapped: 21110784 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:05:08.570551+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 111927296 unmapped: 22896640 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:05:09.570731+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9650000/0x0/0x4ffc00000, data 0x1f5faf5/0x201c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 111927296 unmapped: 22896640 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1321277 data_alloc: 234881024 data_used: 16965632
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:05:10.570905+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 111927296 unmapped: 22896640 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:05:11.571059+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 111927296 unmapped: 22896640 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9650000/0x0/0x4ffc00000, data 0x1f5faf5/0x201c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9650000/0x0/0x4ffc00000, data 0x1f5faf5/0x201c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:05:12.571205+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 111927296 unmapped: 22896640 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:05:13.571420+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 111927296 unmapped: 22896640 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:05:14.571589+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 111927296 unmapped: 22896640 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1321277 data_alloc: 234881024 data_used: 16965632
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e3282800
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 15.900746346s of 15.997513771s, submitted: 18
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:05:15.571756+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 111575040 unmapped: 23248896 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:05:16.572182+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 111575040 unmapped: 23248896 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9650000/0x0/0x4ffc00000, data 0x1f5faf5/0x201c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:05:17.572428+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9650000/0x0/0x4ffc00000, data 0x1f5faf5/0x201c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 111575040 unmapped: 23248896 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:05:18.572782+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f90fc000/0x0/0x4ffc00000, data 0x24b3af5/0x2570000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 113655808 unmapped: 21168128 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:05:19.573202+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f90fc000/0x0/0x4ffc00000, data 0x24b3af5/0x2570000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 113672192 unmapped: 21151744 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1367081 data_alloc: 234881024 data_used: 17113088
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:05:20.573555+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 113672192 unmapped: 21151744 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:05:21.573935+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e3283400
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 113672192 unmapped: 21151744 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:05:22.574222+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 113672192 unmapped: 21151744 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:05:23.574605+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 113672192 unmapped: 21151744 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f90f6000/0x0/0x4ffc00000, data 0x24b9af5/0x2576000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:05:24.574802+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 112992256 unmapped: 21831680 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1366154 data_alloc: 234881024 data_used: 17113088
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:05:25.575022+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 112992256 unmapped: 21831680 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:05:26.575306+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 112992256 unmapped: 21831680 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:05:27.575506+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 112992256 unmapped: 21831680 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:05:28.575733+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 112992256 unmapped: 21831680 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:05:29.575999+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f90f6000/0x0/0x4ffc00000, data 0x24b9af5/0x2576000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 112992256 unmapped: 21831680 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1366154 data_alloc: 234881024 data_used: 17113088
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:05:30.576204+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 112992256 unmapped: 21831680 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:05:31.576439+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 112992256 unmapped: 21831680 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:05:32.576669+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f90f6000/0x0/0x4ffc00000, data 0x24b9af5/0x2576000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 112992256 unmapped: 21831680 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 17.825149536s of 17.985601425s, submitted: 37
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:05:33.576984+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 112992256 unmapped: 21831680 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:05:34.577132+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 112992256 unmapped: 21831680 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f90f6000/0x0/0x4ffc00000, data 0x24b9af5/0x2576000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1366022 data_alloc: 234881024 data_used: 17113088
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:05:35.577353+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 112992256 unmapped: 21831680 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:05:36.577550+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 112992256 unmapped: 21831680 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e321bc00
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e321bc00 session 0x5616e3696d20
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e327f800
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e327f800 session 0x5616df7d8780
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e0e07000
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e0e07000 session 0x5616e311f860
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e0e07000
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e0e07000 session 0x5616dfb71c20
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e108bc00
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:05:37.577726+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e108bc00 session 0x5616e0d161e0
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e226b000
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e226b000 session 0x5616e1f054a0
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e321bc00
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e321bc00 session 0x5616e311a960
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e327f800
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e327f800 session 0x5616e37ced20
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e0e07000
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e0e07000 session 0x5616e0d6ed20
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 114393088 unmapped: 20430848 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f8c38000/0x0/0x4ffc00000, data 0x2976b57/0x2a34000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:05:38.577911+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 114425856 unmapped: 20398080 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:05:39.578103+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 114425856 unmapped: 20398080 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1413608 data_alloc: 234881024 data_used: 17113088
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:05:40.578277+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 114425856 unmapped: 20398080 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:05:41.578467+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 114434048 unmapped: 20389888 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:05:42.578657+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 114434048 unmapped: 20389888 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:05:43.578939+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e327d400
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 10.185551643s of 10.350721359s, submitted: 46
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e327d400 session 0x5616e34db4a0
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 114171904 unmapped: 20652032 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f8c13000/0x0/0x4ffc00000, data 0x299ab7a/0x2a59000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:05:44.579120+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e3280000
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e2a99000
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 114171904 unmapped: 20652032 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1417109 data_alloc: 234881024 data_used: 17113088
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:05:45.579278+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 114868224 unmapped: 19955712 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:05:46.579488+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116793344 unmapped: 18030592 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f8c13000/0x0/0x4ffc00000, data 0x299ab7a/0x2a59000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:05:47.579660+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116793344 unmapped: 18030592 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:05:48.579850+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116793344 unmapped: 18030592 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:05:49.580015+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f8c13000/0x0/0x4ffc00000, data 0x299ab7a/0x2a59000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116793344 unmapped: 18030592 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1449333 data_alloc: 234881024 data_used: 20922368
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:05:50.580218+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116801536 unmapped: 18022400 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:05:51.580369+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116801536 unmapped: 18022400 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:05:52.580554+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116801536 unmapped: 18022400 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:05:53.580819+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116801536 unmapped: 18022400 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:05:54.580975+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116801536 unmapped: 18022400 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1449333 data_alloc: 234881024 data_used: 20922368
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f8c13000/0x0/0x4ffc00000, data 0x299ab7a/0x2a59000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:05:55.581178+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116809728 unmapped: 18014208 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 12.333662033s of 12.362901688s, submitted: 6
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:05:56.581363+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121585664 unmapped: 13238272 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:05:57.581515+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122150912 unmapped: 12673024 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:05:58.581673+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 123871232 unmapped: 10952704 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:05:59.581853+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f833b000/0x0/0x4ffc00000, data 0x3272b7a/0x3331000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 123904000 unmapped: 10919936 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1524459 data_alloc: 234881024 data_used: 21643264
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:06:00.582050+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 123904000 unmapped: 10919936 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:06:01.582200+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 123912192 unmapped: 10911744 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e3280000 session 0x5616e2cd8b40
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e2a99000 session 0x5616e311e960
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:06:02.582338+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e2caf800
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 123445248 unmapped: 11378688 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e2caf800 session 0x5616e2cd8780
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:06:03.582536+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 119570432 unmapped: 15253504 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:06:04.582648+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 119570432 unmapped: 15253504 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1370129 data_alloc: 234881024 data_used: 15147008
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f8f62000/0x0/0x4ffc00000, data 0x24b9af5/0x2576000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:06:05.582846+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 119570432 unmapped: 15253504 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:06:06.583055+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 119578624 unmapped: 15245312 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:06:07.583236+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 119578624 unmapped: 15245312 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:06:08.583343+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 119578624 unmapped: 15245312 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:06:09.583525+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 119578624 unmapped: 15245312 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1370129 data_alloc: 234881024 data_used: 15147008
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:06:10.583674+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f8f62000/0x0/0x4ffc00000, data 0x24b9af5/0x2576000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 119578624 unmapped: 15245312 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:06:11.583851+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 119586816 unmapped: 15237120 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e226c000 session 0x5616e34061e0
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 14.954436302s of 15.854330063s, submitted: 169
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e2271c00 session 0x5616e311a5a0
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e226d000
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f8f62000/0x0/0x4ffc00000, data 0x24b9af5/0x2576000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:06:12.583972+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e226d000 session 0x5616e1f052c0
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f8f62000/0x0/0x4ffc00000, data 0x24b9af5/0x2576000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117055488 unmapped: 17768448 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:06:13.584194+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117055488 unmapped: 17768448 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:06:14.584365+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117055488 unmapped: 17768448 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1239177 data_alloc: 234881024 data_used: 10301440
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:06:15.584559+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117055488 unmapped: 17768448 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:06:16.584721+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117055488 unmapped: 17768448 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9cae000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:06:17.584889+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9cae000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117055488 unmapped: 17768448 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:06:18.585049+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117055488 unmapped: 17768448 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:06:19.585235+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117055488 unmapped: 17768448 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1239177 data_alloc: 234881024 data_used: 10301440
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:06:20.585393+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117055488 unmapped: 17768448 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:06:21.585582+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117055488 unmapped: 17768448 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:06:22.585741+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9cae000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117055488 unmapped: 17768448 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:06:23.585935+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117055488 unmapped: 17768448 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:06:24.586084+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117055488 unmapped: 17768448 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1239177 data_alloc: 234881024 data_used: 10301440
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:06:25.586252+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117055488 unmapped: 17768448 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:06:26.586406+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9cae000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117055488 unmapped: 17768448 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9cae000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:06:27.586598+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117055488 unmapped: 17768448 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:06:28.586777+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117055488 unmapped: 17768448 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:06:29.586940+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117055488 unmapped: 17768448 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1239177 data_alloc: 234881024 data_used: 10301440
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:06:30.587104+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117055488 unmapped: 17768448 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:06:31.587303+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117055488 unmapped: 17768448 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e321c400
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 20.149909973s of 20.260728836s, submitted: 33
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e321c400 session 0x5616e04dd680
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:06:32.587479+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e2caec00
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e2caec00 session 0x5616e2797860
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e226c000
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e226c000 session 0x5616e263c960
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e226d000
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e226d000 session 0x5616e359af00
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e2271c00
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e2271c00 session 0x5616e04dc1e0
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9caf000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117145600 unmapped: 17678336 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:06:33.587684+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117145600 unmapped: 17678336 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:06:34.587908+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117145600 unmapped: 17678336 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1250106 data_alloc: 234881024 data_used: 10301440
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:06:35.588098+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117145600 unmapped: 17678336 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9b9b000/0x0/0x4ffc00000, data 0x1a16ac2/0x1ad1000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:06:36.588221+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117145600 unmapped: 17678336 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:06:37.588378+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9b9b000/0x0/0x4ffc00000, data 0x1a16ac2/0x1ad1000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117145600 unmapped: 17678336 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:06:38.588513+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e321e800
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117145600 unmapped: 17678336 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:06:39.588637+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116973568 unmapped: 17850368 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1250238 data_alloc: 234881024 data_used: 10301440
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:06:40.588771+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116973568 unmapped: 17850368 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:06:41.588888+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116973568 unmapped: 17850368 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:06:42.589038+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116973568 unmapped: 17850368 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9b9b000/0x0/0x4ffc00000, data 0x1a16ac2/0x1ad1000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:06:43.589208+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116973568 unmapped: 17850368 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:06:44.589374+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116973568 unmapped: 17850368 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1250238 data_alloc: 234881024 data_used: 10301440
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:06:45.589558+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116973568 unmapped: 17850368 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:06:46.589737+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116973568 unmapped: 17850368 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:06:47.589872+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116973568 unmapped: 17850368 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:06:48.590004+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9b9b000/0x0/0x4ffc00000, data 0x1a16ac2/0x1ad1000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116973568 unmapped: 17850368 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:06:49.590136+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 17.556289673s of 17.624111176s, submitted: 16
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116973568 unmapped: 17850368 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1255618 data_alloc: 234881024 data_used: 10326016
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:06:50.590342+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116989952 unmapped: 17833984 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9b34000/0x0/0x4ffc00000, data 0x1a7cac2/0x1b37000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:06:51.590478+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116998144 unmapped: 17825792 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:06:52.590616+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116998144 unmapped: 17825792 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:06:53.590784+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9b31000/0x0/0x4ffc00000, data 0x1a80ac2/0x1b3b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116998144 unmapped: 17825792 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:06:54.590900+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116998144 unmapped: 17825792 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1258448 data_alloc: 234881024 data_used: 10338304
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:06:55.591024+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116998144 unmapped: 17825792 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:06:56.591177+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116998144 unmapped: 17825792 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:06:57.591326+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116998144 unmapped: 17825792 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:06:58.591483+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116998144 unmapped: 17825792 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:06:59.591615+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9b31000/0x0/0x4ffc00000, data 0x1a80ac2/0x1b3b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116998144 unmapped: 17825792 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1258448 data_alloc: 234881024 data_used: 10338304
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:07:00.591831+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9b31000/0x0/0x4ffc00000, data 0x1a80ac2/0x1b3b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116998144 unmapped: 17825792 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:07:01.591966+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116998144 unmapped: 17825792 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:07:02.592123+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116998144 unmapped: 17825792 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:07:03.592394+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116998144 unmapped: 17825792 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:07:04.592582+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9b31000/0x0/0x4ffc00000, data 0x1a80ac2/0x1b3b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116998144 unmapped: 17825792 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9b31000/0x0/0x4ffc00000, data 0x1a80ac2/0x1b3b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1258448 data_alloc: 234881024 data_used: 10338304
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:07:05.592737+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116998144 unmapped: 17825792 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:07:06.592896+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9b31000/0x0/0x4ffc00000, data 0x1a80ac2/0x1b3b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116998144 unmapped: 17825792 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:07:07.593029+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e34d6000 session 0x5616e0f6e000
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e321a000 session 0x5616e337e5a0
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116998144 unmapped: 17825792 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:07:08.593231+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9b31000/0x0/0x4ffc00000, data 0x1a80ac2/0x1b3b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116998144 unmapped: 17825792 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:07:09.593363+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117006336 unmapped: 17817600 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1258448 data_alloc: 234881024 data_used: 10338304
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:07:10.593494+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e3283400 session 0x5616e2d70000
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e3282800 session 0x5616e34741e0
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117006336 unmapped: 17817600 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:07:11.593636+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117006336 unmapped: 17817600 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:07:12.593769+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117006336 unmapped: 17817600 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:07:13.593973+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117006336 unmapped: 17817600 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:07:14.594138+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9b31000/0x0/0x4ffc00000, data 0x1a80ac2/0x1b3b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117006336 unmapped: 17817600 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1258448 data_alloc: 234881024 data_used: 10338304
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:07:15.594329+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117006336 unmapped: 17817600 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:07:16.594458+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117006336 unmapped: 17817600 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:07:17.594582+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117006336 unmapped: 17817600 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:07:18.594813+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e3542800
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 29.186046600s of 29.217700958s, submitted: 7
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117006336 unmapped: 17817600 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:07:19.594940+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117006336 unmapped: 17817600 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:07:20.595075+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1258580 data_alloc: 234881024 data_used: 10338304
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9b31000/0x0/0x4ffc00000, data 0x1a80ac2/0x1b3b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e3283000
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 115892224 unmapped: 22609920 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e3283000 session 0x5616e313e780
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:07:21.595228+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e327cc00
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 115892224 unmapped: 22609920 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:07:22.595342+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 115892224 unmapped: 22609920 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:07:23.595481+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f94f5000/0x0/0x4ffc00000, data 0x20bcac2/0x2177000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 115908608 unmapped: 22593536 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:07:24.595693+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 115908608 unmapped: 22593536 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e2cac000
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:07:25.597376+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1310128 data_alloc: 234881024 data_used: 10338304
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 115908608 unmapped: 22593536 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:07:26.599742+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 115908608 unmapped: 22593536 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:07:27.601653+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e0540c00
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 115335168 unmapped: 23166976 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:07:28.605939+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e321cc00
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e321cc00 session 0x5616e3476780
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f94f5000/0x0/0x4ffc00000, data 0x20bcac2/0x2177000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e3543400
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e3543400 session 0x5616e33d3e00
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 115335168 unmapped: 23166976 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:07:29.606093+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e321cc00
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e321cc00 session 0x5616e33d3a40
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e3282800
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 10.488653183s of 10.665133476s, submitted: 14
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e3282800 session 0x5616e33d23c0
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 115343360 unmapped: 23158784 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:07:30.609037+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1315916 data_alloc: 234881024 data_used: 10338304
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e3283000
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 115343360 unmapped: 23158784 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:07:31.609220+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e3283400
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 115433472 unmapped: 23068672 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:07:32.609415+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117096448 unmapped: 21405696 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:07:33.609849+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f94f3000/0x0/0x4ffc00000, data 0x20bcaf5/0x2179000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117096448 unmapped: 21405696 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:07:34.610072+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f94f3000/0x0/0x4ffc00000, data 0x20bcaf5/0x2179000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117096448 unmapped: 21405696 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:07:35.610196+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1357250 data_alloc: 234881024 data_used: 16588800
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f94f3000/0x0/0x4ffc00000, data 0x20bcaf5/0x2179000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117096448 unmapped: 21405696 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:07:36.610366+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117096448 unmapped: 21405696 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:07:37.610529+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117096448 unmapped: 21405696 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:07:38.610776+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117096448 unmapped: 21405696 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:07:39.611098+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117096448 unmapped: 21405696 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:07:40.611334+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1357250 data_alloc: 234881024 data_used: 16588800
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117096448 unmapped: 21405696 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:07:41.611467+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f94f3000/0x0/0x4ffc00000, data 0x20bcaf5/0x2179000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 12.383611679s of 12.435736656s, submitted: 16
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121544704 unmapped: 16957440 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:07:42.611633+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122388480 unmapped: 16113664 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:07:43.611868+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122961920 unmapped: 15540224 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:07:44.612088+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122961920 unmapped: 15540224 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:07:45.612322+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1437520 data_alloc: 234881024 data_used: 17522688
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f8bae000/0x0/0x4ffc00000, data 0x2a00af5/0x2abd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122961920 unmapped: 15540224 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:07:46.612479+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f8bae000/0x0/0x4ffc00000, data 0x2a00af5/0x2abd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122961920 unmapped: 15540224 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:07:47.612645+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122970112 unmapped: 15532032 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:07:48.612843+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122224640 unmapped: 16277504 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:07:49.613008+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122224640 unmapped: 16277504 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:07:50.613229+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1435392 data_alloc: 234881024 data_used: 17522688
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122224640 unmapped: 16277504 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:07:51.613400+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f8b90000/0x0/0x4ffc00000, data 0x2a1faf5/0x2adc000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122224640 unmapped: 16277504 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:07:52.613576+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122224640 unmapped: 16277504 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:07:53.613916+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122224640 unmapped: 16277504 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:07:54.614297+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 12.516505241s of 13.009800911s, submitted: 89
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122429440 unmapped: 16072704 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:07:55.614581+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1435504 data_alloc: 234881024 data_used: 17522688
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122437632 unmapped: 16064512 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:07:56.614743+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122437632 unmapped: 16064512 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:07:57.615096+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f8b86000/0x0/0x4ffc00000, data 0x2a29af5/0x2ae6000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122445824 unmapped: 16056320 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:07:58.615227+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122445824 unmapped: 16056320 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:07:59.615365+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f8b86000/0x0/0x4ffc00000, data 0x2a29af5/0x2ae6000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122445824 unmapped: 16056320 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:08:00.615522+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1435504 data_alloc: 234881024 data_used: 17522688
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122445824 unmapped: 16056320 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:08:01.615663+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122445824 unmapped: 16056320 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:08:02.615828+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f8b86000/0x0/0x4ffc00000, data 0x2a29af5/0x2ae6000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122445824 unmapped: 16056320 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:08:03.616036+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f8b83000/0x0/0x4ffc00000, data 0x2a2caf5/0x2ae9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122445824 unmapped: 16056320 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:08:04.616263+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f8b83000/0x0/0x4ffc00000, data 0x2a2caf5/0x2ae9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:08:05.616403+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122445824 unmapped: 16056320 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1435592 data_alloc: 234881024 data_used: 17522688
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f8b83000/0x0/0x4ffc00000, data 0x2a2caf5/0x2ae9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:08:06.616556+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122445824 unmapped: 16056320 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:08:07.616675+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122445824 unmapped: 16056320 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f8b83000/0x0/0x4ffc00000, data 0x2a2caf5/0x2ae9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:08:08.616852+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122445824 unmapped: 16056320 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f8b83000/0x0/0x4ffc00000, data 0x2a2caf5/0x2ae9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:08:09.617055+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122454016 unmapped: 16048128 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:08:10.617192+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122454016 unmapped: 16048128 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1435592 data_alloc: 234881024 data_used: 17522688
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:08:11.617317+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122454016 unmapped: 16048128 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f8b83000/0x0/0x4ffc00000, data 0x2a2caf5/0x2ae9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:08:12.617447+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122454016 unmapped: 16048128 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:08:13.617650+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122454016 unmapped: 16048128 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:08:14.617787+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122454016 unmapped: 16048128 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 19.440578461s of 19.455564499s, submitted: 4
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:08:15.617925+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122454016 unmapped: 16048128 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1435592 data_alloc: 234881024 data_used: 17522688
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f8b83000/0x0/0x4ffc00000, data 0x2a2caf5/0x2ae9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:08:16.618076+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122454016 unmapped: 16048128 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f8b83000/0x0/0x4ffc00000, data 0x2a2caf5/0x2ae9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:08:17.618227+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122454016 unmapped: 16048128 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:08:18.618411+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122454016 unmapped: 16048128 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:08:19.618555+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122454016 unmapped: 16048128 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:08:20.618657+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122454016 unmapped: 16048128 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1435592 data_alloc: 234881024 data_used: 17522688
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f8b83000/0x0/0x4ffc00000, data 0x2a2caf5/0x2ae9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e3283400 session 0x5616e312e960
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e3283000 session 0x5616e37ceb40
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e0e07000
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:08:21.618818+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117563392 unmapped: 20938752 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e0e07000 session 0x5616e34065a0
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:08:22.619008+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9721000/0x0/0x4ffc00000, data 0x1a80ac2/0x1b3b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117563392 unmapped: 20938752 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:08:23.619220+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117563392 unmapped: 20938752 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:08:24.619314+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117563392 unmapped: 20938752 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9721000/0x0/0x4ffc00000, data 0x1a80ac2/0x1b3b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:08:25.619387+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117563392 unmapped: 20938752 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1268225 data_alloc: 234881024 data_used: 10338304
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:08:26.619495+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117563392 unmapped: 20938752 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:08:27.619632+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117563392 unmapped: 20938752 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9721000/0x0/0x4ffc00000, data 0x1a80ac2/0x1b3b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:08:28.619813+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117563392 unmapped: 20938752 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:08:29.620984+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117563392 unmapped: 20938752 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:08:30.622012+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117563392 unmapped: 20938752 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9721000/0x0/0x4ffc00000, data 0x1a80ac2/0x1b3b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1268225 data_alloc: 234881024 data_used: 10338304
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9721000/0x0/0x4ffc00000, data 0x1a80ac2/0x1b3b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:08:31.624209+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117563392 unmapped: 20938752 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 16.921215057s of 17.030107498s, submitted: 36
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e321e800 session 0x5616e3406d20
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e3542400
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e3542400 session 0x5616e34772c0
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:08:32.624393+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117579776 unmapped: 20922368 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:08:33.624616+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117579776 unmapped: 20922368 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:08:34.624785+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117579776 unmapped: 20922368 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:08:35.625711+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117579776 unmapped: 20922368 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1252711 data_alloc: 234881024 data_used: 10301440
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:08:36.625973+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117579776 unmapped: 20922368 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:08:37.626197+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117579776 unmapped: 20922368 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:08:38.626585+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117579776 unmapped: 20922368 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                           ** DB Stats **
                                           Uptime(secs): 1800.1 total, 600.0 interval
                                           Cumulative writes: 11K writes, 41K keys, 11K commit groups, 1.0 writes per commit group, ingest: 0.03 GB, 0.02 MB/s
                                           Cumulative WAL: 11K writes, 2812 syncs, 3.92 writes per sync, written: 0.03 GB, 0.02 MB/s
                                           Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                           Interval writes: 2196 writes, 6935 keys, 2196 commit groups, 1.0 writes per commit group, ingest: 6.82 MB, 0.01 MB/s
                                           Interval WAL: 2196 writes, 936 syncs, 2.35 writes per sync, written: 0.01 GB, 0.01 MB/s
                                           Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:08:39.626953+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117579776 unmapped: 20922368 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:08:40.627290+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117579776 unmapped: 20922368 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1252711 data_alloc: 234881024 data_used: 10301440
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:08:41.627580+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117579776 unmapped: 20922368 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:08:42.627791+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117579776 unmapped: 20922368 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:08:43.627944+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117579776 unmapped: 20922368 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:08:44.628233+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117579776 unmapped: 20922368 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:08:45.628472+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117579776 unmapped: 20922368 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1252711 data_alloc: 234881024 data_used: 10301440
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:08:46.628692+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117579776 unmapped: 20922368 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:08:47.628877+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117579776 unmapped: 20922368 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:08:48.629384+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117579776 unmapped: 20922368 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:08:49.629922+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117579776 unmapped: 20922368 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:08:50.630235+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117579776 unmapped: 20922368 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1252711 data_alloc: 234881024 data_used: 10301440
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e2617c00
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 19.137660980s of 19.212322235s, submitted: 21
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e2617c00 session 0x5616e312fe00
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:08:51.630430+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117604352 unmapped: 20897792 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:08:52.630668+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117604352 unmapped: 20897792 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:08:53.630929+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117604352 unmapped: 20897792 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:08:54.631109+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117604352 unmapped: 20897792 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:08:55.631275+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9516000/0x0/0x4ffc00000, data 0x1c8bac2/0x1d46000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117604352 unmapped: 20897792 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1282483 data_alloc: 234881024 data_used: 10301440
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e327d400
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e327d400 session 0x5616e34734a0
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:08:56.631455+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e10efc00
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e10efc00 session 0x5616e34ea3c0
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117604352 unmapped: 20897792 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9516000/0x0/0x4ffc00000, data 0x1c8bac2/0x1d46000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e2617c00
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e2617c00 session 0x5616e2d70780
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e321e800
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:08:57.631614+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e321e800 session 0x5616e1f05c20
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116129792 unmapped: 22372352 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e327d400
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:08:58.631853+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e2255c00
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116170752 unmapped: 22331392 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:08:59.632023+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116776960 unmapped: 21725184 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:09:00.632218+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e2255c00 session 0x5616e2d71680
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e327d400 session 0x5616e3476f00
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116776960 unmapped: 21725184 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1312446 data_alloc: 234881024 data_used: 14000128
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e3285000
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 10.036809921s of 10.113059044s, submitted: 15
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e3285000 session 0x5616e36881e0
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f94f1000/0x0/0x4ffc00000, data 0x1cafad2/0x1d6b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:09:01.632367+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116310016 unmapped: 22192128 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:09:02.632519+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116310016 unmapped: 22192128 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:09:03.632693+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116310016 unmapped: 22192128 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:09:04.632840+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116310016 unmapped: 22192128 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:09:05.633011+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116310016 unmapped: 22192128 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1258361 data_alloc: 234881024 data_used: 10301440
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:09:06.633193+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116310016 unmapped: 22192128 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:09:07.633366+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116310016 unmapped: 22192128 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:09:08.634205+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116310016 unmapped: 22192128 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e2619000 session 0x5616e33d2780
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e21ac400 session 0x5616e3494000
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:09:09.634340+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116310016 unmapped: 22192128 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e3283c00
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e3283c00 session 0x5616e3691c20
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e2270c00
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e2270c00 session 0x5616e34ea1e0
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e2a99800
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e2a99800 session 0x5616e337f0e0
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e21ac400
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e21ac400 session 0x5616e0fffc20
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e2270c00
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e2270c00 session 0x5616e05730e0
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e2619000
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e2619000 session 0x5616e2cd83c0
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e3283c00
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:09:10.634582+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e3283c00 session 0x5616e05a2780
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e21e5400
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e21e5400 session 0x5616dfb74b40
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e21ac400
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e21ac400 session 0x5616e311b680
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116105216 unmapped: 29745152 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1346877 data_alloc: 234881024 data_used: 10301440
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f8d84000/0x0/0x4ffc00000, data 0x241cad2/0x24d8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:09:11.634767+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116105216 unmapped: 29745152 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:09:12.634919+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116105216 unmapped: 29745152 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e2270c00
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e2270c00 session 0x5616e3476000
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:09:13.635061+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e3282c00
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e3282c00 session 0x5616e34761e0
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116105216 unmapped: 29745152 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e321ec00
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e321ec00 session 0x5616e359ab40
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e34d5800
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 13.106111526s of 13.270271301s, submitted: 38
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e34d5800 session 0x5616e33d2f00
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:09:14.635246+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116105216 unmapped: 29745152 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e21ac400
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e2270c00
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:09:15.635436+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116113408 unmapped: 29736960 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1348691 data_alloc: 234881024 data_used: 10301440
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:09:16.635624+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f8d83000/0x0/0x4ffc00000, data 0x241cae2/0x24d9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121683968 unmapped: 24166400 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:09:17.635777+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121708544 unmapped: 24141824 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:09:18.635977+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121708544 unmapped: 24141824 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e21ac400 session 0x5616e3404780
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e2270c00 session 0x5616e04dc960
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e2cae400
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:09:19.636118+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e2cae400 session 0x5616e311a000
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116850688 unmapped: 28999680 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:09:20.636262+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116850688 unmapped: 28999680 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1266122 data_alloc: 234881024 data_used: 10301440
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:09:21.636402+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116850688 unmapped: 28999680 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:09:22.636573+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116850688 unmapped: 28999680 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:09:23.636728+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116850688 unmapped: 28999680 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:09:24.636868+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116850688 unmapped: 28999680 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:09:25.637018+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116850688 unmapped: 28999680 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1266122 data_alloc: 234881024 data_used: 10301440
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:09:26.637238+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116850688 unmapped: 28999680 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:09:27.637409+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116850688 unmapped: 28999680 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:09:28.637610+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116850688 unmapped: 28999680 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:09:29.637802+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116850688 unmapped: 28999680 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:09:30.637965+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116850688 unmapped: 28999680 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1266122 data_alloc: 234881024 data_used: 10301440
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:09:31.638169+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116850688 unmapped: 28999680 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:09:32.638401+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116850688 unmapped: 28999680 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:09:33.638647+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116850688 unmapped: 28999680 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:09:34.638853+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116850688 unmapped: 28999680 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 20.711370468s of 20.742525101s, submitted: 12
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:09:35.639029+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117899264 unmapped: 27951104 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1265990 data_alloc: 234881024 data_used: 10301440
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:09:36.639220+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117899264 unmapped: 27951104 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:09:37.639414+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117899264 unmapped: 27951104 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:09:38.639610+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117899264 unmapped: 27951104 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:09:39.640068+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117899264 unmapped: 27951104 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:09:40.640267+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117899264 unmapped: 27951104 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1265990 data_alloc: 234881024 data_used: 10301440
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:09:41.640451+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117899264 unmapped: 27951104 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:09:42.640641+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117899264 unmapped: 27951104 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:09:43.640837+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117899264 unmapped: 27951104 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e327fc00
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e327fc00 session 0x5616e312d680
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:09:44.641022+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116711424 unmapped: 29138944 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:09:45.641204+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116711424 unmapped: 29138944 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1298342 data_alloc: 234881024 data_used: 10301440
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e0541400 session 0x5616dfb705a0
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e0541400
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:09:46.641385+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e0541000 session 0x5616dfb703c0
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e21ac400
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116711424 unmapped: 29138944 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f961c000/0x0/0x4ffc00000, data 0x1b85ac2/0x1c40000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:09:47.641547+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e11f9800 session 0x5616e311b4a0
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e0541000
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116719616 unmapped: 29130752 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:09:48.641739+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116719616 unmapped: 29130752 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:09:49.641917+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e34d4800
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 14.859712601s of 14.898717880s, submitted: 17
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e34d4800 session 0x5616e04dde00
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116957184 unmapped: 28893184 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e3220400
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e3284c00
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:09:50.642077+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116957184 unmapped: 28893184 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1301202 data_alloc: 234881024 data_used: 10452992
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:09:51.642202+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f95f8000/0x0/0x4ffc00000, data 0x1ba9ac2/0x1c64000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116957184 unmapped: 28893184 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:09:52.642332+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116957184 unmapped: 28893184 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:09:53.642510+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116957184 unmapped: 28893184 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:09:54.642638+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116957184 unmapped: 28893184 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:09:55.642785+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116490240 unmapped: 29360128 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1308194 data_alloc: 234881024 data_used: 11509760
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:09:56.643002+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f95f8000/0x0/0x4ffc00000, data 0x1ba9ac2/0x1c64000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [0,0,1])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116695040 unmapped: 29155328 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:09:57.643209+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116858880 unmapped: 28991488 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:09:58.643364+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116858880 unmapped: 28991488 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:09:59.643553+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f95f8000/0x0/0x4ffc00000, data 0x1ba9ac2/0x1c64000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116867072 unmapped: 28983296 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:10:00.643710+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116817920 unmapped: 29032448 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 9.749152184s of 11.151672363s, submitted: 349
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1311098 data_alloc: 234881024 data_used: 11579392
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:10:01.643884+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 119087104 unmapped: 26763264 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:10:02.644067+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 120037376 unmapped: 25812992 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:10:03.644266+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 120037376 unmapped: 25812992 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:10:04.644427+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 120037376 unmapped: 25812992 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f8920000/0x0/0x4ffc00000, data 0x286aac2/0x2925000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:10:05.644585+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 120037376 unmapped: 25812992 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1422902 data_alloc: 234881024 data_used: 11780096
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:10:06.644756+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 120037376 unmapped: 25812992 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:10:07.644952+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f8920000/0x0/0x4ffc00000, data 0x286aac2/0x2925000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 119365632 unmapped: 26484736 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:10:08.645189+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 119365632 unmapped: 26484736 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:10:09.645374+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 119365632 unmapped: 26484736 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:10:10.645544+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 119365632 unmapped: 26484736 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1414566 data_alloc: 234881024 data_used: 11788288
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:10:11.645695+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 119365632 unmapped: 26484736 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:10:12.645843+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 119365632 unmapped: 26484736 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 12.005815506s of 12.341868401s, submitted: 120
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:10:13.646092+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f8903000/0x0/0x4ffc00000, data 0x289eac2/0x2959000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 119480320 unmapped: 26370048 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:10:14.646275+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e3220400 session 0x5616e04dd680
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e3284c00 session 0x5616e3477680
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 119480320 unmapped: 26370048 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e108b400
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:10:15.646481+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e108b400 session 0x5616dfb634a0
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 119390208 unmapped: 26460160 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f93e4000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1279966 data_alloc: 234881024 data_used: 10301440
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:10:16.646673+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 119390208 unmapped: 26460160 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:10:17.646804+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 119390208 unmapped: 26460160 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:10:18.646970+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 119390208 unmapped: 26460160 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:10:19.647185+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 119390208 unmapped: 26460160 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:10:20.647398+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f93e4000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 119390208 unmapped: 26460160 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1279966 data_alloc: 234881024 data_used: 10301440
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:10:21.647580+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 119390208 unmapped: 26460160 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:10:22.647733+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 119390208 unmapped: 26460160 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:10:23.647970+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 119390208 unmapped: 26460160 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:10:24.648156+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f93e4000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 119390208 unmapped: 26460160 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f93e4000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:10:25.648308+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 119390208 unmapped: 26460160 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1279966 data_alloc: 234881024 data_used: 10301440
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:10:26.648493+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 119390208 unmapped: 26460160 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f93e4000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:10:27.648685+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 119398400 unmapped: 26451968 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:10:28.648853+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 119398400 unmapped: 26451968 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:10:29.648995+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f93e4000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 119406592 unmapped: 26443776 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:10:30.649216+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 119406592 unmapped: 26443776 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1279966 data_alloc: 234881024 data_used: 10301440
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:10:31.649387+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 119406592 unmapped: 26443776 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:10:32.649548+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 119406592 unmapped: 26443776 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:10:33.649904+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e2618800
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e2618800 session 0x5616e312d0e0
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e108b400
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e108b400 session 0x5616e312fc20
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e3220400
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e3220400 session 0x5616e34725a0
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e3284c00
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e3284c00 session 0x5616e312c780
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 119422976 unmapped: 26427392 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e34d4800
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 20.763683319s of 20.829357147s, submitted: 19
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e34d4800 session 0x5616e3477c20
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:10:34.650059+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9338000/0x0/0x4ffc00000, data 0x1e69ac2/0x1f24000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e11f9c00 session 0x5616e05732c0
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e108b400
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 119709696 unmapped: 26140672 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:10:35.650385+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 119717888 unmapped: 26132480 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1324070 data_alloc: 234881024 data_used: 10301440
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9338000/0x0/0x4ffc00000, data 0x1e69ac2/0x1f24000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:10:36.650518+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9338000/0x0/0x4ffc00000, data 0x1e69ac2/0x1f24000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 119717888 unmapped: 26132480 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:10:37.679962+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 119717888 unmapped: 26132480 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:10:38.680196+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 119717888 unmapped: 26132480 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9338000/0x0/0x4ffc00000, data 0x1e69ac2/0x1f24000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e34d7000
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e34d7000 session 0x5616e04dc960
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:10:39.680352+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e34d5000
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e327a400
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 120020992 unmapped: 25829376 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:10:40.680569+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 119808000 unmapped: 26042368 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1330608 data_alloc: 234881024 data_used: 11010048
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:10:41.680715+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121479168 unmapped: 24371200 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:10:42.680886+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121479168 unmapped: 24371200 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9314000/0x0/0x4ffc00000, data 0x1e8dac2/0x1f48000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:10:43.681072+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121479168 unmapped: 24371200 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:10:44.681219+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121479168 unmapped: 24371200 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:10:45.681379+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121479168 unmapped: 24371200 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1356296 data_alloc: 234881024 data_used: 14831616
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:10:46.681577+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121479168 unmapped: 24371200 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9314000/0x0/0x4ffc00000, data 0x1e8dac2/0x1f48000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:10:47.681732+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121479168 unmapped: 24371200 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9314000/0x0/0x4ffc00000, data 0x1e8dac2/0x1f48000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:10:48.681859+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121479168 unmapped: 24371200 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:10:49.682006+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121495552 unmapped: 24354816 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:10:50.682170+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 16.424659729s of 16.518987656s, submitted: 14
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e3281400
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e3281400 session 0x5616e0d6ed20
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 123994112 unmapped: 21856256 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1453094 data_alloc: 234881024 data_used: 15106048
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:10:51.682319+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 124026880 unmapped: 21823488 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:10:52.682479+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f87f4000/0x0/0x4ffc00000, data 0x29abac2/0x2a66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 123879424 unmapped: 21970944 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:10:53.682705+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 123879424 unmapped: 21970944 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:10:54.682884+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 123879424 unmapped: 21970944 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:10:55.683075+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 123879424 unmapped: 21970944 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1459086 data_alloc: 234881024 data_used: 15007744
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:10:56.683249+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f87e6000/0x0/0x4ffc00000, data 0x29b3ac2/0x2a6e000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e0540c00 session 0x5616e0cdb0e0
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e327cc00 session 0x5616e278a780
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 123879424 unmapped: 21970944 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:10:57.683446+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 123879424 unmapped: 21970944 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:10:58.683617+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 123666432 unmapped: 22183936 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e2616400
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e2616400 session 0x5616e36910e0
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:10:59.683855+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e0540c00
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e327cc00
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 123813888 unmapped: 22036480 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:11:00.684020+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 124370944 unmapped: 21479424 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1466042 data_alloc: 234881024 data_used: 16191488
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f87c7000/0x0/0x4ffc00000, data 0x29daac2/0x2a95000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:11:01.684270+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 124370944 unmapped: 21479424 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:11:02.684474+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 124370944 unmapped: 21479424 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:11:03.684682+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 124370944 unmapped: 21479424 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:11:04.684919+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 124370944 unmapped: 21479424 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:11:05.685100+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 124370944 unmapped: 21479424 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1466042 data_alloc: 234881024 data_used: 16191488
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:11:06.685276+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 124370944 unmapped: 21479424 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f87c7000/0x0/0x4ffc00000, data 0x29daac2/0x2a95000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:11:07.685459+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 124403712 unmapped: 21446656 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:11:08.685635+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 124403712 unmapped: 21446656 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:11:09.685769+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 124403712 unmapped: 21446656 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:11:10.685918+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 19.643545151s of 19.996114731s, submitted: 90
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 124805120 unmapped: 21045248 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1548442 data_alloc: 234881024 data_used: 16363520
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:11:11.686083+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 124837888 unmapped: 21012480 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:11:12.686241+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f7d82000/0x0/0x4ffc00000, data 0x3417ac2/0x34d2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [0,0,1])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 124469248 unmapped: 21381120 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:11:13.686448+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e2cac000 session 0x5616e1f04b40
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e3542800 session 0x5616e1ec81e0
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 123109376 unmapped: 22740992 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:11:14.686712+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 123109376 unmapped: 22740992 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:11:15.686923+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 123109376 unmapped: 22740992 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1555186 data_alloc: 234881024 data_used: 16371712
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:11:16.687059+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 123109376 unmapped: 22740992 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:11:17.687249+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 123117568 unmapped: 22732800 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f7d01000/0x0/0x4ffc00000, data 0x34a0ac2/0x355b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:11:18.687379+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 123248640 unmapped: 22601728 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:11:19.687537+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 123248640 unmapped: 22601728 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:11:20.687676+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f7ce0000/0x0/0x4ffc00000, data 0x34c1ac2/0x357c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 123248640 unmapped: 22601728 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1555554 data_alloc: 234881024 data_used: 16371712
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 9.561103821s of 10.734168053s, submitted: 73
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:11:21.687830+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 123248640 unmapped: 22601728 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f7ce0000/0x0/0x4ffc00000, data 0x34c1ac2/0x357c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:11:22.687977+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 123248640 unmapped: 22601728 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:11:23.688200+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 123256832 unmapped: 22593536 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:11:24.688321+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f7ce0000/0x0/0x4ffc00000, data 0x34c1ac2/0x357c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 123256832 unmapped: 22593536 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e3222c00
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:11:25.688441+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 123256832 unmapped: 22593536 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1555386 data_alloc: 234881024 data_used: 16371712
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:11:26.688583+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f7ce0000/0x0/0x4ffc00000, data 0x34c1ac2/0x357c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 123256832 unmapped: 22593536 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:11:27.688701+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 123256832 unmapped: 22593536 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:11:28.688846+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 123256832 unmapped: 22593536 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:11:29.688965+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 123256832 unmapped: 22593536 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:11:30.689084+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 123256832 unmapped: 22593536 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1555642 data_alloc: 234881024 data_used: 16371712
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:11:31.689267+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e34d6800
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 9.766269684s of 10.206396103s, submitted: 5
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e0540c00 session 0x5616e312c3c0
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e327cc00 session 0x5616e34734a0
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e321b000
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 123273216 unmapped: 22577152 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:11:32.689391+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e321b000 session 0x5616e0cdb680
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f8a68000/0x0/0x4ffc00000, data 0x2739ac2/0x27f4000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 123273216 unmapped: 22577152 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:11:33.689594+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 123273216 unmapped: 22577152 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:11:34.689746+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f8a68000/0x0/0x4ffc00000, data 0x2739ac2/0x27f4000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 123273216 unmapped: 22577152 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:11:35.689898+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f8a68000/0x0/0x4ffc00000, data 0x2739ac2/0x27f4000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e34d5000 session 0x5616e04dde00
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e327a400 session 0x5616e1f04780
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 123273216 unmapped: 22577152 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1439208 data_alloc: 234881024 data_used: 15007744
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:11:36.690011+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e2618c00
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e2618c00 session 0x5616e311b860
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121823232 unmapped: 24027136 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:11:37.690245+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121823232 unmapped: 24027136 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:11:38.690414+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121831424 unmapped: 24018944 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:11:39.690551+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121831424 unmapped: 24018944 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:11:40.690661+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121831424 unmapped: 24018944 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1299793 data_alloc: 234881024 data_used: 10301440
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:11:41.690853+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121831424 unmapped: 24018944 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:11:42.691134+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121831424 unmapped: 24018944 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:11:43.691404+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121831424 unmapped: 24018944 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:11:44.691542+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121831424 unmapped: 24018944 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:11:45.691689+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121831424 unmapped: 24018944 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1299793 data_alloc: 234881024 data_used: 10301440
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:11:46.691844+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121831424 unmapped: 24018944 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:11:47.691988+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: mgrc ms_handle_reset ms_handle_reset con 0x5616dff2cc00
Feb 02 10:14:24 compute-1 ceph-osd[77691]: mgrc reconnect Terminating session with v2:192.168.122.100:6800/1282799344
Feb 02 10:14:24 compute-1 ceph-osd[77691]: mgrc reconnect Starting new session with [v2:192.168.122.100:6800/1282799344,v1:192.168.122.100:6801/1282799344]
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: get_auth_request con 0x5616e2616400 auth_method 0
Feb 02 10:14:24 compute-1 ceph-osd[77691]: mgrc handle_mgr_configure stats_period=5
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121831424 unmapped: 24018944 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:11:48.692200+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121831424 unmapped: 24018944 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:11:49.692326+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121839616 unmapped: 24010752 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:11:50.692461+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121839616 unmapped: 24010752 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1299793 data_alloc: 234881024 data_used: 10301440
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:11:51.692661+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121839616 unmapped: 24010752 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:11:52.692814+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121839616 unmapped: 24010752 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:11:53.693083+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121839616 unmapped: 24010752 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:11:54.693248+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121839616 unmapped: 24010752 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:11:55.693381+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121839616 unmapped: 24010752 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:11:56.693564+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1299793 data_alloc: 234881024 data_used: 10301440
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121839616 unmapped: 24010752 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:11:57.693726+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121839616 unmapped: 24010752 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:11:58.693939+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121839616 unmapped: 24010752 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:11:59.694186+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121839616 unmapped: 24010752 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:12:00.694370+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121839616 unmapped: 24010752 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:12:01.694527+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1299793 data_alloc: 234881024 data_used: 10301440
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121839616 unmapped: 24010752 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:12:02.694668+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121847808 unmapped: 24002560 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:12:03.694855+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121847808 unmapped: 24002560 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:12:04.695023+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121847808 unmapped: 24002560 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:12:05.695228+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e34d7c00
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e34d7c00 session 0x5616e312cd20
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e226a400
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e226a400 session 0x5616e1f04d20
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e2618c00
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e2618c00 session 0x5616dfb70b40
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e327a400
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e327a400 session 0x5616e1f7a780
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e34d5000
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 34.408138275s of 34.488780975s, submitted: 26
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e34d5000 session 0x5616e34eba40
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121438208 unmapped: 24412160 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:12:06.695400+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1325281 data_alloc: 234881024 data_used: 10301440
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9634000/0x0/0x4ffc00000, data 0x1b6dac2/0x1c28000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121438208 unmapped: 24412160 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:12:07.695545+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121438208 unmapped: 24412160 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:12:08.695687+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121438208 unmapped: 24412160 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:12:09.695905+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9634000/0x0/0x4ffc00000, data 0x1b6dac2/0x1c28000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121438208 unmapped: 24412160 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:12:10.696080+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121438208 unmapped: 24412160 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:12:11.696249+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1325281 data_alloc: 234881024 data_used: 10301440
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9634000/0x0/0x4ffc00000, data 0x1b6dac2/0x1c28000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121446400 unmapped: 24403968 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9634000/0x0/0x4ffc00000, data 0x1b6dac2/0x1c28000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:12:12.696387+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e327f400
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e327f400 session 0x5616e312c780
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e21e5800
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e226c400
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121454592 unmapped: 24395776 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:12:13.696640+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9610000/0x0/0x4ffc00000, data 0x1b91ac2/0x1c4c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121421824 unmapped: 24428544 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:12:14.696778+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121421824 unmapped: 24428544 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:12:15.696983+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121421824 unmapped: 24428544 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:12:16.697180+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1337509 data_alloc: 234881024 data_used: 11628544
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121421824 unmapped: 24428544 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:12:17.697303+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121421824 unmapped: 24428544 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:12:18.697574+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9610000/0x0/0x4ffc00000, data 0x1b91ac2/0x1c4c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9610000/0x0/0x4ffc00000, data 0x1b91ac2/0x1c4c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121421824 unmapped: 24428544 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:12:19.697725+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121421824 unmapped: 24428544 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:12:20.697882+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121421824 unmapped: 24428544 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:12:21.698034+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1337509 data_alloc: 234881024 data_used: 11628544
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121421824 unmapped: 24428544 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:12:22.698207+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121421824 unmapped: 24428544 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:12:23.698368+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 17.755592346s of 17.786962509s, submitted: 13
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9610000/0x0/0x4ffc00000, data 0x1b91ac2/0x1c4c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [0,0,0,0,1])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:12:24.698505+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 124960768 unmapped: 20889600 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:12:25.698668+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 125386752 unmapped: 20463616 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:12:26.698850+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 125386752 unmapped: 20463616 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1441767 data_alloc: 234881024 data_used: 13156352
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f8a1a000/0x0/0x4ffc00000, data 0x2786ac2/0x2841000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:12:27.698986+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 125386752 unmapped: 20463616 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:12:28.699139+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 125476864 unmapped: 20373504 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:12:29.699305+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 125476864 unmapped: 20373504 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:12:30.699499+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 125476864 unmapped: 20373504 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:12:31.699717+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 123789312 unmapped: 22061056 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1434471 data_alloc: 234881024 data_used: 13156352
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f8a18000/0x0/0x4ffc00000, data 0x2789ac2/0x2844000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:12:32.699889+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 123789312 unmapped: 22061056 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:12:33.700055+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 123789312 unmapped: 22061056 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:12:34.700221+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 123789312 unmapped: 22061056 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e21e5800 session 0x5616e312c3c0
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e226c400 session 0x5616e312cd20
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e21e5800
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 11.164107323s of 11.421627998s, submitted: 102
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:12:35.700356+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122593280 unmapped: 23257088 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e21e5800 session 0x5616e359a5a0
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:12:36.700524+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122601472 unmapped: 23248896 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 234881024 data_used: 10301440
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:12:37.700724+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122601472 unmapped: 23248896 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:12:38.700880+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122601472 unmapped: 23248896 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:12:39.701074+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122601472 unmapped: 23248896 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:12:40.701262+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122601472 unmapped: 23248896 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:12:41.701385+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122601472 unmapped: 23248896 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 234881024 data_used: 10301440
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:12:42.701514+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122601472 unmapped: 23248896 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:12:43.701685+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122601472 unmapped: 23248896 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:12:44.701896+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122601472 unmapped: 23248896 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:12:45.702081+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122601472 unmapped: 23248896 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:12:46.702238+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122601472 unmapped: 23248896 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 234881024 data_used: 10301440
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:12:47.702354+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122601472 unmapped: 23248896 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:12:48.702481+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122601472 unmapped: 23248896 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:12:49.702666+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122601472 unmapped: 23248896 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:12:50.702809+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122601472 unmapped: 23248896 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:12:51.703007+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122601472 unmapped: 23248896 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 234881024 data_used: 10301440
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:12:52.703187+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122601472 unmapped: 23248896 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:12:53.703392+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122601472 unmapped: 23248896 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:12:54.703576+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122601472 unmapped: 23248896 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:12:55.703725+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122601472 unmapped: 23248896 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:12:56.703910+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122601472 unmapped: 23248896 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 234881024 data_used: 10301440
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:12:57.704083+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121839616 unmapped: 24010752 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:12:58.704268+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121839616 unmapped: 24010752 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:12:59.704420+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121839616 unmapped: 24010752 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:13:00.704537+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121839616 unmapped: 24010752 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:13:01.704705+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121839616 unmapped: 24010752 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 234881024 data_used: 10301440
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:13:02.704825+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121839616 unmapped: 24010752 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:13:03.704998+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121839616 unmapped: 24010752 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:13:04.705213+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121839616 unmapped: 24010752 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:13:05.705411+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121839616 unmapped: 24010752 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:13:06.705586+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121839616 unmapped: 24010752 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 234881024 data_used: 10301440
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:13:07.705767+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121839616 unmapped: 24010752 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:13:08.705937+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121847808 unmapped: 24002560 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:13:09.706065+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121847808 unmapped: 24002560 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:13:10.706191+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121847808 unmapped: 24002560 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:13:11.706376+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121847808 unmapped: 24002560 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 234881024 data_used: 10301440
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:13:12.706541+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121847808 unmapped: 24002560 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:13:13.706763+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121847808 unmapped: 24002560 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:13:14.706957+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121847808 unmapped: 24002560 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:13:15.707119+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121847808 unmapped: 24002560 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:13:16.707271+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121847808 unmapped: 24002560 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 234881024 data_used: 10301440
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:13:17.707437+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121847808 unmapped: 24002560 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:13:18.707640+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121856000 unmapped: 23994368 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:13:19.707802+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121856000 unmapped: 23994368 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:13:20.707947+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121856000 unmapped: 23994368 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:13:21.708058+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121856000 unmapped: 23994368 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 234881024 data_used: 10301440
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:13:22.708215+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121856000 unmapped: 23994368 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:13:23.708431+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121856000 unmapped: 23994368 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:13:24.708635+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121856000 unmapped: 23994368 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:13:25.708788+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121856000 unmapped: 23994368 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:13:26.708906+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121856000 unmapped: 23994368 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 234881024 data_used: 10301440
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:13:27.709064+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121856000 unmapped: 23994368 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:13:28.709213+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121856000 unmapped: 23994368 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:13:29.709372+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121856000 unmapped: 23994368 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:13:30.709470+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121856000 unmapped: 23994368 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:13:31.709689+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121864192 unmapped: 23986176 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 234881024 data_used: 10301440
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:13:32.709899+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121864192 unmapped: 23986176 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:13:33.710080+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121864192 unmapped: 23986176 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:13:34.710243+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121864192 unmapped: 23986176 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:13:35.710400+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121864192 unmapped: 23986176 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:13:36.710568+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121864192 unmapped: 23986176 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 234881024 data_used: 10301440
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:13:37.710700+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121864192 unmapped: 23986176 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:13:38.710846+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121864192 unmapped: 23986176 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:13:39.710979+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121864192 unmapped: 23986176 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:13:40.711102+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121864192 unmapped: 23986176 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:13:41.711256+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121864192 unmapped: 23986176 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 234881024 data_used: 10301440
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:13:42.711421+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121864192 unmapped: 23986176 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:13:43.711605+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121864192 unmapped: 23986176 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:13:44.711792+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121864192 unmapped: 23986176 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:13:45.711899+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121864192 unmapped: 23986176 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:13:46.712111+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121864192 unmapped: 23986176 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 234881024 data_used: 10301440
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:13:47.712250+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121872384 unmapped: 23977984 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:13:48.712363+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121872384 unmapped: 23977984 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:13:49.712592+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121872384 unmapped: 23977984 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:13:50.712716+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121872384 unmapped: 23977984 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:13:51.712858+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: do_command 'config diff' '{prefix=config diff}'
Feb 02 10:14:24 compute-1 ceph-osd[77691]: do_command 'config diff' '{prefix=config diff}' result is 0 bytes
Feb 02 10:14:24 compute-1 ceph-osd[77691]: do_command 'config show' '{prefix=config show}'
Feb 02 10:14:24 compute-1 ceph-osd[77691]: do_command 'config show' '{prefix=config show}' result is 0 bytes
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122142720 unmapped: 23707648 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:14:24 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:14:24 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 234881024 data_used: 10301440
Feb 02 10:14:24 compute-1 ceph-osd[77691]: do_command 'counter dump' '{prefix=counter dump}'
Feb 02 10:14:24 compute-1 ceph-osd[77691]: do_command 'counter dump' '{prefix=counter dump}' result is 0 bytes
Feb 02 10:14:24 compute-1 ceph-osd[77691]: do_command 'counter schema' '{prefix=counter schema}'
Feb 02 10:14:24 compute-1 ceph-osd[77691]: do_command 'counter schema' '{prefix=counter schema}' result is 0 bytes
Feb 02 10:14:24 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:13:52.713120+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121544704 unmapped: 24305664 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:14:24 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:13:53.713311+0000)
Feb 02 10:14:24 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121421824 unmapped: 24428544 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:14:24 compute-1 ceph-osd[77691]: do_command 'log dump' '{prefix=log dump}'
Feb 02 10:14:24 compute-1 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr module ls"} v 0)
Feb 02 10:14:24 compute-1 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3594690318' entity='client.admin' cmd=[{"prefix": "mgr module ls"}]: dispatch
Feb 02 10:14:24 compute-1 rsyslogd[1009]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Feb 02 10:14:25 compute-1 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr services"} v 0)
Feb 02 10:14:25 compute-1 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/408226540' entity='client.admin' cmd=[{"prefix": "mgr services"}]: dispatch
Feb 02 10:14:25 compute-1 ceph-mon[80115]: from='client.26858 -' entity='client.admin' cmd=[{"prefix": "orch device ls", "target": ["mon-mgr", ""]}]: dispatch
Feb 02 10:14:25 compute-1 ceph-mon[80115]: from='client.17169 -' entity='client.admin' cmd=[{"prefix": "insights", "target": ["mon-mgr", ""]}]: dispatch
Feb 02 10:14:25 compute-1 ceph-mon[80115]: from='client.26893 -' entity='client.admin' cmd=[{"prefix": "orch ls", "target": ["mon-mgr", ""]}]: dispatch
Feb 02 10:14:25 compute-1 ceph-mon[80115]: from='client.26882 -' entity='client.admin' cmd=[{"prefix": "orch ls", "target": ["mon-mgr", ""]}]: dispatch
Feb 02 10:14:25 compute-1 ceph-mon[80115]: from='client.? 192.168.122.101:0/933138151' entity='client.admin' cmd=[{"prefix": "mgr metadata"}]: dispatch
Feb 02 10:14:25 compute-1 ceph-mon[80115]: from='client.? 192.168.122.100:0/2678781036' entity='client.admin' cmd=[{"prefix": "log last", "num": 10000, "level": "debug", "channel": "audit"}]: dispatch
Feb 02 10:14:25 compute-1 ceph-mon[80115]: from='client.? 192.168.122.100:0/2747034242' entity='client.admin' cmd=[{"prefix": "mgr stat"}]: dispatch
Feb 02 10:14:25 compute-1 ceph-mon[80115]: from='client.26926 -' entity='client.admin' cmd=[{"prefix": "orch ls", "export": true, "target": ["mon-mgr", ""]}]: dispatch
Feb 02 10:14:25 compute-1 ceph-mon[80115]: pgmap v1050: 353 pgs: 353 active+clean; 41 MiB data, 310 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Feb 02 10:14:25 compute-1 ceph-mon[80115]: from='client.? 192.168.122.102:0/3594690318' entity='client.admin' cmd=[{"prefix": "mgr module ls"}]: dispatch
Feb 02 10:14:25 compute-1 ceph-mon[80115]: from='client.26897 -' entity='client.admin' cmd=[{"prefix": "orch ls", "export": true, "target": ["mon-mgr", ""]}]: dispatch
Feb 02 10:14:25 compute-1 ceph-mon[80115]: from='client.? 192.168.122.101:0/1469722291' entity='client.admin' cmd=[{"prefix": "mgr module ls"}]: dispatch
Feb 02 10:14:25 compute-1 ceph-mon[80115]: from='client.? 192.168.122.100:0/1332516381' entity='client.admin' cmd=[{"prefix": "mgr versions"}]: dispatch
Feb 02 10:14:25 compute-1 ceph-mon[80115]: from='client.? 192.168.122.100:0/2741009425' entity='client.admin' cmd=[{"prefix": "log last", "num": 10000, "level": "debug", "channel": "cluster"}]: dispatch
Feb 02 10:14:25 compute-1 ceph-mon[80115]: from='client.? 192.168.122.102:0/390320203' entity='client.admin' cmd=[{"prefix": "mgr services"}]: dispatch
Feb 02 10:14:25 compute-1 ceph-mon[80115]: from='client.? 192.168.122.101:0/408226540' entity='client.admin' cmd=[{"prefix": "mgr services"}]: dispatch
Feb 02 10:14:25 compute-1 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr versions"} v 0)
Feb 02 10:14:25 compute-1 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/223053324' entity='client.admin' cmd=[{"prefix": "mgr versions"}]: dispatch
Feb 02 10:14:25 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:14:25 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:14:25 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:14:25.668 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:14:25 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:14:25 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:14:25 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:14:25.673 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:14:25 compute-1 crontab[239564]: (root) LIST (root)
Feb 02 10:14:26 compute-1 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon stat"} v 0)
Feb 02 10:14:26 compute-1 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3137409554' entity='client.admin' cmd=[{"prefix": "mon stat"}]: dispatch
Feb 02 10:14:26 compute-1 ceph-mon[80115]: from='client.26953 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch
Feb 02 10:14:26 compute-1 ceph-mon[80115]: from='client.26912 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch
Feb 02 10:14:26 compute-1 ceph-mon[80115]: from='client.17241 -' entity='client.admin' cmd=[{"prefix": "orch host ls", "target": ["mon-mgr", ""]}]: dispatch
Feb 02 10:14:26 compute-1 ceph-mon[80115]: from='client.? 192.168.122.100:0/1525163689' entity='client.admin' cmd=[{"prefix": "mgr dump"}]: dispatch
Feb 02 10:14:26 compute-1 ceph-mon[80115]: from='client.26974 -' entity='client.admin' cmd=[{"prefix": "orch status", "detail": true, "target": ["mon-mgr", ""]}]: dispatch
Feb 02 10:14:26 compute-1 ceph-mon[80115]: from='client.26936 -' entity='client.admin' cmd=[{"prefix": "orch status", "detail": true, "target": ["mon-mgr", ""]}]: dispatch
Feb 02 10:14:26 compute-1 ceph-mon[80115]: from='client.? 192.168.122.102:0/4089552539' entity='client.admin' cmd=[{"prefix": "mgr versions"}]: dispatch
Feb 02 10:14:26 compute-1 ceph-mon[80115]: from='client.? 192.168.122.101:0/223053324' entity='client.admin' cmd=[{"prefix": "mgr versions"}]: dispatch
Feb 02 10:14:26 compute-1 ceph-mon[80115]: from='client.17265 -' entity='client.admin' cmd=[{"prefix": "orch device ls", "target": ["mon-mgr", ""]}]: dispatch
Feb 02 10:14:26 compute-1 ceph-mon[80115]: from='client.26995 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch
Feb 02 10:14:26 compute-1 ceph-mon[80115]: from='client.? 192.168.122.100:0/3508290189' entity='client.admin' cmd=[{"prefix": "mgr metadata"}]: dispatch
Feb 02 10:14:26 compute-1 ceph-mon[80115]: from='client.? 192.168.122.102:0/2224683646' entity='client.admin' cmd=[{"prefix": "mon stat"}]: dispatch
Feb 02 10:14:26 compute-1 ceph-mon[80115]: from='client.? 192.168.122.101:0/3137409554' entity='client.admin' cmd=[{"prefix": "mon stat"}]: dispatch
Feb 02 10:14:26 compute-1 ceph-mon[80115]: from='client.? 192.168.122.100:0/3141533306' entity='client.admin' cmd=[{"prefix": "mgr module ls"}]: dispatch
Feb 02 10:14:26 compute-1 nova_compute[226294]: 2026-02-02 10:14:26.801 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:14:26 compute-1 nova_compute[226294]: 2026-02-02 10:14:26.803 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:14:27 compute-1 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd crush class ls"} v 0)
Feb 02 10:14:27 compute-1 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4278920637' entity='client.admin' cmd=[{"prefix": "osd crush class ls"}]: dispatch
Feb 02 10:14:27 compute-1 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd crush class ls"} v 0)
Feb 02 10:14:27 compute-1 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1157535423' entity='client.admin' cmd=[{"prefix": "osd crush class ls"}]: dispatch
Feb 02 10:14:27 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 10:14:27 compute-1 ceph-mon[80115]: from='client.26954 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch
Feb 02 10:14:27 compute-1 ceph-mon[80115]: from='client.17289 -' entity='client.admin' cmd=[{"prefix": "orch ls", "target": ["mon-mgr", ""]}]: dispatch
Feb 02 10:14:27 compute-1 ceph-mon[80115]: from='client.27022 -' entity='client.admin' cmd=[{"prefix": "balancer eval", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Feb 02 10:14:27 compute-1 ceph-mon[80115]: from='client.26972 -' entity='client.admin' cmd=[{"prefix": "balancer eval", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Feb 02 10:14:27 compute-1 ceph-mon[80115]: from='client.17310 -' entity='client.admin' cmd=[{"prefix": "orch ls", "export": true, "target": ["mon-mgr", ""]}]: dispatch
Feb 02 10:14:27 compute-1 ceph-mon[80115]: pgmap v1051: 353 pgs: 353 active+clean; 41 MiB data, 310 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Feb 02 10:14:27 compute-1 ceph-mon[80115]: from='client.27043 -' entity='client.admin' cmd=[{"prefix": "balancer status", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Feb 02 10:14:27 compute-1 ceph-mon[80115]: from='client.26993 -' entity='client.admin' cmd=[{"prefix": "balancer status", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Feb 02 10:14:27 compute-1 ceph-mon[80115]: from='client.? 192.168.122.100:0/4025381601' entity='client.admin' cmd=[{"prefix": "mgr services"}]: dispatch
Feb 02 10:14:27 compute-1 ceph-mon[80115]: from='client.17331 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch
Feb 02 10:14:27 compute-1 ceph-mon[80115]: from='client.? 192.168.122.102:0/3277569557' entity='client.admin' cmd=[{"prefix": "node ls"}]: dispatch
Feb 02 10:14:27 compute-1 ceph-mon[80115]: from='client.? 192.168.122.101:0/2467961628' entity='client.admin' cmd=[{"prefix": "node ls"}]: dispatch
Feb 02 10:14:27 compute-1 ceph-mon[80115]: from='client.? 192.168.122.100:0/3146719867' entity='client.admin' cmd=[{"prefix": "mgr versions"}]: dispatch
Feb 02 10:14:27 compute-1 ceph-mon[80115]: from='client.? 192.168.122.102:0/4278920637' entity='client.admin' cmd=[{"prefix": "osd crush class ls"}]: dispatch
Feb 02 10:14:27 compute-1 ceph-mon[80115]: from='client.? 192.168.122.101:0/1157535423' entity='client.admin' cmd=[{"prefix": "osd crush class ls"}]: dispatch
Feb 02 10:14:27 compute-1 ceph-mon[80115]: from='client.? 192.168.122.100:0/3865929111' entity='client.admin' cmd=[{"prefix": "mon stat"}]: dispatch
Feb 02 10:14:27 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:14:27 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:14:27 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:14:27.670 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:14:27 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:14:27 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:14:27 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:14:27.675 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:14:27 compute-1 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd crush dump"} v 0)
Feb 02 10:14:27 compute-1 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/4104614483' entity='client.admin' cmd=[{"prefix": "osd crush dump"}]: dispatch
Feb 02 10:14:28 compute-1 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd crush rule ls"} v 0)
Feb 02 10:14:28 compute-1 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2970138867' entity='client.admin' cmd=[{"prefix": "osd crush rule ls"}]: dispatch
Feb 02 10:14:28 compute-1 systemd[1]: Starting Hostname Service...
Feb 02 10:14:28 compute-1 systemd[1]: Started Hostname Service.
Feb 02 10:14:28 compute-1 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr dump", "format": "json-pretty"} v 0)
Feb 02 10:14:28 compute-1 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3330418646' entity='client.admin' cmd=[{"prefix": "mgr dump", "format": "json-pretty"}]: dispatch
Feb 02 10:14:28 compute-1 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd crush show-tunables"} v 0)
Feb 02 10:14:28 compute-1 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/482955478' entity='client.admin' cmd=[{"prefix": "osd crush show-tunables"}]: dispatch
Feb 02 10:14:28 compute-1 ceph-mon[80115]: from='client.27061 -' entity='client.admin' cmd=[{"prefix": "balancer status detail", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Feb 02 10:14:28 compute-1 ceph-mon[80115]: from='client.27002 -' entity='client.admin' cmd=[{"prefix": "balancer status detail", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Feb 02 10:14:28 compute-1 ceph-mon[80115]: from='client.17355 -' entity='client.admin' cmd=[{"prefix": "orch status", "detail": true, "target": ["mon-mgr", ""]}]: dispatch
Feb 02 10:14:28 compute-1 ceph-mon[80115]: from='client.17370 -' entity='client.admin' cmd=[{"prefix": "healthcheck history ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Feb 02 10:14:28 compute-1 ceph-mon[80115]: from='client.27085 -' entity='client.admin' cmd=[{"prefix": "healthcheck history ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Feb 02 10:14:28 compute-1 ceph-mon[80115]: from='client.17382 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch
Feb 02 10:14:28 compute-1 ceph-mon[80115]: from='client.? 192.168.122.102:0/1933784703' entity='client.admin' cmd=[{"prefix": "osd crush dump"}]: dispatch
Feb 02 10:14:28 compute-1 ceph-mon[80115]: from='client.? 192.168.122.101:0/4104614483' entity='client.admin' cmd=[{"prefix": "osd crush dump"}]: dispatch
Feb 02 10:14:28 compute-1 ceph-mon[80115]: from='client.? 192.168.122.102:0/2046070083' entity='client.admin' cmd=[{"prefix": "log last", "channel": "cephadm", "format": "json-pretty"}]: dispatch
Feb 02 10:14:28 compute-1 ceph-mon[80115]: from='client.? 192.168.122.101:0/2341655505' entity='client.admin' cmd=[{"prefix": "log last", "channel": "cephadm", "format": "json-pretty"}]: dispatch
Feb 02 10:14:28 compute-1 ceph-mon[80115]: from='client.? 192.168.122.102:0/865809388' entity='client.admin' cmd=[{"prefix": "osd crush rule ls"}]: dispatch
Feb 02 10:14:28 compute-1 ceph-mon[80115]: from='client.? 192.168.122.101:0/2970138867' entity='client.admin' cmd=[{"prefix": "osd crush rule ls"}]: dispatch
Feb 02 10:14:28 compute-1 ceph-mon[80115]: from='client.? 192.168.122.102:0/3590236176' entity='client.admin' cmd=[{"prefix": "mgr dump", "format": "json-pretty"}]: dispatch
Feb 02 10:14:28 compute-1 ceph-mon[80115]: from='client.? 192.168.122.101:0/3330418646' entity='client.admin' cmd=[{"prefix": "mgr dump", "format": "json-pretty"}]: dispatch
Feb 02 10:14:28 compute-1 ceph-mon[80115]: from='client.? 192.168.122.102:0/272202263' entity='client.admin' cmd=[{"prefix": "osd crush show-tunables"}]: dispatch
Feb 02 10:14:28 compute-1 ceph-mon[80115]: from='client.? 192.168.122.101:0/482955478' entity='client.admin' cmd=[{"prefix": "osd crush show-tunables"}]: dispatch
Feb 02 10:14:28 compute-1 ceph-mon[80115]: from='client.? 192.168.122.100:0/2398363290' entity='client.admin' cmd=[{"prefix": "node ls"}]: dispatch
Feb 02 10:14:28 compute-1 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr metadata", "format": "json-pretty"} v 0)
Feb 02 10:14:28 compute-1 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1038496448' entity='client.admin' cmd=[{"prefix": "mgr metadata", "format": "json-pretty"}]: dispatch
Feb 02 10:14:28 compute-1 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd crush tree", "show_shadow": true} v 0)
Feb 02 10:14:28 compute-1 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3279308775' entity='client.admin' cmd=[{"prefix": "osd crush tree", "show_shadow": true}]: dispatch
Feb 02 10:14:29 compute-1 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr module ls", "format": "json-pretty"} v 0)
Feb 02 10:14:29 compute-1 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1846634575' entity='client.admin' cmd=[{"prefix": "mgr module ls", "format": "json-pretty"}]: dispatch
Feb 02 10:14:29 compute-1 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd erasure-code-profile ls"} v 0)
Feb 02 10:14:29 compute-1 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3696275265' entity='client.admin' cmd=[{"prefix": "osd erasure-code-profile ls"}]: dispatch
Feb 02 10:14:29 compute-1 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr services", "format": "json-pretty"} v 0)
Feb 02 10:14:29 compute-1 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2233140680' entity='client.admin' cmd=[{"prefix": "mgr services", "format": "json-pretty"}]: dispatch
Feb 02 10:14:29 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:14:29 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 10:14:29 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:14:29.670 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 10:14:29 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:14:29 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 10:14:29 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:14:29.676 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 10:14:29 compute-1 ceph-mon[80115]: from='client.17403 -' entity='client.admin' cmd=[{"prefix": "balancer eval", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Feb 02 10:14:29 compute-1 ceph-mon[80115]: from='client.17424 -' entity='client.admin' cmd=[{"prefix": "balancer status", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Feb 02 10:14:29 compute-1 ceph-mon[80115]: pgmap v1052: 353 pgs: 353 active+clean; 41 MiB data, 310 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Feb 02 10:14:29 compute-1 ceph-mon[80115]: from='client.17436 -' entity='client.admin' cmd=[{"prefix": "balancer status detail", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Feb 02 10:14:29 compute-1 ceph-mon[80115]: from='client.? 192.168.122.102:0/2384577425' entity='client.admin' cmd=[{"prefix": "mgr metadata", "format": "json-pretty"}]: dispatch
Feb 02 10:14:29 compute-1 ceph-mon[80115]: from='client.? 192.168.122.101:0/1038496448' entity='client.admin' cmd=[{"prefix": "mgr metadata", "format": "json-pretty"}]: dispatch
Feb 02 10:14:29 compute-1 ceph-mon[80115]: from='client.? 192.168.122.102:0/3711047924' entity='client.admin' cmd=[{"prefix": "osd crush tree", "show_shadow": true}]: dispatch
Feb 02 10:14:29 compute-1 ceph-mon[80115]: from='client.? 192.168.122.101:0/3279308775' entity='client.admin' cmd=[{"prefix": "osd crush tree", "show_shadow": true}]: dispatch
Feb 02 10:14:29 compute-1 ceph-mon[80115]: from='client.? 192.168.122.100:0/4102532455' entity='client.admin' cmd=[{"prefix": "osd crush class ls"}]: dispatch
Feb 02 10:14:29 compute-1 ceph-mon[80115]: from='client.? 192.168.122.102:0/1968180429' entity='client.admin' cmd=[{"prefix": "mgr module ls", "format": "json-pretty"}]: dispatch
Feb 02 10:14:29 compute-1 ceph-mon[80115]: from='client.? 192.168.122.102:0/3889890898' entity='client.admin' cmd=[{"prefix": "osd erasure-code-profile ls"}]: dispatch
Feb 02 10:14:29 compute-1 ceph-mon[80115]: from='client.? 192.168.122.101:0/1846634575' entity='client.admin' cmd=[{"prefix": "mgr module ls", "format": "json-pretty"}]: dispatch
Feb 02 10:14:29 compute-1 ceph-mon[80115]: from='client.? 192.168.122.101:0/3696275265' entity='client.admin' cmd=[{"prefix": "osd erasure-code-profile ls"}]: dispatch
Feb 02 10:14:29 compute-1 ceph-mon[80115]: from='client.? 192.168.122.100:0/2900426886' entity='client.admin' cmd=[{"prefix": "osd crush dump"}]: dispatch
Feb 02 10:14:29 compute-1 ceph-mon[80115]: from='client.? 192.168.122.100:0/2774745979' entity='client.admin' cmd=[{"prefix": "log last", "channel": "cephadm", "format": "json-pretty"}]: dispatch
Feb 02 10:14:29 compute-1 ceph-mon[80115]: from='client.? 192.168.122.102:0/1621805538' entity='client.admin' cmd=[{"prefix": "osd metadata"}]: dispatch
Feb 02 10:14:29 compute-1 ceph-mon[80115]: from='client.? 192.168.122.101:0/2233140680' entity='client.admin' cmd=[{"prefix": "mgr services", "format": "json-pretty"}]: dispatch
Feb 02 10:14:29 compute-1 ceph-mon[80115]: from='client.? 192.168.122.102:0/2515393203' entity='client.admin' cmd=[{"prefix": "mgr services", "format": "json-pretty"}]: dispatch
Feb 02 10:14:29 compute-1 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd metadata"} v 0)
Feb 02 10:14:29 compute-1 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2314626929' entity='client.admin' cmd=[{"prefix": "osd metadata"}]: dispatch
Feb 02 10:14:30 compute-1 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr stat", "format": "json-pretty"} v 0)
Feb 02 10:14:30 compute-1 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2142204933' entity='client.admin' cmd=[{"prefix": "mgr stat", "format": "json-pretty"}]: dispatch
Feb 02 10:14:30 compute-1 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd utilization"} v 0)
Feb 02 10:14:30 compute-1 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3681611476' entity='client.admin' cmd=[{"prefix": "osd utilization"}]: dispatch
Feb 02 10:14:30 compute-1 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr versions", "format": "json-pretty"} v 0)
Feb 02 10:14:30 compute-1 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/757916292' entity='client.admin' cmd=[{"prefix": "mgr versions", "format": "json-pretty"}]: dispatch
Feb 02 10:14:30 compute-1 ceph-mon[80115]: from='client.17463 -' entity='client.admin' cmd=[{"prefix": "healthcheck history ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Feb 02 10:14:30 compute-1 ceph-mon[80115]: from='client.? 192.168.122.101:0/2314626929' entity='client.admin' cmd=[{"prefix": "osd metadata"}]: dispatch
Feb 02 10:14:30 compute-1 ceph-mon[80115]: from='client.? 192.168.122.100:0/2109358869' entity='client.admin' cmd=[{"prefix": "osd crush rule ls"}]: dispatch
Feb 02 10:14:30 compute-1 ceph-mon[80115]: from='client.? 192.168.122.102:0/1793153749' entity='client.admin' cmd=[{"prefix": "osd utilization"}]: dispatch
Feb 02 10:14:30 compute-1 ceph-mon[80115]: from='client.? 192.168.122.100:0/2781432572' entity='client.admin' cmd=[{"prefix": "mgr dump", "format": "json-pretty"}]: dispatch
Feb 02 10:14:30 compute-1 ceph-mon[80115]: from='client.? 192.168.122.101:0/2142204933' entity='client.admin' cmd=[{"prefix": "mgr stat", "format": "json-pretty"}]: dispatch
Feb 02 10:14:30 compute-1 ceph-mon[80115]: from='client.? 192.168.122.102:0/4228190723' entity='client.admin' cmd=[{"prefix": "mgr stat", "format": "json-pretty"}]: dispatch
Feb 02 10:14:30 compute-1 ceph-mon[80115]: from='client.? 192.168.122.101:0/3681611476' entity='client.admin' cmd=[{"prefix": "osd utilization"}]: dispatch
Feb 02 10:14:30 compute-1 ceph-mon[80115]: from='client.? 192.168.122.100:0/3399529638' entity='client.admin' cmd=[{"prefix": "osd crush show-tunables"}]: dispatch
Feb 02 10:14:30 compute-1 ceph-mon[80115]: from='client.? 192.168.122.101:0/757916292' entity='client.admin' cmd=[{"prefix": "mgr versions", "format": "json-pretty"}]: dispatch
Feb 02 10:14:30 compute-1 ceph-mon[80115]: from='client.? 192.168.122.102:0/3271083416' entity='client.admin' cmd=[{"prefix": "mgr versions", "format": "json-pretty"}]: dispatch
Feb 02 10:14:30 compute-1 ceph-mon[80115]: from='client.? 192.168.122.100:0/1230110462' entity='client.admin' cmd=[{"prefix": "mgr metadata", "format": "json-pretty"}]: dispatch
Feb 02 10:14:30 compute-1 ceph-mon[80115]: from='client.? 192.168.122.100:0/1285535699' entity='client.admin' cmd=[{"prefix": "osd crush tree", "show_shadow": true}]: dispatch
Feb 02 10:14:31 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:14:31 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 10:14:31 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:14:31.672 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 10:14:31 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:14:31 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 10:14:31 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:14:31.679 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 10:14:31 compute-1 ceph-mon[80115]: from='client.27235 -' entity='client.admin' cmd=[{"prefix": "telemetry channel ls", "target": ["mon-mgr", ""]}]: dispatch
Feb 02 10:14:31 compute-1 ceph-mon[80115]: from='client.27209 -' entity='client.admin' cmd=[{"prefix": "telemetry channel ls", "target": ["mon-mgr", ""]}]: dispatch
Feb 02 10:14:31 compute-1 ceph-mon[80115]: pgmap v1053: 353 pgs: 353 active+clean; 41 MiB data, 310 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Feb 02 10:14:31 compute-1 ceph-mon[80115]: from='client.27259 -' entity='client.admin' cmd=[{"prefix": "telemetry collection ls", "target": ["mon-mgr", ""]}]: dispatch
Feb 02 10:14:31 compute-1 ceph-mon[80115]: from='client.27271 -' entity='client.admin' cmd=[{"prefix": "orch host ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Feb 02 10:14:31 compute-1 ceph-mon[80115]: from='client.? 192.168.122.100:0/2715461821' entity='client.admin' cmd=[{"prefix": "mgr module ls", "format": "json-pretty"}]: dispatch
Feb 02 10:14:31 compute-1 ceph-mon[80115]: from='client.? 192.168.122.100:0/2712029861' entity='client.admin' cmd=[{"prefix": "osd erasure-code-profile ls"}]: dispatch
Feb 02 10:14:31 compute-1 ceph-mon[80115]: from='client.? 192.168.122.100:0/1705546128' entity='client.admin' cmd=[{"prefix": "mgr services", "format": "json-pretty"}]: dispatch
Feb 02 10:14:31 compute-1 ceph-mon[80115]: from='client.? 192.168.122.100:0/2670796580' entity='client.admin' cmd=[{"prefix": "osd metadata"}]: dispatch
Feb 02 10:14:31 compute-1 ceph-mon[80115]: from='client.? 192.168.122.102:0/1047381592' entity='client.admin' cmd=[{"prefix": "quorum_status"}]: dispatch
Feb 02 10:14:31 compute-1 ceph-mon[80115]: from='client.? 192.168.122.100:0/1366452837' entity='client.admin' cmd=[{"prefix": "osd utilization"}]: dispatch
Feb 02 10:14:31 compute-1 ceph-mon[80115]: from='client.? 192.168.122.100:0/3898423268' entity='client.admin' cmd=[{"prefix": "mgr stat", "format": "json-pretty"}]: dispatch
Feb 02 10:14:31 compute-1 nova_compute[226294]: 2026-02-02 10:14:31.804 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 02 10:14:31 compute-1 nova_compute[226294]: 2026-02-02 10:14:31.806 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:14:31 compute-1 nova_compute[226294]: 2026-02-02 10:14:31.807 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 02 10:14:31 compute-1 nova_compute[226294]: 2026-02-02 10:14:31.807 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 02 10:14:31 compute-1 nova_compute[226294]: 2026-02-02 10:14:31.807 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 02 10:14:31 compute-1 nova_compute[226294]: 2026-02-02 10:14:31.808 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:14:32 compute-1 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "versions"} v 0)
Feb 02 10:14:32 compute-1 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2796014380' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch
Feb 02 10:14:32 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 10:14:32 compute-1 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "health", "detail": "detail", "format": "json-pretty"} v 0)
Feb 02 10:14:32 compute-1 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3216905718' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail", "format": "json-pretty"}]: dispatch
Feb 02 10:14:32 compute-1 ceph-mon[80115]: from='client.27218 -' entity='client.admin' cmd=[{"prefix": "orch host ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Feb 02 10:14:32 compute-1 ceph-mon[80115]: from='client.27230 -' entity='client.admin' cmd=[{"prefix": "telemetry collection ls", "target": ["mon-mgr", ""]}]: dispatch
Feb 02 10:14:32 compute-1 ceph-mon[80115]: from='client.27245 -' entity='client.admin' cmd=[{"prefix": "orch device ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Feb 02 10:14:32 compute-1 ceph-mon[80115]: from='client.27295 -' entity='client.admin' cmd=[{"prefix": "orch device ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Feb 02 10:14:32 compute-1 ceph-mon[80115]: from='client.27272 -' entity='client.admin' cmd=[{"prefix": "orch ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Feb 02 10:14:32 compute-1 ceph-mon[80115]: from='client.27319 -' entity='client.admin' cmd=[{"prefix": "orch ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Feb 02 10:14:32 compute-1 ceph-mon[80115]: from='client.? 192.168.122.101:0/1562702619' entity='client.admin' cmd=[{"prefix": "quorum_status"}]: dispatch
Feb 02 10:14:32 compute-1 ceph-mon[80115]: from='client.? 192.168.122.102:0/4182919026' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch
Feb 02 10:14:32 compute-1 ceph-mon[80115]: from='client.? 192.168.122.100:0/894840890' entity='client.admin' cmd=[{"prefix": "mgr versions", "format": "json-pretty"}]: dispatch
Feb 02 10:14:32 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Feb 02 10:14:32 compute-1 ceph-mon[80115]: from='client.? 192.168.122.101:0/2796014380' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch
Feb 02 10:14:32 compute-1 ceph-mon[80115]: from='client.? 192.168.122.102:0/2484528277' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail", "format": "json-pretty"}]: dispatch
Feb 02 10:14:33 compute-1 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd tree", "format": "json-pretty"} v 0)
Feb 02 10:14:33 compute-1 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1519222063' entity='client.admin' cmd=[{"prefix": "osd tree", "format": "json-pretty"}]: dispatch
Feb 02 10:14:33 compute-1 ceph-mon[80115]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Feb 02 10:14:33 compute-1 ceph-mon[80115]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Feb 02 10:14:33 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:14:33 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:14:33 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:14:33.675 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:14:33 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:14:33 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:14:33 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:14:33.682 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:14:33 compute-1 ceph-mon[80115]: from='client.27290 -' entity='client.admin' cmd=[{"prefix": "orch ls", "export": true, "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Feb 02 10:14:33 compute-1 ceph-mon[80115]: from='client.27337 -' entity='client.admin' cmd=[{"prefix": "orch ls", "export": true, "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Feb 02 10:14:33 compute-1 ceph-mon[80115]: from='client.17655 -' entity='client.admin' cmd=[{"prefix": "telemetry channel ls", "target": ["mon-mgr", ""]}]: dispatch
Feb 02 10:14:33 compute-1 ceph-mon[80115]: from='client.27314 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Feb 02 10:14:33 compute-1 ceph-mon[80115]: from='client.27358 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Feb 02 10:14:33 compute-1 ceph-mon[80115]: from='client.17673 -' entity='client.admin' cmd=[{"prefix": "orch host ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Feb 02 10:14:33 compute-1 ceph-mon[80115]: pgmap v1054: 353 pgs: 353 active+clean; 41 MiB data, 310 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Feb 02 10:14:33 compute-1 ceph-mon[80115]: from='client.17667 -' entity='client.admin' cmd=[{"prefix": "telemetry collection ls", "target": ["mon-mgr", ""]}]: dispatch
Feb 02 10:14:33 compute-1 ceph-mon[80115]: from='client.27338 -' entity='client.admin' cmd=[{"prefix": "orch status", "detail": true, "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Feb 02 10:14:33 compute-1 ceph-mon[80115]: from='client.? 192.168.122.101:0/3216905718' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail", "format": "json-pretty"}]: dispatch
Feb 02 10:14:33 compute-1 ceph-mon[80115]: from='client.27370 -' entity='client.admin' cmd=[{"prefix": "orch status", "detail": true, "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Feb 02 10:14:33 compute-1 ceph-mon[80115]: from='client.? 192.168.122.102:0/2024779558' entity='client.admin' cmd=[{"prefix": "osd tree", "format": "json-pretty"}]: dispatch
Feb 02 10:14:33 compute-1 ceph-mon[80115]: from='client.? 192.168.122.101:0/1519222063' entity='client.admin' cmd=[{"prefix": "osd tree", "format": "json-pretty"}]: dispatch
Feb 02 10:14:33 compute-1 ceph-mon[80115]: from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Feb 02 10:14:33 compute-1 ceph-mon[80115]: from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Feb 02 10:14:33 compute-1 ceph-mon[80115]: from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Feb 02 10:14:33 compute-1 ceph-mon[80115]: from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Feb 02 10:14:33 compute-1 ceph-mon[80115]: from='admin socket' entity='admin socket' cmd='sessions' args=[]: dispatch
Feb 02 10:14:33 compute-1 ceph-mon[80115]: from='admin socket' entity='admin socket' cmd=sessions args=[]: finished
Feb 02 10:14:33 compute-1 ceph-mon[80115]: from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Feb 02 10:14:33 compute-1 ceph-mon[80115]: from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Feb 02 10:14:33 compute-1 ceph-mon[80115]: from='admin socket' entity='admin socket' cmd='sessions' args=[]: dispatch
Feb 02 10:14:33 compute-1 ceph-mon[80115]: from='admin socket' entity='admin socket' cmd=sessions args=[]: finished
Feb 02 10:14:33 compute-1 ceph-mon[80115]: from='client.? 192.168.122.100:0/4264917689' entity='client.admin' cmd=[{"prefix": "quorum_status"}]: dispatch
Feb 02 10:14:33 compute-1 ceph-mon[80115]: from='admin socket' entity='admin socket' cmd='sessions' args=[]: dispatch
Feb 02 10:14:33 compute-1 ceph-mon[80115]: from='admin socket' entity='admin socket' cmd=sessions args=[]: finished
Feb 02 10:14:33 compute-1 ceph-mon[80115]: from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Feb 02 10:14:33 compute-1 ceph-mon[80115]: from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Feb 02 10:14:33 compute-1 ceph-mon[80115]: from='admin socket' entity='admin socket' cmd='sessions' args=[]: dispatch
Feb 02 10:14:33 compute-1 ceph-mon[80115]: from='admin socket' entity='admin socket' cmd=sessions args=[]: finished
Feb 02 10:14:33 compute-1 ceph-mon[80115]: from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Feb 02 10:14:33 compute-1 ceph-mon[80115]: from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Feb 02 10:14:33 compute-1 ceph-mon[80115]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd='sessions' args=[]: dispatch
Feb 02 10:14:33 compute-1 ceph-mon[80115]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd=sessions args=[]: finished
Feb 02 10:14:33 compute-1 ceph-mon[80115]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Feb 02 10:14:33 compute-1 ceph-mon[80115]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Feb 02 10:14:33 compute-1 ceph-mon[80115]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd='sessions' args=[]: dispatch
Feb 02 10:14:33 compute-1 ceph-mon[80115]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd=sessions args=[]: finished
Feb 02 10:14:34 compute-1 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd tree", "format": "json-pretty"} v 0)
Feb 02 10:14:34 compute-1 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/3425981063' entity='client.admin' cmd=[{"prefix": "osd tree", "format": "json-pretty"}]: dispatch
Feb 02 10:14:34 compute-1 ceph-mon[80115]: from='client.17682 -' entity='client.admin' cmd=[{"prefix": "orch device ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Feb 02 10:14:34 compute-1 ceph-mon[80115]: from='client.27359 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Feb 02 10:14:34 compute-1 ceph-mon[80115]: from='client.27397 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Feb 02 10:14:34 compute-1 ceph-mon[80115]: from='client.17706 -' entity='client.admin' cmd=[{"prefix": "orch ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Feb 02 10:14:34 compute-1 ceph-mon[80115]: from='client.17754 -' entity='client.admin' cmd=[{"prefix": "orch ls", "export": true, "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Feb 02 10:14:34 compute-1 ceph-mon[80115]: from='admin socket' entity='admin socket' cmd='sessions' args=[]: dispatch
Feb 02 10:14:34 compute-1 ceph-mon[80115]: from='admin socket' entity='admin socket' cmd=sessions args=[]: finished
Feb 02 10:14:34 compute-1 ceph-mon[80115]: from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Feb 02 10:14:34 compute-1 ceph-mon[80115]: from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Feb 02 10:14:34 compute-1 ceph-mon[80115]: from='client.? 192.168.122.100:0/3744122681' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch
Feb 02 10:14:34 compute-1 ceph-mon[80115]: from='admin socket' entity='admin socket' cmd='sessions' args=[]: dispatch
Feb 02 10:14:34 compute-1 ceph-mon[80115]: from='admin socket' entity='admin socket' cmd=sessions args=[]: finished
Feb 02 10:14:34 compute-1 ceph-mon[80115]: from='client.? 192.168.122.102:0/2146426638' entity='client.admin' cmd=[{"prefix": "config dump"}]: dispatch
Feb 02 10:14:34 compute-1 ceph-mon[80115]: from='client.? 192.168.122.101:0/3693922191' entity='client.admin' cmd=[{"prefix": "config dump"}]: dispatch
Feb 02 10:14:34 compute-1 ceph-mon[80115]: from='client.? 192.168.122.100:0/410682327' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail", "format": "json-pretty"}]: dispatch
Feb 02 10:14:34 compute-1 ceph-mon[80115]: from='client.? 192.168.122.100:0/3425981063' entity='client.admin' cmd=[{"prefix": "osd tree", "format": "json-pretty"}]: dispatch
Feb 02 10:14:35 compute-1 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "detail": "detail"} v 0)
Feb 02 10:14:35 compute-1 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/499973376' entity='client.admin' cmd=[{"prefix": "df", "detail": "detail"}]: dispatch
Feb 02 10:14:35 compute-1 ceph-mon[80115]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Feb 02 10:14:35 compute-1 ceph-mon[80115]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Feb 02 10:14:35 compute-1 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df"} v 0)
Feb 02 10:14:35 compute-1 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3657339990' entity='client.admin' cmd=[{"prefix": "df"}]: dispatch
Feb 02 10:14:35 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:14:35 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:14:35 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:14:35.677 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:14:35 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:14:35 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:14:35 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:14:35.684 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:14:35 compute-1 ceph-mon[80115]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd='sessions' args=[]: dispatch
Feb 02 10:14:35 compute-1 ceph-mon[80115]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd=sessions args=[]: finished
Feb 02 10:14:35 compute-1 ceph-mon[80115]: from='client.17793 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Feb 02 10:14:35 compute-1 ceph-mon[80115]: from='client.17817 -' entity='client.admin' cmd=[{"prefix": "orch status", "detail": true, "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Feb 02 10:14:35 compute-1 ceph-mon[80115]: from='client.27499 -' entity='client.admin' cmd=[{"prefix": "device ls", "target": ["mon-mgr", ""]}]: dispatch
Feb 02 10:14:35 compute-1 ceph-mon[80115]: pgmap v1055: 353 pgs: 353 active+clean; 41 MiB data, 310 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Feb 02 10:14:35 compute-1 ceph-mon[80115]: from='client.27479 -' entity='client.admin' cmd=[{"prefix": "device ls", "target": ["mon-mgr", ""]}]: dispatch
Feb 02 10:14:35 compute-1 ceph-mon[80115]: from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Feb 02 10:14:35 compute-1 ceph-mon[80115]: from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Feb 02 10:14:35 compute-1 ceph-mon[80115]: from='client.? 192.168.122.102:0/422454412' entity='client.admin' cmd=[{"prefix": "df", "detail": "detail"}]: dispatch
Feb 02 10:14:35 compute-1 ceph-mon[80115]: from='admin socket' entity='admin socket' cmd='sessions' args=[]: dispatch
Feb 02 10:14:35 compute-1 ceph-mon[80115]: from='admin socket' entity='admin socket' cmd=sessions args=[]: finished
Feb 02 10:14:35 compute-1 ceph-mon[80115]: from='client.? 192.168.122.101:0/499973376' entity='client.admin' cmd=[{"prefix": "df", "detail": "detail"}]: dispatch
Feb 02 10:14:35 compute-1 ceph-mon[80115]: from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Feb 02 10:14:35 compute-1 ceph-mon[80115]: from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Feb 02 10:14:35 compute-1 ceph-mon[80115]: from='admin socket' entity='admin socket' cmd='sessions' args=[]: dispatch
Feb 02 10:14:35 compute-1 ceph-mon[80115]: from='admin socket' entity='admin socket' cmd=sessions args=[]: finished
Feb 02 10:14:35 compute-1 ceph-mon[80115]: from='client.? 192.168.122.102:0/4222841602' entity='client.admin' cmd=[{"prefix": "df"}]: dispatch
Feb 02 10:14:35 compute-1 ceph-mon[80115]: from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Feb 02 10:14:35 compute-1 ceph-mon[80115]: from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Feb 02 10:14:35 compute-1 ceph-mon[80115]: from='client.? 192.168.122.101:0/3657339990' entity='client.admin' cmd=[{"prefix": "df"}]: dispatch
Feb 02 10:14:35 compute-1 ceph-mon[80115]: from='admin socket' entity='admin socket' cmd='sessions' args=[]: dispatch
Feb 02 10:14:35 compute-1 ceph-mon[80115]: from='admin socket' entity='admin socket' cmd=sessions args=[]: finished
Feb 02 10:14:35 compute-1 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "fs dump"} v 0)
Feb 02 10:14:35 compute-1 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/923413145' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch
Feb 02 10:14:36 compute-1 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "fs ls"} v 0)
Feb 02 10:14:36 compute-1 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/610912574' entity='client.admin' cmd=[{"prefix": "fs ls"}]: dispatch
Feb 02 10:14:36 compute-1 nova_compute[226294]: 2026-02-02 10:14:36.805 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:14:36 compute-1 nova_compute[226294]: 2026-02-02 10:14:36.808 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:14:36 compute-1 ceph-mon[80115]: from='client.17835 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Feb 02 10:14:36 compute-1 ceph-mon[80115]: from='client.? 192.168.122.102:0/4019379931' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch
Feb 02 10:14:36 compute-1 ceph-mon[80115]: from='client.? 192.168.122.101:0/923413145' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch
Feb 02 10:14:36 compute-1 ceph-mon[80115]: from='client.? 192.168.122.100:0/3529115186' entity='client.admin' cmd=[{"prefix": "config dump"}]: dispatch
Feb 02 10:14:36 compute-1 ceph-mon[80115]: from='client.? 192.168.122.102:0/295696509' entity='client.admin' cmd=[{"prefix": "fs ls"}]: dispatch
Feb 02 10:14:36 compute-1 ceph-mon[80115]: from='client.? 192.168.122.101:0/610912574' entity='client.admin' cmd=[{"prefix": "fs ls"}]: dispatch
Feb 02 10:14:36 compute-1 ceph-mon[80115]: from='client.17922 -' entity='client.admin' cmd=[{"prefix": "device ls", "target": ["mon-mgr", ""]}]: dispatch
Feb 02 10:14:36 compute-1 ceph-mon[80115]: pgmap v1056: 353 pgs: 353 active+clean; 41 MiB data, 310 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Feb 02 10:14:36 compute-1 ceph-mon[80115]: from='client.27586 -' entity='client.admin' cmd=[{"prefix": "fs status", "target": ["mon-mgr", ""]}]: dispatch
Feb 02 10:14:36 compute-1 ceph-mon[80115]: from='client.27563 -' entity='client.admin' cmd=[{"prefix": "fs status", "target": ["mon-mgr", ""]}]: dispatch
Feb 02 10:14:37 compute-1 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mds stat"} v 0)
Feb 02 10:14:37 compute-1 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/891695340' entity='client.admin' cmd=[{"prefix": "mds stat"}]: dispatch
Feb 02 10:14:37 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 10:14:37 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:14:37 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 10:14:37 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:14:37.678 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 10:14:37 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:14:37 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:14:37 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:14:37.686 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:14:37 compute-1 ceph-mon[80115]: from='client.? 192.168.122.100:0/2290487711' entity='client.admin' cmd=[{"prefix": "df", "detail": "detail"}]: dispatch
Feb 02 10:14:37 compute-1 ceph-mon[80115]: from='client.? 192.168.122.102:0/3273556702' entity='client.admin' cmd=[{"prefix": "mds stat"}]: dispatch
Feb 02 10:14:37 compute-1 ceph-mon[80115]: from='client.? 192.168.122.101:0/891695340' entity='client.admin' cmd=[{"prefix": "mds stat"}]: dispatch
Feb 02 10:14:37 compute-1 ceph-mon[80115]: from='client.? 192.168.122.100:0/2241760720' entity='client.admin' cmd=[{"prefix": "df"}]: dispatch
Feb 02 10:14:37 compute-1 ceph-mon[80115]: from='client.? 192.168.122.102:0/4239155458' entity='client.admin' cmd=[{"prefix": "mon dump"}]: dispatch
Feb 02 10:14:37 compute-1 ceph-mon[80115]: from='client.? 192.168.122.101:0/1248462513' entity='client.admin' cmd=[{"prefix": "mon dump"}]: dispatch
Feb 02 10:14:37 compute-1 ceph-mon[80115]: from='client.? 192.168.122.100:0/3220927735' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch
Feb 02 10:14:38 compute-1 ovs-appctl[241966]: ovs|00001|daemon_unix|WARN|/var/run/openvswitch/ovs-monitor-ipsec.pid: open: No such file or directory
Feb 02 10:14:38 compute-1 ovs-appctl[241972]: ovs|00001|daemon_unix|WARN|/var/run/openvswitch/ovs-monitor-ipsec.pid: open: No such file or directory
Feb 02 10:14:38 compute-1 ovs-appctl[241982]: ovs|00001|daemon_unix|WARN|/var/run/openvswitch/ovs-monitor-ipsec.pid: open: No such file or directory
Feb 02 10:14:38 compute-1 ceph-mon[80115]: from='client.27622 -' entity='client.admin' cmd=[{"prefix": "osd blocked-by", "target": ["mon-mgr", ""]}]: dispatch
Feb 02 10:14:38 compute-1 ceph-mon[80115]: from='client.27625 -' entity='client.admin' cmd=[{"prefix": "osd blocked-by", "target": ["mon-mgr", ""]}]: dispatch
Feb 02 10:14:38 compute-1 ceph-mon[80115]: from='client.? 192.168.122.100:0/1983266021' entity='client.admin' cmd=[{"prefix": "fs ls"}]: dispatch
Feb 02 10:14:38 compute-1 ceph-mon[80115]: from='client.? 192.168.122.102:0/612674027' entity='client.admin' cmd=[{"prefix": "osd blocklist ls"}]: dispatch
Feb 02 10:14:38 compute-1 ceph-mon[80115]: from='client.? 192.168.122.101:0/477489850' entity='client.admin' cmd=[{"prefix": "osd blocklist ls"}]: dispatch
Feb 02 10:14:38 compute-1 ceph-mon[80115]: pgmap v1057: 353 pgs: 353 active+clean; 41 MiB data, 310 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Feb 02 10:14:38 compute-1 ceph-mon[80115]: from='client.17970 -' entity='client.admin' cmd=[{"prefix": "fs status", "target": ["mon-mgr", ""]}]: dispatch
Feb 02 10:14:38 compute-1 ceph-mon[80115]: from='client.27649 -' entity='client.admin' cmd=[{"prefix": "osd df", "output_method": "tree", "target": ["mon-mgr", ""]}]: dispatch
Feb 02 10:14:39 compute-1 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd dump"} v 0)
Feb 02 10:14:39 compute-1 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/996126505' entity='client.admin' cmd=[{"prefix": "osd dump"}]: dispatch
Feb 02 10:14:39 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:14:39 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:14:39 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:14:39.681 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:14:39 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:14:39 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 10:14:39 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:14:39.688 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 10:14:39 compute-1 ceph-mon[80115]: from='client.27626 -' entity='client.admin' cmd=[{"prefix": "osd df", "output_method": "tree", "target": ["mon-mgr", ""]}]: dispatch
Feb 02 10:14:39 compute-1 ceph-mon[80115]: from='client.27661 -' entity='client.admin' cmd=[{"prefix": "osd df", "target": ["mon-mgr", ""]}]: dispatch
Feb 02 10:14:39 compute-1 ceph-mon[80115]: from='client.? 192.168.122.100:0/1424509917' entity='client.admin' cmd=[{"prefix": "mds stat"}]: dispatch
Feb 02 10:14:39 compute-1 ceph-mon[80115]: from='client.27641 -' entity='client.admin' cmd=[{"prefix": "osd df", "target": ["mon-mgr", ""]}]: dispatch
Feb 02 10:14:39 compute-1 ceph-mon[80115]: from='client.? 192.168.122.102:0/4066344091' entity='client.admin' cmd=[{"prefix": "osd dump"}]: dispatch
Feb 02 10:14:39 compute-1 ceph-mon[80115]: from='client.? 192.168.122.100:0/3278998420' entity='client.admin' cmd=[{"prefix": "mon dump"}]: dispatch
Feb 02 10:14:39 compute-1 ceph-mon[80115]: from='client.? 192.168.122.101:0/996126505' entity='client.admin' cmd=[{"prefix": "osd dump"}]: dispatch
Feb 02 10:14:39 compute-1 ceph-mon[80115]: from='client.? 192.168.122.102:0/3175540847' entity='client.admin' cmd=[{"prefix": "osd numa-status"}]: dispatch
Feb 02 10:14:39 compute-1 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd numa-status"} v 0)
Feb 02 10:14:39 compute-1 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1974253353' entity='client.admin' cmd=[{"prefix": "osd numa-status"}]: dispatch
Feb 02 10:14:40 compute-1 podman[242880]: 2026-02-02 10:14:40.389120722 +0000 UTC m=+0.066911557 container health_status 1fb2696999ea15e9688144989093738543d3cef725f9986792d6dfd707aefbe2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:099d88ae13fa2b3409da5310cdcba7fa01d2c87a8bc98296299a57054b9a075e, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'db4758ee7523fe447444c4bd2b867b543b1eee4e3bbcf6676cd1b27bf6147d86-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:099d88ae13fa2b3409da5310cdcba7fa01d2c87a8bc98296299a57054b9a075e', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Feb 02 10:14:41 compute-1 sudo[243294]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Feb 02 10:14:41 compute-1 sudo[243294]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 10:14:41 compute-1 sudo[243294]: pam_unix(sudo:session): session closed for user root
Feb 02 10:14:41 compute-1 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd pool ls", "detail": "detail"} v 0)
Feb 02 10:14:41 compute-1 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/4142895348' entity='client.admin' cmd=[{"prefix": "osd pool ls", "detail": "detail"}]: dispatch
Feb 02 10:14:41 compute-1 ceph-mon[80115]: from='client.18009 -' entity='client.admin' cmd=[{"prefix": "osd blocked-by", "target": ["mon-mgr", ""]}]: dispatch
Feb 02 10:14:41 compute-1 ceph-mon[80115]: from='client.? 192.168.122.101:0/1974253353' entity='client.admin' cmd=[{"prefix": "osd numa-status"}]: dispatch
Feb 02 10:14:41 compute-1 ceph-mon[80115]: from='client.27677 -' entity='client.admin' cmd=[{"prefix": "osd perf", "target": ["mon-mgr", ""]}]: dispatch
Feb 02 10:14:41 compute-1 ceph-mon[80115]: from='client.27683 -' entity='client.admin' cmd=[{"prefix": "osd perf", "target": ["mon-mgr", ""]}]: dispatch
Feb 02 10:14:41 compute-1 ceph-mon[80115]: from='client.? 192.168.122.100:0/3707707503' entity='client.admin' cmd=[{"prefix": "osd blocklist ls"}]: dispatch
Feb 02 10:14:41 compute-1 ceph-mon[80115]: from='client.27697 -' entity='client.admin' cmd=[{"prefix": "osd pool autoscale-status", "target": ["mon-mgr", ""]}]: dispatch
Feb 02 10:14:41 compute-1 ceph-mon[80115]: pgmap v1058: 353 pgs: 353 active+clean; 41 MiB data, 310 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Feb 02 10:14:41 compute-1 ceph-mon[80115]: from='client.27695 -' entity='client.admin' cmd=[{"prefix": "osd pool autoscale-status", "target": ["mon-mgr", ""]}]: dispatch
Feb 02 10:14:41 compute-1 ceph-mon[80115]: from='client.18036 -' entity='client.admin' cmd=[{"prefix": "osd df", "output_method": "tree", "target": ["mon-mgr", ""]}]: dispatch
Feb 02 10:14:41 compute-1 ceph-mon[80115]: from='client.? 192.168.122.102:0/1427426924' entity='client.admin' cmd=[{"prefix": "osd pool ls", "detail": "detail"}]: dispatch
Feb 02 10:14:41 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:14:41 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:14:41 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:14:41.682 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:14:41 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:14:41 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:14:41 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:14:41.690 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:14:41 compute-1 nova_compute[226294]: 2026-02-02 10:14:41.806 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:14:41 compute-1 nova_compute[226294]: 2026-02-02 10:14:41.809 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:14:41 compute-1 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd stat"} v 0)
Feb 02 10:14:41 compute-1 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1782890941' entity='client.admin' cmd=[{"prefix": "osd stat"}]: dispatch
Feb 02 10:14:42 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 10:14:42 compute-1 ceph-mon[80115]: from='client.18045 -' entity='client.admin' cmd=[{"prefix": "osd df", "target": ["mon-mgr", ""]}]: dispatch
Feb 02 10:14:42 compute-1 ceph-mon[80115]: from='client.? 192.168.122.102:0/1480509594' entity='client.admin' cmd=[{"prefix": "osd stat"}]: dispatch
Feb 02 10:14:42 compute-1 ceph-mon[80115]: from='client.? 192.168.122.100:0/2058255725' entity='client.admin' cmd=[{"prefix": "osd dump"}]: dispatch
Feb 02 10:14:42 compute-1 ceph-mon[80115]: from='client.? 192.168.122.101:0/4142895348' entity='client.admin' cmd=[{"prefix": "osd pool ls", "detail": "detail"}]: dispatch
Feb 02 10:14:42 compute-1 ceph-mon[80115]: from='client.? 192.168.122.100:0/752273424' entity='client.admin' cmd=[{"prefix": "osd numa-status"}]: dispatch
Feb 02 10:14:42 compute-1 ceph-mon[80115]: from='client.? 192.168.122.101:0/1782890941' entity='client.admin' cmd=[{"prefix": "osd stat"}]: dispatch
Feb 02 10:14:43 compute-1 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "status"} v 0)
Feb 02 10:14:43 compute-1 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2681020612' entity='client.admin' cmd=[{"prefix": "status"}]: dispatch
Feb 02 10:14:43 compute-1 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "time-sync-status"} v 0)
Feb 02 10:14:43 compute-1 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2649633197' entity='client.admin' cmd=[{"prefix": "time-sync-status"}]: dispatch
Feb 02 10:14:43 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:14:43 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 10:14:43 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:14:43.684 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 10:14:43 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:14:43 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:14:43 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:14:43.692 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:14:43 compute-1 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "config dump", "format": "json-pretty"} v 0)
Feb 02 10:14:43 compute-1 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/342004016' entity='client.admin' cmd=[{"prefix": "config dump", "format": "json-pretty"}]: dispatch
Feb 02 10:14:44 compute-1 ceph-mon[80115]: from='client.27724 -' entity='client.admin' cmd=[{"prefix": "pg dump", "target": ["mon-mgr", ""]}]: dispatch
Feb 02 10:14:44 compute-1 ceph-mon[80115]: from='client.18069 -' entity='client.admin' cmd=[{"prefix": "osd perf", "target": ["mon-mgr", ""]}]: dispatch
Feb 02 10:14:44 compute-1 ceph-mon[80115]: from='client.27740 -' entity='client.admin' cmd=[{"prefix": "pg dump", "target": ["mon-mgr", ""]}]: dispatch
Feb 02 10:14:44 compute-1 ceph-mon[80115]: from='client.18075 -' entity='client.admin' cmd=[{"prefix": "pg stat", "target": ["mon-mgr", ""]}]: dispatch
Feb 02 10:14:44 compute-1 ceph-mon[80115]: pgmap v1059: 353 pgs: 353 active+clean; 41 MiB data, 310 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Feb 02 10:14:44 compute-1 ceph-mon[80115]: from='client.18081 -' entity='client.admin' cmd=[{"prefix": "osd pool autoscale-status", "target": ["mon-mgr", ""]}]: dispatch
Feb 02 10:14:44 compute-1 ceph-mon[80115]: from='client.27752 -' entity='client.admin' cmd=[{"prefix": "pg stat", "target": ["mon-mgr", ""]}]: dispatch
Feb 02 10:14:44 compute-1 ceph-mon[80115]: from='client.? 192.168.122.102:0/2819146735' entity='client.admin' cmd=[{"prefix": "status"}]: dispatch
Feb 02 10:14:44 compute-1 ceph-mon[80115]: from='client.? 192.168.122.100:0/4009326478' entity='client.admin' cmd=[{"prefix": "osd pool ls", "detail": "detail"}]: dispatch
Feb 02 10:14:44 compute-1 ceph-mon[80115]: from='client.? 192.168.122.101:0/2681020612' entity='client.admin' cmd=[{"prefix": "status"}]: dispatch
Feb 02 10:14:44 compute-1 ceph-mon[80115]: from='client.? 192.168.122.100:0/3720502581' entity='client.admin' cmd=[{"prefix": "osd stat"}]: dispatch
Feb 02 10:14:44 compute-1 ceph-mon[80115]: from='client.? 192.168.122.102:0/2821609970' entity='client.admin' cmd=[{"prefix": "time-sync-status"}]: dispatch
Feb 02 10:14:44 compute-1 ceph-mon[80115]: from='client.? 192.168.122.101:0/2649633197' entity='client.admin' cmd=[{"prefix": "time-sync-status"}]: dispatch
Feb 02 10:14:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 10:14:44.914 143542 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 02 10:14:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 10:14:44.914 143542 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 02 10:14:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 10:14:44.914 143542 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 02 10:14:45 compute-1 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "detail": "detail", "format": "json-pretty"} v 0)
Feb 02 10:14:45 compute-1 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/472295889' entity='client.admin' cmd=[{"prefix": "df", "detail": "detail", "format": "json-pretty"}]: dispatch
Feb 02 10:14:45 compute-1 ceph-mon[80115]: from='client.18123 -' entity='client.admin' cmd=[{"prefix": "pg dump", "target": ["mon-mgr", ""]}]: dispatch
Feb 02 10:14:45 compute-1 ceph-mon[80115]: from='client.? 192.168.122.102:0/2199196883' entity='client.admin' cmd=[{"prefix": "config dump", "format": "json-pretty"}]: dispatch
Feb 02 10:14:45 compute-1 ceph-mon[80115]: from='client.? 192.168.122.101:0/342004016' entity='client.admin' cmd=[{"prefix": "config dump", "format": "json-pretty"}]: dispatch
Feb 02 10:14:45 compute-1 ceph-mon[80115]: from='client.18132 -' entity='client.admin' cmd=[{"prefix": "pg stat", "target": ["mon-mgr", ""]}]: dispatch
Feb 02 10:14:45 compute-1 ceph-mon[80115]: from='client.27778 -' entity='client.admin' cmd=[{"prefix": "device ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Feb 02 10:14:45 compute-1 ceph-mon[80115]: from='client.27797 -' entity='client.admin' cmd=[{"prefix": "device ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Feb 02 10:14:45 compute-1 ceph-mon[80115]: pgmap v1060: 353 pgs: 353 active+clean; 41 MiB data, 310 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Feb 02 10:14:45 compute-1 ceph-mon[80115]: from='client.? 192.168.122.100:0/3357825108' entity='client.admin' cmd=[{"prefix": "status"}]: dispatch
Feb 02 10:14:45 compute-1 ceph-mon[80115]: from='client.? 192.168.122.102:0/2601720369' entity='client.admin' cmd=[{"prefix": "df", "detail": "detail", "format": "json-pretty"}]: dispatch
Feb 02 10:14:45 compute-1 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json-pretty"} v 0)
Feb 02 10:14:45 compute-1 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3468868603' entity='client.admin' cmd=[{"prefix": "df", "format": "json-pretty"}]: dispatch
Feb 02 10:14:45 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:14:45 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:14:45 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:14:45.687 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:14:45 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:14:45 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:14:45 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:14:45.694 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:14:45 compute-1 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "fs dump", "format": "json-pretty"} v 0)
Feb 02 10:14:45 compute-1 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3233536469' entity='client.admin' cmd=[{"prefix": "fs dump", "format": "json-pretty"}]: dispatch
Feb 02 10:14:46 compute-1 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "fs ls", "format": "json-pretty"} v 0)
Feb 02 10:14:46 compute-1 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1057507925' entity='client.admin' cmd=[{"prefix": "fs ls", "format": "json-pretty"}]: dispatch
Feb 02 10:14:46 compute-1 ceph-mon[80115]: from='client.? 192.168.122.101:0/472295889' entity='client.admin' cmd=[{"prefix": "df", "detail": "detail", "format": "json-pretty"}]: dispatch
Feb 02 10:14:46 compute-1 ceph-mon[80115]: from='client.? 192.168.122.100:0/1863452279' entity='client.admin' cmd=[{"prefix": "time-sync-status"}]: dispatch
Feb 02 10:14:46 compute-1 ceph-mon[80115]: from='client.? 192.168.122.102:0/3001358093' entity='client.admin' cmd=[{"prefix": "df", "format": "json-pretty"}]: dispatch
Feb 02 10:14:46 compute-1 ceph-mon[80115]: from='client.? 192.168.122.101:0/3468868603' entity='client.admin' cmd=[{"prefix": "df", "format": "json-pretty"}]: dispatch
Feb 02 10:14:46 compute-1 ceph-mon[80115]: from='client.? 192.168.122.102:0/1163746050' entity='client.admin' cmd=[{"prefix": "fs dump", "format": "json-pretty"}]: dispatch
Feb 02 10:14:46 compute-1 ceph-mon[80115]: from='client.? 192.168.122.101:0/3233536469' entity='client.admin' cmd=[{"prefix": "fs dump", "format": "json-pretty"}]: dispatch
Feb 02 10:14:46 compute-1 ceph-mon[80115]: from='client.? 192.168.122.100:0/1925975289' entity='client.admin' cmd=[{"prefix": "config dump", "format": "json-pretty"}]: dispatch
Feb 02 10:14:46 compute-1 ceph-mon[80115]: from='client.? 192.168.122.102:0/53969991' entity='client.admin' cmd=[{"prefix": "fs ls", "format": "json-pretty"}]: dispatch
Feb 02 10:14:46 compute-1 ceph-mon[80115]: from='client.? 192.168.122.101:0/1057507925' entity='client.admin' cmd=[{"prefix": "fs ls", "format": "json-pretty"}]: dispatch
Feb 02 10:14:46 compute-1 nova_compute[226294]: 2026-02-02 10:14:46.807 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:14:46 compute-1 nova_compute[226294]: 2026-02-02 10:14:46.811 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:14:47 compute-1 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mds stat", "format": "json-pretty"} v 0)
Feb 02 10:14:47 compute-1 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/603170672' entity='client.admin' cmd=[{"prefix": "mds stat", "format": "json-pretty"}]: dispatch
Feb 02 10:14:47 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 10:14:47 compute-1 ceph-mon[80115]: from='client.18186 -' entity='client.admin' cmd=[{"prefix": "device ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Feb 02 10:14:47 compute-1 ceph-mon[80115]: pgmap v1061: 353 pgs: 353 active+clean; 41 MiB data, 310 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Feb 02 10:14:47 compute-1 ceph-mon[80115]: from='client.27823 -' entity='client.admin' cmd=[{"prefix": "fs status", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Feb 02 10:14:47 compute-1 ceph-mon[80115]: from='client.27854 -' entity='client.admin' cmd=[{"prefix": "fs status", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Feb 02 10:14:47 compute-1 ceph-mon[80115]: from='client.? 192.168.122.100:0/1423557611' entity='client.admin' cmd=[{"prefix": "df", "detail": "detail", "format": "json-pretty"}]: dispatch
Feb 02 10:14:47 compute-1 ceph-mon[80115]: from='client.? 192.168.122.102:0/1094621728' entity='client.admin' cmd=[{"prefix": "mds stat", "format": "json-pretty"}]: dispatch
Feb 02 10:14:47 compute-1 ceph-mon[80115]: from='client.? 192.168.122.101:0/603170672' entity='client.admin' cmd=[{"prefix": "mds stat", "format": "json-pretty"}]: dispatch
Feb 02 10:14:47 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Feb 02 10:14:47 compute-1 ceph-mon[80115]: from='client.? 192.168.122.100:0/2291812909' entity='client.admin' cmd=[{"prefix": "df", "format": "json-pretty"}]: dispatch
Feb 02 10:14:47 compute-1 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json-pretty"} v 0)
Feb 02 10:14:47 compute-1 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2864374764' entity='client.admin' cmd=[{"prefix": "mon dump", "format": "json-pretty"}]: dispatch
Feb 02 10:14:47 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:14:47 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:14:47 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:14:47.689 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:14:47 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:14:47 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:14:47 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:14:47.696 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:14:48 compute-1 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd blocklist ls", "format": "json-pretty"} v 0)
Feb 02 10:14:48 compute-1 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3240901568' entity='client.admin' cmd=[{"prefix": "osd blocklist ls", "format": "json-pretty"}]: dispatch
Feb 02 10:14:48 compute-1 podman[243897]: 2026-02-02 10:14:48.459874682 +0000 UTC m=+0.048589707 container health_status 9cab238676dda9a572dafc047b624a8b78a8fbec063bf997856559897160605d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'db4758ee7523fe447444c4bd2b867b543b1eee4e3bbcf6676cd1b27bf6147d86-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_managed=true)
Feb 02 10:14:48 compute-1 ceph-mon[80115]: from='client.? 192.168.122.102:0/1136564812' entity='client.admin' cmd=[{"prefix": "mon dump", "format": "json-pretty"}]: dispatch
Feb 02 10:14:48 compute-1 ceph-mon[80115]: from='client.? 192.168.122.101:0/2864374764' entity='client.admin' cmd=[{"prefix": "mon dump", "format": "json-pretty"}]: dispatch
Feb 02 10:14:48 compute-1 ceph-mon[80115]: from='client.? 192.168.122.100:0/2718334318' entity='client.admin' cmd=[{"prefix": "fs dump", "format": "json-pretty"}]: dispatch
Feb 02 10:14:48 compute-1 ceph-mon[80115]: from='client.? 192.168.122.100:0/2919304723' entity='client.admin' cmd=[{"prefix": "fs ls", "format": "json-pretty"}]: dispatch
Feb 02 10:14:48 compute-1 ceph-mon[80115]: from='client.? 192.168.122.102:0/2431639728' entity='client.admin' cmd=[{"prefix": "osd blocklist ls", "format": "json-pretty"}]: dispatch
Feb 02 10:14:48 compute-1 ceph-mon[80115]: from='client.? 192.168.122.101:0/3240901568' entity='client.admin' cmd=[{"prefix": "osd blocklist ls", "format": "json-pretty"}]: dispatch
Feb 02 10:14:48 compute-1 virtqemud[225988]: Failed to connect socket to '/var/run/libvirt/virtstoraged-sock-ro': No such file or directory
Feb 02 10:14:49 compute-1 systemd[1]: Starting Time & Date Service...
Feb 02 10:14:49 compute-1 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd dump", "format": "json-pretty"} v 0)
Feb 02 10:14:49 compute-1 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2824407657' entity='client.admin' cmd=[{"prefix": "osd dump", "format": "json-pretty"}]: dispatch
Feb 02 10:14:49 compute-1 systemd[1]: Started Time & Date Service.
Feb 02 10:14:49 compute-1 ceph-mon[80115]: from='client.27859 -' entity='client.admin' cmd=[{"prefix": "osd blocked-by", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Feb 02 10:14:49 compute-1 ceph-mon[80115]: from='client.27887 -' entity='client.admin' cmd=[{"prefix": "osd blocked-by", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Feb 02 10:14:49 compute-1 ceph-mon[80115]: pgmap v1062: 353 pgs: 353 active+clean; 41 MiB data, 310 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Feb 02 10:14:49 compute-1 ceph-mon[80115]: from='client.18237 -' entity='client.admin' cmd=[{"prefix": "fs status", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Feb 02 10:14:49 compute-1 ceph-mon[80115]: from='client.27880 -' entity='client.admin' cmd=[{"prefix": "osd df", "output_method": "tree", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Feb 02 10:14:49 compute-1 ceph-mon[80115]: from='client.27908 -' entity='client.admin' cmd=[{"prefix": "osd df", "output_method": "tree", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Feb 02 10:14:49 compute-1 ceph-mon[80115]: from='client.? 192.168.122.100:0/3905466608' entity='client.admin' cmd=[{"prefix": "mds stat", "format": "json-pretty"}]: dispatch
Feb 02 10:14:49 compute-1 ceph-mon[80115]: from='client.? 192.168.122.102:0/2366253068' entity='client.admin' cmd=[{"prefix": "osd dump", "format": "json-pretty"}]: dispatch
Feb 02 10:14:49 compute-1 ceph-mon[80115]: from='client.? 192.168.122.101:0/2824407657' entity='client.admin' cmd=[{"prefix": "osd dump", "format": "json-pretty"}]: dispatch
Feb 02 10:14:49 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:14:49 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:14:49 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:14:49.691 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:14:49 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:14:49 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:14:49 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:14:49.698 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:14:50 compute-1 ceph-mon[80115]: from='client.27892 -' entity='client.admin' cmd=[{"prefix": "osd df", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Feb 02 10:14:50 compute-1 ceph-mon[80115]: from='client.27923 -' entity='client.admin' cmd=[{"prefix": "osd df", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Feb 02 10:14:50 compute-1 ceph-mon[80115]: from='client.? 192.168.122.100:0/806713623' entity='client.admin' cmd=[{"prefix": "mon dump", "format": "json-pretty"}]: dispatch
Feb 02 10:14:50 compute-1 ceph-mon[80115]: from='client.? 192.168.122.102:0/3576855869' entity='client.admin' cmd=[{"prefix": "osd numa-status", "format": "json-pretty"}]: dispatch
Feb 02 10:14:50 compute-1 ceph-mon[80115]: from='client.? 192.168.122.101:0/2696044831' entity='client.admin' cmd=[{"prefix": "osd numa-status", "format": "json-pretty"}]: dispatch
Feb 02 10:14:50 compute-1 ceph-mon[80115]: from='client.? 192.168.122.100:0/3280711553' entity='client.admin' cmd=[{"prefix": "osd blocklist ls", "format": "json-pretty"}]: dispatch
Feb 02 10:14:50 compute-1 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd pool ls", "detail": "detail", "format": "json-pretty"} v 0)
Feb 02 10:14:50 compute-1 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1527859638' entity='client.admin' cmd=[{"prefix": "osd pool ls", "detail": "detail", "format": "json-pretty"}]: dispatch
Feb 02 10:14:51 compute-1 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd stat", "format": "json-pretty"} v 0)
Feb 02 10:14:51 compute-1 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/218288169' entity='client.admin' cmd=[{"prefix": "osd stat", "format": "json-pretty"}]: dispatch
Feb 02 10:14:51 compute-1 ceph-mon[80115]: from='client.18285 -' entity='client.admin' cmd=[{"prefix": "osd blocked-by", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Feb 02 10:14:51 compute-1 ceph-mon[80115]: from='client.27928 -' entity='client.admin' cmd=[{"prefix": "osd perf", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Feb 02 10:14:51 compute-1 ceph-mon[80115]: from='client.27931 -' entity='client.admin' cmd=[{"prefix": "osd perf", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Feb 02 10:14:51 compute-1 ceph-mon[80115]: from='client.27937 -' entity='client.admin' cmd=[{"prefix": "osd pool autoscale-status", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Feb 02 10:14:51 compute-1 ceph-mon[80115]: from='client.27956 -' entity='client.admin' cmd=[{"prefix": "osd pool autoscale-status", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Feb 02 10:14:51 compute-1 ceph-mon[80115]: pgmap v1063: 353 pgs: 353 active+clean; 41 MiB data, 310 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Feb 02 10:14:51 compute-1 ceph-mon[80115]: from='client.? 192.168.122.102:0/3994556235' entity='client.admin' cmd=[{"prefix": "osd pool ls", "detail": "detail", "format": "json-pretty"}]: dispatch
Feb 02 10:14:51 compute-1 ceph-mon[80115]: from='client.18309 -' entity='client.admin' cmd=[{"prefix": "osd df", "output_method": "tree", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Feb 02 10:14:51 compute-1 ceph-mon[80115]: from='client.? 192.168.122.101:0/1527859638' entity='client.admin' cmd=[{"prefix": "osd pool ls", "detail": "detail", "format": "json-pretty"}]: dispatch
Feb 02 10:14:51 compute-1 ceph-mon[80115]: from='client.? 192.168.122.102:0/1138997752' entity='client.admin' cmd=[{"prefix": "osd stat", "format": "json-pretty"}]: dispatch
Feb 02 10:14:51 compute-1 ceph-mon[80115]: from='client.? 192.168.122.101:0/218288169' entity='client.admin' cmd=[{"prefix": "osd stat", "format": "json-pretty"}]: dispatch
Feb 02 10:14:51 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:14:51 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:14:51 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:14:51.692 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:14:51 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:14:51 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:14:51 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:14:51.700 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:14:51 compute-1 nova_compute[226294]: 2026-02-02 10:14:51.808 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:14:51 compute-1 nova_compute[226294]: 2026-02-02 10:14:51.811 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:14:52 compute-1 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "status", "format": "json-pretty"} v 0)
Feb 02 10:14:52 compute-1 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3301089356' entity='client.admin' cmd=[{"prefix": "status", "format": "json-pretty"}]: dispatch
Feb 02 10:14:52 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 10:14:52 compute-1 ceph-mon[80115]: from='client.27965 -' entity='client.admin' cmd=[{"prefix": "osd df", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Feb 02 10:14:52 compute-1 ceph-mon[80115]: from='client.27964 -' entity='client.admin' cmd=[{"prefix": "pg dump", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Feb 02 10:14:52 compute-1 ceph-mon[80115]: from='client.27980 -' entity='client.admin' cmd=[{"prefix": "pg dump", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Feb 02 10:14:52 compute-1 ceph-mon[80115]: from='client.? 192.168.122.100:0/2449344092' entity='client.admin' cmd=[{"prefix": "osd dump", "format": "json-pretty"}]: dispatch
Feb 02 10:14:52 compute-1 ceph-mon[80115]: from='client.? 192.168.122.100:0/3134255602' entity='client.admin' cmd=[{"prefix": "osd numa-status", "format": "json-pretty"}]: dispatch
Feb 02 10:14:52 compute-1 ceph-mon[80115]: from='client.? 192.168.122.101:0/3301089356' entity='client.admin' cmd=[{"prefix": "status", "format": "json-pretty"}]: dispatch
Feb 02 10:14:52 compute-1 ceph-mon[80115]: from='client.? 192.168.122.102:0/3010295768' entity='client.admin' cmd=[{"prefix": "status", "format": "json-pretty"}]: dispatch
Feb 02 10:14:52 compute-1 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "time-sync-status", "format": "json-pretty"} v 0)
Feb 02 10:14:52 compute-1 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2848879131' entity='client.admin' cmd=[{"prefix": "time-sync-status", "format": "json-pretty"}]: dispatch
Feb 02 10:14:53 compute-1 ceph-mon[80115]: from='client.27986 -' entity='client.admin' cmd=[{"prefix": "pg stat", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Feb 02 10:14:53 compute-1 ceph-mon[80115]: from='client.27973 -' entity='client.admin' cmd=[{"prefix": "pg stat", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Feb 02 10:14:53 compute-1 ceph-mon[80115]: from='client.18357 -' entity='client.admin' cmd=[{"prefix": "osd perf", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Feb 02 10:14:53 compute-1 ceph-mon[80115]: pgmap v1064: 353 pgs: 353 active+clean; 41 MiB data, 310 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Feb 02 10:14:53 compute-1 ceph-mon[80115]: from='client.18366 -' entity='client.admin' cmd=[{"prefix": "osd pool autoscale-status", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Feb 02 10:14:53 compute-1 ceph-mon[80115]: from='client.? 192.168.122.101:0/2848879131' entity='client.admin' cmd=[{"prefix": "time-sync-status", "format": "json-pretty"}]: dispatch
Feb 02 10:14:53 compute-1 ceph-mon[80115]: from='client.? 192.168.122.102:0/883416121' entity='client.admin' cmd=[{"prefix": "time-sync-status", "format": "json-pretty"}]: dispatch
Feb 02 10:14:53 compute-1 ceph-mon[80115]: from='client.? 192.168.122.100:0/1316092978' entity='client.admin' cmd=[{"prefix": "osd pool ls", "detail": "detail", "format": "json-pretty"}]: dispatch
Feb 02 10:14:53 compute-1 ceph-mon[80115]: from='client.? 192.168.122.100:0/159882353' entity='client.admin' cmd=[{"prefix": "osd stat", "format": "json-pretty"}]: dispatch
Feb 02 10:14:53 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:14:53 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:14:53 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:14:53.694 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:14:53 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:14:53 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 10:14:53 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:14:53.702 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 10:14:55 compute-1 ceph-mon[80115]: from='client.18393 -' entity='client.admin' cmd=[{"prefix": "pg dump", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Feb 02 10:14:55 compute-1 ceph-mon[80115]: from='client.18399 -' entity='client.admin' cmd=[{"prefix": "pg stat", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Feb 02 10:14:55 compute-1 ceph-mon[80115]: pgmap v1065: 353 pgs: 353 active+clean; 41 MiB data, 310 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Feb 02 10:14:55 compute-1 ceph-mon[80115]: from='client.? 192.168.122.100:0/3895016293' entity='client.admin' cmd=[{"prefix": "status", "format": "json-pretty"}]: dispatch
Feb 02 10:14:55 compute-1 ceph-mon[80115]: from='client.? 192.168.122.10:0/3732477402' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Feb 02 10:14:55 compute-1 ceph-mon[80115]: from='client.? 192.168.122.10:0/3732477402' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Feb 02 10:14:55 compute-1 ceph-mon[80115]: from='client.? 192.168.122.100:0/2205610481' entity='client.admin' cmd=[{"prefix": "time-sync-status", "format": "json-pretty"}]: dispatch
Feb 02 10:14:55 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:14:55 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:14:55 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:14:55.697 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:14:55 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:14:55 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:14:55 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:14:55.705 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:14:56 compute-1 nova_compute[226294]: 2026-02-02 10:14:56.811 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:14:57 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 10:14:57 compute-1 ceph-mon[80115]: pgmap v1066: 353 pgs: 353 active+clean; 41 MiB data, 310 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Feb 02 10:14:57 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:14:57 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 10:14:57 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:14:57.699 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 10:14:57 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:14:57 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:14:57 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:14:57.707 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:14:59 compute-1 ceph-mon[80115]: pgmap v1067: 353 pgs: 353 active+clean; 41 MiB data, 310 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Feb 02 10:14:59 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:14:59 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 10:14:59 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:14:59.702 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 10:14:59 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:14:59 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 10:14:59 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:14:59.709 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 10:15:01 compute-1 sudo[244601]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Feb 02 10:15:01 compute-1 sudo[244601]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 10:15:01 compute-1 sudo[244601]: pam_unix(sudo:session): session closed for user root
Feb 02 10:15:01 compute-1 ceph-mon[80115]: pgmap v1068: 353 pgs: 353 active+clean; 41 MiB data, 310 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Feb 02 10:15:01 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:15:01 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:15:01 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:15:01.706 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:15:01 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:15:01 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:15:01 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:15:01.712 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:15:01 compute-1 nova_compute[226294]: 2026-02-02 10:15:01.812 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:15:02 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 10:15:02 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Feb 02 10:15:03 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:15:03 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:15:03 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:15:03.708 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:15:03 compute-1 ceph-mon[80115]: pgmap v1069: 353 pgs: 353 active+clean; 41 MiB data, 310 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Feb 02 10:15:03 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:15:03 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 10:15:03 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:15:03.714 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 10:15:05 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:15:05 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:15:05 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:15:05.710 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:15:05 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:15:05 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:15:05 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:15:05.718 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:15:05 compute-1 ceph-mon[80115]: pgmap v1070: 353 pgs: 353 active+clean; 41 MiB data, 310 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Feb 02 10:15:06 compute-1 nova_compute[226294]: 2026-02-02 10:15:06.814 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:15:07 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 10:15:07 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:15:07 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:15:07 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:15:07.711 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:15:07 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:15:07 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 10:15:07 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:15:07.719 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 10:15:07 compute-1 ceph-mon[80115]: pgmap v1071: 353 pgs: 353 active+clean; 41 MiB data, 310 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Feb 02 10:15:09 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:15:09 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 10:15:09 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:15:09.713 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 10:15:09 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:15:09 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:15:09 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:15:09.722 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:15:09 compute-1 ceph-mon[80115]: pgmap v1072: 353 pgs: 353 active+clean; 41 MiB data, 310 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Feb 02 10:15:10 compute-1 ceph-mon[80115]: pgmap v1073: 353 pgs: 353 active+clean; 41 MiB data, 310 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Feb 02 10:15:11 compute-1 podman[244631]: 2026-02-02 10:15:11.447948966 +0000 UTC m=+0.125427379 container health_status 1fb2696999ea15e9688144989093738543d3cef725f9986792d6dfd707aefbe2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:099d88ae13fa2b3409da5310cdcba7fa01d2c87a8bc98296299a57054b9a075e, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'db4758ee7523fe447444c4bd2b867b543b1eee4e3bbcf6676cd1b27bf6147d86-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:099d88ae13fa2b3409da5310cdcba7fa01d2c87a8bc98296299a57054b9a075e', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Feb 02 10:15:11 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:15:11 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 10:15:11 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:15:11.716 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 10:15:11 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:15:11 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:15:11 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:15:11.725 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:15:11 compute-1 nova_compute[226294]: 2026-02-02 10:15:11.816 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:15:12 compute-1 sshd-session[244656]: Invalid user solv from 80.94.92.184 port 59426
Feb 02 10:15:12 compute-1 sshd-session[244656]: Connection closed by invalid user solv 80.94.92.184 port 59426 [preauth]
Feb 02 10:15:12 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 10:15:13 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:15:13 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 10:15:13 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:15:13.727 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 10:15:14 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:15:14 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 10:15:14 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:15:14.072 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 10:15:14 compute-1 ceph-mon[80115]: pgmap v1074: 353 pgs: 353 active+clean; 41 MiB data, 310 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Feb 02 10:15:14 compute-1 ceph-mon[80115]: from='client.? 192.168.122.100:0/4029781014' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Feb 02 10:15:14 compute-1 ceph-mon[80115]: from='client.? 192.168.122.100:0/3992078019' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Feb 02 10:15:14 compute-1 nova_compute[226294]: 2026-02-02 10:15:14.649 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 02 10:15:15 compute-1 ceph-mon[80115]: pgmap v1075: 353 pgs: 353 active+clean; 41 MiB data, 310 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Feb 02 10:15:15 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:15:15 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:15:15 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:15:15.731 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:15:16 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:15:16 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 10:15:16 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:15:16.074 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 10:15:16 compute-1 sudo[244663]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 02 10:15:16 compute-1 sudo[244663]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 10:15:16 compute-1 sudo[244663]: pam_unix(sudo:session): session closed for user root
Feb 02 10:15:16 compute-1 ceph-mon[80115]: from='client.? 192.168.122.102:0/3460437424' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Feb 02 10:15:16 compute-1 ceph-mon[80115]: from='client.? 192.168.122.102:0/3814130275' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Feb 02 10:15:16 compute-1 sudo[244688]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/d241d473-9fcb-5f74-b163-f1ca4454e7f1/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Feb 02 10:15:16 compute-1 sudo[244688]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 10:15:16 compute-1 nova_compute[226294]: 2026-02-02 10:15:16.650 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 02 10:15:16 compute-1 nova_compute[226294]: 2026-02-02 10:15:16.651 226298 DEBUG nova.compute.manager [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 02 10:15:16 compute-1 nova_compute[226294]: 2026-02-02 10:15:16.651 226298 DEBUG nova.compute.manager [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 02 10:15:16 compute-1 nova_compute[226294]: 2026-02-02 10:15:16.676 226298 DEBUG nova.compute.manager [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Feb 02 10:15:16 compute-1 nova_compute[226294]: 2026-02-02 10:15:16.676 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 02 10:15:16 compute-1 nova_compute[226294]: 2026-02-02 10:15:16.677 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 02 10:15:16 compute-1 sudo[244688]: pam_unix(sudo:session): session closed for user root
Feb 02 10:15:16 compute-1 nova_compute[226294]: 2026-02-02 10:15:16.816 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:15:16 compute-1 nova_compute[226294]: 2026-02-02 10:15:16.818 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:15:17 compute-1 ceph-mon[80115]: pgmap v1076: 353 pgs: 353 active+clean; 41 MiB data, 310 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Feb 02 10:15:17 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Feb 02 10:15:17 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Feb 02 10:15:17 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb 02 10:15:17 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb 02 10:15:17 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Feb 02 10:15:17 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Feb 02 10:15:17 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Feb 02 10:15:17 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Feb 02 10:15:17 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 10:15:17 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:15:17 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 10:15:17 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:15:17.733 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 10:15:18 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:15:18 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:15:18 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:15:18.077 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:15:18 compute-1 ceph-mon[80115]: pgmap v1077: 353 pgs: 353 active+clean; 41 MiB data, 310 MiB used, 60 GiB / 60 GiB avail; 597 B/s rd, 0 op/s
Feb 02 10:15:18 compute-1 nova_compute[226294]: 2026-02-02 10:15:18.648 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 02 10:15:19 compute-1 podman[244748]: 2026-02-02 10:15:19.417171886 +0000 UTC m=+0.081618620 container health_status 9cab238676dda9a572dafc047b624a8b78a8fbec063bf997856559897160605d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'db4758ee7523fe447444c4bd2b867b543b1eee4e3bbcf6676cd1b27bf6147d86-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent)
Feb 02 10:15:19 compute-1 systemd[1]: systemd-timedated.service: Deactivated successfully.
Feb 02 10:15:19 compute-1 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Feb 02 10:15:19 compute-1 nova_compute[226294]: 2026-02-02 10:15:19.644 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 02 10:15:19 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:15:19 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 10:15:19 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:15:19.736 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 10:15:20 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:15:20 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 10:15:20 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:15:20.079 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 10:15:20 compute-1 ceph-mon[80115]: pgmap v1078: 353 pgs: 353 active+clean; 41 MiB data, 310 MiB used, 60 GiB / 60 GiB avail; 597 B/s rd, 0 op/s
Feb 02 10:15:21 compute-1 sudo[244774]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Feb 02 10:15:21 compute-1 sudo[244774]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 10:15:21 compute-1 sudo[244774]: pam_unix(sudo:session): session closed for user root
Feb 02 10:15:21 compute-1 nova_compute[226294]: 2026-02-02 10:15:21.648 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 02 10:15:21 compute-1 nova_compute[226294]: 2026-02-02 10:15:21.649 226298 DEBUG nova.compute.manager [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 02 10:15:21 compute-1 nova_compute[226294]: 2026-02-02 10:15:21.649 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 02 10:15:21 compute-1 nova_compute[226294]: 2026-02-02 10:15:21.682 226298 DEBUG oslo_concurrency.lockutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 02 10:15:21 compute-1 nova_compute[226294]: 2026-02-02 10:15:21.683 226298 DEBUG oslo_concurrency.lockutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 02 10:15:21 compute-1 nova_compute[226294]: 2026-02-02 10:15:21.683 226298 DEBUG oslo_concurrency.lockutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 02 10:15:21 compute-1 nova_compute[226294]: 2026-02-02 10:15:21.684 226298 DEBUG nova.compute.resource_tracker [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 02 10:15:21 compute-1 nova_compute[226294]: 2026-02-02 10:15:21.684 226298 DEBUG oslo_concurrency.processutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 02 10:15:21 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:15:21 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 10:15:21 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:15:21.738 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 10:15:21 compute-1 sudo[244800]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 02 10:15:21 compute-1 sudo[244800]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 10:15:21 compute-1 sudo[244800]: pam_unix(sudo:session): session closed for user root
Feb 02 10:15:21 compute-1 nova_compute[226294]: 2026-02-02 10:15:21.818 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:15:22 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:15:22 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:15:22 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:15:22.082 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:15:22 compute-1 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Feb 02 10:15:22 compute-1 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/958922429' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Feb 02 10:15:22 compute-1 nova_compute[226294]: 2026-02-02 10:15:22.135 226298 DEBUG oslo_concurrency.processutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.451s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 02 10:15:22 compute-1 nova_compute[226294]: 2026-02-02 10:15:22.289 226298 WARNING nova.virt.libvirt.driver [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 02 10:15:22 compute-1 nova_compute[226294]: 2026-02-02 10:15:22.290 226298 DEBUG nova.compute.resource_tracker [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4686MB free_disk=59.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 02 10:15:22 compute-1 nova_compute[226294]: 2026-02-02 10:15:22.290 226298 DEBUG oslo_concurrency.lockutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 02 10:15:22 compute-1 nova_compute[226294]: 2026-02-02 10:15:22.291 226298 DEBUG oslo_concurrency.lockutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 02 10:15:22 compute-1 nova_compute[226294]: 2026-02-02 10:15:22.361 226298 DEBUG nova.compute.resource_tracker [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 02 10:15:22 compute-1 nova_compute[226294]: 2026-02-02 10:15:22.362 226298 DEBUG nova.compute.resource_tracker [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 02 10:15:22 compute-1 nova_compute[226294]: 2026-02-02 10:15:22.389 226298 DEBUG oslo_concurrency.processutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 02 10:15:22 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 10:15:22 compute-1 ceph-mon[80115]: pgmap v1079: 353 pgs: 353 active+clean; 41 MiB data, 310 MiB used, 60 GiB / 60 GiB avail; 597 B/s rd, 0 op/s
Feb 02 10:15:22 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb 02 10:15:22 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb 02 10:15:22 compute-1 ceph-mon[80115]: from='client.? 192.168.122.101:0/958922429' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Feb 02 10:15:22 compute-1 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Feb 02 10:15:22 compute-1 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/7819906' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Feb 02 10:15:22 compute-1 nova_compute[226294]: 2026-02-02 10:15:22.863 226298 DEBUG oslo_concurrency.processutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.474s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 02 10:15:22 compute-1 nova_compute[226294]: 2026-02-02 10:15:22.867 226298 DEBUG nova.compute.provider_tree [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Inventory has not changed in ProviderTree for provider: 8e32c057-ad28-4c19-8374-763e0c1c8622 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 02 10:15:22 compute-1 nova_compute[226294]: 2026-02-02 10:15:22.882 226298 DEBUG nova.scheduler.client.report [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Inventory has not changed for provider 8e32c057-ad28-4c19-8374-763e0c1c8622 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 02 10:15:22 compute-1 nova_compute[226294]: 2026-02-02 10:15:22.883 226298 DEBUG nova.compute.resource_tracker [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 02 10:15:22 compute-1 nova_compute[226294]: 2026-02-02 10:15:22.884 226298 DEBUG oslo_concurrency.lockutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.593s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 02 10:15:23 compute-1 ceph-mon[80115]: from='client.? 192.168.122.101:0/7819906' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Feb 02 10:15:23 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:15:23 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:15:23 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:15:23.741 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:15:23 compute-1 nova_compute[226294]: 2026-02-02 10:15:23.884 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 02 10:15:24 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:15:24 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:15:24 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:15:24.084 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:15:24 compute-1 ceph-mon[80115]: pgmap v1080: 353 pgs: 353 active+clean; 41 MiB data, 310 MiB used, 60 GiB / 60 GiB avail; 597 B/s rd, 0 op/s
Feb 02 10:15:25 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:15:25 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:15:25 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:15:25.743 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:15:26 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:15:26 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:15:26 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:15:26.086 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:15:26 compute-1 ceph-mon[80115]: pgmap v1081: 353 pgs: 353 active+clean; 41 MiB data, 310 MiB used, 60 GiB / 60 GiB avail; 597 B/s rd, 0 op/s
Feb 02 10:15:26 compute-1 nova_compute[226294]: 2026-02-02 10:15:26.821 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 02 10:15:26 compute-1 nova_compute[226294]: 2026-02-02 10:15:26.822 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:15:26 compute-1 nova_compute[226294]: 2026-02-02 10:15:26.822 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 02 10:15:26 compute-1 nova_compute[226294]: 2026-02-02 10:15:26.822 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 02 10:15:26 compute-1 nova_compute[226294]: 2026-02-02 10:15:26.823 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 02 10:15:26 compute-1 nova_compute[226294]: 2026-02-02 10:15:26.824 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:15:27 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 10:15:27 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:15:27 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 10:15:27 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:15:27.745 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 10:15:27 compute-1 ceph-mon[80115]: pgmap v1082: 353 pgs: 353 active+clean; 41 MiB data, 310 MiB used, 60 GiB / 60 GiB avail; 597 B/s rd, 0 op/s
Feb 02 10:15:28 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:15:28 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:15:28 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:15:28.089 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:15:29 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:15:29 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:15:29 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:15:29.748 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:15:29 compute-1 ceph-mon[80115]: pgmap v1083: 353 pgs: 353 active+clean; 41 MiB data, 310 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Feb 02 10:15:30 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:15:30 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 10:15:30 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:15:30.091 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 10:15:31 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:15:31 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:15:31 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:15:31.751 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:15:31 compute-1 nova_compute[226294]: 2026-02-02 10:15:31.824 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 02 10:15:32 compute-1 ceph-mon[80115]: pgmap v1084: 353 pgs: 353 active+clean; 41 MiB data, 310 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Feb 02 10:15:32 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:15:32 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:15:32 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:15:32.094 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:15:32 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 10:15:33 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Feb 02 10:15:33 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:15:33 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 10:15:33 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:15:33.753 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 10:15:34 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:15:34 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:15:34 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:15:34.096 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:15:34 compute-1 ceph-mon[80115]: pgmap v1085: 353 pgs: 353 active+clean; 41 MiB data, 310 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Feb 02 10:15:35 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:15:35 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:15:35 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:15:35.757 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:15:36 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:15:36 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:15:36 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:15:36.098 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:15:36 compute-1 ceph-mon[80115]: pgmap v1086: 353 pgs: 353 active+clean; 41 MiB data, 310 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Feb 02 10:15:36 compute-1 nova_compute[226294]: 2026-02-02 10:15:36.827 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 02 10:15:36 compute-1 nova_compute[226294]: 2026-02-02 10:15:36.828 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 02 10:15:36 compute-1 nova_compute[226294]: 2026-02-02 10:15:36.828 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 02 10:15:36 compute-1 nova_compute[226294]: 2026-02-02 10:15:36.829 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 02 10:15:36 compute-1 nova_compute[226294]: 2026-02-02 10:15:36.878 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:15:36 compute-1 nova_compute[226294]: 2026-02-02 10:15:36.879 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 02 10:15:37 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 10:15:37 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:15:37 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 10:15:37 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:15:37.759 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 10:15:38 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:15:38 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 10:15:38 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:15:38.101 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 10:15:38 compute-1 ceph-mon[80115]: pgmap v1087: 353 pgs: 353 active+clean; 41 MiB data, 310 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Feb 02 10:15:39 compute-1 sudo[237043]: pam_unix(sudo:session): session closed for user root
Feb 02 10:15:39 compute-1 sshd-session[237029]: Received disconnect from 192.168.122.10 port 37774:11: disconnected by user
Feb 02 10:15:39 compute-1 sshd-session[237029]: Disconnected from user zuul 192.168.122.10 port 37774
Feb 02 10:15:39 compute-1 sshd-session[237001]: pam_unix(sshd:session): session closed for user zuul
Feb 02 10:15:39 compute-1 systemd-logind[805]: Session 55 logged out. Waiting for processes to exit.
Feb 02 10:15:39 compute-1 systemd[1]: session-55.scope: Deactivated successfully.
Feb 02 10:15:39 compute-1 systemd[1]: session-55.scope: Consumed 2min 36.039s CPU time, 765.6M memory peak, read 304.5M from disk, written 211.4M to disk.
Feb 02 10:15:39 compute-1 systemd-logind[805]: Removed session 55.
Feb 02 10:15:39 compute-1 sshd-session[244877]: Accepted publickey for zuul from 192.168.122.10 port 48440 ssh2: ECDSA SHA256:RIWOugHsRom13QN8+H2eekzMj6VNcm6gUxie+zDStiQ
Feb 02 10:15:39 compute-1 systemd-logind[805]: New session 56 of user zuul.
Feb 02 10:15:39 compute-1 systemd[1]: Started Session 56 of User zuul.
Feb 02 10:15:39 compute-1 sshd-session[244877]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Feb 02 10:15:39 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:15:39 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 10:15:39 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:15:39.763 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 10:15:39 compute-1 sudo[244881]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/cat /var/tmp/sos-osp/sosreport-compute-1-2026-02-02-dxtofop.tar.xz
Feb 02 10:15:39 compute-1 sudo[244881]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 10:15:39 compute-1 sudo[244881]: pam_unix(sudo:session): session closed for user root
Feb 02 10:15:39 compute-1 sshd-session[244880]: Received disconnect from 192.168.122.10 port 48440:11: disconnected by user
Feb 02 10:15:39 compute-1 sshd-session[244880]: Disconnected from user zuul 192.168.122.10 port 48440
Feb 02 10:15:39 compute-1 sshd-session[244877]: pam_unix(sshd:session): session closed for user zuul
Feb 02 10:15:39 compute-1 systemd[1]: session-56.scope: Deactivated successfully.
Feb 02 10:15:39 compute-1 systemd-logind[805]: Session 56 logged out. Waiting for processes to exit.
Feb 02 10:15:39 compute-1 systemd-logind[805]: Removed session 56.
Feb 02 10:15:39 compute-1 sshd-session[244906]: Accepted publickey for zuul from 192.168.122.10 port 48450 ssh2: ECDSA SHA256:RIWOugHsRom13QN8+H2eekzMj6VNcm6gUxie+zDStiQ
Feb 02 10:15:39 compute-1 systemd-logind[805]: New session 57 of user zuul.
Feb 02 10:15:40 compute-1 systemd[1]: Started Session 57 of User zuul.
Feb 02 10:15:40 compute-1 sshd-session[244906]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Feb 02 10:15:40 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:15:40 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 10:15:40 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:15:40.104 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 10:15:40 compute-1 sudo[244910]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/rm -rf /var/tmp/sos-osp
Feb 02 10:15:40 compute-1 sudo[244910]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 10:15:40 compute-1 sudo[244910]: pam_unix(sudo:session): session closed for user root
Feb 02 10:15:40 compute-1 sshd-session[244909]: Received disconnect from 192.168.122.10 port 48450:11: disconnected by user
Feb 02 10:15:40 compute-1 sshd-session[244909]: Disconnected from user zuul 192.168.122.10 port 48450
Feb 02 10:15:40 compute-1 sshd-session[244906]: pam_unix(sshd:session): session closed for user zuul
Feb 02 10:15:40 compute-1 systemd[1]: session-57.scope: Deactivated successfully.
Feb 02 10:15:40 compute-1 systemd-logind[805]: Session 57 logged out. Waiting for processes to exit.
Feb 02 10:15:40 compute-1 systemd-logind[805]: Removed session 57.
Feb 02 10:15:40 compute-1 ceph-mon[80115]: pgmap v1088: 353 pgs: 353 active+clean; 41 MiB data, 310 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Feb 02 10:15:41 compute-1 sudo[244937]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Feb 02 10:15:41 compute-1 sudo[244937]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 10:15:41 compute-1 sudo[244937]: pam_unix(sudo:session): session closed for user root
Feb 02 10:15:41 compute-1 podman[244961]: 2026-02-02 10:15:41.706040898 +0000 UTC m=+0.087514487 container health_status 1fb2696999ea15e9688144989093738543d3cef725f9986792d6dfd707aefbe2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:099d88ae13fa2b3409da5310cdcba7fa01d2c87a8bc98296299a57054b9a075e, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'db4758ee7523fe447444c4bd2b867b543b1eee4e3bbcf6676cd1b27bf6147d86-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:099d88ae13fa2b3409da5310cdcba7fa01d2c87a8bc98296299a57054b9a075e', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Feb 02 10:15:41 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:15:41 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 10:15:41 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:15:41.766 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 10:15:41 compute-1 nova_compute[226294]: 2026-02-02 10:15:41.880 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 02 10:15:41 compute-1 nova_compute[226294]: 2026-02-02 10:15:41.882 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 02 10:15:41 compute-1 nova_compute[226294]: 2026-02-02 10:15:41.882 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 02 10:15:41 compute-1 nova_compute[226294]: 2026-02-02 10:15:41.882 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 02 10:15:41 compute-1 nova_compute[226294]: 2026-02-02 10:15:41.936 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:15:41 compute-1 nova_compute[226294]: 2026-02-02 10:15:41.937 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 02 10:15:42 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:15:42 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:15:42 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:15:42.107 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:15:42 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 10:15:42 compute-1 ceph-mon[80115]: pgmap v1089: 353 pgs: 353 active+clean; 41 MiB data, 310 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Feb 02 10:15:43 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:15:43 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:15:43 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:15:43.769 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:15:44 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:15:44 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:15:44 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:15:44.109 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:15:44 compute-1 ceph-mon[80115]: pgmap v1090: 353 pgs: 353 active+clean; 41 MiB data, 310 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Feb 02 10:15:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 10:15:44.915 143542 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 02 10:15:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 10:15:44.916 143542 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 02 10:15:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 10:15:44.916 143542 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 02 10:15:45 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:15:45 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 10:15:45 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:15:45.770 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 10:15:46 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:15:46 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 10:15:46 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:15:46.111 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 10:15:46 compute-1 ceph-mon[80115]: pgmap v1091: 353 pgs: 353 active+clean; 41 MiB data, 310 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Feb 02 10:15:46 compute-1 nova_compute[226294]: 2026-02-02 10:15:46.938 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 02 10:15:46 compute-1 nova_compute[226294]: 2026-02-02 10:15:46.940 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 02 10:15:46 compute-1 nova_compute[226294]: 2026-02-02 10:15:46.941 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 02 10:15:46 compute-1 nova_compute[226294]: 2026-02-02 10:15:46.941 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 02 10:15:46 compute-1 nova_compute[226294]: 2026-02-02 10:15:46.951 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:15:46 compute-1 nova_compute[226294]: 2026-02-02 10:15:46.953 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 02 10:15:47 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 10:15:47 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Feb 02 10:15:47 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:15:47 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:15:47 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:15:47.774 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:15:48 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:15:48 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 10:15:48 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:15:48.114 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 10:15:48 compute-1 ceph-mon[80115]: pgmap v1092: 353 pgs: 353 active+clean; 41 MiB data, 310 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Feb 02 10:15:49 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:15:49 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 10:15:49 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:15:49.776 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 10:15:50 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:15:50 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 10:15:50 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:15:50.116 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 10:15:50 compute-1 podman[244991]: 2026-02-02 10:15:50.395131681 +0000 UTC m=+0.064849083 container health_status 9cab238676dda9a572dafc047b624a8b78a8fbec063bf997856559897160605d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'db4758ee7523fe447444c4bd2b867b543b1eee4e3bbcf6676cd1b27bf6147d86-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Feb 02 10:15:50 compute-1 ceph-mon[80115]: pgmap v1093: 353 pgs: 353 active+clean; 41 MiB data, 310 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Feb 02 10:15:51 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:15:51 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 10:15:51 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:15:51.779 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 10:15:51 compute-1 nova_compute[226294]: 2026-02-02 10:15:51.953 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:15:51 compute-1 nova_compute[226294]: 2026-02-02 10:15:51.956 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:15:52 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:15:52 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 10:15:52 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:15:52.119 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 10:15:52 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 10:15:52 compute-1 ceph-mon[80115]: pgmap v1094: 353 pgs: 353 active+clean; 41 MiB data, 310 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Feb 02 10:15:52 compute-1 ceph-mon[80115]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #61. Immutable memtables: 0.
Feb 02 10:15:52 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:15:52.797206) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Feb 02 10:15:52 compute-1 ceph-mon[80115]: rocksdb: [db/flush_job.cc:856] [default] [JOB 35] Flushing memtable with next log file: 61
Feb 02 10:15:52 compute-1 ceph-mon[80115]: rocksdb: EVENT_LOG_v1 {"time_micros": 1770027352797248, "job": 35, "event": "flush_started", "num_memtables": 1, "num_entries": 2724, "num_deletes": 251, "total_data_size": 6598238, "memory_usage": 6685136, "flush_reason": "Manual Compaction"}
Feb 02 10:15:52 compute-1 ceph-mon[80115]: rocksdb: [db/flush_job.cc:885] [default] [JOB 35] Level-0 flush table #62: started
Feb 02 10:15:52 compute-1 ceph-mon[80115]: rocksdb: EVENT_LOG_v1 {"time_micros": 1770027352846441, "cf_name": "default", "job": 35, "event": "table_file_creation", "file_number": 62, "file_size": 4262073, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 31255, "largest_seqno": 33974, "table_properties": {"data_size": 4250121, "index_size": 7486, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 3397, "raw_key_size": 29962, "raw_average_key_size": 22, "raw_value_size": 4224550, "raw_average_value_size": 3138, "num_data_blocks": 319, "num_entries": 1346, "num_filter_entries": 1346, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1770027157, "oldest_key_time": 1770027157, "file_creation_time": 1770027352, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "fac1d709-8a2a-487d-b05b-57255ec289c7", "db_session_id": "DE871D21TSCUFP8UED8E", "orig_file_number": 62, "seqno_to_time_mapping": "N/A"}}
Feb 02 10:15:52 compute-1 ceph-mon[80115]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 35] Flush lasted 49278 microseconds, and 10765 cpu microseconds.
Feb 02 10:15:52 compute-1 ceph-mon[80115]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Feb 02 10:15:52 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:15:52.846487) [db/flush_job.cc:967] [default] [JOB 35] Level-0 flush table #62: 4262073 bytes OK
Feb 02 10:15:52 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:15:52.846503) [db/memtable_list.cc:519] [default] Level-0 commit table #62 started
Feb 02 10:15:52 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:15:52.850197) [db/memtable_list.cc:722] [default] Level-0 commit table #62: memtable #1 done
Feb 02 10:15:52 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:15:52.850246) EVENT_LOG_v1 {"time_micros": 1770027352850236, "job": 35, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Feb 02 10:15:52 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:15:52.850271) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Feb 02 10:15:52 compute-1 ceph-mon[80115]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 35] Try to delete WAL files size 6585171, prev total WAL file size 6585171, number of live WAL files 2.
Feb 02 10:15:52 compute-1 ceph-mon[80115]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000058.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb 02 10:15:52 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:15:52.851236) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730032353130' seq:72057594037927935, type:22 .. '7061786F730032373632' seq:0, type:0; will stop at (end)
Feb 02 10:15:52 compute-1 ceph-mon[80115]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 36] Compacting 1@0 + 1@6 files to L6, score -1.00
Feb 02 10:15:52 compute-1 ceph-mon[80115]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 35 Base level 0, inputs: [62(4162KB)], [60(11MB)]
Feb 02 10:15:52 compute-1 ceph-mon[80115]: rocksdb: EVENT_LOG_v1 {"time_micros": 1770027352851273, "job": 36, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [62], "files_L6": [60], "score": -1, "input_data_size": 15907516, "oldest_snapshot_seqno": -1}
Feb 02 10:15:52 compute-1 ceph-mon[80115]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 36] Generated table #63: 6611 keys, 13905949 bytes, temperature: kUnknown
Feb 02 10:15:52 compute-1 ceph-mon[80115]: rocksdb: EVENT_LOG_v1 {"time_micros": 1770027352977006, "cf_name": "default", "job": 36, "event": "table_file_creation", "file_number": 63, "file_size": 13905949, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 13862827, "index_size": 25474, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 16581, "raw_key_size": 169843, "raw_average_key_size": 25, "raw_value_size": 13745104, "raw_average_value_size": 2079, "num_data_blocks": 1023, "num_entries": 6611, "num_filter_entries": 6611, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1770025175, "oldest_key_time": 0, "file_creation_time": 1770027352, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "fac1d709-8a2a-487d-b05b-57255ec289c7", "db_session_id": "DE871D21TSCUFP8UED8E", "orig_file_number": 63, "seqno_to_time_mapping": "N/A"}}
Feb 02 10:15:52 compute-1 ceph-mon[80115]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Feb 02 10:15:52 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:15:52.977328) [db/compaction/compaction_job.cc:1663] [default] [JOB 36] Compacted 1@0 + 1@6 files to L6 => 13905949 bytes
Feb 02 10:15:52 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:15:52.984192) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 126.4 rd, 110.5 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(4.1, 11.1 +0.0 blob) out(13.3 +0.0 blob), read-write-amplify(7.0) write-amplify(3.3) OK, records in: 7132, records dropped: 521 output_compression: NoCompression
Feb 02 10:15:52 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:15:52.984227) EVENT_LOG_v1 {"time_micros": 1770027352984212, "job": 36, "event": "compaction_finished", "compaction_time_micros": 125820, "compaction_time_cpu_micros": 19360, "output_level": 6, "num_output_files": 1, "total_output_size": 13905949, "num_input_records": 7132, "num_output_records": 6611, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Feb 02 10:15:52 compute-1 ceph-mon[80115]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000062.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb 02 10:15:52 compute-1 ceph-mon[80115]: rocksdb: EVENT_LOG_v1 {"time_micros": 1770027352985083, "job": 36, "event": "table_file_deletion", "file_number": 62}
Feb 02 10:15:52 compute-1 ceph-mon[80115]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000060.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb 02 10:15:52 compute-1 ceph-mon[80115]: rocksdb: EVENT_LOG_v1 {"time_micros": 1770027352987114, "job": 36, "event": "table_file_deletion", "file_number": 60}
Feb 02 10:15:52 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:15:52.851095) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 02 10:15:52 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:15:52.987368) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 02 10:15:52 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:15:52.987379) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 02 10:15:52 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:15:52.987383) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 02 10:15:52 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:15:52.987386) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 02 10:15:52 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:15:52.987389) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 02 10:15:53 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:15:53 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:15:53 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:15:53.783 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:15:54 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:15:54 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:15:54 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:15:54.121 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:15:54 compute-1 ceph-mon[80115]: pgmap v1095: 353 pgs: 353 active+clean; 41 MiB data, 310 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Feb 02 10:15:55 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:15:55 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:15:55 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:15:55.786 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:15:55 compute-1 ceph-mon[80115]: from='client.? 192.168.122.10:0/2532913621' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Feb 02 10:15:55 compute-1 ceph-mon[80115]: from='client.? 192.168.122.10:0/2532913621' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Feb 02 10:15:56 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:15:56 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:15:56 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:15:56.124 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:15:56 compute-1 ceph-mon[80115]: pgmap v1096: 353 pgs: 353 active+clean; 41 MiB data, 310 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Feb 02 10:15:56 compute-1 nova_compute[226294]: 2026-02-02 10:15:56.958 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 02 10:15:56 compute-1 nova_compute[226294]: 2026-02-02 10:15:56.959 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:15:56 compute-1 nova_compute[226294]: 2026-02-02 10:15:56.959 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 02 10:15:56 compute-1 nova_compute[226294]: 2026-02-02 10:15:56.959 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 02 10:15:56 compute-1 nova_compute[226294]: 2026-02-02 10:15:56.959 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 02 10:15:57 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 10:15:57 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:15:57 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:15:57 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:15:57.789 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:15:57 compute-1 ceph-mon[80115]: pgmap v1097: 353 pgs: 353 active+clean; 41 MiB data, 310 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Feb 02 10:15:58 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:15:58 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 10:15:58 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:15:58.126 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 10:15:59 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:15:59 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:15:59 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:15:59.791 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:15:59 compute-1 ceph-mon[80115]: pgmap v1098: 353 pgs: 353 active+clean; 41 MiB data, 310 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Feb 02 10:16:00 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:16:00 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:16:00 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:16:00.128 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:16:01 compute-1 sudo[245016]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Feb 02 10:16:01 compute-1 sudo[245016]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 10:16:01 compute-1 sudo[245016]: pam_unix(sudo:session): session closed for user root
Feb 02 10:16:01 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:16:01 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:16:01 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:16:01.793 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:16:01 compute-1 nova_compute[226294]: 2026-02-02 10:16:01.960 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 02 10:16:01 compute-1 ceph-mon[80115]: pgmap v1099: 353 pgs: 353 active+clean; 41 MiB data, 310 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Feb 02 10:16:02 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:16:02 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 10:16:02 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:16:02.130 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 10:16:02 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 10:16:03 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Feb 02 10:16:03 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:16:03 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:16:03 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:16:03.795 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:16:04 compute-1 ceph-mon[80115]: pgmap v1100: 353 pgs: 353 active+clean; 41 MiB data, 310 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Feb 02 10:16:04 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:16:04 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 10:16:04 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:16:04.132 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 10:16:05 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:16:05 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 10:16:05 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:16:05.797 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 10:16:06 compute-1 ceph-mon[80115]: pgmap v1101: 353 pgs: 353 active+clean; 41 MiB data, 310 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Feb 02 10:16:06 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:16:06 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 10:16:06 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:16:06.134 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 10:16:06 compute-1 nova_compute[226294]: 2026-02-02 10:16:06.967 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:16:07 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 10:16:07 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:16:07 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:16:07 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:16:07.801 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:16:08 compute-1 ceph-mon[80115]: pgmap v1102: 353 pgs: 353 active+clean; 41 MiB data, 310 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Feb 02 10:16:08 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:16:08 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 10:16:08 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:16:08.136 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 10:16:09 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:16:09 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 10:16:09 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:16:09.803 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 10:16:10 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:16:10 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:16:10 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:16:10.139 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:16:10 compute-1 ceph-mon[80115]: pgmap v1103: 353 pgs: 353 active+clean; 41 MiB data, 310 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Feb 02 10:16:11 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:16:11 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 10:16:11 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:16:11.806 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 10:16:11 compute-1 nova_compute[226294]: 2026-02-02 10:16:11.966 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:16:11 compute-1 nova_compute[226294]: 2026-02-02 10:16:11.969 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:16:12 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:16:12 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:16:12 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:16:12.140 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:16:12 compute-1 ceph-mon[80115]: pgmap v1104: 353 pgs: 353 active+clean; 41 MiB data, 310 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Feb 02 10:16:12 compute-1 podman[245046]: 2026-02-02 10:16:12.409922515 +0000 UTC m=+0.082931555 container health_status 1fb2696999ea15e9688144989093738543d3cef725f9986792d6dfd707aefbe2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:099d88ae13fa2b3409da5310cdcba7fa01d2c87a8bc98296299a57054b9a075e, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'db4758ee7523fe447444c4bd2b867b543b1eee4e3bbcf6676cd1b27bf6147d86-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:099d88ae13fa2b3409da5310cdcba7fa01d2c87a8bc98296299a57054b9a075e', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20260127, maintainer=OpenStack Kubernetes Operator team)
Feb 02 10:16:12 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 10:16:13 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:16:13 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:16:13 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:16:13.810 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:16:14 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:16:14 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:16:14 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:16:14.143 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:16:14 compute-1 ceph-mon[80115]: pgmap v1105: 353 pgs: 353 active+clean; 41 MiB data, 310 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Feb 02 10:16:14 compute-1 ceph-mon[80115]: from='client.? 192.168.122.100:0/2865253705' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Feb 02 10:16:15 compute-1 ceph-mon[80115]: from='client.? 192.168.122.100:0/3044454252' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Feb 02 10:16:15 compute-1 ceph-mon[80115]: from='client.? 192.168.122.102:0/3938040267' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Feb 02 10:16:15 compute-1 nova_compute[226294]: 2026-02-02 10:16:15.648 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 02 10:16:15 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:16:15 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 10:16:15 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:16:15.813 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 10:16:16 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:16:16 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:16:16 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:16:16.147 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:16:16 compute-1 ceph-mon[80115]: pgmap v1106: 353 pgs: 353 active+clean; 41 MiB data, 310 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Feb 02 10:16:16 compute-1 ceph-mon[80115]: from='client.? 192.168.122.102:0/1963613397' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Feb 02 10:16:16 compute-1 nova_compute[226294]: 2026-02-02 10:16:16.651 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 02 10:16:16 compute-1 nova_compute[226294]: 2026-02-02 10:16:16.991 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 02 10:16:16 compute-1 nova_compute[226294]: 2026-02-02 10:16:16.992 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:16:16 compute-1 nova_compute[226294]: 2026-02-02 10:16:16.993 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5023 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 02 10:16:16 compute-1 nova_compute[226294]: 2026-02-02 10:16:16.993 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 02 10:16:16 compute-1 nova_compute[226294]: 2026-02-02 10:16:16.993 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 02 10:16:16 compute-1 nova_compute[226294]: 2026-02-02 10:16:16.995 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:16:17 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Feb 02 10:16:17 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 10:16:17 compute-1 nova_compute[226294]: 2026-02-02 10:16:17.645 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 02 10:16:17 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:16:17 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 10:16:17 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:16:17.816 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 10:16:18 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:16:18 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:16:18 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:16:18.150 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:16:18 compute-1 ceph-mon[80115]: pgmap v1107: 353 pgs: 353 active+clean; 41 MiB data, 310 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Feb 02 10:16:18 compute-1 nova_compute[226294]: 2026-02-02 10:16:18.648 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 02 10:16:18 compute-1 nova_compute[226294]: 2026-02-02 10:16:18.649 226298 DEBUG nova.compute.manager [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 02 10:16:18 compute-1 nova_compute[226294]: 2026-02-02 10:16:18.649 226298 DEBUG nova.compute.manager [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 02 10:16:18 compute-1 nova_compute[226294]: 2026-02-02 10:16:18.681 226298 DEBUG nova.compute.manager [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Feb 02 10:16:18 compute-1 nova_compute[226294]: 2026-02-02 10:16:18.681 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 02 10:16:19 compute-1 nova_compute[226294]: 2026-02-02 10:16:19.648 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 02 10:16:19 compute-1 nova_compute[226294]: 2026-02-02 10:16:19.649 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 02 10:16:19 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:16:19 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 10:16:19 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:16:19.818 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 10:16:20 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:16:20 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:16:20 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:16:20.152 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:16:20 compute-1 ceph-mon[80115]: pgmap v1108: 353 pgs: 353 active+clean; 41 MiB data, 310 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Feb 02 10:16:21 compute-1 podman[245078]: 2026-02-02 10:16:21.400395155 +0000 UTC m=+0.081523047 container health_status 9cab238676dda9a572dafc047b624a8b78a8fbec063bf997856559897160605d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'db4758ee7523fe447444c4bd2b867b543b1eee4e3bbcf6676cd1b27bf6147d86-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3)
Feb 02 10:16:21 compute-1 sudo[245097]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Feb 02 10:16:21 compute-1 sudo[245097]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 10:16:21 compute-1 sudo[245097]: pam_unix(sudo:session): session closed for user root
Feb 02 10:16:21 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:16:21 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:16:21 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:16:21.821 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:16:21 compute-1 sudo[245122]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 02 10:16:21 compute-1 sudo[245122]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 10:16:21 compute-1 sudo[245122]: pam_unix(sudo:session): session closed for user root
Feb 02 10:16:21 compute-1 nova_compute[226294]: 2026-02-02 10:16:21.996 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 02 10:16:21 compute-1 nova_compute[226294]: 2026-02-02 10:16:21.998 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 02 10:16:21 compute-1 nova_compute[226294]: 2026-02-02 10:16:21.998 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 02 10:16:21 compute-1 nova_compute[226294]: 2026-02-02 10:16:21.998 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 02 10:16:22 compute-1 sudo[245147]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/d241d473-9fcb-5f74-b163-f1ca4454e7f1/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Feb 02 10:16:22 compute-1 nova_compute[226294]: 2026-02-02 10:16:22.033 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:16:22 compute-1 nova_compute[226294]: 2026-02-02 10:16:22.034 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 02 10:16:22 compute-1 sudo[245147]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 10:16:22 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:16:22 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 10:16:22 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:16:22.153 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 10:16:22 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 10:16:22 compute-1 ceph-mon[80115]: pgmap v1109: 353 pgs: 353 active+clean; 41 MiB data, 310 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Feb 02 10:16:22 compute-1 sudo[245147]: pam_unix(sudo:session): session closed for user root
Feb 02 10:16:22 compute-1 nova_compute[226294]: 2026-02-02 10:16:22.648 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 02 10:16:22 compute-1 nova_compute[226294]: 2026-02-02 10:16:22.676 226298 DEBUG oslo_concurrency.lockutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 02 10:16:22 compute-1 nova_compute[226294]: 2026-02-02 10:16:22.677 226298 DEBUG oslo_concurrency.lockutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 02 10:16:22 compute-1 nova_compute[226294]: 2026-02-02 10:16:22.677 226298 DEBUG oslo_concurrency.lockutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 02 10:16:22 compute-1 nova_compute[226294]: 2026-02-02 10:16:22.677 226298 DEBUG nova.compute.resource_tracker [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 02 10:16:22 compute-1 nova_compute[226294]: 2026-02-02 10:16:22.678 226298 DEBUG oslo_concurrency.processutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 02 10:16:23 compute-1 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Feb 02 10:16:23 compute-1 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/613712461' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Feb 02 10:16:23 compute-1 nova_compute[226294]: 2026-02-02 10:16:23.124 226298 DEBUG oslo_concurrency.processutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.446s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 02 10:16:23 compute-1 nova_compute[226294]: 2026-02-02 10:16:23.305 226298 WARNING nova.virt.libvirt.driver [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 02 10:16:23 compute-1 nova_compute[226294]: 2026-02-02 10:16:23.307 226298 DEBUG nova.compute.resource_tracker [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4858MB free_disk=59.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 02 10:16:23 compute-1 nova_compute[226294]: 2026-02-02 10:16:23.307 226298 DEBUG oslo_concurrency.lockutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 02 10:16:23 compute-1 nova_compute[226294]: 2026-02-02 10:16:23.307 226298 DEBUG oslo_concurrency.lockutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 02 10:16:23 compute-1 nova_compute[226294]: 2026-02-02 10:16:23.396 226298 DEBUG nova.compute.resource_tracker [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 02 10:16:23 compute-1 nova_compute[226294]: 2026-02-02 10:16:23.397 226298 DEBUG nova.compute.resource_tracker [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 02 10:16:23 compute-1 nova_compute[226294]: 2026-02-02 10:16:23.417 226298 DEBUG oslo_concurrency.processutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 02 10:16:23 compute-1 ceph-mon[80115]: from='client.? 192.168.122.101:0/613712461' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Feb 02 10:16:23 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:16:23 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:16:23 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:16:23.823 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:16:23 compute-1 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Feb 02 10:16:23 compute-1 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/216113752' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Feb 02 10:16:23 compute-1 nova_compute[226294]: 2026-02-02 10:16:23.896 226298 DEBUG oslo_concurrency.processutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.479s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 02 10:16:23 compute-1 nova_compute[226294]: 2026-02-02 10:16:23.904 226298 DEBUG nova.compute.provider_tree [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Inventory has not changed in ProviderTree for provider: 8e32c057-ad28-4c19-8374-763e0c1c8622 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 02 10:16:23 compute-1 nova_compute[226294]: 2026-02-02 10:16:23.931 226298 DEBUG nova.scheduler.client.report [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Inventory has not changed for provider 8e32c057-ad28-4c19-8374-763e0c1c8622 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 02 10:16:23 compute-1 nova_compute[226294]: 2026-02-02 10:16:23.933 226298 DEBUG nova.compute.resource_tracker [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 02 10:16:23 compute-1 nova_compute[226294]: 2026-02-02 10:16:23.933 226298 DEBUG oslo_concurrency.lockutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.626s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 02 10:16:24 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:16:24 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:16:24 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:16:24.157 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:16:24 compute-1 ceph-mon[80115]: pgmap v1110: 353 pgs: 353 active+clean; 41 MiB data, 310 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Feb 02 10:16:24 compute-1 ceph-mon[80115]: from='client.? 192.168.122.101:0/216113752' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Feb 02 10:16:24 compute-1 nova_compute[226294]: 2026-02-02 10:16:24.935 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 02 10:16:24 compute-1 nova_compute[226294]: 2026-02-02 10:16:24.936 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 02 10:16:24 compute-1 nova_compute[226294]: 2026-02-02 10:16:24.936 226298 DEBUG nova.compute.manager [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 02 10:16:25 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:16:25 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:16:25 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:16:25.827 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:16:26 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:16:26 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 10:16:26 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:16:26.158 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 10:16:26 compute-1 ceph-mon[80115]: pgmap v1111: 353 pgs: 353 active+clean; 41 MiB data, 310 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Feb 02 10:16:26 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb 02 10:16:26 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb 02 10:16:26 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Feb 02 10:16:26 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Feb 02 10:16:26 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb 02 10:16:26 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb 02 10:16:26 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Feb 02 10:16:26 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Feb 02 10:16:26 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Feb 02 10:16:27 compute-1 nova_compute[226294]: 2026-02-02 10:16:27.035 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 02 10:16:27 compute-1 nova_compute[226294]: 2026-02-02 10:16:27.038 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 02 10:16:27 compute-1 nova_compute[226294]: 2026-02-02 10:16:27.038 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 02 10:16:27 compute-1 nova_compute[226294]: 2026-02-02 10:16:27.038 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 02 10:16:27 compute-1 nova_compute[226294]: 2026-02-02 10:16:27.077 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:16:27 compute-1 nova_compute[226294]: 2026-02-02 10:16:27.078 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 02 10:16:27 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 10:16:27 compute-1 ceph-mon[80115]: pgmap v1112: 353 pgs: 353 active+clean; 41 MiB data, 310 MiB used, 60 GiB / 60 GiB avail; 556 B/s rd, 0 op/s
Feb 02 10:16:27 compute-1 ceph-mon[80115]: pgmap v1113: 353 pgs: 353 active+clean; 41 MiB data, 310 MiB used, 60 GiB / 60 GiB avail; 679 B/s rd, 0 op/s
Feb 02 10:16:27 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:16:27 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 10:16:27 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:16:27.829 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 10:16:28 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:16:28 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 10:16:28 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:16:28.161 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 10:16:29 compute-1 ceph-mon[80115]: pgmap v1114: 353 pgs: 353 active+clean; 41 MiB data, 310 MiB used, 60 GiB / 60 GiB avail; 339 B/s rd, 0 op/s
Feb 02 10:16:29 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:16:29 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 10:16:29 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:16:29.832 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 10:16:30 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:16:30 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:16:30 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:16:30.164 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:16:31 compute-1 sudo[245252]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 02 10:16:31 compute-1 sudo[245252]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 10:16:31 compute-1 sudo[245252]: pam_unix(sudo:session): session closed for user root
Feb 02 10:16:31 compute-1 ceph-mon[80115]: pgmap v1115: 353 pgs: 353 active+clean; 41 MiB data, 310 MiB used, 60 GiB / 60 GiB avail; 679 B/s rd, 0 op/s
Feb 02 10:16:31 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb 02 10:16:31 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb 02 10:16:31 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:16:31 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:16:31 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:16:31.836 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:16:32 compute-1 nova_compute[226294]: 2026-02-02 10:16:32.078 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:16:32 compute-1 nova_compute[226294]: 2026-02-02 10:16:32.080 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:16:32 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:16:32 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 10:16:32 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:16:32.166 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 10:16:32 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 10:16:32 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Feb 02 10:16:33 compute-1 ceph-mon[80115]: pgmap v1116: 353 pgs: 353 active+clean; 41 MiB data, 310 MiB used, 60 GiB / 60 GiB avail; 679 B/s rd, 0 op/s
Feb 02 10:16:33 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:16:33 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:16:33 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:16:33.839 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:16:34 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:16:34 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 10:16:34 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:16:34.168 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 10:16:35 compute-1 ceph-mon[80115]: pgmap v1117: 353 pgs: 353 active+clean; 41 MiB data, 310 MiB used, 60 GiB / 60 GiB avail; 679 B/s rd, 0 op/s
Feb 02 10:16:35 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:16:35 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 10:16:35 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:16:35.841 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 10:16:36 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:16:36 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:16:36 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:16:36.171 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:16:37 compute-1 nova_compute[226294]: 2026-02-02 10:16:37.081 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 02 10:16:37 compute-1 nova_compute[226294]: 2026-02-02 10:16:37.083 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 02 10:16:37 compute-1 nova_compute[226294]: 2026-02-02 10:16:37.083 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 02 10:16:37 compute-1 nova_compute[226294]: 2026-02-02 10:16:37.083 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 02 10:16:37 compute-1 nova_compute[226294]: 2026-02-02 10:16:37.120 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:16:37 compute-1 nova_compute[226294]: 2026-02-02 10:16:37.120 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 02 10:16:37 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 10:16:37 compute-1 ceph-mon[80115]: pgmap v1118: 353 pgs: 353 active+clean; 41 MiB data, 310 MiB used, 60 GiB / 60 GiB avail; 614 B/s rd, 0 op/s
Feb 02 10:16:37 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:16:37 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:16:37 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:16:37.845 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:16:38 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:16:38 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 10:16:38 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:16:38.173 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 10:16:39 compute-1 ceph-mon[80115]: pgmap v1119: 353 pgs: 353 active+clean; 41 MiB data, 310 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Feb 02 10:16:39 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:16:39 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 10:16:39 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:16:39.847 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 10:16:40 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:16:40 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:16:40 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:16:40.177 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:16:41 compute-1 ceph-mon[80115]: pgmap v1120: 353 pgs: 353 active+clean; 41 MiB data, 310 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Feb 02 10:16:41 compute-1 sudo[245282]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Feb 02 10:16:41 compute-1 sudo[245282]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 10:16:41 compute-1 sudo[245282]: pam_unix(sudo:session): session closed for user root
Feb 02 10:16:41 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:16:41 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 10:16:41 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:16:41.850 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 10:16:42 compute-1 nova_compute[226294]: 2026-02-02 10:16:42.121 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 02 10:16:42 compute-1 nova_compute[226294]: 2026-02-02 10:16:42.173 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 02 10:16:42 compute-1 nova_compute[226294]: 2026-02-02 10:16:42.173 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5053 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 02 10:16:42 compute-1 nova_compute[226294]: 2026-02-02 10:16:42.173 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 02 10:16:42 compute-1 nova_compute[226294]: 2026-02-02 10:16:42.174 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 02 10:16:42 compute-1 nova_compute[226294]: 2026-02-02 10:16:42.175 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:16:42 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:16:42 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 10:16:42 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:16:42.178 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 10:16:42 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 10:16:43 compute-1 podman[245309]: 2026-02-02 10:16:43.419556217 +0000 UTC m=+0.097810962 container health_status 1fb2696999ea15e9688144989093738543d3cef725f9986792d6dfd707aefbe2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:099d88ae13fa2b3409da5310cdcba7fa01d2c87a8bc98296299a57054b9a075e, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, io.buildah.version=1.41.3, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'db4758ee7523fe447444c4bd2b867b543b1eee4e3bbcf6676cd1b27bf6147d86-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:099d88ae13fa2b3409da5310cdcba7fa01d2c87a8bc98296299a57054b9a075e', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_controller)
Feb 02 10:16:43 compute-1 ceph-mon[80115]: pgmap v1121: 353 pgs: 353 active+clean; 41 MiB data, 310 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Feb 02 10:16:43 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:16:43 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:16:43 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:16:43.853 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:16:44 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:16:44 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 10:16:44 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:16:44.181 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 10:16:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 10:16:44.917 143542 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 02 10:16:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 10:16:44.917 143542 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 02 10:16:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 10:16:44.917 143542 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 02 10:16:45 compute-1 ceph-mon[80115]: pgmap v1122: 353 pgs: 353 active+clean; 41 MiB data, 310 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Feb 02 10:16:45 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:16:45 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:16:45 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:16:45.855 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:16:46 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:16:46 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:16:46 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:16:46.183 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:16:47 compute-1 nova_compute[226294]: 2026-02-02 10:16:47.175 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:16:47 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 10:16:47 compute-1 ceph-mon[80115]: pgmap v1123: 353 pgs: 353 active+clean; 41 MiB data, 310 MiB used, 60 GiB / 60 GiB avail; 510 B/s rd, 0 op/s
Feb 02 10:16:47 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:16:47 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 10:16:47 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:16:47.858 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 10:16:47 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Feb 02 10:16:48 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:16:48 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 10:16:48 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:16:48.186 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 10:16:48 compute-1 ceph-mon[80115]: pgmap v1124: 353 pgs: 353 active+clean; 41 MiB data, 310 MiB used, 60 GiB / 60 GiB avail; 510 B/s rd, 0 op/s
Feb 02 10:16:49 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:16:49 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:16:49 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:16:49.862 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:16:50 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:16:50 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:16:50 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:16:50.189 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:16:51 compute-1 ceph-mon[80115]: pgmap v1125: 353 pgs: 353 active+clean; 41 MiB data, 310 MiB used, 60 GiB / 60 GiB avail; 765 B/s rd, 0 op/s
Feb 02 10:16:51 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:16:51 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 10:16:51 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:16:51.864 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 10:16:52 compute-1 nova_compute[226294]: 2026-02-02 10:16:52.178 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 02 10:16:52 compute-1 nova_compute[226294]: 2026-02-02 10:16:52.179 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 02 10:16:52 compute-1 nova_compute[226294]: 2026-02-02 10:16:52.179 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 02 10:16:52 compute-1 nova_compute[226294]: 2026-02-02 10:16:52.179 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 02 10:16:52 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:16:52 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:16:52 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:16:52.211 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:16:52 compute-1 nova_compute[226294]: 2026-02-02 10:16:52.212 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:16:52 compute-1 nova_compute[226294]: 2026-02-02 10:16:52.213 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 02 10:16:52 compute-1 podman[245339]: 2026-02-02 10:16:52.385393418 +0000 UTC m=+0.062600542 container health_status 9cab238676dda9a572dafc047b624a8b78a8fbec063bf997856559897160605d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'db4758ee7523fe447444c4bd2b867b543b1eee4e3bbcf6676cd1b27bf6147d86-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_metadata_agent, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent)
Feb 02 10:16:52 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 10:16:53 compute-1 ceph-mon[80115]: pgmap v1126: 353 pgs: 353 active+clean; 41 MiB data, 310 MiB used, 60 GiB / 60 GiB avail; 510 B/s rd, 0 op/s
Feb 02 10:16:53 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:16:53 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 10:16:53 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:16:53.868 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 10:16:54 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:16:54 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 10:16:54 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:16:54.214 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 10:16:55 compute-1 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Feb 02 10:16:55 compute-1 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2626537737' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Feb 02 10:16:55 compute-1 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Feb 02 10:16:55 compute-1 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2626537737' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Feb 02 10:16:55 compute-1 ceph-mon[80115]: pgmap v1127: 353 pgs: 353 active+clean; 41 MiB data, 310 MiB used, 60 GiB / 60 GiB avail; 765 B/s rd, 0 op/s
Feb 02 10:16:55 compute-1 ceph-mon[80115]: from='client.? 192.168.122.10:0/2626537737' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Feb 02 10:16:55 compute-1 ceph-mon[80115]: from='client.? 192.168.122.10:0/2626537737' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Feb 02 10:16:55 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:16:55 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:16:55 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:16:55.872 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:16:56 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:16:56 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:16:56 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:16:56.217 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:16:57 compute-1 ceph-mon[80115]: pgmap v1128: 353 pgs: 353 active+clean; 41 MiB data, 310 MiB used, 60 GiB / 60 GiB avail; 510 B/s rd, 0 op/s
Feb 02 10:16:57 compute-1 nova_compute[226294]: 2026-02-02 10:16:57.214 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 02 10:16:57 compute-1 nova_compute[226294]: 2026-02-02 10:16:57.216 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 02 10:16:57 compute-1 nova_compute[226294]: 2026-02-02 10:16:57.216 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 02 10:16:57 compute-1 nova_compute[226294]: 2026-02-02 10:16:57.216 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 02 10:16:57 compute-1 nova_compute[226294]: 2026-02-02 10:16:57.264 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:16:57 compute-1 nova_compute[226294]: 2026-02-02 10:16:57.265 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 02 10:16:57 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 10:16:57 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:16:57 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:16:57 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:16:57.875 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:16:58 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:16:58 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:16:58 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:16:58.219 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:16:59 compute-1 ceph-mon[80115]: pgmap v1129: 353 pgs: 353 active+clean; 41 MiB data, 310 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Feb 02 10:16:59 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:16:59 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:16:59 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:16:59.877 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:17:00 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:17:00 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:17:00 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:17:00.220 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:17:01 compute-1 ceph-mon[80115]: pgmap v1130: 353 pgs: 353 active+clean; 41 MiB data, 310 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Feb 02 10:17:01 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:17:01 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 10:17:01 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:17:01.879 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 10:17:01 compute-1 sudo[245364]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Feb 02 10:17:01 compute-1 sudo[245364]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 10:17:01 compute-1 sudo[245364]: pam_unix(sudo:session): session closed for user root
Feb 02 10:17:02 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:17:02 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:17:02 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:17:02.223 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:17:02 compute-1 nova_compute[226294]: 2026-02-02 10:17:02.267 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 02 10:17:02 compute-1 nova_compute[226294]: 2026-02-02 10:17:02.269 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 02 10:17:02 compute-1 nova_compute[226294]: 2026-02-02 10:17:02.269 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 02 10:17:02 compute-1 nova_compute[226294]: 2026-02-02 10:17:02.269 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 02 10:17:02 compute-1 nova_compute[226294]: 2026-02-02 10:17:02.304 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:17:02 compute-1 nova_compute[226294]: 2026-02-02 10:17:02.304 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 02 10:17:02 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 10:17:03 compute-1 ceph-mon[80115]: pgmap v1131: 353 pgs: 353 active+clean; 41 MiB data, 310 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Feb 02 10:17:03 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Feb 02 10:17:03 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:17:03 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:17:03 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:17:03.883 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:17:04 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:17:04 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 10:17:04 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:17:04.225 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 10:17:05 compute-1 ceph-mon[80115]: pgmap v1132: 353 pgs: 353 active+clean; 41 MiB data, 310 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Feb 02 10:17:05 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:17:05 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 10:17:05 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:17:05.885 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 10:17:06 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:17:06 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:17:06 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:17:06.228 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:17:07 compute-1 ceph-mon[80115]: pgmap v1133: 353 pgs: 353 active+clean; 41 MiB data, 310 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Feb 02 10:17:07 compute-1 nova_compute[226294]: 2026-02-02 10:17:07.305 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 02 10:17:07 compute-1 nova_compute[226294]: 2026-02-02 10:17:07.307 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 02 10:17:07 compute-1 nova_compute[226294]: 2026-02-02 10:17:07.307 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 02 10:17:07 compute-1 nova_compute[226294]: 2026-02-02 10:17:07.307 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 02 10:17:07 compute-1 nova_compute[226294]: 2026-02-02 10:17:07.335 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:17:07 compute-1 nova_compute[226294]: 2026-02-02 10:17:07.335 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 02 10:17:07 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 10:17:07 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:17:07 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:17:07 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:17:07.889 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:17:08 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:17:08 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:17:08 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:17:08.230 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:17:09 compute-1 ceph-mon[80115]: pgmap v1134: 353 pgs: 353 active+clean; 41 MiB data, 310 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Feb 02 10:17:09 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:17:09 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:17:09 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:17:09.891 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:17:10 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:17:10 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:17:10 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:17:10.232 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:17:11 compute-1 ceph-mon[80115]: pgmap v1135: 353 pgs: 353 active+clean; 41 MiB data, 310 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Feb 02 10:17:11 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:17:11 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:17:11 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:17:11.893 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:17:12 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:17:12 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 10:17:12 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:17:12.234 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 10:17:12 compute-1 nova_compute[226294]: 2026-02-02 10:17:12.336 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 02 10:17:12 compute-1 nova_compute[226294]: 2026-02-02 10:17:12.338 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 02 10:17:12 compute-1 nova_compute[226294]: 2026-02-02 10:17:12.338 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 02 10:17:12 compute-1 nova_compute[226294]: 2026-02-02 10:17:12.338 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 02 10:17:12 compute-1 nova_compute[226294]: 2026-02-02 10:17:12.372 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:17:12 compute-1 nova_compute[226294]: 2026-02-02 10:17:12.372 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 02 10:17:12 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 10:17:13 compute-1 ceph-mon[80115]: pgmap v1136: 353 pgs: 353 active+clean; 41 MiB data, 310 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Feb 02 10:17:13 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:17:13 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 10:17:13 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:17:13.896 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 10:17:14 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:17:14 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 10:17:14 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:17:14.237 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 10:17:14 compute-1 podman[245395]: 2026-02-02 10:17:14.444194848 +0000 UTC m=+0.118240397 container health_status 1fb2696999ea15e9688144989093738543d3cef725f9986792d6dfd707aefbe2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:099d88ae13fa2b3409da5310cdcba7fa01d2c87a8bc98296299a57054b9a075e, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'db4758ee7523fe447444c4bd2b867b543b1eee4e3bbcf6676cd1b27bf6147d86-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:099d88ae13fa2b3409da5310cdcba7fa01d2c87a8bc98296299a57054b9a075e', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Feb 02 10:17:15 compute-1 ceph-mon[80115]: pgmap v1137: 353 pgs: 353 active+clean; 41 MiB data, 310 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Feb 02 10:17:15 compute-1 ceph-mon[80115]: from='client.? 192.168.122.102:0/2830967543' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Feb 02 10:17:15 compute-1 ceph-mon[80115]: from='client.? 192.168.122.100:0/3368655149' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Feb 02 10:17:15 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:17:15 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 10:17:15 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:17:15.898 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 10:17:16 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:17:16 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:17:16 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:17:16.240 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:17:16 compute-1 nova_compute[226294]: 2026-02-02 10:17:16.650 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 02 10:17:16 compute-1 ceph-mon[80115]: from='client.? 192.168.122.102:0/530742723' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Feb 02 10:17:16 compute-1 ceph-mon[80115]: from='client.? 192.168.122.100:0/1487047632' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Feb 02 10:17:17 compute-1 nova_compute[226294]: 2026-02-02 10:17:17.373 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 02 10:17:17 compute-1 nova_compute[226294]: 2026-02-02 10:17:17.379 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 02 10:17:17 compute-1 nova_compute[226294]: 2026-02-02 10:17:17.379 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5006 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 02 10:17:17 compute-1 nova_compute[226294]: 2026-02-02 10:17:17.379 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 02 10:17:17 compute-1 nova_compute[226294]: 2026-02-02 10:17:17.429 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:17:17 compute-1 nova_compute[226294]: 2026-02-02 10:17:17.429 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 02 10:17:17 compute-1 nova_compute[226294]: 2026-02-02 10:17:17.430 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:17:17 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 10:17:17 compute-1 ceph-mon[80115]: pgmap v1138: 353 pgs: 353 active+clean; 41 MiB data, 310 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Feb 02 10:17:17 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Feb 02 10:17:17 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:17:17 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 10:17:17 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:17:17.901 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 10:17:18 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:17:18 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:17:18 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:17:18.242 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:17:18 compute-1 nova_compute[226294]: 2026-02-02 10:17:18.650 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 02 10:17:18 compute-1 nova_compute[226294]: 2026-02-02 10:17:18.650 226298 DEBUG nova.compute.manager [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 02 10:17:18 compute-1 nova_compute[226294]: 2026-02-02 10:17:18.650 226298 DEBUG nova.compute.manager [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 02 10:17:18 compute-1 nova_compute[226294]: 2026-02-02 10:17:18.666 226298 DEBUG nova.compute.manager [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Feb 02 10:17:18 compute-1 nova_compute[226294]: 2026-02-02 10:17:18.667 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 02 10:17:18 compute-1 nova_compute[226294]: 2026-02-02 10:17:18.667 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 02 10:17:19 compute-1 nova_compute[226294]: 2026-02-02 10:17:19.648 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 02 10:17:19 compute-1 ceph-mon[80115]: pgmap v1139: 353 pgs: 353 active+clean; 41 MiB data, 310 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Feb 02 10:17:19 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:17:19 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 10:17:19 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:17:19.904 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 10:17:20 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:17:20 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 10:17:20 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:17:20.244 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 10:17:21 compute-1 nova_compute[226294]: 2026-02-02 10:17:21.644 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 02 10:17:21 compute-1 ceph-mon[80115]: pgmap v1140: 353 pgs: 353 active+clean; 41 MiB data, 310 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Feb 02 10:17:21 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:17:21 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:17:21 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:17:21.908 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:17:21 compute-1 sudo[245427]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Feb 02 10:17:21 compute-1 sudo[245427]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 10:17:21 compute-1 sudo[245427]: pam_unix(sudo:session): session closed for user root
Feb 02 10:17:22 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:17:22 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:17:22 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:17:22.246 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:17:22 compute-1 nova_compute[226294]: 2026-02-02 10:17:22.431 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 02 10:17:22 compute-1 nova_compute[226294]: 2026-02-02 10:17:22.433 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 02 10:17:22 compute-1 nova_compute[226294]: 2026-02-02 10:17:22.433 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 02 10:17:22 compute-1 nova_compute[226294]: 2026-02-02 10:17:22.433 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 02 10:17:22 compute-1 nova_compute[226294]: 2026-02-02 10:17:22.475 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:17:22 compute-1 nova_compute[226294]: 2026-02-02 10:17:22.475 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 02 10:17:22 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 10:17:23 compute-1 podman[245453]: 2026-02-02 10:17:23.36066748 +0000 UTC m=+0.041812267 container health_status 9cab238676dda9a572dafc047b624a8b78a8fbec063bf997856559897160605d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'db4758ee7523fe447444c4bd2b867b543b1eee4e3bbcf6676cd1b27bf6147d86-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2)
Feb 02 10:17:23 compute-1 nova_compute[226294]: 2026-02-02 10:17:23.648 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 02 10:17:23 compute-1 nova_compute[226294]: 2026-02-02 10:17:23.649 226298 DEBUG nova.compute.manager [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 02 10:17:23 compute-1 nova_compute[226294]: 2026-02-02 10:17:23.649 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 02 10:17:23 compute-1 nova_compute[226294]: 2026-02-02 10:17:23.679 226298 DEBUG oslo_concurrency.lockutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 02 10:17:23 compute-1 nova_compute[226294]: 2026-02-02 10:17:23.679 226298 DEBUG oslo_concurrency.lockutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 02 10:17:23 compute-1 nova_compute[226294]: 2026-02-02 10:17:23.680 226298 DEBUG oslo_concurrency.lockutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 02 10:17:23 compute-1 nova_compute[226294]: 2026-02-02 10:17:23.680 226298 DEBUG nova.compute.resource_tracker [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 02 10:17:23 compute-1 nova_compute[226294]: 2026-02-02 10:17:23.681 226298 DEBUG oslo_concurrency.processutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 02 10:17:23 compute-1 ceph-mon[80115]: pgmap v1141: 353 pgs: 353 active+clean; 41 MiB data, 310 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Feb 02 10:17:23 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:17:23 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:17:23 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:17:23.910 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:17:24 compute-1 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Feb 02 10:17:24 compute-1 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/557594777' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Feb 02 10:17:24 compute-1 nova_compute[226294]: 2026-02-02 10:17:24.130 226298 DEBUG oslo_concurrency.processutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.449s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 02 10:17:24 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:17:24 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:17:24 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:17:24.248 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:17:24 compute-1 nova_compute[226294]: 2026-02-02 10:17:24.272 226298 WARNING nova.virt.libvirt.driver [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 02 10:17:24 compute-1 nova_compute[226294]: 2026-02-02 10:17:24.274 226298 DEBUG nova.compute.resource_tracker [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4874MB free_disk=59.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 02 10:17:24 compute-1 nova_compute[226294]: 2026-02-02 10:17:24.274 226298 DEBUG oslo_concurrency.lockutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 02 10:17:24 compute-1 nova_compute[226294]: 2026-02-02 10:17:24.275 226298 DEBUG oslo_concurrency.lockutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 02 10:17:24 compute-1 nova_compute[226294]: 2026-02-02 10:17:24.334 226298 DEBUG nova.compute.resource_tracker [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 02 10:17:24 compute-1 nova_compute[226294]: 2026-02-02 10:17:24.335 226298 DEBUG nova.compute.resource_tracker [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 02 10:17:24 compute-1 nova_compute[226294]: 2026-02-02 10:17:24.355 226298 DEBUG oslo_concurrency.processutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 02 10:17:24 compute-1 ceph-mon[80115]: from='client.? 192.168.122.101:0/557594777' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Feb 02 10:17:24 compute-1 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Feb 02 10:17:24 compute-1 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3229599435' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Feb 02 10:17:24 compute-1 nova_compute[226294]: 2026-02-02 10:17:24.804 226298 DEBUG oslo_concurrency.processutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.449s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 02 10:17:24 compute-1 nova_compute[226294]: 2026-02-02 10:17:24.810 226298 DEBUG nova.compute.provider_tree [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Inventory has not changed in ProviderTree for provider: 8e32c057-ad28-4c19-8374-763e0c1c8622 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 02 10:17:24 compute-1 nova_compute[226294]: 2026-02-02 10:17:24.830 226298 DEBUG nova.scheduler.client.report [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Inventory has not changed for provider 8e32c057-ad28-4c19-8374-763e0c1c8622 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 02 10:17:24 compute-1 nova_compute[226294]: 2026-02-02 10:17:24.833 226298 DEBUG nova.compute.resource_tracker [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 02 10:17:24 compute-1 nova_compute[226294]: 2026-02-02 10:17:24.834 226298 DEBUG oslo_concurrency.lockutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.559s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 02 10:17:25 compute-1 ceph-mon[80115]: pgmap v1142: 353 pgs: 353 active+clean; 41 MiB data, 310 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Feb 02 10:17:25 compute-1 ceph-mon[80115]: from='client.? 192.168.122.101:0/3229599435' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Feb 02 10:17:25 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:17:25 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:17:25 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:17:25.913 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:17:26 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:17:26 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:17:26 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:17:26.250 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:17:26 compute-1 nova_compute[226294]: 2026-02-02 10:17:26.836 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 02 10:17:27 compute-1 nova_compute[226294]: 2026-02-02 10:17:27.476 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 02 10:17:27 compute-1 nova_compute[226294]: 2026-02-02 10:17:27.478 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 02 10:17:27 compute-1 nova_compute[226294]: 2026-02-02 10:17:27.479 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 02 10:17:27 compute-1 nova_compute[226294]: 2026-02-02 10:17:27.479 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 02 10:17:27 compute-1 nova_compute[226294]: 2026-02-02 10:17:27.515 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:17:27 compute-1 nova_compute[226294]: 2026-02-02 10:17:27.515 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 02 10:17:27 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 10:17:27 compute-1 ceph-mon[80115]: pgmap v1143: 353 pgs: 353 active+clean; 41 MiB data, 310 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Feb 02 10:17:27 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:17:27 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:17:27 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:17:27.916 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:17:28 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:17:28 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 10:17:28 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:17:28.252 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 10:17:29 compute-1 ceph-mon[80115]: pgmap v1144: 353 pgs: 353 active+clean; 41 MiB data, 310 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Feb 02 10:17:29 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:17:29 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:17:29 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:17:29.919 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:17:30 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:17:30 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 10:17:30 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:17:30.255 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 10:17:31 compute-1 sudo[245520]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 02 10:17:31 compute-1 sudo[245520]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 10:17:31 compute-1 sudo[245520]: pam_unix(sudo:session): session closed for user root
Feb 02 10:17:31 compute-1 sudo[245545]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/d241d473-9fcb-5f74-b163-f1ca4454e7f1/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Feb 02 10:17:31 compute-1 sudo[245545]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 10:17:31 compute-1 sudo[245545]: pam_unix(sudo:session): session closed for user root
Feb 02 10:17:31 compute-1 ceph-mon[80115]: pgmap v1145: 353 pgs: 353 active+clean; 41 MiB data, 310 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Feb 02 10:17:31 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:17:31 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 10:17:31 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:17:31.922 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 10:17:32 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:17:32 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 10:17:32 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:17:32.258 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 10:17:32 compute-1 nova_compute[226294]: 2026-02-02 10:17:32.516 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 02 10:17:32 compute-1 nova_compute[226294]: 2026-02-02 10:17:32.518 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 02 10:17:32 compute-1 nova_compute[226294]: 2026-02-02 10:17:32.518 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 02 10:17:32 compute-1 nova_compute[226294]: 2026-02-02 10:17:32.518 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 02 10:17:32 compute-1 nova_compute[226294]: 2026-02-02 10:17:32.560 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:17:32 compute-1 nova_compute[226294]: 2026-02-02 10:17:32.561 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 02 10:17:32 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 10:17:32 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Feb 02 10:17:32 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Feb 02 10:17:32 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb 02 10:17:32 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb 02 10:17:32 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Feb 02 10:17:32 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Feb 02 10:17:32 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Feb 02 10:17:32 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Feb 02 10:17:33 compute-1 ceph-mon[80115]: pgmap v1146: 353 pgs: 353 active+clean; 41 MiB data, 310 MiB used, 60 GiB / 60 GiB avail; 512 B/s rd, 0 op/s
Feb 02 10:17:33 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:17:33 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:17:33 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:17:33.926 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:17:34 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:17:34 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:17:34 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:17:34.260 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:17:34 compute-1 ceph-mon[80115]: pgmap v1147: 353 pgs: 353 active+clean; 41 MiB data, 310 MiB used, 60 GiB / 60 GiB avail; 769 B/s rd, 0 op/s
Feb 02 10:17:35 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:17:35 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:17:35 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:17:35.928 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:17:36 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:17:36 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:17:36 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:17:36.262 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:17:36 compute-1 sudo[245604]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 02 10:17:36 compute-1 sudo[245604]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 10:17:36 compute-1 sudo[245604]: pam_unix(sudo:session): session closed for user root
Feb 02 10:17:37 compute-1 ceph-mon[80115]: pgmap v1148: 353 pgs: 353 active+clean; 41 MiB data, 310 MiB used, 60 GiB / 60 GiB avail; 512 B/s rd, 0 op/s
Feb 02 10:17:37 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb 02 10:17:37 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb 02 10:17:37 compute-1 nova_compute[226294]: 2026-02-02 10:17:37.563 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 02 10:17:37 compute-1 nova_compute[226294]: 2026-02-02 10:17:37.564 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 02 10:17:37 compute-1 nova_compute[226294]: 2026-02-02 10:17:37.564 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 02 10:17:37 compute-1 nova_compute[226294]: 2026-02-02 10:17:37.564 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 02 10:17:37 compute-1 nova_compute[226294]: 2026-02-02 10:17:37.614 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:17:37 compute-1 nova_compute[226294]: 2026-02-02 10:17:37.615 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 02 10:17:37 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 10:17:37 compute-1 ceph-mon[80115]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #64. Immutable memtables: 0.
Feb 02 10:17:37 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:17:37.646342) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Feb 02 10:17:37 compute-1 ceph-mon[80115]: rocksdb: [db/flush_job.cc:856] [default] [JOB 37] Flushing memtable with next log file: 64
Feb 02 10:17:37 compute-1 ceph-mon[80115]: rocksdb: EVENT_LOG_v1 {"time_micros": 1770027457646388, "job": 37, "event": "flush_started", "num_memtables": 1, "num_entries": 1565, "num_deletes": 505, "total_data_size": 3212587, "memory_usage": 3274512, "flush_reason": "Manual Compaction"}
Feb 02 10:17:37 compute-1 ceph-mon[80115]: rocksdb: [db/flush_job.cc:885] [default] [JOB 37] Level-0 flush table #65: started
Feb 02 10:17:37 compute-1 ceph-mon[80115]: rocksdb: EVENT_LOG_v1 {"time_micros": 1770027457662079, "cf_name": "default", "job": 37, "event": "table_file_creation", "file_number": 65, "file_size": 2072083, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 33979, "largest_seqno": 35539, "table_properties": {"data_size": 2065840, "index_size": 2998, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2181, "raw_key_size": 15397, "raw_average_key_size": 17, "raw_value_size": 2051204, "raw_average_value_size": 2357, "num_data_blocks": 132, "num_entries": 870, "num_filter_entries": 870, "num_deletions": 505, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1770027353, "oldest_key_time": 1770027353, "file_creation_time": 1770027457, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "fac1d709-8a2a-487d-b05b-57255ec289c7", "db_session_id": "DE871D21TSCUFP8UED8E", "orig_file_number": 65, "seqno_to_time_mapping": "N/A"}}
Feb 02 10:17:37 compute-1 ceph-mon[80115]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 37] Flush lasted 15844 microseconds, and 6053 cpu microseconds.
Feb 02 10:17:37 compute-1 ceph-mon[80115]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Feb 02 10:17:37 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:17:37.662141) [db/flush_job.cc:967] [default] [JOB 37] Level-0 flush table #65: 2072083 bytes OK
Feb 02 10:17:37 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:17:37.662209) [db/memtable_list.cc:519] [default] Level-0 commit table #65 started
Feb 02 10:17:37 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:17:37.664047) [db/memtable_list.cc:722] [default] Level-0 commit table #65: memtable #1 done
Feb 02 10:17:37 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:17:37.664073) EVENT_LOG_v1 {"time_micros": 1770027457664065, "job": 37, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Feb 02 10:17:37 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:17:37.664099) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Feb 02 10:17:37 compute-1 ceph-mon[80115]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 37] Try to delete WAL files size 3204307, prev total WAL file size 3204307, number of live WAL files 2.
Feb 02 10:17:37 compute-1 ceph-mon[80115]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000061.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb 02 10:17:37 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:17:37.665214) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6B7600323534' seq:72057594037927935, type:22 .. '6B7600353035' seq:0, type:0; will stop at (end)
Feb 02 10:17:37 compute-1 ceph-mon[80115]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 38] Compacting 1@0 + 1@6 files to L6, score -1.00
Feb 02 10:17:37 compute-1 ceph-mon[80115]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 37 Base level 0, inputs: [65(2023KB)], [63(13MB)]
Feb 02 10:17:37 compute-1 ceph-mon[80115]: rocksdb: EVENT_LOG_v1 {"time_micros": 1770027457665292, "job": 38, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [65], "files_L6": [63], "score": -1, "input_data_size": 15978032, "oldest_snapshot_seqno": -1}
Feb 02 10:17:37 compute-1 ceph-mon[80115]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 38] Generated table #66: 6456 keys, 14497144 bytes, temperature: kUnknown
Feb 02 10:17:37 compute-1 ceph-mon[80115]: rocksdb: EVENT_LOG_v1 {"time_micros": 1770027457820205, "cf_name": "default", "job": 38, "event": "table_file_creation", "file_number": 66, "file_size": 14497144, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 14453674, "index_size": 26208, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 16197, "raw_key_size": 169525, "raw_average_key_size": 26, "raw_value_size": 14337089, "raw_average_value_size": 2220, "num_data_blocks": 1037, "num_entries": 6456, "num_filter_entries": 6456, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1770025175, "oldest_key_time": 0, "file_creation_time": 1770027457, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "fac1d709-8a2a-487d-b05b-57255ec289c7", "db_session_id": "DE871D21TSCUFP8UED8E", "orig_file_number": 66, "seqno_to_time_mapping": "N/A"}}
Feb 02 10:17:37 compute-1 ceph-mon[80115]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Feb 02 10:17:37 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:17:37.820535) [db/compaction/compaction_job.cc:1663] [default] [JOB 38] Compacted 1@0 + 1@6 files to L6 => 14497144 bytes
Feb 02 10:17:37 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:17:37.822262) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 103.1 rd, 93.5 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(2.0, 13.3 +0.0 blob) out(13.8 +0.0 blob), read-write-amplify(14.7) write-amplify(7.0) OK, records in: 7481, records dropped: 1025 output_compression: NoCompression
Feb 02 10:17:37 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:17:37.822298) EVENT_LOG_v1 {"time_micros": 1770027457822282, "job": 38, "event": "compaction_finished", "compaction_time_micros": 155000, "compaction_time_cpu_micros": 40127, "output_level": 6, "num_output_files": 1, "total_output_size": 14497144, "num_input_records": 7481, "num_output_records": 6456, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Feb 02 10:17:37 compute-1 ceph-mon[80115]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000065.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb 02 10:17:37 compute-1 ceph-mon[80115]: rocksdb: EVENT_LOG_v1 {"time_micros": 1770027457822775, "job": 38, "event": "table_file_deletion", "file_number": 65}
Feb 02 10:17:37 compute-1 ceph-mon[80115]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000063.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb 02 10:17:37 compute-1 ceph-mon[80115]: rocksdb: EVENT_LOG_v1 {"time_micros": 1770027457825079, "job": 38, "event": "table_file_deletion", "file_number": 63}
Feb 02 10:17:37 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:17:37.665046) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 02 10:17:37 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:17:37.825150) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 02 10:17:37 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:17:37.825158) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 02 10:17:37 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:17:37.825186) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 02 10:17:37 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:17:37.825189) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 02 10:17:37 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:17:37.825191) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 02 10:17:37 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:17:37 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 10:17:37 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:17:37.931 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 10:17:38 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:17:38 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:17:38 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:17:38.264 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:17:39 compute-1 ceph-mon[80115]: pgmap v1149: 353 pgs: 353 active+clean; 41 MiB data, 310 MiB used, 60 GiB / 60 GiB avail; 512 B/s rd, 0 op/s
Feb 02 10:17:39 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:17:39 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:17:39 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:17:39.934 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:17:40 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:17:40 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:17:40 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:17:40.266 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:17:41 compute-1 ceph-mon[80115]: pgmap v1150: 353 pgs: 353 active+clean; 41 MiB data, 310 MiB used, 60 GiB / 60 GiB avail; 769 B/s rd, 0 op/s
Feb 02 10:17:41 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:17:41 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:17:41 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:17:41.938 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:17:42 compute-1 sudo[245632]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Feb 02 10:17:42 compute-1 sudo[245632]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 10:17:42 compute-1 sudo[245632]: pam_unix(sudo:session): session closed for user root
Feb 02 10:17:42 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:17:42 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:17:42 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:17:42.268 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:17:42 compute-1 nova_compute[226294]: 2026-02-02 10:17:42.616 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 02 10:17:42 compute-1 nova_compute[226294]: 2026-02-02 10:17:42.618 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 02 10:17:42 compute-1 nova_compute[226294]: 2026-02-02 10:17:42.618 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 02 10:17:42 compute-1 nova_compute[226294]: 2026-02-02 10:17:42.618 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 02 10:17:42 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 10:17:42 compute-1 nova_compute[226294]: 2026-02-02 10:17:42.649 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:17:42 compute-1 nova_compute[226294]: 2026-02-02 10:17:42.650 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 02 10:17:43 compute-1 ceph-mon[80115]: pgmap v1151: 353 pgs: 353 active+clean; 41 MiB data, 310 MiB used, 60 GiB / 60 GiB avail; 512 B/s rd, 0 op/s
Feb 02 10:17:43 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:17:43 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:17:43 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:17:43.941 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:17:44 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:17:44 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:17:44 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:17:44.270 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:17:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 10:17:44.918 143542 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 02 10:17:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 10:17:44.919 143542 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 02 10:17:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 10:17:44.919 143542 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 02 10:17:45 compute-1 podman[245659]: 2026-02-02 10:17:45.460818215 +0000 UTC m=+0.131302446 container health_status 1fb2696999ea15e9688144989093738543d3cef725f9986792d6dfd707aefbe2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:099d88ae13fa2b3409da5310cdcba7fa01d2c87a8bc98296299a57054b9a075e, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'db4758ee7523fe447444c4bd2b867b543b1eee4e3bbcf6676cd1b27bf6147d86-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:099d88ae13fa2b3409da5310cdcba7fa01d2c87a8bc98296299a57054b9a075e', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 02 10:17:45 compute-1 ceph-mon[80115]: pgmap v1152: 353 pgs: 353 active+clean; 41 MiB data, 310 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Feb 02 10:17:45 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:17:45 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:17:45 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:17:45.943 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:17:46 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:17:46 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 10:17:46 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:17:46.272 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 10:17:47 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 10:17:47 compute-1 nova_compute[226294]: 2026-02-02 10:17:47.651 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:17:47 compute-1 ceph-mon[80115]: pgmap v1153: 353 pgs: 353 active+clean; 41 MiB data, 310 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Feb 02 10:17:47 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Feb 02 10:17:47 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:17:47 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:17:47 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:17:47.946 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:17:48 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:17:48 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:17:48 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:17:48.275 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:17:49 compute-1 ceph-mon[80115]: pgmap v1154: 353 pgs: 353 active+clean; 41 MiB data, 310 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Feb 02 10:17:49 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:17:49 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 10:17:49 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:17:49.948 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 10:17:50 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:17:50 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:17:50 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:17:50.278 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:17:51 compute-1 ceph-mon[80115]: pgmap v1155: 353 pgs: 353 active+clean; 41 MiB data, 310 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Feb 02 10:17:51 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:17:51 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:17:51 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:17:51.952 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:17:52 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:17:52 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 10:17:52 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:17:52.280 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 10:17:52 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 10:17:52 compute-1 nova_compute[226294]: 2026-02-02 10:17:52.653 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 02 10:17:52 compute-1 nova_compute[226294]: 2026-02-02 10:17:52.654 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 02 10:17:52 compute-1 nova_compute[226294]: 2026-02-02 10:17:52.654 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 02 10:17:52 compute-1 nova_compute[226294]: 2026-02-02 10:17:52.655 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 02 10:17:52 compute-1 nova_compute[226294]: 2026-02-02 10:17:52.694 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:17:52 compute-1 nova_compute[226294]: 2026-02-02 10:17:52.695 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 02 10:17:53 compute-1 ceph-mon[80115]: pgmap v1156: 353 pgs: 353 active+clean; 41 MiB data, 310 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Feb 02 10:17:53 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:17:53 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 10:17:53 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:17:53.954 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 10:17:54 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:17:54 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 10:17:54 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:17:54.283 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 10:17:54 compute-1 podman[245691]: 2026-02-02 10:17:54.387037939 +0000 UTC m=+0.060651500 container health_status 9cab238676dda9a572dafc047b624a8b78a8fbec063bf997856559897160605d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'db4758ee7523fe447444c4bd2b867b543b1eee4e3bbcf6676cd1b27bf6147d86-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_metadata_agent, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Feb 02 10:17:55 compute-1 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Feb 02 10:17:55 compute-1 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2534645283' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Feb 02 10:17:55 compute-1 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Feb 02 10:17:55 compute-1 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2534645283' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Feb 02 10:17:55 compute-1 ceph-mon[80115]: pgmap v1157: 353 pgs: 353 active+clean; 41 MiB data, 310 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Feb 02 10:17:55 compute-1 ceph-mon[80115]: from='client.? 192.168.122.10:0/2534645283' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Feb 02 10:17:55 compute-1 ceph-mon[80115]: from='client.? 192.168.122.10:0/2534645283' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Feb 02 10:17:55 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:17:55 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 10:17:55 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:17:55.957 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 10:17:56 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:17:56 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:17:56 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:17:56.286 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:17:57 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 10:17:57 compute-1 nova_compute[226294]: 2026-02-02 10:17:57.695 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:17:57 compute-1 ceph-mon[80115]: pgmap v1158: 353 pgs: 353 active+clean; 41 MiB data, 310 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Feb 02 10:17:57 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:17:57 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 10:17:57 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:17:57.960 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 10:17:58 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:17:58 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 10:17:58 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:17:58.288 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 10:17:59 compute-1 ceph-mon[80115]: pgmap v1159: 353 pgs: 353 active+clean; 41 MiB data, 310 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Feb 02 10:17:59 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:17:59 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:17:59 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:17:59.964 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:18:00 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:18:00 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.002000054s ======
Feb 02 10:18:00 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:18:00.290 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000054s
Feb 02 10:18:01 compute-1 ceph-mon[80115]: pgmap v1160: 353 pgs: 353 active+clean; 41 MiB data, 310 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Feb 02 10:18:01 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:18:01 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 10:18:01 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:18:01.966 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 10:18:02 compute-1 sudo[245715]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Feb 02 10:18:02 compute-1 sudo[245715]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 10:18:02 compute-1 sudo[245715]: pam_unix(sudo:session): session closed for user root
Feb 02 10:18:02 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:18:02 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:18:02 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:18:02.293 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:18:02 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 10:18:02 compute-1 nova_compute[226294]: 2026-02-02 10:18:02.697 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 02 10:18:02 compute-1 nova_compute[226294]: 2026-02-02 10:18:02.698 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 02 10:18:02 compute-1 nova_compute[226294]: 2026-02-02 10:18:02.698 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5001 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 02 10:18:02 compute-1 nova_compute[226294]: 2026-02-02 10:18:02.698 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 02 10:18:02 compute-1 nova_compute[226294]: 2026-02-02 10:18:02.729 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:18:02 compute-1 nova_compute[226294]: 2026-02-02 10:18:02.730 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 02 10:18:02 compute-1 ceph-mon[80115]: pgmap v1161: 353 pgs: 353 active+clean; 41 MiB data, 310 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Feb 02 10:18:02 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Feb 02 10:18:03 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:18:03 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:18:03 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:18:03.969 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:18:04 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:18:04 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:18:04 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:18:04.295 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:18:05 compute-1 ceph-mon[80115]: pgmap v1162: 353 pgs: 353 active+clean; 41 MiB data, 310 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Feb 02 10:18:05 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:18:05 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 10:18:05 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:18:05.971 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 10:18:06 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:18:06 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 10:18:06 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:18:06.297 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 10:18:07 compute-1 ceph-mon[80115]: pgmap v1163: 353 pgs: 353 active+clean; 41 MiB data, 310 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Feb 02 10:18:07 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 10:18:07 compute-1 nova_compute[226294]: 2026-02-02 10:18:07.731 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:18:07 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:18:07 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:18:07 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:18:07.976 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:18:08 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:18:08 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:18:08 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:18:08.300 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:18:09 compute-1 ceph-mon[80115]: pgmap v1164: 353 pgs: 353 active+clean; 41 MiB data, 310 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Feb 02 10:18:09 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:18:09 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:18:09 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:18:09.979 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:18:10 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:18:10 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:18:10 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:18:10.302 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:18:11 compute-1 ceph-mon[80115]: pgmap v1165: 353 pgs: 353 active+clean; 41 MiB data, 310 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Feb 02 10:18:11 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:18:11 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 10:18:11 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:18:11.981 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 10:18:12 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:18:12 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 10:18:12 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:18:12.304 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 10:18:12 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 10:18:12 compute-1 nova_compute[226294]: 2026-02-02 10:18:12.732 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 02 10:18:12 compute-1 nova_compute[226294]: 2026-02-02 10:18:12.734 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 02 10:18:12 compute-1 nova_compute[226294]: 2026-02-02 10:18:12.735 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 02 10:18:12 compute-1 nova_compute[226294]: 2026-02-02 10:18:12.735 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 02 10:18:12 compute-1 nova_compute[226294]: 2026-02-02 10:18:12.762 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:18:12 compute-1 nova_compute[226294]: 2026-02-02 10:18:12.763 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 02 10:18:13 compute-1 ceph-mon[80115]: pgmap v1166: 353 pgs: 353 active+clean; 41 MiB data, 310 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Feb 02 10:18:13 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:18:13 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:18:13 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:18:13.984 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:18:15 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:18:15 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:18:15 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:18:15.034 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:18:15 compute-1 ceph-mon[80115]: pgmap v1167: 353 pgs: 353 active+clean; 41 MiB data, 310 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Feb 02 10:18:15 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:18:15 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:18:15 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:18:15.987 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:18:16 compute-1 ceph-mon[80115]: from='client.? 192.168.122.100:0/1079793048' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Feb 02 10:18:16 compute-1 podman[245747]: 2026-02-02 10:18:16.476032142 +0000 UTC m=+0.135679743 container health_status 1fb2696999ea15e9688144989093738543d3cef725f9986792d6dfd707aefbe2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:099d88ae13fa2b3409da5310cdcba7fa01d2c87a8bc98296299a57054b9a075e, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'db4758ee7523fe447444c4bd2b867b543b1eee4e3bbcf6676cd1b27bf6147d86-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:099d88ae13fa2b3409da5310cdcba7fa01d2c87a8bc98296299a57054b9a075e', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller)
Feb 02 10:18:16 compute-1 nova_compute[226294]: 2026-02-02 10:18:16.649 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 02 10:18:16 compute-1 nova_compute[226294]: 2026-02-02 10:18:16.649 226298 DEBUG nova.compute.manager [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Feb 02 10:18:17 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:18:17 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:18:17 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:18:17.036 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:18:17 compute-1 ceph-mon[80115]: pgmap v1168: 353 pgs: 353 active+clean; 41 MiB data, 310 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Feb 02 10:18:17 compute-1 ceph-mon[80115]: from='client.? 192.168.122.102:0/3102003360' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Feb 02 10:18:17 compute-1 ceph-mon[80115]: from='client.? 192.168.122.100:0/3816661589' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Feb 02 10:18:17 compute-1 ceph-mon[80115]: from='client.? 192.168.122.102:0/2892987634' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Feb 02 10:18:17 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 10:18:17 compute-1 nova_compute[226294]: 2026-02-02 10:18:17.764 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 02 10:18:17 compute-1 nova_compute[226294]: 2026-02-02 10:18:17.765 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:18:17 compute-1 nova_compute[226294]: 2026-02-02 10:18:17.766 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 02 10:18:17 compute-1 nova_compute[226294]: 2026-02-02 10:18:17.766 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 02 10:18:17 compute-1 nova_compute[226294]: 2026-02-02 10:18:17.766 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 02 10:18:17 compute-1 nova_compute[226294]: 2026-02-02 10:18:17.769 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:18:17 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:18:17 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 10:18:17 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:18:17.989 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 10:18:18 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Feb 02 10:18:18 compute-1 nova_compute[226294]: 2026-02-02 10:18:18.666 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 02 10:18:18 compute-1 nova_compute[226294]: 2026-02-02 10:18:18.667 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 02 10:18:19 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:18:19 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:18:19 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:18:19.038 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:18:19 compute-1 ceph-mon[80115]: pgmap v1169: 353 pgs: 353 active+clean; 41 MiB data, 310 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Feb 02 10:18:19 compute-1 nova_compute[226294]: 2026-02-02 10:18:19.645 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 02 10:18:19 compute-1 nova_compute[226294]: 2026-02-02 10:18:19.672 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 02 10:18:19 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:18:19 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:18:19 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:18:19.994 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:18:20 compute-1 nova_compute[226294]: 2026-02-02 10:18:20.648 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 02 10:18:20 compute-1 nova_compute[226294]: 2026-02-02 10:18:20.649 226298 DEBUG nova.compute.manager [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 02 10:18:20 compute-1 nova_compute[226294]: 2026-02-02 10:18:20.649 226298 DEBUG nova.compute.manager [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 02 10:18:20 compute-1 nova_compute[226294]: 2026-02-02 10:18:20.663 226298 DEBUG nova.compute.manager [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Feb 02 10:18:21 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:18:21 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:18:21 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:18:21.040 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:18:21 compute-1 ceph-mon[80115]: pgmap v1170: 353 pgs: 353 active+clean; 41 MiB data, 310 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Feb 02 10:18:21 compute-1 nova_compute[226294]: 2026-02-02 10:18:21.648 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 02 10:18:21 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:18:21 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:18:21 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:18:21.997 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:18:22 compute-1 sudo[245777]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Feb 02 10:18:22 compute-1 sudo[245777]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 10:18:22 compute-1 sudo[245777]: pam_unix(sudo:session): session closed for user root
Feb 02 10:18:22 compute-1 nova_compute[226294]: 2026-02-02 10:18:22.645 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 02 10:18:22 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 10:18:22 compute-1 nova_compute[226294]: 2026-02-02 10:18:22.765 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:18:22 compute-1 nova_compute[226294]: 2026-02-02 10:18:22.770 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:18:23 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:18:23 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 10:18:23 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:18:23.042 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 10:18:23 compute-1 ceph-mon[80115]: pgmap v1171: 353 pgs: 353 active+clean; 41 MiB data, 310 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Feb 02 10:18:23 compute-1 nova_compute[226294]: 2026-02-02 10:18:23.649 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 02 10:18:23 compute-1 nova_compute[226294]: 2026-02-02 10:18:23.650 226298 DEBUG nova.compute.manager [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 02 10:18:24 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:18:24 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:18:24 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:18:24.000 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:18:24 compute-1 nova_compute[226294]: 2026-02-02 10:18:24.648 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 02 10:18:24 compute-1 nova_compute[226294]: 2026-02-02 10:18:24.689 226298 DEBUG oslo_concurrency.lockutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 02 10:18:24 compute-1 nova_compute[226294]: 2026-02-02 10:18:24.689 226298 DEBUG oslo_concurrency.lockutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 02 10:18:24 compute-1 nova_compute[226294]: 2026-02-02 10:18:24.690 226298 DEBUG oslo_concurrency.lockutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 02 10:18:24 compute-1 nova_compute[226294]: 2026-02-02 10:18:24.690 226298 DEBUG nova.compute.resource_tracker [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 02 10:18:24 compute-1 nova_compute[226294]: 2026-02-02 10:18:24.690 226298 DEBUG oslo_concurrency.processutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 02 10:18:25 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:18:25 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 10:18:25 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:18:25.044 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 10:18:25 compute-1 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Feb 02 10:18:25 compute-1 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/356890111' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Feb 02 10:18:25 compute-1 nova_compute[226294]: 2026-02-02 10:18:25.168 226298 DEBUG oslo_concurrency.processutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.478s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 02 10:18:25 compute-1 ceph-mon[80115]: pgmap v1172: 353 pgs: 353 active+clean; 41 MiB data, 310 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Feb 02 10:18:25 compute-1 ceph-mon[80115]: from='client.? 192.168.122.101:0/356890111' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Feb 02 10:18:25 compute-1 nova_compute[226294]: 2026-02-02 10:18:25.352 226298 WARNING nova.virt.libvirt.driver [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 02 10:18:25 compute-1 nova_compute[226294]: 2026-02-02 10:18:25.354 226298 DEBUG nova.compute.resource_tracker [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4901MB free_disk=59.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 02 10:18:25 compute-1 nova_compute[226294]: 2026-02-02 10:18:25.354 226298 DEBUG oslo_concurrency.lockutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 02 10:18:25 compute-1 nova_compute[226294]: 2026-02-02 10:18:25.355 226298 DEBUG oslo_concurrency.lockutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 02 10:18:25 compute-1 podman[245826]: 2026-02-02 10:18:25.37912583 +0000 UTC m=+0.055079129 container health_status 9cab238676dda9a572dafc047b624a8b78a8fbec063bf997856559897160605d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'db4758ee7523fe447444c4bd2b867b543b1eee4e3bbcf6676cd1b27bf6147d86-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Feb 02 10:18:25 compute-1 nova_compute[226294]: 2026-02-02 10:18:25.574 226298 DEBUG nova.compute.resource_tracker [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 02 10:18:25 compute-1 nova_compute[226294]: 2026-02-02 10:18:25.575 226298 DEBUG nova.compute.resource_tracker [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 02 10:18:25 compute-1 nova_compute[226294]: 2026-02-02 10:18:25.606 226298 DEBUG oslo_concurrency.processutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 02 10:18:26 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:18:26 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:18:26 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:18:26.003 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:18:26 compute-1 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Feb 02 10:18:26 compute-1 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1611172185' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Feb 02 10:18:26 compute-1 nova_compute[226294]: 2026-02-02 10:18:26.061 226298 DEBUG oslo_concurrency.processutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.455s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 02 10:18:26 compute-1 nova_compute[226294]: 2026-02-02 10:18:26.066 226298 DEBUG nova.compute.provider_tree [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Inventory has not changed in ProviderTree for provider: 8e32c057-ad28-4c19-8374-763e0c1c8622 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 02 10:18:26 compute-1 nova_compute[226294]: 2026-02-02 10:18:26.087 226298 DEBUG nova.scheduler.client.report [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Inventory has not changed for provider 8e32c057-ad28-4c19-8374-763e0c1c8622 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 02 10:18:26 compute-1 nova_compute[226294]: 2026-02-02 10:18:26.090 226298 DEBUG nova.compute.resource_tracker [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 02 10:18:26 compute-1 nova_compute[226294]: 2026-02-02 10:18:26.090 226298 DEBUG oslo_concurrency.lockutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.735s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 02 10:18:26 compute-1 ceph-mon[80115]: from='client.? 192.168.122.101:0/1611172185' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Feb 02 10:18:26 compute-1 nova_compute[226294]: 2026-02-02 10:18:26.648 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 02 10:18:26 compute-1 nova_compute[226294]: 2026-02-02 10:18:26.649 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 02 10:18:27 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:18:27 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:18:27 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:18:27.046 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:18:27 compute-1 ceph-mon[80115]: pgmap v1173: 353 pgs: 353 active+clean; 41 MiB data, 310 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Feb 02 10:18:27 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 10:18:27 compute-1 nova_compute[226294]: 2026-02-02 10:18:27.767 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:18:27 compute-1 nova_compute[226294]: 2026-02-02 10:18:27.771 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:18:28 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:18:28 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:18:28 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:18:28.007 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:18:29 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:18:29 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:18:29 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:18:29.048 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:18:29 compute-1 ceph-mon[80115]: pgmap v1174: 353 pgs: 353 active+clean; 41 MiB data, 310 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Feb 02 10:18:30 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:18:30 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 10:18:30 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:18:30.010 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 10:18:31 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:18:31 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:18:31 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:18:31.051 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:18:31 compute-1 ceph-mon[80115]: pgmap v1175: 353 pgs: 353 active+clean; 41 MiB data, 310 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Feb 02 10:18:32 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:18:32 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:18:32 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:18:32.013 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:18:32 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Feb 02 10:18:32 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 10:18:32 compute-1 nova_compute[226294]: 2026-02-02 10:18:32.772 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 02 10:18:32 compute-1 nova_compute[226294]: 2026-02-02 10:18:32.774 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 02 10:18:32 compute-1 nova_compute[226294]: 2026-02-02 10:18:32.774 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 02 10:18:32 compute-1 nova_compute[226294]: 2026-02-02 10:18:32.774 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 02 10:18:32 compute-1 nova_compute[226294]: 2026-02-02 10:18:32.802 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:18:32 compute-1 nova_compute[226294]: 2026-02-02 10:18:32.802 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 02 10:18:33 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:18:33 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:18:33 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:18:33.053 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:18:33 compute-1 ceph-mon[80115]: pgmap v1176: 353 pgs: 353 active+clean; 41 MiB data, 310 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Feb 02 10:18:34 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:18:34 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 10:18:34 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:18:34.017 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 10:18:35 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:18:35 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:18:35 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:18:35.055 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:18:35 compute-1 ceph-mon[80115]: pgmap v1177: 353 pgs: 353 active+clean; 41 MiB data, 310 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Feb 02 10:18:36 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:18:36 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 10:18:36 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:18:36.021 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 10:18:36 compute-1 sudo[245872]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 02 10:18:36 compute-1 sudo[245872]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 10:18:36 compute-1 sudo[245872]: pam_unix(sudo:session): session closed for user root
Feb 02 10:18:36 compute-1 sudo[245897]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/d241d473-9fcb-5f74-b163-f1ca4454e7f1/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Feb 02 10:18:36 compute-1 sudo[245897]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 10:18:37 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:18:37 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:18:37 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:18:37.057 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:18:37 compute-1 sudo[245897]: pam_unix(sudo:session): session closed for user root
Feb 02 10:18:37 compute-1 ceph-mon[80115]: pgmap v1178: 353 pgs: 353 active+clean; 41 MiB data, 310 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Feb 02 10:18:37 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb 02 10:18:37 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb 02 10:18:37 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 10:18:37 compute-1 nova_compute[226294]: 2026-02-02 10:18:37.803 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 02 10:18:38 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:18:38 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:18:38 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:18:38.024 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:18:38 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Feb 02 10:18:38 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Feb 02 10:18:38 compute-1 ceph-mon[80115]: pgmap v1179: 353 pgs: 353 active+clean; 41 MiB data, 310 MiB used, 60 GiB / 60 GiB avail; 521 B/s rd, 0 op/s
Feb 02 10:18:38 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb 02 10:18:38 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb 02 10:18:38 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Feb 02 10:18:38 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Feb 02 10:18:38 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Feb 02 10:18:39 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:18:39 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:18:39 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:18:39.059 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:18:39 compute-1 nova_compute[226294]: 2026-02-02 10:18:39.667 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 02 10:18:39 compute-1 nova_compute[226294]: 2026-02-02 10:18:39.668 226298 DEBUG nova.compute.manager [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Feb 02 10:18:39 compute-1 nova_compute[226294]: 2026-02-02 10:18:39.704 226298 DEBUG nova.compute.manager [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Feb 02 10:18:40 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:18:40 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 10:18:40 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:18:40.027 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 10:18:40 compute-1 ceph-mon[80115]: pgmap v1180: 353 pgs: 353 active+clean; 41 MiB data, 310 MiB used, 60 GiB / 60 GiB avail; 781 B/s rd, 0 op/s
Feb 02 10:18:41 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:18:41 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 10:18:41 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:18:41.061 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 10:18:41 compute-1 sshd-session[245955]: Invalid user solv from 80.94.92.184 port 34228
Feb 02 10:18:41 compute-1 sshd-session[245955]: Connection closed by invalid user solv 80.94.92.184 port 34228 [preauth]
Feb 02 10:18:42 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:18:42 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:18:42 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:18:42.030 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:18:42 compute-1 sudo[245957]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Feb 02 10:18:42 compute-1 sudo[245957]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 10:18:42 compute-1 sudo[245957]: pam_unix(sudo:session): session closed for user root
Feb 02 10:18:42 compute-1 sudo[245982]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 02 10:18:42 compute-1 sudo[245982]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 10:18:42 compute-1 sudo[245982]: pam_unix(sudo:session): session closed for user root
Feb 02 10:18:42 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 10:18:42 compute-1 nova_compute[226294]: 2026-02-02 10:18:42.805 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 02 10:18:42 compute-1 nova_compute[226294]: 2026-02-02 10:18:42.807 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:18:42 compute-1 nova_compute[226294]: 2026-02-02 10:18:42.807 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 02 10:18:42 compute-1 nova_compute[226294]: 2026-02-02 10:18:42.807 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 02 10:18:42 compute-1 nova_compute[226294]: 2026-02-02 10:18:42.807 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 02 10:18:42 compute-1 nova_compute[226294]: 2026-02-02 10:18:42.808 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:18:42 compute-1 ceph-mon[80115]: pgmap v1181: 353 pgs: 353 active+clean; 41 MiB data, 310 MiB used, 60 GiB / 60 GiB avail; 521 B/s rd, 0 op/s
Feb 02 10:18:42 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb 02 10:18:42 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb 02 10:18:43 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:18:43 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:18:43 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:18:43.063 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:18:44 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:18:44 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:18:44 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:18:44.032 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:18:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 10:18:44.919 143542 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 02 10:18:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 10:18:44.919 143542 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 02 10:18:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 10:18:44.919 143542 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 02 10:18:44 compute-1 ceph-mon[80115]: pgmap v1182: 353 pgs: 353 active+clean; 41 MiB data, 310 MiB used, 60 GiB / 60 GiB avail; 781 B/s rd, 0 op/s
Feb 02 10:18:45 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:18:45 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 10:18:45 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:18:45.065 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 10:18:45 compute-1 ceph-mon[80115]: pgmap v1183: 353 pgs: 353 active+clean; 41 MiB data, 310 MiB used, 60 GiB / 60 GiB avail; 521 B/s rd, 0 op/s
Feb 02 10:18:46 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:18:46 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 10:18:46 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:18:46.036 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 10:18:47 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:18:47 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 10:18:47 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:18:47.068 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 10:18:47 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Feb 02 10:18:47 compute-1 podman[246010]: 2026-02-02 10:18:47.434586136 +0000 UTC m=+0.107762053 container health_status 1fb2696999ea15e9688144989093738543d3cef725f9986792d6dfd707aefbe2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:099d88ae13fa2b3409da5310cdcba7fa01d2c87a8bc98296299a57054b9a075e, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'db4758ee7523fe447444c4bd2b867b543b1eee4e3bbcf6676cd1b27bf6147d86-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:099d88ae13fa2b3409da5310cdcba7fa01d2c87a8bc98296299a57054b9a075e', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Feb 02 10:18:47 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 10:18:47 compute-1 nova_compute[226294]: 2026-02-02 10:18:47.808 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 02 10:18:47 compute-1 nova_compute[226294]: 2026-02-02 10:18:47.810 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 02 10:18:47 compute-1 nova_compute[226294]: 2026-02-02 10:18:47.810 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 02 10:18:47 compute-1 nova_compute[226294]: 2026-02-02 10:18:47.811 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 02 10:18:47 compute-1 nova_compute[226294]: 2026-02-02 10:18:47.840 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:18:47 compute-1 nova_compute[226294]: 2026-02-02 10:18:47.840 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 02 10:18:48 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:18:48 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 10:18:48 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:18:48.039 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 10:18:48 compute-1 ceph-mon[80115]: pgmap v1184: 353 pgs: 353 active+clean; 41 MiB data, 310 MiB used, 60 GiB / 60 GiB avail; 521 B/s rd, 0 op/s
Feb 02 10:18:49 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:18:49 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:18:49 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:18:49.070 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:18:50 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:18:50 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:18:50 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:18:50.042 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:18:50 compute-1 ceph-mon[80115]: pgmap v1185: 353 pgs: 353 active+clean; 41 MiB data, 310 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Feb 02 10:18:51 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:18:51 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:18:51 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:18:51.072 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:18:52 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:18:52 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:18:52 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:18:52.044 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:18:52 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 10:18:52 compute-1 nova_compute[226294]: 2026-02-02 10:18:52.841 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:18:52 compute-1 ceph-mon[80115]: pgmap v1186: 353 pgs: 353 active+clean; 41 MiB data, 310 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Feb 02 10:18:53 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:18:53 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:18:53 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:18:53.074 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:18:54 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:18:54 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:18:54 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:18:54.046 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:18:54 compute-1 ceph-mon[80115]: pgmap v1187: 353 pgs: 353 active+clean; 41 MiB data, 310 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Feb 02 10:18:55 compute-1 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Feb 02 10:18:55 compute-1 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2702642017' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Feb 02 10:18:55 compute-1 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Feb 02 10:18:55 compute-1 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2702642017' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Feb 02 10:18:55 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:18:55 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:18:55 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:18:55.076 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:18:55 compute-1 ceph-mon[80115]: from='client.? 192.168.122.10:0/2702642017' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Feb 02 10:18:55 compute-1 ceph-mon[80115]: from='client.? 192.168.122.10:0/2702642017' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Feb 02 10:18:56 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:18:56 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:18:56 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:18:56.049 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:18:56 compute-1 podman[246041]: 2026-02-02 10:18:56.437530136 +0000 UTC m=+0.099669740 container health_status 9cab238676dda9a572dafc047b624a8b78a8fbec063bf997856559897160605d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'db4758ee7523fe447444c4bd2b867b543b1eee4e3bbcf6676cd1b27bf6147d86-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 02 10:18:56 compute-1 ceph-mon[80115]: pgmap v1188: 353 pgs: 353 active+clean; 41 MiB data, 310 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Feb 02 10:18:57 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:18:57 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 10:18:57 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:18:57.078 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 10:18:57 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 10:18:57 compute-1 nova_compute[226294]: 2026-02-02 10:18:57.842 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:18:57 compute-1 ceph-mon[80115]: pgmap v1189: 353 pgs: 353 active+clean; 41 MiB data, 310 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Feb 02 10:18:58 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:18:58 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:18:58 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:18:58.052 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:18:59 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:18:59 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:18:59 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:18:59.080 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:19:00 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:19:00 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 10:19:00 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:19:00.054 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 10:19:00 compute-1 nova_compute[226294]: 2026-02-02 10:19:00.442 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 02 10:19:00 compute-1 ceph-mon[80115]: pgmap v1190: 353 pgs: 353 active+clean; 41 MiB data, 310 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Feb 02 10:19:01 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:19:01 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:19:01 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:19:01.082 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:19:02 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:19:02 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:19:02 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:19:02.058 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:19:02 compute-1 sudo[246062]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Feb 02 10:19:02 compute-1 sudo[246062]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 10:19:02 compute-1 sudo[246062]: pam_unix(sudo:session): session closed for user root
Feb 02 10:19:02 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 10:19:02 compute-1 nova_compute[226294]: 2026-02-02 10:19:02.843 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 02 10:19:02 compute-1 nova_compute[226294]: 2026-02-02 10:19:02.846 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:19:02 compute-1 nova_compute[226294]: 2026-02-02 10:19:02.846 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 02 10:19:02 compute-1 nova_compute[226294]: 2026-02-02 10:19:02.846 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 02 10:19:02 compute-1 nova_compute[226294]: 2026-02-02 10:19:02.847 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 02 10:19:02 compute-1 nova_compute[226294]: 2026-02-02 10:19:02.848 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:19:02 compute-1 ceph-mon[80115]: pgmap v1191: 353 pgs: 353 active+clean; 41 MiB data, 310 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Feb 02 10:19:02 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Feb 02 10:19:03 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:19:03 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 10:19:03 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:19:03.084 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 10:19:04 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:19:04 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:19:04 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:19:04.061 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:19:04 compute-1 ceph-mon[80115]: pgmap v1192: 353 pgs: 353 active+clean; 41 MiB data, 310 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Feb 02 10:19:05 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:19:05 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:19:05 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:19:05.086 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:19:06 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:19:06 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 10:19:06 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:19:06.064 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 10:19:06 compute-1 ceph-mon[80115]: pgmap v1193: 353 pgs: 353 active+clean; 41 MiB data, 310 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Feb 02 10:19:07 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:19:07 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:19:07 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:19:07.088 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:19:07 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 10:19:07 compute-1 nova_compute[226294]: 2026-02-02 10:19:07.847 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:19:07 compute-1 ceph-mon[80115]: pgmap v1194: 353 pgs: 353 active+clean; 41 MiB data, 310 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Feb 02 10:19:08 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:19:08 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:19:08 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:19:08.068 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:19:08 compute-1 ceph-osd[77691]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Feb 02 10:19:08 compute-1 ceph-osd[77691]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                           ** DB Stats **
                                           Uptime(secs): 2400.1 total, 600.0 interval
                                           Cumulative writes: 12K writes, 47K keys, 12K commit groups, 1.0 writes per commit group, ingest: 0.03 GB, 0.01 MB/s
                                           Cumulative WAL: 12K writes, 3443 syncs, 3.64 writes per sync, written: 0.03 GB, 0.01 MB/s
                                           Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                           Interval writes: 1534 writes, 5165 keys, 1534 commit groups, 1.0 writes per commit group, ingest: 4.83 MB, 0.01 MB/s
                                           Interval WAL: 1534 writes, 631 syncs, 2.43 writes per sync, written: 0.00 GB, 0.01 MB/s
                                           Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Feb 02 10:19:09 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:19:09 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 10:19:09 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:19:09.090 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 10:19:09 compute-1 ceph-mon[80115]: pgmap v1195: 353 pgs: 353 active+clean; 41 MiB data, 310 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Feb 02 10:19:10 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:19:10 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:19:10 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:19:10.071 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:19:11 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:19:11 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:19:11 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:19:11.092 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:19:12 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:19:12 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:19:12 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:19:12.074 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:19:12 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 10:19:12 compute-1 nova_compute[226294]: 2026-02-02 10:19:12.848 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:19:12 compute-1 nova_compute[226294]: 2026-02-02 10:19:12.851 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:19:12 compute-1 ceph-mon[80115]: pgmap v1196: 353 pgs: 353 active+clean; 41 MiB data, 310 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Feb 02 10:19:13 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:19:13 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 10:19:13 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:19:13.094 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 10:19:14 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:19:14 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:19:14 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:19:14.077 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:19:14 compute-1 ceph-mon[80115]: pgmap v1197: 353 pgs: 353 active+clean; 41 MiB data, 310 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Feb 02 10:19:14 compute-1 ceph-mon[80115]: from='client.? 192.168.122.100:0/367606097' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Feb 02 10:19:15 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:19:15 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 10:19:15 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:19:15.097 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 10:19:16 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:19:16 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:19:16 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:19:16.081 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:19:16 compute-1 ceph-mon[80115]: pgmap v1198: 353 pgs: 353 active+clean; 41 MiB data, 310 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Feb 02 10:19:16 compute-1 ceph-mon[80115]: from='client.? 192.168.122.100:0/2247780628' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Feb 02 10:19:17 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:19:17 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 10:19:17 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:19:17.100 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 10:19:17 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 10:19:17 compute-1 nova_compute[226294]: 2026-02-02 10:19:17.852 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 02 10:19:17 compute-1 nova_compute[226294]: 2026-02-02 10:19:17.853 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 02 10:19:17 compute-1 nova_compute[226294]: 2026-02-02 10:19:17.854 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 02 10:19:17 compute-1 nova_compute[226294]: 2026-02-02 10:19:17.854 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 02 10:19:17 compute-1 nova_compute[226294]: 2026-02-02 10:19:17.903 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:19:17 compute-1 nova_compute[226294]: 2026-02-02 10:19:17.903 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 02 10:19:17 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Feb 02 10:19:18 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:19:18 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:19:18 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:19:18.084 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:19:18 compute-1 podman[246095]: 2026-02-02 10:19:18.414468434 +0000 UTC m=+0.089407288 container health_status 1fb2696999ea15e9688144989093738543d3cef725f9986792d6dfd707aefbe2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:099d88ae13fa2b3409da5310cdcba7fa01d2c87a8bc98296299a57054b9a075e, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'db4758ee7523fe447444c4bd2b867b543b1eee4e3bbcf6676cd1b27bf6147d86-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:099d88ae13fa2b3409da5310cdcba7fa01d2c87a8bc98296299a57054b9a075e', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=ovn_controller)
Feb 02 10:19:18 compute-1 nova_compute[226294]: 2026-02-02 10:19:18.676 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 02 10:19:18 compute-1 nova_compute[226294]: 2026-02-02 10:19:18.677 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 02 10:19:18 compute-1 ceph-mon[80115]: pgmap v1199: 353 pgs: 353 active+clean; 41 MiB data, 310 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Feb 02 10:19:18 compute-1 ceph-mon[80115]: from='client.? 192.168.122.102:0/1662614057' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Feb 02 10:19:19 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:19:19 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:19:19 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:19:19.103 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:19:19 compute-1 nova_compute[226294]: 2026-02-02 10:19:19.649 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 02 10:19:19 compute-1 ceph-mon[80115]: from='client.? 192.168.122.102:0/3266834973' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Feb 02 10:19:19 compute-1 ceph-mon[80115]: pgmap v1200: 353 pgs: 353 active+clean; 41 MiB data, 310 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Feb 02 10:19:20 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:19:20 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:19:20 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:19:20.087 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:19:20 compute-1 nova_compute[226294]: 2026-02-02 10:19:20.649 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 02 10:19:20 compute-1 nova_compute[226294]: 2026-02-02 10:19:20.649 226298 DEBUG nova.compute.manager [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 02 10:19:20 compute-1 nova_compute[226294]: 2026-02-02 10:19:20.650 226298 DEBUG nova.compute.manager [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 02 10:19:20 compute-1 nova_compute[226294]: 2026-02-02 10:19:20.706 226298 DEBUG nova.compute.manager [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Feb 02 10:19:21 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:19:21 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 10:19:21 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:19:21.105 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 10:19:22 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:19:22 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 10:19:22 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:19:22.090 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 10:19:22 compute-1 sudo[246123]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Feb 02 10:19:22 compute-1 sudo[246123]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 10:19:22 compute-1 sudo[246123]: pam_unix(sudo:session): session closed for user root
Feb 02 10:19:22 compute-1 nova_compute[226294]: 2026-02-02 10:19:22.649 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 02 10:19:22 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 10:19:22 compute-1 ceph-mon[80115]: pgmap v1201: 353 pgs: 353 active+clean; 41 MiB data, 310 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Feb 02 10:19:22 compute-1 nova_compute[226294]: 2026-02-02 10:19:22.904 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:19:23 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:19:23 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:19:23 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:19:23.109 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:19:23 compute-1 nova_compute[226294]: 2026-02-02 10:19:23.649 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 02 10:19:23 compute-1 nova_compute[226294]: 2026-02-02 10:19:23.649 226298 DEBUG nova.compute.manager [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 02 10:19:24 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:19:24 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:19:24 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:19:24.095 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:19:24 compute-1 nova_compute[226294]: 2026-02-02 10:19:24.644 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 02 10:19:24 compute-1 ceph-mon[80115]: pgmap v1202: 353 pgs: 353 active+clean; 41 MiB data, 310 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Feb 02 10:19:25 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:19:25 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 10:19:25 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:19:25.112 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 10:19:25 compute-1 nova_compute[226294]: 2026-02-02 10:19:25.648 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 02 10:19:25 compute-1 nova_compute[226294]: 2026-02-02 10:19:25.670 226298 DEBUG oslo_concurrency.lockutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 02 10:19:25 compute-1 nova_compute[226294]: 2026-02-02 10:19:25.671 226298 DEBUG oslo_concurrency.lockutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 02 10:19:25 compute-1 nova_compute[226294]: 2026-02-02 10:19:25.671 226298 DEBUG oslo_concurrency.lockutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 02 10:19:25 compute-1 nova_compute[226294]: 2026-02-02 10:19:25.671 226298 DEBUG nova.compute.resource_tracker [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 02 10:19:25 compute-1 nova_compute[226294]: 2026-02-02 10:19:25.671 226298 DEBUG oslo_concurrency.processutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 02 10:19:26 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:19:26 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:19:26 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:19:26.098 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:19:26 compute-1 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Feb 02 10:19:26 compute-1 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2773930151' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Feb 02 10:19:26 compute-1 nova_compute[226294]: 2026-02-02 10:19:26.145 226298 DEBUG oslo_concurrency.processutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.474s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 02 10:19:26 compute-1 nova_compute[226294]: 2026-02-02 10:19:26.284 226298 DEBUG oslo_concurrency.processutils [None req-c44df769-14b9-4ff4-8b94-fd29c4457052 41d09654a7d04d60a23411cf80fe1f98 823d3e7e313a44e9a50531e3fef22a1b - - default default] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 02 10:19:26 compute-1 nova_compute[226294]: 2026-02-02 10:19:26.304 226298 DEBUG oslo_concurrency.processutils [None req-c44df769-14b9-4ff4-8b94-fd29c4457052 41d09654a7d04d60a23411cf80fe1f98 823d3e7e313a44e9a50531e3fef22a1b - - default default] CMD "env LANG=C uptime" returned: 0 in 0.020s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 02 10:19:26 compute-1 nova_compute[226294]: 2026-02-02 10:19:26.353 226298 WARNING nova.virt.libvirt.driver [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 02 10:19:26 compute-1 nova_compute[226294]: 2026-02-02 10:19:26.354 226298 DEBUG nova.compute.resource_tracker [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4902MB free_disk=59.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 02 10:19:26 compute-1 nova_compute[226294]: 2026-02-02 10:19:26.354 226298 DEBUG oslo_concurrency.lockutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 02 10:19:26 compute-1 nova_compute[226294]: 2026-02-02 10:19:26.354 226298 DEBUG oslo_concurrency.lockutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 02 10:19:26 compute-1 nova_compute[226294]: 2026-02-02 10:19:26.470 226298 DEBUG nova.compute.resource_tracker [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 02 10:19:26 compute-1 nova_compute[226294]: 2026-02-02 10:19:26.471 226298 DEBUG nova.compute.resource_tracker [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 02 10:19:26 compute-1 nova_compute[226294]: 2026-02-02 10:19:26.487 226298 DEBUG nova.scheduler.client.report [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Refreshing inventories for resource provider 8e32c057-ad28-4c19-8374-763e0c1c8622 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Feb 02 10:19:26 compute-1 nova_compute[226294]: 2026-02-02 10:19:26.587 226298 DEBUG nova.scheduler.client.report [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Updating ProviderTree inventory for provider 8e32c057-ad28-4c19-8374-763e0c1c8622 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Feb 02 10:19:26 compute-1 nova_compute[226294]: 2026-02-02 10:19:26.588 226298 DEBUG nova.compute.provider_tree [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Updating inventory in ProviderTree for provider 8e32c057-ad28-4c19-8374-763e0c1c8622 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Feb 02 10:19:26 compute-1 nova_compute[226294]: 2026-02-02 10:19:26.607 226298 DEBUG nova.scheduler.client.report [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Refreshing aggregate associations for resource provider 8e32c057-ad28-4c19-8374-763e0c1c8622, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Feb 02 10:19:26 compute-1 nova_compute[226294]: 2026-02-02 10:19:26.636 226298 DEBUG nova.scheduler.client.report [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Refreshing trait associations for resource provider 8e32c057-ad28-4c19-8374-763e0c1c8622, traits: HW_CPU_X86_AESNI,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_SECURITY_TPM_1_2,HW_CPU_X86_SSE2,HW_CPU_X86_MMX,COMPUTE_STORAGE_BUS_USB,COMPUTE_STORAGE_BUS_IDE,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_SVM,HW_CPU_X86_F16C,HW_CPU_X86_AMD_SVM,HW_CPU_X86_SSE,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_GRAPHICS_MODEL_VIRTIO,HW_CPU_X86_BMI2,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_DEVICE_TAGGING,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_GRAPHICS_MODEL_VGA,HW_CPU_X86_AVX,COMPUTE_VOLUME_EXTEND,COMPUTE_SECURITY_UEFI_SECURE_BOOT,HW_CPU_X86_SHA,COMPUTE_SECURITY_TPM_2_0,HW_CPU_X86_ABM,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_CLMUL,HW_CPU_X86_FMA3,COMPUTE_ACCELERATORS,COMPUTE_TRUSTED_CERTS,HW_CPU_X86_BMI,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_SSE41,COMPUTE_VIOMMU_MODEL_AUTO,HW_CPU_X86_AVX2,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_RESCUE_BFV,HW_CPU_X86_SSE42,COMPUTE_NODE,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_SSSE3,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_NET_VIF_MODEL_PCNET,HW_CPU_X86_SSE4A,COMPUTE_IMAGE_TYPE_QCOW2 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Feb 02 10:19:26 compute-1 nova_compute[226294]: 2026-02-02 10:19:26.654 226298 DEBUG oslo_concurrency.processutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 02 10:19:26 compute-1 ceph-mon[80115]: pgmap v1203: 353 pgs: 353 active+clean; 41 MiB data, 310 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Feb 02 10:19:26 compute-1 ceph-mon[80115]: from='client.? 192.168.122.101:0/2773930151' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Feb 02 10:19:26 compute-1 ceph-mon[80115]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #67. Immutable memtables: 0.
Feb 02 10:19:26 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:19:26.943076) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Feb 02 10:19:26 compute-1 ceph-mon[80115]: rocksdb: [db/flush_job.cc:856] [default] [JOB 39] Flushing memtable with next log file: 67
Feb 02 10:19:26 compute-1 ceph-mon[80115]: rocksdb: EVENT_LOG_v1 {"time_micros": 1770027566943207, "job": 39, "event": "flush_started", "num_memtables": 1, "num_entries": 1280, "num_deletes": 251, "total_data_size": 3159674, "memory_usage": 3210328, "flush_reason": "Manual Compaction"}
Feb 02 10:19:26 compute-1 ceph-mon[80115]: rocksdb: [db/flush_job.cc:885] [default] [JOB 39] Level-0 flush table #68: started
Feb 02 10:19:26 compute-1 ceph-mon[80115]: rocksdb: EVENT_LOG_v1 {"time_micros": 1770027566964058, "cf_name": "default", "job": 39, "event": "table_file_creation", "file_number": 68, "file_size": 2057326, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 35544, "largest_seqno": 36819, "table_properties": {"data_size": 2051765, "index_size": 2956, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1541, "raw_key_size": 11972, "raw_average_key_size": 19, "raw_value_size": 2040593, "raw_average_value_size": 3395, "num_data_blocks": 130, "num_entries": 601, "num_filter_entries": 601, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1770027458, "oldest_key_time": 1770027458, "file_creation_time": 1770027566, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "fac1d709-8a2a-487d-b05b-57255ec289c7", "db_session_id": "DE871D21TSCUFP8UED8E", "orig_file_number": 68, "seqno_to_time_mapping": "N/A"}}
Feb 02 10:19:26 compute-1 ceph-mon[80115]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 39] Flush lasted 21075 microseconds, and 3790 cpu microseconds.
Feb 02 10:19:26 compute-1 ceph-mon[80115]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Feb 02 10:19:26 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:19:26.964129) [db/flush_job.cc:967] [default] [JOB 39] Level-0 flush table #68: 2057326 bytes OK
Feb 02 10:19:26 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:19:26.964188) [db/memtable_list.cc:519] [default] Level-0 commit table #68 started
Feb 02 10:19:26 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:19:26.967189) [db/memtable_list.cc:722] [default] Level-0 commit table #68: memtable #1 done
Feb 02 10:19:26 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:19:26.967207) EVENT_LOG_v1 {"time_micros": 1770027566967202, "job": 39, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Feb 02 10:19:26 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:19:26.967228) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Feb 02 10:19:26 compute-1 ceph-mon[80115]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 39] Try to delete WAL files size 3153571, prev total WAL file size 3153571, number of live WAL files 2.
Feb 02 10:19:26 compute-1 ceph-mon[80115]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000064.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb 02 10:19:26 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:19:26.967816) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730032373631' seq:72057594037927935, type:22 .. '7061786F730033303133' seq:0, type:0; will stop at (end)
Feb 02 10:19:26 compute-1 ceph-mon[80115]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 40] Compacting 1@0 + 1@6 files to L6, score -1.00
Feb 02 10:19:26 compute-1 ceph-mon[80115]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 39 Base level 0, inputs: [68(2009KB)], [66(13MB)]
Feb 02 10:19:26 compute-1 ceph-mon[80115]: rocksdb: EVENT_LOG_v1 {"time_micros": 1770027566967865, "job": 40, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [68], "files_L6": [66], "score": -1, "input_data_size": 16554470, "oldest_snapshot_seqno": -1}
Feb 02 10:19:27 compute-1 ceph-mon[80115]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 40] Generated table #69: 6541 keys, 14507996 bytes, temperature: kUnknown
Feb 02 10:19:27 compute-1 ceph-mon[80115]: rocksdb: EVENT_LOG_v1 {"time_micros": 1770027567092125, "cf_name": "default", "job": 40, "event": "table_file_creation", "file_number": 69, "file_size": 14507996, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 14463888, "index_size": 26652, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 16389, "raw_key_size": 171985, "raw_average_key_size": 26, "raw_value_size": 14345423, "raw_average_value_size": 2193, "num_data_blocks": 1052, "num_entries": 6541, "num_filter_entries": 6541, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1770025175, "oldest_key_time": 0, "file_creation_time": 1770027566, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "fac1d709-8a2a-487d-b05b-57255ec289c7", "db_session_id": "DE871D21TSCUFP8UED8E", "orig_file_number": 69, "seqno_to_time_mapping": "N/A"}}
Feb 02 10:19:27 compute-1 ceph-mon[80115]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Feb 02 10:19:27 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:19:27.092729) [db/compaction/compaction_job.cc:1663] [default] [JOB 40] Compacted 1@0 + 1@6 files to L6 => 14507996 bytes
Feb 02 10:19:27 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:19:27.094261) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 132.8 rd, 116.4 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(2.0, 13.8 +0.0 blob) out(13.8 +0.0 blob), read-write-amplify(15.1) write-amplify(7.1) OK, records in: 7057, records dropped: 516 output_compression: NoCompression
Feb 02 10:19:27 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:19:27.094291) EVENT_LOG_v1 {"time_micros": 1770027567094277, "job": 40, "event": "compaction_finished", "compaction_time_micros": 124658, "compaction_time_cpu_micros": 26506, "output_level": 6, "num_output_files": 1, "total_output_size": 14507996, "num_input_records": 7057, "num_output_records": 6541, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Feb 02 10:19:27 compute-1 ceph-mon[80115]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000068.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb 02 10:19:27 compute-1 ceph-mon[80115]: rocksdb: EVENT_LOG_v1 {"time_micros": 1770027567095221, "job": 40, "event": "table_file_deletion", "file_number": 68}
Feb 02 10:19:27 compute-1 ceph-mon[80115]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000066.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb 02 10:19:27 compute-1 ceph-mon[80115]: rocksdb: EVENT_LOG_v1 {"time_micros": 1770027567097337, "job": 40, "event": "table_file_deletion", "file_number": 66}
Feb 02 10:19:27 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:19:26.967759) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 02 10:19:27 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:19:27.097485) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 02 10:19:27 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:19:27.097490) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 02 10:19:27 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:19:27.097491) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 02 10:19:27 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:19:27.097493) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 02 10:19:27 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:19:27.097494) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 02 10:19:27 compute-1 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Feb 02 10:19:27 compute-1 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3504227893' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Feb 02 10:19:27 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:19:27 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:19:27 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:19:27.114 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:19:27 compute-1 nova_compute[226294]: 2026-02-02 10:19:27.127 226298 DEBUG oslo_concurrency.processutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.473s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 02 10:19:27 compute-1 nova_compute[226294]: 2026-02-02 10:19:27.133 226298 DEBUG nova.compute.provider_tree [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Inventory has not changed in ProviderTree for provider: 8e32c057-ad28-4c19-8374-763e0c1c8622 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 02 10:19:27 compute-1 nova_compute[226294]: 2026-02-02 10:19:27.162 226298 DEBUG nova.scheduler.client.report [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Inventory has not changed for provider 8e32c057-ad28-4c19-8374-763e0c1c8622 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 02 10:19:27 compute-1 nova_compute[226294]: 2026-02-02 10:19:27.163 226298 DEBUG nova.compute.resource_tracker [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 02 10:19:27 compute-1 nova_compute[226294]: 2026-02-02 10:19:27.163 226298 DEBUG oslo_concurrency.lockutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.809s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 02 10:19:27 compute-1 podman[246196]: 2026-02-02 10:19:27.382475472 +0000 UTC m=+0.055208952 container health_status 9cab238676dda9a572dafc047b624a8b78a8fbec063bf997856559897160605d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'db4758ee7523fe447444c4bd2b867b543b1eee4e3bbcf6676cd1b27bf6147d86-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Feb 02 10:19:27 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 10:19:27 compute-1 nova_compute[226294]: 2026-02-02 10:19:27.906 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 02 10:19:27 compute-1 ceph-mon[80115]: from='client.? 192.168.122.101:0/3504227893' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Feb 02 10:19:27 compute-1 ceph-mon[80115]: pgmap v1204: 353 pgs: 353 active+clean; 41 MiB data, 310 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Feb 02 10:19:28 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:19:28 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:19:28 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:19:28.101 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:19:28 compute-1 nova_compute[226294]: 2026-02-02 10:19:28.164 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 02 10:19:29 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:19:29 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:19:29 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:19:29.117 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:19:30 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:19:30 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:19:30 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:19:30.103 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:19:30 compute-1 ceph-mon[80115]: pgmap v1205: 353 pgs: 353 active+clean; 41 MiB data, 310 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Feb 02 10:19:31 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:19:31 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:19:31 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:19:31.119 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:19:31 compute-1 ovn_metadata_agent[143537]: 2026-02-02 10:19:31.890 143542 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=14, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '66:4f:4d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '4a:a7:f3:61:65:15'}, ipsec=False) old=SB_Global(nb_cfg=13) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 02 10:19:31 compute-1 ovn_metadata_agent[143537]: 2026-02-02 10:19:31.891 143542 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Feb 02 10:19:31 compute-1 nova_compute[226294]: 2026-02-02 10:19:31.891 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:19:32 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:19:32 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:19:32 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:19:32.107 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:19:32 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 10:19:32 compute-1 ceph-mon[80115]: pgmap v1206: 353 pgs: 353 active+clean; 41 MiB data, 310 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Feb 02 10:19:32 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Feb 02 10:19:32 compute-1 nova_compute[226294]: 2026-02-02 10:19:32.955 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:19:33 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:19:33 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:19:33 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:19:33.121 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:19:34 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:19:34 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:19:34 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:19:34.110 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:19:34 compute-1 ceph-mon[80115]: pgmap v1207: 353 pgs: 353 active+clean; 41 MiB data, 310 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Feb 02 10:19:35 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:19:35 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:19:35 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:19:35.123 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:19:35 compute-1 ceph-mon[80115]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Feb 02 10:19:35 compute-1 ceph-mon[80115]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                           ** DB Stats **
                                           Uptime(secs): 2400.0 total, 600.0 interval
                                           Cumulative writes: 6906 writes, 36K keys, 6906 commit groups, 1.0 writes per commit group, ingest: 0.09 GB, 0.04 MB/s
                                           Cumulative WAL: 6906 writes, 6906 syncs, 1.00 writes per sync, written: 0.09 GB, 0.04 MB/s
                                           Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                           Interval writes: 1550 writes, 8350 keys, 1550 commit groups, 1.0 writes per commit group, ingest: 17.96 MB, 0.03 MB/s
                                           Interval WAL: 1550 writes, 1550 syncs, 1.00 writes per sync, written: 0.02 GB, 0.03 MB/s
                                           Interval stall: 00:00:0.000 H:M:S, 0.0 percent
                                           
                                           ** Compaction Stats [default] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.1      0.1       0.0   1.0      0.0    105.3      0.52              0.13        20    0.026       0      0       0.0       0.0
                                             L6      1/0   13.84 MB   0.0      0.3     0.1      0.2       0.2      0.0       0.0   4.4    119.0    101.8      2.39              0.54        19    0.126    108K    10K       0.0       0.0
                                            Sum      1/0   13.84 MB   0.0      0.3     0.1      0.2       0.3      0.1       0.0   5.4     97.6    102.4      2.91              0.67        39    0.075    108K    10K       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.1     0.0      0.1       0.1      0.0       0.0   7.0     99.7    100.4      0.78              0.17        10    0.078     34K   3576       0.0       0.0
                                           
                                           ** Compaction Stats [default] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Low      0/0    0.00 KB   0.0      0.3     0.1      0.2       0.2      0.0       0.0   0.0    119.0    101.8      2.39              0.54        19    0.126    108K    10K       0.0       0.0
                                           High      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.1      0.1       0.0   0.0      0.0    105.7      0.52              0.13        19    0.027       0      0       0.0       0.0
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.8      0.00              0.00         1    0.002       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 2400.0 total, 600.0 interval
                                           Flush(GB): cumulative 0.054, interval 0.011
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.29 GB write, 0.12 MB/s write, 0.28 GB read, 0.12 MB/s read, 2.9 seconds
                                           Interval compaction: 0.08 GB write, 0.13 MB/s write, 0.08 GB read, 0.13 MB/s read, 0.8 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55a64debd350#2 capacity: 304.00 MB usage: 26.29 MB table_size: 0 occupancy: 18446744073709551615 collections: 5 last_copies: 0 last_secs: 0.000284 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(1582,25.45 MB,8.37284%) FilterBlock(39,320.67 KB,0.103012%) IndexBlock(39,533.52 KB,0.171385%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [default] **
Feb 02 10:19:35 compute-1 ovn_metadata_agent[143537]: 2026-02-02 10:19:35.893 143542 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=2f54a3b0-231a-4b96-9e3a-0a36e3e73216, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '14'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 02 10:19:35 compute-1 ceph-mon[80115]: pgmap v1208: 353 pgs: 353 active+clean; 41 MiB data, 310 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Feb 02 10:19:36 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:19:36 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 10:19:36 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:19:36.114 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 10:19:37 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:19:37 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:19:37 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:19:37.124 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:19:37 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 10:19:37 compute-1 nova_compute[226294]: 2026-02-02 10:19:37.957 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 02 10:19:37 compute-1 nova_compute[226294]: 2026-02-02 10:19:37.959 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 02 10:19:37 compute-1 nova_compute[226294]: 2026-02-02 10:19:37.959 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 02 10:19:37 compute-1 nova_compute[226294]: 2026-02-02 10:19:37.959 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 02 10:19:37 compute-1 nova_compute[226294]: 2026-02-02 10:19:37.995 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:19:37 compute-1 nova_compute[226294]: 2026-02-02 10:19:37.995 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 02 10:19:38 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:19:38 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:19:38 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:19:38.118 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:19:38 compute-1 ceph-mon[80115]: pgmap v1209: 353 pgs: 353 active+clean; 41 MiB data, 310 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Feb 02 10:19:39 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:19:39 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:19:39 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:19:39.126 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:19:40 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:19:40 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:19:40 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:19:40.121 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:19:40 compute-1 ceph-mon[80115]: pgmap v1210: 353 pgs: 353 active+clean; 41 MiB data, 310 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Feb 02 10:19:41 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:19:41 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 10:19:41 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:19:41.128 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 10:19:41 compute-1 ceph-mon[80115]: pgmap v1211: 353 pgs: 353 active+clean; 41 MiB data, 310 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Feb 02 10:19:42 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:19:42 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:19:42 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:19:42.124 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:19:42 compute-1 sudo[246224]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Feb 02 10:19:42 compute-1 sudo[246224]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 10:19:42 compute-1 sudo[246224]: pam_unix(sudo:session): session closed for user root
Feb 02 10:19:42 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 10:19:42 compute-1 sudo[246249]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 02 10:19:42 compute-1 sudo[246249]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 10:19:42 compute-1 sudo[246249]: pam_unix(sudo:session): session closed for user root
Feb 02 10:19:42 compute-1 sudo[246274]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/d241d473-9fcb-5f74-b163-f1ca4454e7f1/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 check-host
Feb 02 10:19:42 compute-1 sudo[246274]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 10:19:42 compute-1 nova_compute[226294]: 2026-02-02 10:19:42.996 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 02 10:19:42 compute-1 nova_compute[226294]: 2026-02-02 10:19:42.998 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 02 10:19:42 compute-1 nova_compute[226294]: 2026-02-02 10:19:42.999 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 02 10:19:42 compute-1 nova_compute[226294]: 2026-02-02 10:19:42.999 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 02 10:19:43 compute-1 nova_compute[226294]: 2026-02-02 10:19:43.035 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:19:43 compute-1 nova_compute[226294]: 2026-02-02 10:19:43.035 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 02 10:19:43 compute-1 sudo[246274]: pam_unix(sudo:session): session closed for user root
Feb 02 10:19:43 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:19:43 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:19:43 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:19:43.130 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:19:43 compute-1 sudo[246320]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 02 10:19:43 compute-1 sudo[246320]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 10:19:43 compute-1 sudo[246320]: pam_unix(sudo:session): session closed for user root
Feb 02 10:19:43 compute-1 sudo[246345]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/d241d473-9fcb-5f74-b163-f1ca4454e7f1/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Feb 02 10:19:43 compute-1 sudo[246345]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 10:19:43 compute-1 sudo[246345]: pam_unix(sudo:session): session closed for user root
Feb 02 10:19:44 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:19:44 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:19:44 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:19:44.127 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:19:44 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb 02 10:19:44 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb 02 10:19:44 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb 02 10:19:44 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb 02 10:19:44 compute-1 ceph-mon[80115]: pgmap v1212: 353 pgs: 353 active+clean; 41 MiB data, 310 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Feb 02 10:19:44 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Feb 02 10:19:44 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Feb 02 10:19:44 compute-1 ceph-mon[80115]: pgmap v1213: 353 pgs: 353 active+clean; 41 MiB data, 310 MiB used, 60 GiB / 60 GiB avail; 613 B/s rd, 0 op/s
Feb 02 10:19:44 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb 02 10:19:44 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb 02 10:19:44 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Feb 02 10:19:44 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Feb 02 10:19:44 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Feb 02 10:19:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 10:19:44.919 143542 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 02 10:19:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 10:19:44.920 143542 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 02 10:19:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 10:19:44.920 143542 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 02 10:19:45 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:19:45 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 10:19:45 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:19:45.132 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 10:19:46 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:19:46 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 10:19:46 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:19:46.130 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 10:19:46 compute-1 ceph-mon[80115]: pgmap v1214: 353 pgs: 353 active+clean; 41 MiB data, 310 MiB used, 60 GiB / 60 GiB avail; 613 B/s rd, 0 op/s
Feb 02 10:19:47 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:19:47 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:19:47 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:19:47.135 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:19:47 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 10:19:47 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Feb 02 10:19:48 compute-1 nova_compute[226294]: 2026-02-02 10:19:48.036 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 02 10:19:48 compute-1 nova_compute[226294]: 2026-02-02 10:19:48.038 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 02 10:19:48 compute-1 nova_compute[226294]: 2026-02-02 10:19:48.038 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 02 10:19:48 compute-1 nova_compute[226294]: 2026-02-02 10:19:48.038 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 02 10:19:48 compute-1 nova_compute[226294]: 2026-02-02 10:19:48.070 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:19:48 compute-1 nova_compute[226294]: 2026-02-02 10:19:48.070 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 02 10:19:48 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:19:48 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:19:48 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:19:48.134 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:19:48 compute-1 sudo[246405]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 02 10:19:48 compute-1 sudo[246405]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 10:19:48 compute-1 sudo[246405]: pam_unix(sudo:session): session closed for user root
Feb 02 10:19:48 compute-1 ceph-mon[80115]: pgmap v1215: 353 pgs: 353 active+clean; 41 MiB data, 310 MiB used, 60 GiB / 60 GiB avail; 613 B/s rd, 0 op/s
Feb 02 10:19:48 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb 02 10:19:48 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb 02 10:19:48 compute-1 podman[246429]: 2026-02-02 10:19:48.944069836 +0000 UTC m=+0.100845701 container health_status 1fb2696999ea15e9688144989093738543d3cef725f9986792d6dfd707aefbe2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:099d88ae13fa2b3409da5310cdcba7fa01d2c87a8bc98296299a57054b9a075e, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'db4758ee7523fe447444c4bd2b867b543b1eee4e3bbcf6676cd1b27bf6147d86-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:099d88ae13fa2b3409da5310cdcba7fa01d2c87a8bc98296299a57054b9a075e', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.build-date=20260127, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 02 10:19:49 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:19:49 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 10:19:49 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:19:49.138 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 10:19:49 compute-1 ceph-mon[80115]: pgmap v1216: 353 pgs: 353 active+clean; 41 MiB data, 310 MiB used, 60 GiB / 60 GiB avail; 613 B/s rd, 0 op/s
Feb 02 10:19:50 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:19:50 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 10:19:50 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:19:50.137 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 10:19:51 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:19:51 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 10:19:51 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:19:51.141 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 10:19:52 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:19:52 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 10:19:52 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:19:52.139 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 10:19:52 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 10:19:52 compute-1 ceph-mon[80115]: pgmap v1217: 353 pgs: 353 active+clean; 41 MiB data, 310 MiB used, 60 GiB / 60 GiB avail; 613 B/s rd, 0 op/s
Feb 02 10:19:53 compute-1 nova_compute[226294]: 2026-02-02 10:19:53.071 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:19:53 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:19:53 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 10:19:53 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:19:53.144 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 10:19:53 compute-1 ceph-mon[80115]: pgmap v1218: 353 pgs: 353 active+clean; 41 MiB data, 310 MiB used, 60 GiB / 60 GiB avail; 613 B/s rd, 0 op/s
Feb 02 10:19:54 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:19:54 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 10:19:54 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:19:54.141 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 10:19:55 compute-1 ceph-mon[80115]: from='client.? 192.168.122.10:0/3058379664' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Feb 02 10:19:55 compute-1 ceph-mon[80115]: from='client.? 192.168.122.10:0/3058379664' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Feb 02 10:19:55 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:19:55 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 10:19:55 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:19:55.146 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 10:19:56 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:19:56 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:19:56 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:19:56.144 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:19:56 compute-1 ceph-mon[80115]: pgmap v1219: 353 pgs: 353 active+clean; 41 MiB data, 310 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Feb 02 10:19:57 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:19:57 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:19:57 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:19:57.148 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:19:57 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 10:19:57 compute-1 ceph-mon[80115]: pgmap v1220: 353 pgs: 353 active+clean; 41 MiB data, 310 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Feb 02 10:19:58 compute-1 nova_compute[226294]: 2026-02-02 10:19:58.073 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 02 10:19:58 compute-1 nova_compute[226294]: 2026-02-02 10:19:58.075 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 02 10:19:58 compute-1 nova_compute[226294]: 2026-02-02 10:19:58.075 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 02 10:19:58 compute-1 nova_compute[226294]: 2026-02-02 10:19:58.075 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 02 10:19:58 compute-1 nova_compute[226294]: 2026-02-02 10:19:58.121 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:19:58 compute-1 nova_compute[226294]: 2026-02-02 10:19:58.122 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 02 10:19:58 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:19:58 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:19:58 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:19:58.147 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:19:58 compute-1 podman[246463]: 2026-02-02 10:19:58.392246533 +0000 UTC m=+0.063973594 container health_status 9cab238676dda9a572dafc047b624a8b78a8fbec063bf997856559897160605d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'db4758ee7523fe447444c4bd2b867b543b1eee4e3bbcf6676cd1b27bf6147d86-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Feb 02 10:19:59 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:19:59 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 10:19:59 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:19:59.150 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 10:20:00 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:20:00 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 10:20:00 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:20:00.151 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 10:20:00 compute-1 ceph-mon[80115]: pgmap v1221: 353 pgs: 353 active+clean; 41 MiB data, 310 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Feb 02 10:20:00 compute-1 ceph-mon[80115]: Health detail: HEALTH_WARN 2 failed cephadm daemon(s)
Feb 02 10:20:00 compute-1 ceph-mon[80115]: [WRN] CEPHADM_FAILED_DAEMON: 2 failed cephadm daemon(s)
Feb 02 10:20:00 compute-1 ceph-mon[80115]:     daemon nfs.cephfs.0.0.compute-1.mhzhsx on compute-1 is in error state
Feb 02 10:20:00 compute-1 ceph-mon[80115]:     daemon nfs.cephfs.1.0.compute-2.dciyfa on compute-2 is in error state
Feb 02 10:20:01 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:20:01 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 10:20:01 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:20:01.153 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 10:20:02 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:20:02 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:20:02 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:20:02.155 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:20:02 compute-1 sudo[246485]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Feb 02 10:20:02 compute-1 sudo[246485]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 10:20:02 compute-1 sudo[246485]: pam_unix(sudo:session): session closed for user root
Feb 02 10:20:02 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 10:20:02 compute-1 ceph-mon[80115]: pgmap v1222: 353 pgs: 353 active+clean; 41 MiB data, 310 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Feb 02 10:20:02 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Feb 02 10:20:03 compute-1 nova_compute[226294]: 2026-02-02 10:20:03.122 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 02 10:20:03 compute-1 nova_compute[226294]: 2026-02-02 10:20:03.124 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:20:03 compute-1 nova_compute[226294]: 2026-02-02 10:20:03.124 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 02 10:20:03 compute-1 nova_compute[226294]: 2026-02-02 10:20:03.124 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 02 10:20:03 compute-1 nova_compute[226294]: 2026-02-02 10:20:03.125 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 02 10:20:03 compute-1 nova_compute[226294]: 2026-02-02 10:20:03.126 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:20:03 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:20:03 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:20:03 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:20:03.156 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:20:03 compute-1 ceph-mon[80115]: pgmap v1223: 353 pgs: 353 active+clean; 41 MiB data, 310 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Feb 02 10:20:04 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:20:04 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 10:20:04 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:20:04.159 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 10:20:05 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:20:05 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:20:05 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:20:05.160 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:20:06 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:20:06 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 10:20:06 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:20:06.163 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 10:20:06 compute-1 ceph-mon[80115]: pgmap v1224: 353 pgs: 353 active+clean; 41 MiB data, 310 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Feb 02 10:20:07 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:20:07 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:20:07 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:20:07.163 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:20:07 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 10:20:08 compute-1 ceph-mon[80115]: pgmap v1225: 353 pgs: 353 active+clean; 41 MiB data, 310 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Feb 02 10:20:08 compute-1 nova_compute[226294]: 2026-02-02 10:20:08.127 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 02 10:20:08 compute-1 nova_compute[226294]: 2026-02-02 10:20:08.129 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 02 10:20:08 compute-1 nova_compute[226294]: 2026-02-02 10:20:08.129 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 02 10:20:08 compute-1 nova_compute[226294]: 2026-02-02 10:20:08.129 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 02 10:20:08 compute-1 nova_compute[226294]: 2026-02-02 10:20:08.158 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:20:08 compute-1 nova_compute[226294]: 2026-02-02 10:20:08.159 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 02 10:20:08 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:20:08 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 10:20:08 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:20:08.164 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 10:20:09 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:20:09 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:20:09 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:20:09.164 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:20:10 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:20:10 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:20:10 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:20:10.168 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:20:10 compute-1 ceph-mon[80115]: pgmap v1226: 353 pgs: 353 active+clean; 41 MiB data, 310 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Feb 02 10:20:11 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:20:11 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:20:11 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:20:11.167 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:20:12 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:20:12 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 10:20:12 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:20:12.171 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 10:20:12 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 10:20:12 compute-1 ceph-mon[80115]: pgmap v1227: 353 pgs: 353 active+clean; 41 MiB data, 310 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Feb 02 10:20:13 compute-1 nova_compute[226294]: 2026-02-02 10:20:13.160 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 02 10:20:13 compute-1 nova_compute[226294]: 2026-02-02 10:20:13.161 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 02 10:20:13 compute-1 nova_compute[226294]: 2026-02-02 10:20:13.162 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 02 10:20:13 compute-1 nova_compute[226294]: 2026-02-02 10:20:13.162 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 02 10:20:13 compute-1 nova_compute[226294]: 2026-02-02 10:20:13.196 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:20:13 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:20:13 compute-1 nova_compute[226294]: 2026-02-02 10:20:13.197 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 02 10:20:13 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 10:20:13 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:20:13.195 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 10:20:13 compute-1 nova_compute[226294]: 2026-02-02 10:20:13.197 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:20:13 compute-1 ceph-mon[80115]: pgmap v1228: 353 pgs: 353 active+clean; 41 MiB data, 310 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Feb 02 10:20:14 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:20:14 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:20:14 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:20:14.175 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:20:15 compute-1 ceph-mon[80115]: from='client.? 192.168.122.100:0/2747210279' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Feb 02 10:20:15 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:20:15 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:20:15 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:20:15.198 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:20:16 compute-1 ceph-mon[80115]: from='client.? 192.168.122.100:0/3454324369' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Feb 02 10:20:16 compute-1 ceph-mon[80115]: pgmap v1229: 353 pgs: 353 active+clean; 41 MiB data, 310 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Feb 02 10:20:16 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:20:16 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:20:16 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:20:16.178 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:20:17 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:20:17 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 10:20:17 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:20:17.200 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 10:20:17 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Feb 02 10:20:17 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 10:20:18 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:20:18 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:20:18 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:20:18.181 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:20:18 compute-1 nova_compute[226294]: 2026-02-02 10:20:18.198 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 02 10:20:18 compute-1 nova_compute[226294]: 2026-02-02 10:20:18.200 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 02 10:20:18 compute-1 nova_compute[226294]: 2026-02-02 10:20:18.200 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 02 10:20:18 compute-1 nova_compute[226294]: 2026-02-02 10:20:18.200 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 02 10:20:18 compute-1 nova_compute[226294]: 2026-02-02 10:20:18.253 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:20:18 compute-1 nova_compute[226294]: 2026-02-02 10:20:18.254 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 02 10:20:18 compute-1 ceph-mon[80115]: pgmap v1230: 353 pgs: 353 active+clean; 41 MiB data, 310 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Feb 02 10:20:18 compute-1 nova_compute[226294]: 2026-02-02 10:20:18.649 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 02 10:20:18 compute-1 nova_compute[226294]: 2026-02-02 10:20:18.649 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 02 10:20:19 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:20:19 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:20:19 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:20:19.203 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:20:19 compute-1 ceph-mon[80115]: from='client.? 192.168.122.102:0/2950206628' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Feb 02 10:20:19 compute-1 podman[246519]: 2026-02-02 10:20:19.4598973 +0000 UTC m=+0.141137497 container health_status 1fb2696999ea15e9688144989093738543d3cef725f9986792d6dfd707aefbe2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:099d88ae13fa2b3409da5310cdcba7fa01d2c87a8bc98296299a57054b9a075e, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'db4758ee7523fe447444c4bd2b867b543b1eee4e3bbcf6676cd1b27bf6147d86-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:099d88ae13fa2b3409da5310cdcba7fa01d2c87a8bc98296299a57054b9a075e', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_controller)
Feb 02 10:20:19 compute-1 nova_compute[226294]: 2026-02-02 10:20:19.649 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 02 10:20:20 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:20:20 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:20:20 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:20:20.184 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:20:20 compute-1 ceph-mon[80115]: from='client.? 192.168.122.102:0/1797396716' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Feb 02 10:20:20 compute-1 ceph-mon[80115]: pgmap v1231: 353 pgs: 353 active+clean; 41 MiB data, 310 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Feb 02 10:20:21 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:20:21 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 10:20:21 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:20:21.205 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 10:20:21 compute-1 nova_compute[226294]: 2026-02-02 10:20:21.649 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 02 10:20:21 compute-1 nova_compute[226294]: 2026-02-02 10:20:21.650 226298 DEBUG nova.compute.manager [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 02 10:20:21 compute-1 nova_compute[226294]: 2026-02-02 10:20:21.650 226298 DEBUG nova.compute.manager [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 02 10:20:21 compute-1 nova_compute[226294]: 2026-02-02 10:20:21.674 226298 DEBUG nova.compute.manager [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Feb 02 10:20:22 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:20:22 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:20:22 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:20:22.187 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:20:22 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 10:20:22 compute-1 sudo[246546]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Feb 02 10:20:22 compute-1 sudo[246546]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 10:20:22 compute-1 sudo[246546]: pam_unix(sudo:session): session closed for user root
Feb 02 10:20:22 compute-1 ceph-mon[80115]: pgmap v1232: 353 pgs: 353 active+clean; 41 MiB data, 310 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Feb 02 10:20:23 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:20:23 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 10:20:23 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:20:23.208 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 10:20:23 compute-1 nova_compute[226294]: 2026-02-02 10:20:23.255 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 02 10:20:23 compute-1 nova_compute[226294]: 2026-02-02 10:20:23.256 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:20:23 compute-1 nova_compute[226294]: 2026-02-02 10:20:23.257 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 02 10:20:23 compute-1 nova_compute[226294]: 2026-02-02 10:20:23.257 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 02 10:20:23 compute-1 nova_compute[226294]: 2026-02-02 10:20:23.257 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 02 10:20:23 compute-1 nova_compute[226294]: 2026-02-02 10:20:23.259 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:20:23 compute-1 nova_compute[226294]: 2026-02-02 10:20:23.649 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 02 10:20:23 compute-1 nova_compute[226294]: 2026-02-02 10:20:23.670 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 02 10:20:24 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:20:24 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 10:20:24 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:20:24.189 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 10:20:24 compute-1 nova_compute[226294]: 2026-02-02 10:20:24.665 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 02 10:20:24 compute-1 ceph-mon[80115]: pgmap v1233: 353 pgs: 353 active+clean; 41 MiB data, 310 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Feb 02 10:20:25 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:20:25 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:20:25 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:20:25.211 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:20:25 compute-1 nova_compute[226294]: 2026-02-02 10:20:25.648 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 02 10:20:25 compute-1 nova_compute[226294]: 2026-02-02 10:20:25.649 226298 DEBUG nova.compute.manager [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 02 10:20:26 compute-1 ceph-mon[80115]: pgmap v1234: 353 pgs: 353 active+clean; 41 MiB data, 310 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Feb 02 10:20:26 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:20:26 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:20:26 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:20:26.193 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:20:26 compute-1 nova_compute[226294]: 2026-02-02 10:20:26.650 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 02 10:20:27 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:20:27 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 10:20:27 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:20:27.213 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 10:20:27 compute-1 nova_compute[226294]: 2026-02-02 10:20:27.649 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 02 10:20:27 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 10:20:27 compute-1 nova_compute[226294]: 2026-02-02 10:20:27.696 226298 DEBUG oslo_concurrency.lockutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 02 10:20:27 compute-1 nova_compute[226294]: 2026-02-02 10:20:27.697 226298 DEBUG oslo_concurrency.lockutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 02 10:20:27 compute-1 nova_compute[226294]: 2026-02-02 10:20:27.697 226298 DEBUG oslo_concurrency.lockutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 02 10:20:27 compute-1 nova_compute[226294]: 2026-02-02 10:20:27.697 226298 DEBUG nova.compute.resource_tracker [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 02 10:20:27 compute-1 nova_compute[226294]: 2026-02-02 10:20:27.698 226298 DEBUG oslo_concurrency.processutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 02 10:20:28 compute-1 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Feb 02 10:20:28 compute-1 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1092170194' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Feb 02 10:20:28 compute-1 nova_compute[226294]: 2026-02-02 10:20:28.166 226298 DEBUG oslo_concurrency.processutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.468s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 02 10:20:28 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:20:28 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 10:20:28 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:20:28.196 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 10:20:28 compute-1 nova_compute[226294]: 2026-02-02 10:20:28.257 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:20:28 compute-1 nova_compute[226294]: 2026-02-02 10:20:28.366 226298 WARNING nova.virt.libvirt.driver [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 02 10:20:28 compute-1 nova_compute[226294]: 2026-02-02 10:20:28.368 226298 DEBUG nova.compute.resource_tracker [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4884MB free_disk=59.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 02 10:20:28 compute-1 nova_compute[226294]: 2026-02-02 10:20:28.368 226298 DEBUG oslo_concurrency.lockutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 02 10:20:28 compute-1 nova_compute[226294]: 2026-02-02 10:20:28.368 226298 DEBUG oslo_concurrency.lockutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 02 10:20:28 compute-1 nova_compute[226294]: 2026-02-02 10:20:28.451 226298 DEBUG nova.compute.resource_tracker [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 02 10:20:28 compute-1 nova_compute[226294]: 2026-02-02 10:20:28.452 226298 DEBUG nova.compute.resource_tracker [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 02 10:20:28 compute-1 nova_compute[226294]: 2026-02-02 10:20:28.477 226298 DEBUG oslo_concurrency.processutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 02 10:20:28 compute-1 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Feb 02 10:20:28 compute-1 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2889706750' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Feb 02 10:20:28 compute-1 ceph-mon[80115]: pgmap v1235: 353 pgs: 353 active+clean; 41 MiB data, 310 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Feb 02 10:20:28 compute-1 ceph-mon[80115]: from='client.? 192.168.122.101:0/1092170194' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Feb 02 10:20:28 compute-1 nova_compute[226294]: 2026-02-02 10:20:28.916 226298 DEBUG oslo_concurrency.processutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.440s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 02 10:20:28 compute-1 nova_compute[226294]: 2026-02-02 10:20:28.922 226298 DEBUG nova.compute.provider_tree [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Inventory has not changed in ProviderTree for provider: 8e32c057-ad28-4c19-8374-763e0c1c8622 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 02 10:20:28 compute-1 nova_compute[226294]: 2026-02-02 10:20:28.941 226298 DEBUG nova.scheduler.client.report [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Inventory has not changed for provider 8e32c057-ad28-4c19-8374-763e0c1c8622 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 02 10:20:28 compute-1 nova_compute[226294]: 2026-02-02 10:20:28.943 226298 DEBUG nova.compute.resource_tracker [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 02 10:20:28 compute-1 nova_compute[226294]: 2026-02-02 10:20:28.943 226298 DEBUG oslo_concurrency.lockutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.575s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 02 10:20:29 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:20:29 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 10:20:29 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:20:29.216 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 10:20:29 compute-1 podman[246619]: 2026-02-02 10:20:29.403965804 +0000 UTC m=+0.081510178 container health_status 9cab238676dda9a572dafc047b624a8b78a8fbec063bf997856559897160605d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'db4758ee7523fe447444c4bd2b867b543b1eee4e3bbcf6676cd1b27bf6147d86-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Feb 02 10:20:29 compute-1 ceph-mon[80115]: from='client.? 192.168.122.101:0/2889706750' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Feb 02 10:20:30 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:20:30 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:20:30 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:20:30.200 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:20:30 compute-1 ceph-mon[80115]: pgmap v1236: 353 pgs: 353 active+clean; 41 MiB data, 303 MiB used, 60 GiB / 60 GiB avail; 68 KiB/s rd, 0 B/s wr, 113 op/s
Feb 02 10:20:31 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:20:31 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 10:20:31 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:20:31.218 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 10:20:31 compute-1 ceph-mon[80115]: pgmap v1237: 353 pgs: 353 active+clean; 41 MiB data, 303 MiB used, 60 GiB / 60 GiB avail; 68 KiB/s rd, 0 B/s wr, 113 op/s
Feb 02 10:20:32 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:20:32 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 10:20:32 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:20:32.202 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 10:20:32 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 10:20:32 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Feb 02 10:20:33 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:20:33 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:20:33 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:20:33.220 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:20:33 compute-1 nova_compute[226294]: 2026-02-02 10:20:33.261 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 02 10:20:33 compute-1 nova_compute[226294]: 2026-02-02 10:20:33.262 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 02 10:20:33 compute-1 nova_compute[226294]: 2026-02-02 10:20:33.262 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 02 10:20:33 compute-1 nova_compute[226294]: 2026-02-02 10:20:33.263 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 02 10:20:33 compute-1 nova_compute[226294]: 2026-02-02 10:20:33.299 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:20:33 compute-1 nova_compute[226294]: 2026-02-02 10:20:33.300 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 02 10:20:34 compute-1 ceph-mon[80115]: pgmap v1238: 353 pgs: 353 active+clean; 41 MiB data, 303 MiB used, 60 GiB / 60 GiB avail; 101 KiB/s rd, 0 B/s wr, 167 op/s
Feb 02 10:20:34 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:20:34 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 10:20:34 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:20:34.206 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 10:20:35 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:20:35 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 10:20:35 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:20:35.223 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 10:20:36 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:20:36 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 10:20:36 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:20:36.210 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 10:20:36 compute-1 ceph-mon[80115]: pgmap v1239: 353 pgs: 353 active+clean; 41 MiB data, 303 MiB used, 60 GiB / 60 GiB avail; 100 KiB/s rd, 0 B/s wr, 167 op/s
Feb 02 10:20:37 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:20:37 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 10:20:37 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:20:37.226 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 10:20:37 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 10:20:38 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:20:38 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:20:38 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:20:38.214 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:20:38 compute-1 nova_compute[226294]: 2026-02-02 10:20:38.300 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 02 10:20:38 compute-1 nova_compute[226294]: 2026-02-02 10:20:38.302 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 02 10:20:38 compute-1 nova_compute[226294]: 2026-02-02 10:20:38.302 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 02 10:20:38 compute-1 nova_compute[226294]: 2026-02-02 10:20:38.302 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 02 10:20:38 compute-1 nova_compute[226294]: 2026-02-02 10:20:38.347 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:20:38 compute-1 nova_compute[226294]: 2026-02-02 10:20:38.348 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 02 10:20:38 compute-1 ceph-mon[80115]: pgmap v1240: 353 pgs: 353 active+clean; 41 MiB data, 303 MiB used, 60 GiB / 60 GiB avail; 100 KiB/s rd, 0 B/s wr, 167 op/s
Feb 02 10:20:39 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:20:39 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:20:39 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:20:39.229 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:20:40 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:20:40 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:20:40 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:20:40.218 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:20:40 compute-1 ceph-mon[80115]: pgmap v1241: 353 pgs: 353 active+clean; 41 MiB data, 303 MiB used, 60 GiB / 60 GiB avail; 101 KiB/s rd, 0 B/s wr, 167 op/s
Feb 02 10:20:41 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:20:41 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 10:20:41 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:20:41.231 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 10:20:41 compute-1 ceph-mon[80115]: pgmap v1242: 353 pgs: 353 active+clean; 41 MiB data, 303 MiB used, 60 GiB / 60 GiB avail; 33 KiB/s rd, 0 B/s wr, 54 op/s
Feb 02 10:20:42 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:20:42 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:20:42 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:20:42.221 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:20:42 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 10:20:42 compute-1 ceph-mon[80115]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #70. Immutable memtables: 0.
Feb 02 10:20:42 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:20:42.716891) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Feb 02 10:20:42 compute-1 ceph-mon[80115]: rocksdb: [db/flush_job.cc:856] [default] [JOB 41] Flushing memtable with next log file: 70
Feb 02 10:20:42 compute-1 ceph-mon[80115]: rocksdb: EVENT_LOG_v1 {"time_micros": 1770027642717029, "job": 41, "event": "flush_started", "num_memtables": 1, "num_entries": 1019, "num_deletes": 251, "total_data_size": 2329272, "memory_usage": 2352552, "flush_reason": "Manual Compaction"}
Feb 02 10:20:42 compute-1 ceph-mon[80115]: rocksdb: [db/flush_job.cc:885] [default] [JOB 41] Level-0 flush table #71: started
Feb 02 10:20:42 compute-1 ceph-mon[80115]: rocksdb: EVENT_LOG_v1 {"time_micros": 1770027642728203, "cf_name": "default", "job": 41, "event": "table_file_creation", "file_number": 71, "file_size": 1012034, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 36824, "largest_seqno": 37838, "table_properties": {"data_size": 1008115, "index_size": 1571, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1285, "raw_key_size": 10365, "raw_average_key_size": 21, "raw_value_size": 999773, "raw_average_value_size": 2036, "num_data_blocks": 66, "num_entries": 491, "num_filter_entries": 491, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1770027567, "oldest_key_time": 1770027567, "file_creation_time": 1770027642, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "fac1d709-8a2a-487d-b05b-57255ec289c7", "db_session_id": "DE871D21TSCUFP8UED8E", "orig_file_number": 71, "seqno_to_time_mapping": "N/A"}}
Feb 02 10:20:42 compute-1 ceph-mon[80115]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 41] Flush lasted 11381 microseconds, and 5805 cpu microseconds.
Feb 02 10:20:42 compute-1 ceph-mon[80115]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Feb 02 10:20:42 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:20:42.728293) [db/flush_job.cc:967] [default] [JOB 41] Level-0 flush table #71: 1012034 bytes OK
Feb 02 10:20:42 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:20:42.728325) [db/memtable_list.cc:519] [default] Level-0 commit table #71 started
Feb 02 10:20:42 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:20:42.731332) [db/memtable_list.cc:722] [default] Level-0 commit table #71: memtable #1 done
Feb 02 10:20:42 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:20:42.731372) EVENT_LOG_v1 {"time_micros": 1770027642731361, "job": 41, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Feb 02 10:20:42 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:20:42.731405) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Feb 02 10:20:42 compute-1 ceph-mon[80115]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 41] Try to delete WAL files size 2324180, prev total WAL file size 2324180, number of live WAL files 2.
Feb 02 10:20:42 compute-1 ceph-mon[80115]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000067.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb 02 10:20:42 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:20:42.732874) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6D6772737461740031303030' seq:72057594037927935, type:22 .. '6D6772737461740031323532' seq:0, type:0; will stop at (end)
Feb 02 10:20:42 compute-1 ceph-mon[80115]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 42] Compacting 1@0 + 1@6 files to L6, score -1.00
Feb 02 10:20:42 compute-1 ceph-mon[80115]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 41 Base level 0, inputs: [71(988KB)], [69(13MB)]
Feb 02 10:20:42 compute-1 ceph-mon[80115]: rocksdb: EVENT_LOG_v1 {"time_micros": 1770027642732948, "job": 42, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [71], "files_L6": [69], "score": -1, "input_data_size": 15520030, "oldest_snapshot_seqno": -1}
Feb 02 10:20:42 compute-1 sudo[246644]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Feb 02 10:20:42 compute-1 sudo[246644]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 10:20:42 compute-1 sudo[246644]: pam_unix(sudo:session): session closed for user root
Feb 02 10:20:42 compute-1 ceph-mon[80115]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 42] Generated table #72: 6544 keys, 11914868 bytes, temperature: kUnknown
Feb 02 10:20:42 compute-1 ceph-mon[80115]: rocksdb: EVENT_LOG_v1 {"time_micros": 1770027642855491, "cf_name": "default", "job": 42, "event": "table_file_creation", "file_number": 72, "file_size": 11914868, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 11874566, "index_size": 22846, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 16389, "raw_key_size": 172256, "raw_average_key_size": 26, "raw_value_size": 11759884, "raw_average_value_size": 1797, "num_data_blocks": 894, "num_entries": 6544, "num_filter_entries": 6544, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1770025175, "oldest_key_time": 0, "file_creation_time": 1770027642, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "fac1d709-8a2a-487d-b05b-57255ec289c7", "db_session_id": "DE871D21TSCUFP8UED8E", "orig_file_number": 72, "seqno_to_time_mapping": "N/A"}}
Feb 02 10:20:42 compute-1 ceph-mon[80115]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Feb 02 10:20:42 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:20:42.855737) [db/compaction/compaction_job.cc:1663] [default] [JOB 42] Compacted 1@0 + 1@6 files to L6 => 11914868 bytes
Feb 02 10:20:42 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:20:42.857132) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 126.6 rd, 97.2 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.0, 13.8 +0.0 blob) out(11.4 +0.0 blob), read-write-amplify(27.1) write-amplify(11.8) OK, records in: 7032, records dropped: 488 output_compression: NoCompression
Feb 02 10:20:42 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:20:42.857175) EVENT_LOG_v1 {"time_micros": 1770027642857163, "job": 42, "event": "compaction_finished", "compaction_time_micros": 122601, "compaction_time_cpu_micros": 25212, "output_level": 6, "num_output_files": 1, "total_output_size": 11914868, "num_input_records": 7032, "num_output_records": 6544, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Feb 02 10:20:42 compute-1 ceph-mon[80115]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000071.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb 02 10:20:42 compute-1 ceph-mon[80115]: rocksdb: EVENT_LOG_v1 {"time_micros": 1770027642857442, "job": 42, "event": "table_file_deletion", "file_number": 71}
Feb 02 10:20:42 compute-1 ceph-mon[80115]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000069.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb 02 10:20:42 compute-1 ceph-mon[80115]: rocksdb: EVENT_LOG_v1 {"time_micros": 1770027642859199, "job": 42, "event": "table_file_deletion", "file_number": 69}
Feb 02 10:20:42 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:20:42.732222) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 02 10:20:42 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:20:42.859345) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 02 10:20:42 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:20:42.859354) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 02 10:20:42 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:20:42.859356) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 02 10:20:42 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:20:42.859359) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 02 10:20:42 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:20:42.859361) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 02 10:20:43 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:20:43 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:20:43 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:20:43.233 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:20:43 compute-1 nova_compute[226294]: 2026-02-02 10:20:43.349 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 02 10:20:43 compute-1 nova_compute[226294]: 2026-02-02 10:20:43.351 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 02 10:20:43 compute-1 nova_compute[226294]: 2026-02-02 10:20:43.351 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 02 10:20:43 compute-1 nova_compute[226294]: 2026-02-02 10:20:43.351 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 02 10:20:43 compute-1 nova_compute[226294]: 2026-02-02 10:20:43.394 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:20:43 compute-1 nova_compute[226294]: 2026-02-02 10:20:43.395 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 02 10:20:44 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:20:44 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:20:44 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:20:44.224 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:20:44 compute-1 ceph-mon[80115]: pgmap v1243: 353 pgs: 353 active+clean; 41 MiB data, 303 MiB used, 60 GiB / 60 GiB avail; 33 KiB/s rd, 0 B/s wr, 54 op/s
Feb 02 10:20:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 10:20:44.921 143542 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 02 10:20:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 10:20:44.921 143542 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 02 10:20:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 10:20:44.921 143542 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 02 10:20:45 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:20:45 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:20:45 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:20:45.236 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:20:46 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:20:46 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 10:20:46 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:20:46.225 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 10:20:46 compute-1 ceph-mon[80115]: pgmap v1244: 353 pgs: 353 active+clean; 41 MiB data, 303 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Feb 02 10:20:47 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:20:47 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 10:20:47 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:20:47.237 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 10:20:47 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 10:20:47 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Feb 02 10:20:48 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:20:48 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:20:48 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:20:48.229 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:20:48 compute-1 nova_compute[226294]: 2026-02-02 10:20:48.396 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 02 10:20:48 compute-1 ceph-mon[80115]: pgmap v1245: 353 pgs: 353 active+clean; 41 MiB data, 303 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Feb 02 10:20:49 compute-1 sudo[246673]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 02 10:20:49 compute-1 sudo[246673]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 10:20:49 compute-1 sudo[246673]: pam_unix(sudo:session): session closed for user root
Feb 02 10:20:49 compute-1 sudo[246698]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/d241d473-9fcb-5f74-b163-f1ca4454e7f1/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Feb 02 10:20:49 compute-1 sudo[246698]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 10:20:49 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:20:49 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 10:20:49 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:20:49.239 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 10:20:49 compute-1 sudo[246698]: pam_unix(sudo:session): session closed for user root
Feb 02 10:20:49 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Feb 02 10:20:49 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Feb 02 10:20:49 compute-1 ceph-mon[80115]: pgmap v1246: 353 pgs: 353 active+clean; 41 MiB data, 303 MiB used, 60 GiB / 60 GiB avail; 775 B/s rd, 0 op/s
Feb 02 10:20:49 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb 02 10:20:49 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb 02 10:20:49 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Feb 02 10:20:49 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Feb 02 10:20:49 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Feb 02 10:20:50 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:20:50 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:20:50 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:20:50.232 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:20:50 compute-1 podman[246755]: 2026-02-02 10:20:50.415286684 +0000 UTC m=+0.092190022 container health_status 1fb2696999ea15e9688144989093738543d3cef725f9986792d6dfd707aefbe2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:099d88ae13fa2b3409da5310cdcba7fa01d2c87a8bc98296299a57054b9a075e, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'db4758ee7523fe447444c4bd2b867b543b1eee4e3bbcf6676cd1b27bf6147d86-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:099d88ae13fa2b3409da5310cdcba7fa01d2c87a8bc98296299a57054b9a075e', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Feb 02 10:20:51 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:20:51 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 10:20:51 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:20:51.241 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 10:20:52 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:20:52 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:20:52 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:20:52.234 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:20:52 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 10:20:52 compute-1 ceph-mon[80115]: pgmap v1247: 353 pgs: 353 active+clean; 41 MiB data, 303 MiB used, 60 GiB / 60 GiB avail; 517 B/s rd, 0 op/s
Feb 02 10:20:53 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:20:53 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 10:20:53 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:20:53.243 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 10:20:53 compute-1 nova_compute[226294]: 2026-02-02 10:20:53.397 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:20:54 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:20:54 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 10:20:54 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:20:54.237 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 10:20:54 compute-1 sudo[246785]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 02 10:20:54 compute-1 sudo[246785]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 10:20:54 compute-1 sudo[246785]: pam_unix(sudo:session): session closed for user root
Feb 02 10:20:54 compute-1 ceph-mon[80115]: pgmap v1248: 353 pgs: 353 active+clean; 41 MiB data, 303 MiB used, 60 GiB / 60 GiB avail; 775 B/s rd, 0 op/s
Feb 02 10:20:54 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb 02 10:20:54 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb 02 10:20:55 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:20:55 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:20:55 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:20:55.245 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:20:55 compute-1 ceph-mon[80115]: from='client.? 192.168.122.10:0/91022654' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Feb 02 10:20:55 compute-1 ceph-mon[80115]: from='client.? 192.168.122.10:0/91022654' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Feb 02 10:20:56 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:20:56 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:20:56 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:20:56.242 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:20:56 compute-1 ceph-mon[80115]: pgmap v1249: 353 pgs: 353 active+clean; 41 MiB data, 303 MiB used, 60 GiB / 60 GiB avail; 517 B/s rd, 0 op/s
Feb 02 10:20:57 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:20:57 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 10:20:57 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:20:57.248 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 10:20:57 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 10:20:58 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:20:58 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:20:58 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:20:58.245 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:20:58 compute-1 nova_compute[226294]: 2026-02-02 10:20:58.401 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:20:58 compute-1 ceph-mon[80115]: pgmap v1250: 353 pgs: 353 active+clean; 41 MiB data, 303 MiB used, 60 GiB / 60 GiB avail; 517 B/s rd, 0 op/s
Feb 02 10:20:59 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:20:59 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 10:20:59 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:20:59.250 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 10:21:00 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:21:00 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:21:00 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:21:00.248 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:21:00 compute-1 podman[246813]: 2026-02-02 10:21:00.369045714 +0000 UTC m=+0.046153973 container health_status 9cab238676dda9a572dafc047b624a8b78a8fbec063bf997856559897160605d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'db4758ee7523fe447444c4bd2b867b543b1eee4e3bbcf6676cd1b27bf6147d86-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Feb 02 10:21:00 compute-1 ceph-mon[80115]: pgmap v1251: 353 pgs: 353 active+clean; 41 MiB data, 303 MiB used, 60 GiB / 60 GiB avail; 775 B/s rd, 0 op/s
Feb 02 10:21:01 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:21:01 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 10:21:01 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:21:01.253 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 10:21:02 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:21:02 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:21:02 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:21:02.251 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:21:02 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 10:21:02 compute-1 sudo[246835]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Feb 02 10:21:02 compute-1 sudo[246835]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 10:21:02 compute-1 sudo[246835]: pam_unix(sudo:session): session closed for user root
Feb 02 10:21:02 compute-1 ceph-mon[80115]: pgmap v1252: 353 pgs: 353 active+clean; 41 MiB data, 303 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Feb 02 10:21:02 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Feb 02 10:21:03 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:21:03 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:21:03 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:21:03.255 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:21:03 compute-1 nova_compute[226294]: 2026-02-02 10:21:03.403 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 02 10:21:04 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:21:04 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:21:04 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:21:04.254 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:21:04 compute-1 ceph-mon[80115]: pgmap v1253: 353 pgs: 353 active+clean; 41 MiB data, 303 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Feb 02 10:21:05 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:21:05 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 10:21:05 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:21:05.257 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 10:21:06 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:21:06 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:21:06 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:21:06.256 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:21:06 compute-1 ceph-mon[80115]: pgmap v1254: 353 pgs: 353 active+clean; 41 MiB data, 303 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Feb 02 10:21:07 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:21:07 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.002000053s ======
Feb 02 10:21:07 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:21:07.259 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000053s
Feb 02 10:21:07 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 10:21:08 compute-1 ceph-mon[80115]: pgmap v1255: 353 pgs: 353 active+clean; 41 MiB data, 303 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Feb 02 10:21:08 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:21:08 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:21:08 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:21:08.260 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:21:08 compute-1 nova_compute[226294]: 2026-02-02 10:21:08.406 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 02 10:21:09 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:21:09 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 10:21:09 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:21:09.262 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 10:21:10 compute-1 ceph-mon[80115]: pgmap v1256: 353 pgs: 353 active+clean; 41 MiB data, 303 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Feb 02 10:21:10 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:21:10 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 10:21:10 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:21:10.263 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 10:21:11 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:21:11 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 10:21:11 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:21:11.263 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 10:21:12 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:21:12 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:21:12 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:21:12.267 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:21:12 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 10:21:12 compute-1 ceph-mon[80115]: pgmap v1257: 353 pgs: 353 active+clean; 41 MiB data, 303 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Feb 02 10:21:13 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:21:13 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 10:21:13 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:21:13.266 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 10:21:13 compute-1 nova_compute[226294]: 2026-02-02 10:21:13.408 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:21:14 compute-1 ceph-mon[80115]: pgmap v1258: 353 pgs: 353 active+clean; 41 MiB data, 303 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Feb 02 10:21:14 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:21:14 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 10:21:14 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:21:14.270 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 10:21:15 compute-1 ceph-mon[80115]: from='client.? 192.168.122.100:0/1499987093' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Feb 02 10:21:15 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:21:15 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 10:21:15 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:21:15.268 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 10:21:16 compute-1 ceph-mon[80115]: from='client.? 192.168.122.100:0/462763208' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Feb 02 10:21:16 compute-1 ceph-mon[80115]: pgmap v1259: 353 pgs: 353 active+clean; 41 MiB data, 303 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Feb 02 10:21:16 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:21:16 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:21:16 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:21:16.274 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:21:17 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:21:17 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 10:21:17 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:21:17.272 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 10:21:17 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Feb 02 10:21:17 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 10:21:18 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:21:18 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:21:18 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:21:18.278 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:21:18 compute-1 ceph-mon[80115]: pgmap v1260: 353 pgs: 353 active+clean; 41 MiB data, 303 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Feb 02 10:21:18 compute-1 nova_compute[226294]: 2026-02-02 10:21:18.411 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 02 10:21:19 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:21:19 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 10:21:19 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:21:19.274 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 10:21:19 compute-1 nova_compute[226294]: 2026-02-02 10:21:19.943 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 02 10:21:20 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:21:20 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 10:21:20 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:21:20.280 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 10:21:20 compute-1 nova_compute[226294]: 2026-02-02 10:21:20.648 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 02 10:21:20 compute-1 nova_compute[226294]: 2026-02-02 10:21:20.649 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 02 10:21:20 compute-1 ceph-mon[80115]: pgmap v1261: 353 pgs: 353 active+clean; 41 MiB data, 303 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Feb 02 10:21:20 compute-1 ceph-mon[80115]: from='client.? 192.168.122.102:0/3647188461' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Feb 02 10:21:21 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:21:21 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:21:21 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:21:21.277 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:21:21 compute-1 podman[246870]: 2026-02-02 10:21:21.486745827 +0000 UTC m=+0.155917809 container health_status 1fb2696999ea15e9688144989093738543d3cef725f9986792d6dfd707aefbe2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:099d88ae13fa2b3409da5310cdcba7fa01d2c87a8bc98296299a57054b9a075e, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'db4758ee7523fe447444c4bd2b867b543b1eee4e3bbcf6676cd1b27bf6147d86-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:099d88ae13fa2b3409da5310cdcba7fa01d2c87a8bc98296299a57054b9a075e', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true)
Feb 02 10:21:21 compute-1 ceph-mon[80115]: from='client.? 192.168.122.102:0/3799849492' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Feb 02 10:21:22 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:21:22 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:21:22 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:21:22.284 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:21:22 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 10:21:22 compute-1 ceph-mon[80115]: pgmap v1262: 353 pgs: 353 active+clean; 41 MiB data, 303 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Feb 02 10:21:22 compute-1 sudo[246897]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Feb 02 10:21:22 compute-1 sudo[246897]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 10:21:22 compute-1 sudo[246897]: pam_unix(sudo:session): session closed for user root
Feb 02 10:21:23 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:21:23 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 10:21:23 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:21:23.278 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 10:21:23 compute-1 nova_compute[226294]: 2026-02-02 10:21:23.412 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:21:23 compute-1 nova_compute[226294]: 2026-02-02 10:21:23.649 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 02 10:21:23 compute-1 nova_compute[226294]: 2026-02-02 10:21:23.650 226298 DEBUG nova.compute.manager [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 02 10:21:23 compute-1 nova_compute[226294]: 2026-02-02 10:21:23.650 226298 DEBUG nova.compute.manager [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 02 10:21:23 compute-1 nova_compute[226294]: 2026-02-02 10:21:23.665 226298 DEBUG nova.compute.manager [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Feb 02 10:21:24 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:21:24 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:21:24 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:21:24.285 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:21:24 compute-1 nova_compute[226294]: 2026-02-02 10:21:24.648 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 02 10:21:24 compute-1 ceph-mon[80115]: pgmap v1263: 353 pgs: 353 active+clean; 41 MiB data, 303 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Feb 02 10:21:25 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:21:25 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:21:25 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:21:25.281 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:21:26 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:21:26 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 10:21:26 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:21:26.288 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 10:21:26 compute-1 nova_compute[226294]: 2026-02-02 10:21:26.644 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 02 10:21:26 compute-1 nova_compute[226294]: 2026-02-02 10:21:26.647 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 02 10:21:26 compute-1 ceph-mon[80115]: pgmap v1264: 353 pgs: 353 active+clean; 41 MiB data, 303 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Feb 02 10:21:27 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:21:27 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:21:27 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:21:27.283 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:21:27 compute-1 nova_compute[226294]: 2026-02-02 10:21:27.648 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 02 10:21:27 compute-1 nova_compute[226294]: 2026-02-02 10:21:27.649 226298 DEBUG nova.compute.manager [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 02 10:21:27 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 10:21:28 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:21:28 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 10:21:28 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:21:28.291 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 10:21:28 compute-1 nova_compute[226294]: 2026-02-02 10:21:28.415 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:21:28 compute-1 ceph-mon[80115]: pgmap v1265: 353 pgs: 353 active+clean; 41 MiB data, 303 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Feb 02 10:21:29 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:21:29 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:21:29 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:21:29.286 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:21:29 compute-1 nova_compute[226294]: 2026-02-02 10:21:29.648 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 02 10:21:29 compute-1 nova_compute[226294]: 2026-02-02 10:21:29.688 226298 DEBUG oslo_concurrency.lockutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 02 10:21:29 compute-1 nova_compute[226294]: 2026-02-02 10:21:29.688 226298 DEBUG oslo_concurrency.lockutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 02 10:21:29 compute-1 nova_compute[226294]: 2026-02-02 10:21:29.688 226298 DEBUG oslo_concurrency.lockutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 02 10:21:29 compute-1 nova_compute[226294]: 2026-02-02 10:21:29.688 226298 DEBUG nova.compute.resource_tracker [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 02 10:21:29 compute-1 nova_compute[226294]: 2026-02-02 10:21:29.688 226298 DEBUG oslo_concurrency.processutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 02 10:21:30 compute-1 ceph-mon[80115]: pgmap v1266: 353 pgs: 353 active+clean; 41 MiB data, 303 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Feb 02 10:21:30 compute-1 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Feb 02 10:21:30 compute-1 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3243985345' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Feb 02 10:21:30 compute-1 nova_compute[226294]: 2026-02-02 10:21:30.174 226298 DEBUG oslo_concurrency.processutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.486s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 02 10:21:30 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:21:30 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:21:30 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:21:30.294 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:21:30 compute-1 nova_compute[226294]: 2026-02-02 10:21:30.308 226298 WARNING nova.virt.libvirt.driver [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 02 10:21:30 compute-1 nova_compute[226294]: 2026-02-02 10:21:30.310 226298 DEBUG nova.compute.resource_tracker [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4884MB free_disk=59.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 02 10:21:30 compute-1 nova_compute[226294]: 2026-02-02 10:21:30.310 226298 DEBUG oslo_concurrency.lockutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 02 10:21:30 compute-1 nova_compute[226294]: 2026-02-02 10:21:30.311 226298 DEBUG oslo_concurrency.lockutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 02 10:21:30 compute-1 nova_compute[226294]: 2026-02-02 10:21:30.375 226298 DEBUG nova.compute.resource_tracker [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 02 10:21:30 compute-1 nova_compute[226294]: 2026-02-02 10:21:30.375 226298 DEBUG nova.compute.resource_tracker [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 02 10:21:30 compute-1 nova_compute[226294]: 2026-02-02 10:21:30.391 226298 DEBUG oslo_concurrency.processutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 02 10:21:30 compute-1 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Feb 02 10:21:30 compute-1 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/207582716' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Feb 02 10:21:30 compute-1 nova_compute[226294]: 2026-02-02 10:21:30.846 226298 DEBUG oslo_concurrency.processutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.455s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 02 10:21:30 compute-1 nova_compute[226294]: 2026-02-02 10:21:30.852 226298 DEBUG nova.compute.provider_tree [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Inventory has not changed in ProviderTree for provider: 8e32c057-ad28-4c19-8374-763e0c1c8622 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 02 10:21:30 compute-1 nova_compute[226294]: 2026-02-02 10:21:30.869 226298 DEBUG nova.scheduler.client.report [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Inventory has not changed for provider 8e32c057-ad28-4c19-8374-763e0c1c8622 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 02 10:21:30 compute-1 nova_compute[226294]: 2026-02-02 10:21:30.871 226298 DEBUG nova.compute.resource_tracker [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 02 10:21:30 compute-1 nova_compute[226294]: 2026-02-02 10:21:30.872 226298 DEBUG oslo_concurrency.lockutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.561s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 02 10:21:31 compute-1 ceph-mon[80115]: from='client.? 192.168.122.101:0/3243985345' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Feb 02 10:21:31 compute-1 ceph-mon[80115]: from='client.? 192.168.122.101:0/207582716' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Feb 02 10:21:31 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:21:31 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 10:21:31 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:21:31.288 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 10:21:31 compute-1 podman[246970]: 2026-02-02 10:21:31.395231729 +0000 UTC m=+0.066272375 container health_status 9cab238676dda9a572dafc047b624a8b78a8fbec063bf997856559897160605d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'db4758ee7523fe447444c4bd2b867b543b1eee4e3bbcf6676cd1b27bf6147d86-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_metadata_agent)
Feb 02 10:21:32 compute-1 ceph-mon[80115]: pgmap v1267: 353 pgs: 353 active+clean; 41 MiB data, 303 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Feb 02 10:21:32 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:21:32 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:21:32 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:21:32.298 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:21:32 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 10:21:33 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Feb 02 10:21:33 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:21:33 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 10:21:33 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:21:33.290 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 10:21:33 compute-1 nova_compute[226294]: 2026-02-02 10:21:33.417 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:21:34 compute-1 ceph-mon[80115]: pgmap v1268: 353 pgs: 353 active+clean; 41 MiB data, 303 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Feb 02 10:21:34 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:21:34 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:21:34 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:21:34.301 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:21:35 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:21:35 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 10:21:35 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:21:35.293 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 10:21:36 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:21:36 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:21:36 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:21:36.304 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:21:36 compute-1 ceph-mon[80115]: pgmap v1269: 353 pgs: 353 active+clean; 41 MiB data, 303 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Feb 02 10:21:37 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:21:37 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 10:21:37 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:21:37.296 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 10:21:37 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 10:21:38 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:21:38 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:21:38 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:21:38.307 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:21:38 compute-1 nova_compute[226294]: 2026-02-02 10:21:38.420 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 02 10:21:38 compute-1 nova_compute[226294]: 2026-02-02 10:21:38.422 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:21:39 compute-1 ceph-mon[80115]: pgmap v1270: 353 pgs: 353 active+clean; 41 MiB data, 303 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Feb 02 10:21:39 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:21:39 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 10:21:39 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:21:39.298 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 10:21:40 compute-1 ceph-mon[80115]: pgmap v1271: 353 pgs: 353 active+clean; 41 MiB data, 303 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Feb 02 10:21:40 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:21:40 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:21:40 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:21:40.310 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:21:41 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:21:41 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:21:41 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:21:41.301 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:21:42 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:21:42 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 10:21:42 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:21:42.313 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 10:21:42 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 10:21:42 compute-1 ceph-mon[80115]: pgmap v1272: 353 pgs: 353 active+clean; 41 MiB data, 303 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Feb 02 10:21:43 compute-1 sudo[246997]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Feb 02 10:21:43 compute-1 sudo[246997]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 10:21:43 compute-1 sudo[246997]: pam_unix(sudo:session): session closed for user root
Feb 02 10:21:43 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:21:43 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:21:43 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:21:43.304 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:21:43 compute-1 nova_compute[226294]: 2026-02-02 10:21:43.423 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 02 10:21:43 compute-1 nova_compute[226294]: 2026-02-02 10:21:43.425 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 02 10:21:43 compute-1 nova_compute[226294]: 2026-02-02 10:21:43.425 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 02 10:21:43 compute-1 nova_compute[226294]: 2026-02-02 10:21:43.425 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 02 10:21:43 compute-1 nova_compute[226294]: 2026-02-02 10:21:43.462 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:21:43 compute-1 nova_compute[226294]: 2026-02-02 10:21:43.463 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 02 10:21:44 compute-1 ceph-mon[80115]: pgmap v1273: 353 pgs: 353 active+clean; 41 MiB data, 303 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Feb 02 10:21:44 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:21:44 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 10:21:44 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:21:44.318 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 10:21:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 10:21:44.922 143542 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 02 10:21:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 10:21:44.923 143542 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 02 10:21:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 10:21:44.923 143542 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 02 10:21:45 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:21:45 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:21:45 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:21:45.307 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:21:46 compute-1 ceph-mon[80115]: pgmap v1274: 353 pgs: 353 active+clean; 41 MiB data, 303 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Feb 02 10:21:46 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:21:46 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:21:46 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:21:46.322 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:21:47 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:21:47 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 10:21:47 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:21:47.310 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 10:21:47 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Feb 02 10:21:47 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 10:21:48 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:21:48 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 10:21:48 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:21:48.324 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 10:21:48 compute-1 ceph-mon[80115]: pgmap v1275: 353 pgs: 353 active+clean; 41 MiB data, 303 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Feb 02 10:21:48 compute-1 nova_compute[226294]: 2026-02-02 10:21:48.464 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:21:49 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:21:49 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 10:21:49 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:21:49.312 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 10:21:50 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:21:50 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:21:50 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:21:50.328 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:21:50 compute-1 ceph-mon[80115]: pgmap v1276: 353 pgs: 353 active+clean; 41 MiB data, 303 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Feb 02 10:21:51 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:21:51 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:21:51 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:21:51.314 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:21:52 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:21:52 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 10:21:52 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:21:52.331 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 10:21:52 compute-1 podman[247026]: 2026-02-02 10:21:52.408659693 +0000 UTC m=+0.085027412 container health_status 1fb2696999ea15e9688144989093738543d3cef725f9986792d6dfd707aefbe2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:099d88ae13fa2b3409da5310cdcba7fa01d2c87a8bc98296299a57054b9a075e, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, container_name=ovn_controller, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'db4758ee7523fe447444c4bd2b867b543b1eee4e3bbcf6676cd1b27bf6147d86-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:099d88ae13fa2b3409da5310cdcba7fa01d2c87a8bc98296299a57054b9a075e', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20260127, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 02 10:21:52 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 10:21:52 compute-1 ceph-mon[80115]: pgmap v1277: 353 pgs: 353 active+clean; 41 MiB data, 303 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Feb 02 10:21:53 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:21:53 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:21:53 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:21:53.316 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:21:53 compute-1 nova_compute[226294]: 2026-02-02 10:21:53.466 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 02 10:21:53 compute-1 nova_compute[226294]: 2026-02-02 10:21:53.468 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 02 10:21:53 compute-1 nova_compute[226294]: 2026-02-02 10:21:53.468 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5004 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 02 10:21:53 compute-1 nova_compute[226294]: 2026-02-02 10:21:53.469 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 02 10:21:53 compute-1 nova_compute[226294]: 2026-02-02 10:21:53.501 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:21:53 compute-1 nova_compute[226294]: 2026-02-02 10:21:53.502 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 02 10:21:53 compute-1 ceph-mon[80115]: pgmap v1278: 353 pgs: 353 active+clean; 41 MiB data, 303 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Feb 02 10:21:54 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:21:54 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:21:54 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:21:54.335 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:21:54 compute-1 sudo[247054]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 02 10:21:54 compute-1 sudo[247054]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 10:21:54 compute-1 sudo[247054]: pam_unix(sudo:session): session closed for user root
Feb 02 10:21:55 compute-1 sudo[247079]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/d241d473-9fcb-5f74-b163-f1ca4454e7f1/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Feb 02 10:21:55 compute-1 sudo[247079]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 10:21:55 compute-1 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Feb 02 10:21:55 compute-1 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2404311604' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Feb 02 10:21:55 compute-1 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Feb 02 10:21:55 compute-1 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2404311604' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Feb 02 10:21:55 compute-1 ceph-mon[80115]: from='client.? 192.168.122.10:0/2404311604' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Feb 02 10:21:55 compute-1 ceph-mon[80115]: from='client.? 192.168.122.10:0/2404311604' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Feb 02 10:21:55 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:21:55 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 10:21:55 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:21:55.319 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 10:21:55 compute-1 sudo[247079]: pam_unix(sudo:session): session closed for user root
Feb 02 10:21:56 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Feb 02 10:21:56 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Feb 02 10:21:56 compute-1 ceph-mon[80115]: pgmap v1279: 353 pgs: 353 active+clean; 41 MiB data, 303 MiB used, 60 GiB / 60 GiB avail; 518 B/s rd, 0 op/s
Feb 02 10:21:56 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb 02 10:21:56 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb 02 10:21:56 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Feb 02 10:21:56 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Feb 02 10:21:56 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Feb 02 10:21:56 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:21:56 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:21:56 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:21:56.339 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:21:57 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:21:57 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:21:57 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:21:57.321 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:21:57 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 10:21:58 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:21:58 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:21:58 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:21:58.342 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:21:58 compute-1 nova_compute[226294]: 2026-02-02 10:21:58.503 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:21:58 compute-1 nova_compute[226294]: 2026-02-02 10:21:58.504 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:21:58 compute-1 ceph-mon[80115]: pgmap v1280: 353 pgs: 353 active+clean; 41 MiB data, 303 MiB used, 60 GiB / 60 GiB avail; 518 B/s rd, 0 op/s
Feb 02 10:21:59 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:21:59 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 10:21:59 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:21:59.323 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 10:22:00 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:22:00 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:22:00 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:22:00.345 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:22:00 compute-1 ceph-mon[80115]: pgmap v1281: 353 pgs: 353 active+clean; 41 MiB data, 303 MiB used, 60 GiB / 60 GiB avail; 777 B/s rd, 0 op/s
Feb 02 10:22:00 compute-1 sudo[247138]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 02 10:22:00 compute-1 sudo[247138]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 10:22:00 compute-1 sudo[247138]: pam_unix(sudo:session): session closed for user root
Feb 02 10:22:01 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:22:01 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:22:01 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:22:01.325 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:22:01 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb 02 10:22:01 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb 02 10:22:02 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:22:02 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:22:02 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:22:02.347 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:22:02 compute-1 podman[247163]: 2026-02-02 10:22:02.387223929 +0000 UTC m=+0.054713670 container health_status 9cab238676dda9a572dafc047b624a8b78a8fbec063bf997856559897160605d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'db4758ee7523fe447444c4bd2b867b543b1eee4e3bbcf6676cd1b27bf6147d86-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent)
Feb 02 10:22:02 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 10:22:02 compute-1 ceph-mon[80115]: pgmap v1282: 353 pgs: 353 active+clean; 41 MiB data, 303 MiB used, 60 GiB / 60 GiB avail; 518 B/s rd, 0 op/s
Feb 02 10:22:02 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Feb 02 10:22:03 compute-1 sudo[247184]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Feb 02 10:22:03 compute-1 sudo[247184]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 10:22:03 compute-1 sudo[247184]: pam_unix(sudo:session): session closed for user root
Feb 02 10:22:03 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:22:03 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:22:03 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:22:03.327 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:22:03 compute-1 nova_compute[226294]: 2026-02-02 10:22:03.505 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 02 10:22:03 compute-1 nova_compute[226294]: 2026-02-02 10:22:03.507 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 02 10:22:03 compute-1 nova_compute[226294]: 2026-02-02 10:22:03.507 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 02 10:22:03 compute-1 nova_compute[226294]: 2026-02-02 10:22:03.507 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 02 10:22:03 compute-1 nova_compute[226294]: 2026-02-02 10:22:03.542 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:22:03 compute-1 nova_compute[226294]: 2026-02-02 10:22:03.543 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 02 10:22:04 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:22:04 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 10:22:04 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:22:04.350 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 10:22:04 compute-1 ceph-mon[80115]: pgmap v1283: 353 pgs: 353 active+clean; 41 MiB data, 303 MiB used, 60 GiB / 60 GiB avail; 777 B/s rd, 0 op/s
Feb 02 10:22:05 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:22:05 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 10:22:05 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:22:05.329 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 10:22:06 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:22:06 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 10:22:06 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:22:06.353 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 10:22:06 compute-1 ceph-mon[80115]: pgmap v1284: 353 pgs: 353 active+clean; 41 MiB data, 303 MiB used, 60 GiB / 60 GiB avail; 518 B/s rd, 0 op/s
Feb 02 10:22:07 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:22:07 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:22:07 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:22:07.332 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:22:07 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 10:22:08 compute-1 ceph-mon[80115]: pgmap v1285: 353 pgs: 353 active+clean; 41 MiB data, 303 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Feb 02 10:22:08 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:22:08 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:22:08 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:22:08.356 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:22:08 compute-1 nova_compute[226294]: 2026-02-02 10:22:08.543 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 02 10:22:09 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:22:09 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:22:09 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:22:09.335 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:22:10 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:22:10 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:22:10 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:22:10.360 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:22:10 compute-1 ceph-mon[80115]: pgmap v1286: 353 pgs: 353 active+clean; 41 MiB data, 303 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Feb 02 10:22:10 compute-1 sshd-session[247212]: Invalid user solv from 80.94.92.184 port 37290
Feb 02 10:22:11 compute-1 sshd-session[247212]: Connection closed by invalid user solv 80.94.92.184 port 37290 [preauth]
Feb 02 10:22:11 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:22:11 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 10:22:11 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:22:11.337 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 10:22:12 compute-1 ceph-mon[80115]: pgmap v1287: 353 pgs: 353 active+clean; 41 MiB data, 303 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Feb 02 10:22:12 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:22:12 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:22:12 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:22:12.363 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:22:12 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 10:22:13 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:22:13 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:22:13 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:22:13.340 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:22:13 compute-1 nova_compute[226294]: 2026-02-02 10:22:13.546 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 02 10:22:14 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:22:14 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 10:22:14 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:22:14.366 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 10:22:14 compute-1 ceph-mon[80115]: pgmap v1288: 353 pgs: 353 active+clean; 41 MiB data, 303 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Feb 02 10:22:15 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:22:15 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 10:22:15 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:22:15.342 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 10:22:16 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:22:16 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:22:16 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:22:16.369 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:22:16 compute-1 ceph-mon[80115]: pgmap v1289: 353 pgs: 353 active+clean; 41 MiB data, 303 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Feb 02 10:22:16 compute-1 ceph-mon[80115]: from='client.? 192.168.122.100:0/3106572320' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Feb 02 10:22:16 compute-1 ceph-mon[80115]: from='client.? 192.168.122.100:0/2890586090' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Feb 02 10:22:17 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:22:17 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 10:22:17 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:22:17.345 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 10:22:17 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 10:22:17 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Feb 02 10:22:18 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:22:18 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:22:18 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:22:18.373 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:22:18 compute-1 nova_compute[226294]: 2026-02-02 10:22:18.548 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 02 10:22:18 compute-1 nova_compute[226294]: 2026-02-02 10:22:18.550 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 02 10:22:18 compute-1 nova_compute[226294]: 2026-02-02 10:22:18.550 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 02 10:22:18 compute-1 nova_compute[226294]: 2026-02-02 10:22:18.550 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 02 10:22:18 compute-1 nova_compute[226294]: 2026-02-02 10:22:18.552 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:22:18 compute-1 nova_compute[226294]: 2026-02-02 10:22:18.553 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 02 10:22:18 compute-1 ceph-mon[80115]: pgmap v1290: 353 pgs: 353 active+clean; 41 MiB data, 303 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Feb 02 10:22:19 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:22:19 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 10:22:19 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:22:19.347 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 10:22:20 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:22:20 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:22:20 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:22:20.375 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:22:20 compute-1 ceph-mon[80115]: pgmap v1291: 353 pgs: 353 active+clean; 41 MiB data, 303 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Feb 02 10:22:20 compute-1 ceph-mon[80115]: from='client.? 192.168.122.102:0/2708372678' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Feb 02 10:22:21 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:22:21 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:22:21 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:22:21.349 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:22:21 compute-1 ceph-mon[80115]: from='client.? 192.168.122.102:0/3640400089' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Feb 02 10:22:21 compute-1 nova_compute[226294]: 2026-02-02 10:22:21.872 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 02 10:22:21 compute-1 nova_compute[226294]: 2026-02-02 10:22:21.873 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 02 10:22:22 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:22:22 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:22:22 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:22:22.379 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:22:22 compute-1 nova_compute[226294]: 2026-02-02 10:22:22.649 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 02 10:22:22 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 10:22:22 compute-1 ceph-mon[80115]: pgmap v1292: 353 pgs: 353 active+clean; 41 MiB data, 303 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Feb 02 10:22:23 compute-1 sudo[247221]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Feb 02 10:22:23 compute-1 sudo[247221]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 10:22:23 compute-1 sudo[247221]: pam_unix(sudo:session): session closed for user root
Feb 02 10:22:23 compute-1 podman[247245]: 2026-02-02 10:22:23.283879602 +0000 UTC m=+0.091103142 container health_status 1fb2696999ea15e9688144989093738543d3cef725f9986792d6dfd707aefbe2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:099d88ae13fa2b3409da5310cdcba7fa01d2c87a8bc98296299a57054b9a075e, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'db4758ee7523fe447444c4bd2b867b543b1eee4e3bbcf6676cd1b27bf6147d86-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:099d88ae13fa2b3409da5310cdcba7fa01d2c87a8bc98296299a57054b9a075e', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true)
Feb 02 10:22:23 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:22:23 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:22:23 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:22:23.351 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:22:23 compute-1 nova_compute[226294]: 2026-02-02 10:22:23.553 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:22:23 compute-1 nova_compute[226294]: 2026-02-02 10:22:23.554 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:22:23 compute-1 nova_compute[226294]: 2026-02-02 10:22:23.649 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 02 10:22:23 compute-1 nova_compute[226294]: 2026-02-02 10:22:23.649 226298 DEBUG nova.compute.manager [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 02 10:22:23 compute-1 nova_compute[226294]: 2026-02-02 10:22:23.650 226298 DEBUG nova.compute.manager [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 02 10:22:23 compute-1 nova_compute[226294]: 2026-02-02 10:22:23.675 226298 DEBUG nova.compute.manager [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Feb 02 10:22:24 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:22:24 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 10:22:24 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:22:24.381 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 10:22:24 compute-1 nova_compute[226294]: 2026-02-02 10:22:24.649 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 02 10:22:24 compute-1 ceph-mon[80115]: pgmap v1293: 353 pgs: 353 active+clean; 41 MiB data, 303 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Feb 02 10:22:25 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:22:25 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:22:25 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:22:25.353 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:22:25 compute-1 nova_compute[226294]: 2026-02-02 10:22:25.644 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 02 10:22:26 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:22:26 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 10:22:26 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:22:26.384 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 10:22:26 compute-1 ceph-mon[80115]: pgmap v1294: 353 pgs: 353 active+clean; 41 MiB data, 303 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Feb 02 10:22:27 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:22:27 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:22:27 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:22:27.355 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:22:27 compute-1 nova_compute[226294]: 2026-02-02 10:22:27.658 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 02 10:22:27 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 10:22:28 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:22:28 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:22:28 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:22:28.387 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:22:28 compute-1 nova_compute[226294]: 2026-02-02 10:22:28.555 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 02 10:22:28 compute-1 nova_compute[226294]: 2026-02-02 10:22:28.557 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 02 10:22:28 compute-1 nova_compute[226294]: 2026-02-02 10:22:28.557 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 02 10:22:28 compute-1 nova_compute[226294]: 2026-02-02 10:22:28.558 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 02 10:22:28 compute-1 nova_compute[226294]: 2026-02-02 10:22:28.594 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:22:28 compute-1 nova_compute[226294]: 2026-02-02 10:22:28.595 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 02 10:22:28 compute-1 nova_compute[226294]: 2026-02-02 10:22:28.648 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 02 10:22:29 compute-1 ceph-mon[80115]: pgmap v1295: 353 pgs: 353 active+clean; 41 MiB data, 303 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Feb 02 10:22:29 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:22:29 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:22:29 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:22:29.357 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:22:29 compute-1 nova_compute[226294]: 2026-02-02 10:22:29.648 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 02 10:22:29 compute-1 nova_compute[226294]: 2026-02-02 10:22:29.649 226298 DEBUG nova.compute.manager [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 02 10:22:30 compute-1 ceph-mon[80115]: pgmap v1296: 353 pgs: 353 active+clean; 41 MiB data, 303 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Feb 02 10:22:30 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:22:30 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:22:30 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:22:30.389 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:22:31 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:22:31 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 10:22:31 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:22:31.359 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 10:22:31 compute-1 nova_compute[226294]: 2026-02-02 10:22:31.648 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 02 10:22:31 compute-1 nova_compute[226294]: 2026-02-02 10:22:31.673 226298 DEBUG oslo_concurrency.lockutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 02 10:22:31 compute-1 nova_compute[226294]: 2026-02-02 10:22:31.674 226298 DEBUG oslo_concurrency.lockutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 02 10:22:31 compute-1 nova_compute[226294]: 2026-02-02 10:22:31.675 226298 DEBUG oslo_concurrency.lockutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 02 10:22:31 compute-1 nova_compute[226294]: 2026-02-02 10:22:31.675 226298 DEBUG nova.compute.resource_tracker [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 02 10:22:31 compute-1 nova_compute[226294]: 2026-02-02 10:22:31.676 226298 DEBUG oslo_concurrency.processutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 02 10:22:32 compute-1 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Feb 02 10:22:32 compute-1 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/689481102' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Feb 02 10:22:32 compute-1 nova_compute[226294]: 2026-02-02 10:22:32.171 226298 DEBUG oslo_concurrency.processutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.495s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 02 10:22:32 compute-1 nova_compute[226294]: 2026-02-02 10:22:32.299 226298 WARNING nova.virt.libvirt.driver [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 02 10:22:32 compute-1 nova_compute[226294]: 2026-02-02 10:22:32.300 226298 DEBUG nova.compute.resource_tracker [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4874MB free_disk=59.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 02 10:22:32 compute-1 nova_compute[226294]: 2026-02-02 10:22:32.300 226298 DEBUG oslo_concurrency.lockutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 02 10:22:32 compute-1 nova_compute[226294]: 2026-02-02 10:22:32.301 226298 DEBUG oslo_concurrency.lockutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 02 10:22:32 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:22:32 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:22:32 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:22:32.392 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:22:32 compute-1 nova_compute[226294]: 2026-02-02 10:22:32.405 226298 DEBUG nova.compute.resource_tracker [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 02 10:22:32 compute-1 nova_compute[226294]: 2026-02-02 10:22:32.405 226298 DEBUG nova.compute.resource_tracker [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 02 10:22:32 compute-1 nova_compute[226294]: 2026-02-02 10:22:32.419 226298 DEBUG oslo_concurrency.processutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 02 10:22:32 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 10:22:32 compute-1 ceph-mon[80115]: pgmap v1297: 353 pgs: 353 active+clean; 41 MiB data, 303 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Feb 02 10:22:32 compute-1 ceph-mon[80115]: from='client.? 192.168.122.101:0/689481102' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Feb 02 10:22:32 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Feb 02 10:22:32 compute-1 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Feb 02 10:22:32 compute-1 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1174471894' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Feb 02 10:22:32 compute-1 nova_compute[226294]: 2026-02-02 10:22:32.890 226298 DEBUG oslo_concurrency.processutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.471s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 02 10:22:32 compute-1 nova_compute[226294]: 2026-02-02 10:22:32.896 226298 DEBUG nova.compute.provider_tree [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Inventory has not changed in ProviderTree for provider: 8e32c057-ad28-4c19-8374-763e0c1c8622 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 02 10:22:32 compute-1 nova_compute[226294]: 2026-02-02 10:22:32.917 226298 DEBUG nova.scheduler.client.report [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Inventory has not changed for provider 8e32c057-ad28-4c19-8374-763e0c1c8622 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 02 10:22:32 compute-1 nova_compute[226294]: 2026-02-02 10:22:32.920 226298 DEBUG nova.compute.resource_tracker [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 02 10:22:32 compute-1 nova_compute[226294]: 2026-02-02 10:22:32.921 226298 DEBUG oslo_concurrency.lockutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.620s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 02 10:22:33 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:22:33 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:22:33 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:22:33.363 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:22:33 compute-1 podman[247322]: 2026-02-02 10:22:33.384798758 +0000 UTC m=+0.060374539 container health_status 9cab238676dda9a572dafc047b624a8b78a8fbec063bf997856559897160605d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20260127, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'db4758ee7523fe447444c4bd2b867b543b1eee4e3bbcf6676cd1b27bf6147d86-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Feb 02 10:22:33 compute-1 nova_compute[226294]: 2026-02-02 10:22:33.596 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 02 10:22:33 compute-1 ceph-mon[80115]: from='client.? 192.168.122.101:0/1174471894' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Feb 02 10:22:34 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:22:34 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 10:22:34 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:22:34.395 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 10:22:34 compute-1 ceph-mon[80115]: pgmap v1298: 353 pgs: 353 active+clean; 41 MiB data, 303 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Feb 02 10:22:35 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:22:35 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:22:35 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:22:35.365 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:22:36 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:22:36 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:22:36 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:22:36.398 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:22:36 compute-1 ceph-mon[80115]: pgmap v1299: 353 pgs: 353 active+clean; 41 MiB data, 303 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Feb 02 10:22:37 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:22:37 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 10:22:37 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:22:37.367 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 10:22:37 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 10:22:38 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:22:38 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:22:38 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:22:38.401 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:22:38 compute-1 nova_compute[226294]: 2026-02-02 10:22:38.598 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:22:38 compute-1 ceph-mon[80115]: pgmap v1300: 353 pgs: 353 active+clean; 41 MiB data, 303 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Feb 02 10:22:39 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:22:39 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:22:39 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:22:39.369 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:22:40 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:22:40 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:22:40 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:22:40.403 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:22:40 compute-1 ceph-mon[80115]: pgmap v1301: 353 pgs: 353 active+clean; 41 MiB data, 303 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Feb 02 10:22:41 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:22:41 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:22:41 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:22:41.371 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:22:42 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:22:42 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:22:42 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:22:42.406 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:22:42 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 10:22:42 compute-1 ceph-mon[80115]: pgmap v1302: 353 pgs: 353 active+clean; 41 MiB data, 303 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Feb 02 10:22:43 compute-1 sudo[247346]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Feb 02 10:22:43 compute-1 sudo[247346]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 10:22:43 compute-1 sudo[247346]: pam_unix(sudo:session): session closed for user root
Feb 02 10:22:43 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:22:43 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 10:22:43 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:22:43.374 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 10:22:43 compute-1 nova_compute[226294]: 2026-02-02 10:22:43.600 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:22:44 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:22:44 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:22:44 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:22:44.409 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:22:44 compute-1 ceph-mon[80115]: pgmap v1303: 353 pgs: 353 active+clean; 41 MiB data, 303 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Feb 02 10:22:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 10:22:44.923 143542 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 02 10:22:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 10:22:44.924 143542 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 02 10:22:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 10:22:44.924 143542 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 02 10:22:45 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:22:45 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 10:22:45 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:22:45.376 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 10:22:46 compute-1 ceph-mon[80115]: pgmap v1304: 353 pgs: 353 active+clean; 41 MiB data, 303 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Feb 02 10:22:46 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:22:46 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:22:46 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:22:46.413 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:22:47 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Feb 02 10:22:47 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:22:47 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 10:22:47 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:22:47.378 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 10:22:47 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 10:22:48 compute-1 ceph-mon[80115]: pgmap v1305: 353 pgs: 353 active+clean; 41 MiB data, 303 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Feb 02 10:22:48 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:22:48 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:22:48 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:22:48.415 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:22:48 compute-1 nova_compute[226294]: 2026-02-02 10:22:48.604 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 02 10:22:48 compute-1 nova_compute[226294]: 2026-02-02 10:22:48.606 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 02 10:22:48 compute-1 nova_compute[226294]: 2026-02-02 10:22:48.607 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 02 10:22:48 compute-1 nova_compute[226294]: 2026-02-02 10:22:48.607 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 02 10:22:48 compute-1 nova_compute[226294]: 2026-02-02 10:22:48.633 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:22:48 compute-1 nova_compute[226294]: 2026-02-02 10:22:48.633 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 02 10:22:49 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:22:49 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:22:49 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:22:49.380 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:22:50 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:22:50 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:22:50 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:22:50.418 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:22:50 compute-1 ceph-mon[80115]: pgmap v1306: 353 pgs: 353 active+clean; 41 MiB data, 303 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Feb 02 10:22:51 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:22:51 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:22:51 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:22:51.382 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:22:52 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:22:52 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 10:22:52 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:22:52.421 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 10:22:52 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 10:22:52 compute-1 ceph-mon[80115]: pgmap v1307: 353 pgs: 353 active+clean; 41 MiB data, 303 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Feb 02 10:22:53 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:22:53 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:22:53 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:22:53.384 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:22:53 compute-1 podman[247376]: 2026-02-02 10:22:53.466393744 +0000 UTC m=+0.142899065 container health_status 1fb2696999ea15e9688144989093738543d3cef725f9986792d6dfd707aefbe2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:099d88ae13fa2b3409da5310cdcba7fa01d2c87a8bc98296299a57054b9a075e, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'db4758ee7523fe447444c4bd2b867b543b1eee4e3bbcf6676cd1b27bf6147d86-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:099d88ae13fa2b3409da5310cdcba7fa01d2c87a8bc98296299a57054b9a075e', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Feb 02 10:22:53 compute-1 nova_compute[226294]: 2026-02-02 10:22:53.633 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:22:53 compute-1 nova_compute[226294]: 2026-02-02 10:22:53.635 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:22:54 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:22:54 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 10:22:54 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:22:54.424 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 10:22:54 compute-1 ceph-mon[80115]: pgmap v1308: 353 pgs: 353 active+clean; 41 MiB data, 303 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Feb 02 10:22:55 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:22:55 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:22:55 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:22:55.386 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:22:56 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:22:56 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:22:56 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:22:56.427 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:22:56 compute-1 ceph-mon[80115]: pgmap v1309: 353 pgs: 353 active+clean; 41 MiB data, 303 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Feb 02 10:22:57 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:22:57 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 10:22:57 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:22:57.388 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 10:22:57 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 10:22:58 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:22:58 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:22:58 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:22:58.431 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:22:58 compute-1 nova_compute[226294]: 2026-02-02 10:22:58.635 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 02 10:22:58 compute-1 nova_compute[226294]: 2026-02-02 10:22:58.636 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:22:58 compute-1 nova_compute[226294]: 2026-02-02 10:22:58.636 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5001 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 02 10:22:58 compute-1 nova_compute[226294]: 2026-02-02 10:22:58.637 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 02 10:22:58 compute-1 nova_compute[226294]: 2026-02-02 10:22:58.637 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 02 10:22:58 compute-1 nova_compute[226294]: 2026-02-02 10:22:58.639 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:22:58 compute-1 ceph-mon[80115]: pgmap v1310: 353 pgs: 353 active+clean; 41 MiB data, 303 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Feb 02 10:22:58 compute-1 ceph-mon[80115]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #73. Immutable memtables: 0.
Feb 02 10:22:58 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:22:58.826787) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Feb 02 10:22:58 compute-1 ceph-mon[80115]: rocksdb: [db/flush_job.cc:856] [default] [JOB 43] Flushing memtable with next log file: 73
Feb 02 10:22:58 compute-1 ceph-mon[80115]: rocksdb: EVENT_LOG_v1 {"time_micros": 1770027778826827, "job": 43, "event": "flush_started", "num_memtables": 1, "num_entries": 1568, "num_deletes": 251, "total_data_size": 3859254, "memory_usage": 3913120, "flush_reason": "Manual Compaction"}
Feb 02 10:22:58 compute-1 ceph-mon[80115]: rocksdb: [db/flush_job.cc:885] [default] [JOB 43] Level-0 flush table #74: started
Feb 02 10:22:58 compute-1 ceph-mon[80115]: rocksdb: EVENT_LOG_v1 {"time_micros": 1770027778849514, "cf_name": "default", "job": 43, "event": "table_file_creation", "file_number": 74, "file_size": 2515296, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 37843, "largest_seqno": 39406, "table_properties": {"data_size": 2508757, "index_size": 3674, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1797, "raw_key_size": 13899, "raw_average_key_size": 19, "raw_value_size": 2495610, "raw_average_value_size": 3590, "num_data_blocks": 160, "num_entries": 695, "num_filter_entries": 695, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1770027643, "oldest_key_time": 1770027643, "file_creation_time": 1770027778, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "fac1d709-8a2a-487d-b05b-57255ec289c7", "db_session_id": "DE871D21TSCUFP8UED8E", "orig_file_number": 74, "seqno_to_time_mapping": "N/A"}}
Feb 02 10:22:58 compute-1 ceph-mon[80115]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 43] Flush lasted 22786 microseconds, and 6479 cpu microseconds.
Feb 02 10:22:58 compute-1 ceph-mon[80115]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Feb 02 10:22:58 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:22:58.849572) [db/flush_job.cc:967] [default] [JOB 43] Level-0 flush table #74: 2515296 bytes OK
Feb 02 10:22:58 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:22:58.849596) [db/memtable_list.cc:519] [default] Level-0 commit table #74 started
Feb 02 10:22:58 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:22:58.851735) [db/memtable_list.cc:722] [default] Level-0 commit table #74: memtable #1 done
Feb 02 10:22:58 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:22:58.851756) EVENT_LOG_v1 {"time_micros": 1770027778851749, "job": 43, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Feb 02 10:22:58 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:22:58.851779) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Feb 02 10:22:58 compute-1 ceph-mon[80115]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 43] Try to delete WAL files size 3852011, prev total WAL file size 3852011, number of live WAL files 2.
Feb 02 10:22:58 compute-1 ceph-mon[80115]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000070.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb 02 10:22:58 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:22:58.852716) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730033303132' seq:72057594037927935, type:22 .. '7061786F730033323634' seq:0, type:0; will stop at (end)
Feb 02 10:22:58 compute-1 ceph-mon[80115]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 44] Compacting 1@0 + 1@6 files to L6, score -1.00
Feb 02 10:22:58 compute-1 ceph-mon[80115]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 43 Base level 0, inputs: [74(2456KB)], [72(11MB)]
Feb 02 10:22:58 compute-1 ceph-mon[80115]: rocksdb: EVENT_LOG_v1 {"time_micros": 1770027778852766, "job": 44, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [74], "files_L6": [72], "score": -1, "input_data_size": 14430164, "oldest_snapshot_seqno": -1}
Feb 02 10:22:58 compute-1 ceph-mon[80115]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 44] Generated table #75: 6723 keys, 12201192 bytes, temperature: kUnknown
Feb 02 10:22:58 compute-1 ceph-mon[80115]: rocksdb: EVENT_LOG_v1 {"time_micros": 1770027778953628, "cf_name": "default", "job": 44, "event": "table_file_creation", "file_number": 75, "file_size": 12201192, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 12159961, "index_size": 23327, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 16837, "raw_key_size": 176643, "raw_average_key_size": 26, "raw_value_size": 12042319, "raw_average_value_size": 1791, "num_data_blocks": 910, "num_entries": 6723, "num_filter_entries": 6723, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1770025175, "oldest_key_time": 0, "file_creation_time": 1770027778, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "fac1d709-8a2a-487d-b05b-57255ec289c7", "db_session_id": "DE871D21TSCUFP8UED8E", "orig_file_number": 75, "seqno_to_time_mapping": "N/A"}}
Feb 02 10:22:58 compute-1 ceph-mon[80115]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Feb 02 10:22:58 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:22:58.953988) [db/compaction/compaction_job.cc:1663] [default] [JOB 44] Compacted 1@0 + 1@6 files to L6 => 12201192 bytes
Feb 02 10:22:58 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:22:58.956528) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 142.9 rd, 120.9 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(2.4, 11.4 +0.0 blob) out(11.6 +0.0 blob), read-write-amplify(10.6) write-amplify(4.9) OK, records in: 7239, records dropped: 516 output_compression: NoCompression
Feb 02 10:22:58 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:22:58.956566) EVENT_LOG_v1 {"time_micros": 1770027778956548, "job": 44, "event": "compaction_finished", "compaction_time_micros": 100960, "compaction_time_cpu_micros": 31418, "output_level": 6, "num_output_files": 1, "total_output_size": 12201192, "num_input_records": 7239, "num_output_records": 6723, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Feb 02 10:22:58 compute-1 ceph-mon[80115]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000074.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb 02 10:22:58 compute-1 ceph-mon[80115]: rocksdb: EVENT_LOG_v1 {"time_micros": 1770027778957335, "job": 44, "event": "table_file_deletion", "file_number": 74}
Feb 02 10:22:58 compute-1 ceph-mon[80115]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000072.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb 02 10:22:58 compute-1 ceph-mon[80115]: rocksdb: EVENT_LOG_v1 {"time_micros": 1770027778959478, "job": 44, "event": "table_file_deletion", "file_number": 72}
Feb 02 10:22:58 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:22:58.852574) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 02 10:22:58 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:22:58.959586) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 02 10:22:58 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:22:58.959594) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 02 10:22:58 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:22:58.959598) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 02 10:22:58 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:22:58.959602) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 02 10:22:58 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:22:58.959607) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 02 10:22:59 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:22:59 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:22:59 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:22:59.391 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:23:00 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:23:00 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 10:23:00 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:23:00.434 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 10:23:00 compute-1 ceph-mon[80115]: pgmap v1311: 353 pgs: 353 active+clean; 41 MiB data, 303 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Feb 02 10:23:01 compute-1 sudo[247408]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 02 10:23:01 compute-1 sudo[247408]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 10:23:01 compute-1 sudo[247408]: pam_unix(sudo:session): session closed for user root
Feb 02 10:23:01 compute-1 sudo[247433]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/d241d473-9fcb-5f74-b163-f1ca4454e7f1/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Feb 02 10:23:01 compute-1 sudo[247433]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 10:23:01 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:23:01 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:23:01 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:23:01.394 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:23:01 compute-1 sudo[247433]: pam_unix(sudo:session): session closed for user root
Feb 02 10:23:01 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Feb 02 10:23:01 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Feb 02 10:23:02 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:23:02 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:23:02 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:23:02.437 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:23:02 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 10:23:02 compute-1 ceph-mon[80115]: pgmap v1312: 353 pgs: 353 active+clean; 41 MiB data, 303 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Feb 02 10:23:02 compute-1 ceph-mon[80115]: pgmap v1313: 353 pgs: 353 active+clean; 41 MiB data, 303 MiB used, 60 GiB / 60 GiB avail; 605 B/s rd, 0 op/s
Feb 02 10:23:02 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb 02 10:23:02 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb 02 10:23:02 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Feb 02 10:23:02 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Feb 02 10:23:02 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Feb 02 10:23:02 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Feb 02 10:23:03 compute-1 sudo[247491]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Feb 02 10:23:03 compute-1 sudo[247491]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 10:23:03 compute-1 sudo[247491]: pam_unix(sudo:session): session closed for user root
Feb 02 10:23:03 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:23:03 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 10:23:03 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:23:03.396 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 10:23:03 compute-1 nova_compute[226294]: 2026-02-02 10:23:03.640 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:23:04 compute-1 ceph-mon[80115]: pgmap v1314: 353 pgs: 353 active+clean; 41 MiB data, 303 MiB used, 60 GiB / 60 GiB avail; 605 B/s rd, 0 op/s
Feb 02 10:23:04 compute-1 podman[247516]: 2026-02-02 10:23:04.410090953 +0000 UTC m=+0.086244613 container health_status 9cab238676dda9a572dafc047b624a8b78a8fbec063bf997856559897160605d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'db4758ee7523fe447444c4bd2b867b543b1eee4e3bbcf6676cd1b27bf6147d86-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, config_id=ovn_metadata_agent)
Feb 02 10:23:04 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:23:04 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 10:23:04 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:23:04.440 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 10:23:05 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:23:05 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:23:05 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:23:05.398 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:23:06 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:23:06 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:23:06 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:23:06.444 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:23:06 compute-1 ceph-mon[80115]: pgmap v1315: 353 pgs: 353 active+clean; 41 MiB data, 303 MiB used, 60 GiB / 60 GiB avail; 605 B/s rd, 0 op/s
Feb 02 10:23:07 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:23:07 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:23:07 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:23:07.401 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:23:07 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 10:23:07 compute-1 sudo[247538]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 02 10:23:07 compute-1 sudo[247538]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 10:23:07 compute-1 sudo[247538]: pam_unix(sudo:session): session closed for user root
Feb 02 10:23:08 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:23:08 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:23:08 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:23:08.447 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:23:08 compute-1 nova_compute[226294]: 2026-02-02 10:23:08.640 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:23:08 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb 02 10:23:08 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb 02 10:23:08 compute-1 ceph-mon[80115]: pgmap v1316: 353 pgs: 353 active+clean; 41 MiB data, 303 MiB used, 60 GiB / 60 GiB avail; 605 B/s rd, 0 op/s
Feb 02 10:23:09 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:23:09 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:23:09 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:23:09.403 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:23:10 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:23:10 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:23:10 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:23:10.452 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:23:10 compute-1 ceph-mon[80115]: pgmap v1317: 353 pgs: 353 active+clean; 41 MiB data, 303 MiB used, 60 GiB / 60 GiB avail; 605 B/s rd, 0 op/s
Feb 02 10:23:11 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:23:11 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:23:11 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:23:11.407 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:23:12 compute-1 ceph-mon[80115]: pgmap v1318: 353 pgs: 353 active+clean; 41 MiB data, 303 MiB used, 60 GiB / 60 GiB avail; 605 B/s rd, 0 op/s
Feb 02 10:23:12 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:23:12 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 10:23:12 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:23:12.455 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 10:23:12 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 10:23:13 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:23:13 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 10:23:13 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:23:13.410 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 10:23:13 compute-1 nova_compute[226294]: 2026-02-02 10:23:13.642 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:23:14 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:23:14 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:23:14 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:23:14.457 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:23:14 compute-1 ceph-mon[80115]: pgmap v1319: 353 pgs: 353 active+clean; 41 MiB data, 303 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Feb 02 10:23:15 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:23:15 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:23:15 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:23:15.413 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:23:16 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:23:16 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 10:23:16 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:23:16.459 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 10:23:16 compute-1 ceph-mon[80115]: pgmap v1320: 353 pgs: 353 active+clean; 41 MiB data, 303 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Feb 02 10:23:16 compute-1 ceph-mon[80115]: from='client.? 192.168.122.100:0/1037213434' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Feb 02 10:23:17 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:23:17 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:23:17 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:23:17.416 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:23:17 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 10:23:17 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Feb 02 10:23:17 compute-1 ceph-mon[80115]: from='client.? 192.168.122.100:0/3544513142' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Feb 02 10:23:18 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:23:18 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:23:18 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:23:18.463 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:23:18 compute-1 nova_compute[226294]: 2026-02-02 10:23:18.643 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:23:18 compute-1 ceph-mon[80115]: pgmap v1321: 353 pgs: 353 active+clean; 41 MiB data, 303 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Feb 02 10:23:19 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:23:19 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:23:19 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:23:19.419 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:23:20 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:23:20 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:23:20 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:23:20.465 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:23:20 compute-1 ceph-mon[80115]: pgmap v1322: 353 pgs: 353 active+clean; 41 MiB data, 303 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Feb 02 10:23:20 compute-1 ceph-mon[80115]: from='client.? 192.168.122.102:0/453364845' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Feb 02 10:23:20 compute-1 ceph-mon[80115]: from='client.? 192.168.122.102:0/899814096' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Feb 02 10:23:21 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:23:21 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:23:21 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:23:21.421 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:23:22 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:23:22 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:23:22 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:23:22.468 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:23:22 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 10:23:22 compute-1 nova_compute[226294]: 2026-02-02 10:23:22.922 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 02 10:23:22 compute-1 nova_compute[226294]: 2026-02-02 10:23:22.923 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 02 10:23:22 compute-1 ceph-mon[80115]: pgmap v1323: 353 pgs: 353 active+clean; 41 MiB data, 303 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Feb 02 10:23:23 compute-1 sudo[247571]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Feb 02 10:23:23 compute-1 sudo[247571]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 10:23:23 compute-1 sudo[247571]: pam_unix(sudo:session): session closed for user root
Feb 02 10:23:23 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:23:23 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 10:23:23 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:23:23.423 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 10:23:23 compute-1 nova_compute[226294]: 2026-02-02 10:23:23.646 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:23:24 compute-1 podman[247596]: 2026-02-02 10:23:24.441801866 +0000 UTC m=+0.109861042 container health_status 1fb2696999ea15e9688144989093738543d3cef725f9986792d6dfd707aefbe2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:099d88ae13fa2b3409da5310cdcba7fa01d2c87a8bc98296299a57054b9a075e, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'db4758ee7523fe447444c4bd2b867b543b1eee4e3bbcf6676cd1b27bf6147d86-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:099d88ae13fa2b3409da5310cdcba7fa01d2c87a8bc98296299a57054b9a075e', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127)
Feb 02 10:23:24 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:23:24 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:23:24 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:23:24.469 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:23:24 compute-1 nova_compute[226294]: 2026-02-02 10:23:24.649 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 02 10:23:24 compute-1 nova_compute[226294]: 2026-02-02 10:23:24.649 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 02 10:23:24 compute-1 ceph-mon[80115]: pgmap v1324: 353 pgs: 353 active+clean; 41 MiB data, 303 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Feb 02 10:23:25 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:23:25 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:23:25 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:23:25.425 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:23:25 compute-1 nova_compute[226294]: 2026-02-02 10:23:25.648 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 02 10:23:25 compute-1 nova_compute[226294]: 2026-02-02 10:23:25.648 226298 DEBUG nova.compute.manager [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 02 10:23:25 compute-1 nova_compute[226294]: 2026-02-02 10:23:25.648 226298 DEBUG nova.compute.manager [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 02 10:23:25 compute-1 nova_compute[226294]: 2026-02-02 10:23:25.665 226298 DEBUG nova.compute.manager [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Feb 02 10:23:25 compute-1 nova_compute[226294]: 2026-02-02 10:23:25.666 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 02 10:23:25 compute-1 nova_compute[226294]: 2026-02-02 10:23:25.666 226298 DEBUG nova.compute.manager [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Feb 02 10:23:26 compute-1 ceph-mon[80115]: pgmap v1325: 353 pgs: 353 active+clean; 41 MiB data, 303 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Feb 02 10:23:26 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:23:26 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:23:26 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:23:26.472 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:23:26 compute-1 nova_compute[226294]: 2026-02-02 10:23:26.649 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 02 10:23:27 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:23:27 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 10:23:27 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:23:27.428 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 10:23:27 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 10:23:28 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:23:28 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:23:28 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:23:28.475 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:23:28 compute-1 nova_compute[226294]: 2026-02-02 10:23:28.648 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:23:28 compute-1 nova_compute[226294]: 2026-02-02 10:23:28.658 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 02 10:23:28 compute-1 ceph-mon[80115]: pgmap v1326: 353 pgs: 353 active+clean; 41 MiB data, 303 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Feb 02 10:23:29 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:23:29 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:23:29 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:23:29.432 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:23:30 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:23:30 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 10:23:30 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:23:30.477 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 10:23:30 compute-1 nova_compute[226294]: 2026-02-02 10:23:30.648 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 02 10:23:31 compute-1 ceph-mon[80115]: pgmap v1327: 353 pgs: 353 active+clean; 41 MiB data, 303 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Feb 02 10:23:31 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:23:31 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:23:31 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:23:31.435 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:23:31 compute-1 nova_compute[226294]: 2026-02-02 10:23:31.648 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 02 10:23:31 compute-1 nova_compute[226294]: 2026-02-02 10:23:31.648 226298 DEBUG nova.compute.manager [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 02 10:23:32 compute-1 ceph-mon[80115]: pgmap v1328: 353 pgs: 353 active+clean; 41 MiB data, 303 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Feb 02 10:23:32 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:23:32 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 10:23:32 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:23:32.480 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 10:23:32 compute-1 nova_compute[226294]: 2026-02-02 10:23:32.649 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 02 10:23:32 compute-1 nova_compute[226294]: 2026-02-02 10:23:32.671 226298 DEBUG oslo_concurrency.lockutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 02 10:23:32 compute-1 nova_compute[226294]: 2026-02-02 10:23:32.671 226298 DEBUG oslo_concurrency.lockutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 02 10:23:32 compute-1 nova_compute[226294]: 2026-02-02 10:23:32.672 226298 DEBUG oslo_concurrency.lockutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 02 10:23:32 compute-1 nova_compute[226294]: 2026-02-02 10:23:32.672 226298 DEBUG nova.compute.resource_tracker [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 02 10:23:32 compute-1 nova_compute[226294]: 2026-02-02 10:23:32.673 226298 DEBUG oslo_concurrency.processutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 02 10:23:32 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 10:23:33 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Feb 02 10:23:33 compute-1 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Feb 02 10:23:33 compute-1 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2284638192' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Feb 02 10:23:33 compute-1 nova_compute[226294]: 2026-02-02 10:23:33.175 226298 DEBUG oslo_concurrency.processutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.503s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 02 10:23:33 compute-1 nova_compute[226294]: 2026-02-02 10:23:33.328 226298 WARNING nova.virt.libvirt.driver [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 02 10:23:33 compute-1 nova_compute[226294]: 2026-02-02 10:23:33.329 226298 DEBUG nova.compute.resource_tracker [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4876MB free_disk=59.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 02 10:23:33 compute-1 nova_compute[226294]: 2026-02-02 10:23:33.330 226298 DEBUG oslo_concurrency.lockutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 02 10:23:33 compute-1 nova_compute[226294]: 2026-02-02 10:23:33.330 226298 DEBUG oslo_concurrency.lockutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 02 10:23:33 compute-1 nova_compute[226294]: 2026-02-02 10:23:33.396 226298 DEBUG nova.compute.resource_tracker [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 02 10:23:33 compute-1 nova_compute[226294]: 2026-02-02 10:23:33.397 226298 DEBUG nova.compute.resource_tracker [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 02 10:23:33 compute-1 nova_compute[226294]: 2026-02-02 10:23:33.411 226298 DEBUG oslo_concurrency.processutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 02 10:23:33 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:23:33 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:23:33 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:23:33.438 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:23:33 compute-1 nova_compute[226294]: 2026-02-02 10:23:33.653 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:23:33 compute-1 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Feb 02 10:23:33 compute-1 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/761614579' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Feb 02 10:23:33 compute-1 nova_compute[226294]: 2026-02-02 10:23:33.846 226298 DEBUG oslo_concurrency.processutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.434s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 02 10:23:33 compute-1 nova_compute[226294]: 2026-02-02 10:23:33.850 226298 DEBUG nova.compute.provider_tree [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Inventory has not changed in ProviderTree for provider: 8e32c057-ad28-4c19-8374-763e0c1c8622 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 02 10:23:33 compute-1 nova_compute[226294]: 2026-02-02 10:23:33.866 226298 DEBUG nova.scheduler.client.report [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Inventory has not changed for provider 8e32c057-ad28-4c19-8374-763e0c1c8622 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 02 10:23:33 compute-1 nova_compute[226294]: 2026-02-02 10:23:33.868 226298 DEBUG nova.compute.resource_tracker [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 02 10:23:33 compute-1 nova_compute[226294]: 2026-02-02 10:23:33.869 226298 DEBUG oslo_concurrency.lockutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.539s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 02 10:23:34 compute-1 ceph-mon[80115]: from='client.? 192.168.122.101:0/2284638192' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Feb 02 10:23:34 compute-1 ceph-mon[80115]: pgmap v1329: 353 pgs: 353 active+clean; 41 MiB data, 303 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Feb 02 10:23:34 compute-1 ceph-mon[80115]: from='client.? 192.168.122.101:0/761614579' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Feb 02 10:23:34 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:23:34 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 10:23:34 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:23:34.483 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 10:23:35 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:23:35 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:23:35 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:23:35.440 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:23:35 compute-1 podman[247673]: 2026-02-02 10:23:35.450325906 +0000 UTC m=+0.066521007 container health_status 9cab238676dda9a572dafc047b624a8b78a8fbec063bf997856559897160605d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'db4758ee7523fe447444c4bd2b867b543b1eee4e3bbcf6676cd1b27bf6147d86-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Feb 02 10:23:36 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:23:36 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 10:23:36 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:23:36.486 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 10:23:36 compute-1 ceph-mon[80115]: pgmap v1330: 353 pgs: 353 active+clean; 41 MiB data, 303 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Feb 02 10:23:37 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:23:37 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 10:23:37 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:23:37.442 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 10:23:37 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 10:23:38 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:23:38 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 10:23:38 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:23:38.489 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 10:23:38 compute-1 nova_compute[226294]: 2026-02-02 10:23:38.652 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:23:38 compute-1 nova_compute[226294]: 2026-02-02 10:23:38.658 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:23:38 compute-1 ceph-mon[80115]: pgmap v1331: 353 pgs: 353 active+clean; 41 MiB data, 303 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Feb 02 10:23:39 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:23:39 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:23:39 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:23:39.446 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:23:39 compute-1 nova_compute[226294]: 2026-02-02 10:23:39.649 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 02 10:23:39 compute-1 nova_compute[226294]: 2026-02-02 10:23:39.649 226298 DEBUG nova.compute.manager [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Feb 02 10:23:39 compute-1 nova_compute[226294]: 2026-02-02 10:23:39.664 226298 DEBUG nova.compute.manager [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Feb 02 10:23:40 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:23:40 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 10:23:40 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:23:40.492 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 10:23:40 compute-1 ceph-mon[80115]: pgmap v1332: 353 pgs: 353 active+clean; 41 MiB data, 303 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Feb 02 10:23:41 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:23:41 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:23:41 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:23:41.448 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:23:42 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:23:42 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:23:42 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:23:42.494 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:23:42 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 10:23:43 compute-1 ceph-mon[80115]: pgmap v1333: 353 pgs: 353 active+clean; 41 MiB data, 303 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Feb 02 10:23:43 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:23:43 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 10:23:43 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:23:43.450 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 10:23:43 compute-1 sudo[247696]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Feb 02 10:23:43 compute-1 sudo[247696]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 10:23:43 compute-1 sudo[247696]: pam_unix(sudo:session): session closed for user root
Feb 02 10:23:43 compute-1 nova_compute[226294]: 2026-02-02 10:23:43.655 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:23:43 compute-1 nova_compute[226294]: 2026-02-02 10:23:43.659 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:23:44 compute-1 ceph-mon[80115]: pgmap v1334: 353 pgs: 353 active+clean; 41 MiB data, 303 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Feb 02 10:23:44 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:23:44 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:23:44 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:23:44.496 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:23:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 10:23:44.925 143542 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 02 10:23:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 10:23:44.926 143542 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 02 10:23:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 10:23:44.926 143542 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 02 10:23:45 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:23:45 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 10:23:45 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:23:45.452 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 10:23:46 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:23:46 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:23:46 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:23:46.499 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:23:47 compute-1 ceph-mon[80115]: pgmap v1335: 353 pgs: 353 active+clean; 41 MiB data, 303 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Feb 02 10:23:47 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:23:47 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:23:47 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:23:47.456 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:23:47 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 10:23:48 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Feb 02 10:23:48 compute-1 ceph-mon[80115]: pgmap v1336: 353 pgs: 353 active+clean; 41 MiB data, 303 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Feb 02 10:23:48 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:23:48 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 10:23:48 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:23:48.502 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 10:23:48 compute-1 nova_compute[226294]: 2026-02-02 10:23:48.657 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:23:48 compute-1 nova_compute[226294]: 2026-02-02 10:23:48.660 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:23:49 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:23:49 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 10:23:49 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:23:49.458 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 10:23:50 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:23:50 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:23:50 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:23:50.504 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:23:50 compute-1 ceph-mon[80115]: pgmap v1337: 353 pgs: 353 active+clean; 41 MiB data, 303 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Feb 02 10:23:51 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:23:51 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 10:23:51 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:23:51.461 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 10:23:52 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:23:52 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:23:52 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:23:52.506 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:23:52 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 10:23:52 compute-1 ceph-mon[80115]: pgmap v1338: 353 pgs: 353 active+clean; 41 MiB data, 303 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Feb 02 10:23:53 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:23:53 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:23:53 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:23:53.464 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:23:53 compute-1 nova_compute[226294]: 2026-02-02 10:23:53.660 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:23:54 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:23:54 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 10:23:54 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:23:54.507 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 10:23:54 compute-1 ceph-mon[80115]: pgmap v1339: 353 pgs: 353 active+clean; 41 MiB data, 303 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Feb 02 10:23:55 compute-1 podman[247727]: 2026-02-02 10:23:55.431507539 +0000 UTC m=+0.107515531 container health_status 1fb2696999ea15e9688144989093738543d3cef725f9986792d6dfd707aefbe2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:099d88ae13fa2b3409da5310cdcba7fa01d2c87a8bc98296299a57054b9a075e, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'db4758ee7523fe447444c4bd2b867b543b1eee4e3bbcf6676cd1b27bf6147d86-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:099d88ae13fa2b3409da5310cdcba7fa01d2c87a8bc98296299a57054b9a075e', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Feb 02 10:23:55 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:23:55 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:23:55 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:23:55.465 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:23:55 compute-1 ceph-mon[80115]: from='client.? 192.168.122.10:0/2533542984' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Feb 02 10:23:55 compute-1 ceph-mon[80115]: from='client.? 192.168.122.10:0/2533542984' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Feb 02 10:23:56 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:23:56 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:23:56 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:23:56.510 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:23:56 compute-1 ceph-mon[80115]: pgmap v1340: 353 pgs: 353 active+clean; 41 MiB data, 303 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Feb 02 10:23:57 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:23:57 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:23:57 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:23:57.468 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:23:57 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 10:23:58 compute-1 ceph-mon[80115]: pgmap v1341: 353 pgs: 353 active+clean; 41 MiB data, 303 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Feb 02 10:23:58 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:23:58 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:23:58 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:23:58.512 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:23:58 compute-1 nova_compute[226294]: 2026-02-02 10:23:58.661 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:23:58 compute-1 nova_compute[226294]: 2026-02-02 10:23:58.663 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:23:59 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:23:59 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:23:59 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:23:59.471 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:24:00 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:24:00 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:24:00 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:24:00.514 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:24:00 compute-1 ceph-mon[80115]: pgmap v1342: 353 pgs: 353 active+clean; 41 MiB data, 303 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Feb 02 10:24:01 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:24:01 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:24:01 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:24:01.473 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:24:02 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:24:02 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:24:02 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:24:02.517 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:24:02 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 10:24:02 compute-1 ceph-mon[80115]: pgmap v1343: 353 pgs: 353 active+clean; 41 MiB data, 303 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Feb 02 10:24:02 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Feb 02 10:24:03 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:24:03 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:24:03 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:24:03.475 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:24:03 compute-1 sudo[247759]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Feb 02 10:24:03 compute-1 sudo[247759]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 10:24:03 compute-1 sudo[247759]: pam_unix(sudo:session): session closed for user root
Feb 02 10:24:03 compute-1 nova_compute[226294]: 2026-02-02 10:24:03.663 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:24:04 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:24:04 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 10:24:04 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:24:04.519 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 10:24:04 compute-1 ceph-mon[80115]: pgmap v1344: 353 pgs: 353 active+clean; 41 MiB data, 303 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Feb 02 10:24:05 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:24:05 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 10:24:05 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:24:05.477 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 10:24:06 compute-1 podman[247785]: 2026-02-02 10:24:06.398370375 +0000 UTC m=+0.066807064 container health_status 9cab238676dda9a572dafc047b624a8b78a8fbec063bf997856559897160605d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'db4758ee7523fe447444c4bd2b867b543b1eee4e3bbcf6676cd1b27bf6147d86-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Feb 02 10:24:06 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:24:06 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 10:24:06 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:24:06.521 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 10:24:06 compute-1 ceph-mon[80115]: pgmap v1345: 353 pgs: 353 active+clean; 41 MiB data, 303 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Feb 02 10:24:07 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:24:07 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:24:07 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:24:07.480 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:24:07 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 10:24:08 compute-1 sudo[247805]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 02 10:24:08 compute-1 sudo[247805]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 10:24:08 compute-1 sudo[247805]: pam_unix(sudo:session): session closed for user root
Feb 02 10:24:08 compute-1 sudo[247830]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/d241d473-9fcb-5f74-b163-f1ca4454e7f1/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Feb 02 10:24:08 compute-1 sudo[247830]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 10:24:08 compute-1 sudo[247830]: pam_unix(sudo:session): session closed for user root
Feb 02 10:24:08 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:24:08 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:24:08 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:24:08.524 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:24:08 compute-1 nova_compute[226294]: 2026-02-02 10:24:08.665 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 02 10:24:08 compute-1 nova_compute[226294]: 2026-02-02 10:24:08.668 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 02 10:24:09 compute-1 ceph-mon[80115]: pgmap v1346: 353 pgs: 353 active+clean; 41 MiB data, 303 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Feb 02 10:24:09 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "config rm", "who": "osd/host:compute-0", "name": "osd_memory_target"}]: dispatch
Feb 02 10:24:09 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Feb 02 10:24:09 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Feb 02 10:24:09 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb 02 10:24:09 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb 02 10:24:09 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Feb 02 10:24:09 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Feb 02 10:24:09 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Feb 02 10:24:09 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:24:09 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:24:09 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:24:09.484 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:24:10 compute-1 ceph-mon[80115]: pgmap v1347: 353 pgs: 353 active+clean; 41 MiB data, 303 MiB used, 60 GiB / 60 GiB avail; 851 B/s rd, 0 op/s
Feb 02 10:24:10 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:24:10 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:24:10 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:24:10.527 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:24:11 compute-1 ceph-mon[80115]: pgmap v1348: 353 pgs: 353 active+clean; 41 MiB data, 303 MiB used, 60 GiB / 60 GiB avail; 567 B/s rd, 0 op/s
Feb 02 10:24:11 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:24:11 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:24:11 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:24:11.487 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:24:12 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:24:12 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 10:24:12 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:24:12.530 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 10:24:12 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 10:24:13 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:24:13 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 10:24:13 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:24:13.489 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 10:24:13 compute-1 sudo[247890]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 02 10:24:13 compute-1 sudo[247890]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 10:24:13 compute-1 sudo[247890]: pam_unix(sudo:session): session closed for user root
Feb 02 10:24:13 compute-1 nova_compute[226294]: 2026-02-02 10:24:13.667 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:24:13 compute-1 ceph-mon[80115]: pgmap v1349: 353 pgs: 353 active+clean; 41 MiB data, 303 MiB used, 60 GiB / 60 GiB avail; 567 B/s rd, 0 op/s
Feb 02 10:24:13 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb 02 10:24:13 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' 
Feb 02 10:24:14 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:24:14 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:24:14 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:24:14.533 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:24:15 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:24:15 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 10:24:15 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:24:15.491 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 10:24:15 compute-1 ceph-mon[80115]: pgmap v1350: 353 pgs: 353 active+clean; 41 MiB data, 303 MiB used, 60 GiB / 60 GiB avail; 567 B/s rd, 0 op/s
Feb 02 10:24:16 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:24:16 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:24:16 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:24:16.534 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:24:17 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:24:17 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 10:24:17 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:24:17.494 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 10:24:17 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 10:24:17 compute-1 ceph-mon[80115]: pgmap v1351: 353 pgs: 353 active+clean; 41 MiB data, 303 MiB used, 60 GiB / 60 GiB avail; 567 B/s rd, 0 op/s
Feb 02 10:24:17 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Feb 02 10:24:18 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:24:18 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:24:18 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:24:18.537 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:24:18 compute-1 nova_compute[226294]: 2026-02-02 10:24:18.669 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 02 10:24:18 compute-1 nova_compute[226294]: 2026-02-02 10:24:18.671 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 02 10:24:18 compute-1 nova_compute[226294]: 2026-02-02 10:24:18.671 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 02 10:24:18 compute-1 nova_compute[226294]: 2026-02-02 10:24:18.671 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 02 10:24:18 compute-1 nova_compute[226294]: 2026-02-02 10:24:18.707 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:24:18 compute-1 nova_compute[226294]: 2026-02-02 10:24:18.707 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 02 10:24:18 compute-1 ceph-mon[80115]: from='client.? 192.168.122.100:0/566599006' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Feb 02 10:24:18 compute-1 ceph-mon[80115]: from='client.? 192.168.122.100:0/3818587934' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Feb 02 10:24:19 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:24:19 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 10:24:19 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:24:19.496 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 10:24:19 compute-1 ceph-mon[80115]: pgmap v1352: 353 pgs: 353 active+clean; 41 MiB data, 303 MiB used, 60 GiB / 60 GiB avail; 851 B/s rd, 0 op/s
Feb 02 10:24:20 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:24:20 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:24:20 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:24:20.538 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:24:21 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:24:21 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:24:21 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:24:21.500 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:24:21 compute-1 ceph-mon[80115]: pgmap v1353: 353 pgs: 353 active+clean; 41 MiB data, 303 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Feb 02 10:24:21 compute-1 ceph-mon[80115]: from='client.? 192.168.122.102:0/4206122882' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Feb 02 10:24:22 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:24:22 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:24:22 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:24:22.540 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:24:22 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 10:24:22 compute-1 ceph-mon[80115]: from='client.? 192.168.122.102:0/3006506951' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Feb 02 10:24:23 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:24:23 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:24:23 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:24:23.503 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:24:23 compute-1 sudo[247920]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Feb 02 10:24:23 compute-1 sudo[247920]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 10:24:23 compute-1 sudo[247920]: pam_unix(sudo:session): session closed for user root
Feb 02 10:24:23 compute-1 nova_compute[226294]: 2026-02-02 10:24:23.664 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 02 10:24:23 compute-1 nova_compute[226294]: 2026-02-02 10:24:23.709 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:24:23 compute-1 ceph-mon[80115]: pgmap v1354: 353 pgs: 353 active+clean; 41 MiB data, 303 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Feb 02 10:24:24 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:24:24 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:24:24 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:24:24.541 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:24:24 compute-1 nova_compute[226294]: 2026-02-02 10:24:24.649 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 02 10:24:24 compute-1 nova_compute[226294]: 2026-02-02 10:24:24.649 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 02 10:24:25 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:24:25 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 10:24:25 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:24:25.506 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 10:24:25 compute-1 nova_compute[226294]: 2026-02-02 10:24:25.648 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 02 10:24:25 compute-1 ceph-mon[80115]: pgmap v1355: 353 pgs: 353 active+clean; 41 MiB data, 303 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Feb 02 10:24:26 compute-1 podman[247946]: 2026-02-02 10:24:26.419065672 +0000 UTC m=+0.090645961 container health_status 1fb2696999ea15e9688144989093738543d3cef725f9986792d6dfd707aefbe2 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:099d88ae13fa2b3409da5310cdcba7fa01d2c87a8bc98296299a57054b9a075e, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'db4758ee7523fe447444c4bd2b867b543b1eee4e3bbcf6676cd1b27bf6147d86-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:099d88ae13fa2b3409da5310cdcba7fa01d2c87a8bc98296299a57054b9a075e', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3)
Feb 02 10:24:26 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:24:26 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:24:26 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:24:26.544 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:24:26 compute-1 nova_compute[226294]: 2026-02-02 10:24:26.648 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 02 10:24:26 compute-1 nova_compute[226294]: 2026-02-02 10:24:26.649 226298 DEBUG nova.compute.manager [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 02 10:24:26 compute-1 nova_compute[226294]: 2026-02-02 10:24:26.649 226298 DEBUG nova.compute.manager [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 02 10:24:26 compute-1 nova_compute[226294]: 2026-02-02 10:24:26.686 226298 DEBUG nova.compute.manager [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Feb 02 10:24:27 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:24:27 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:24:27 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:24:27.508 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:24:27 compute-1 nova_compute[226294]: 2026-02-02 10:24:27.682 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 02 10:24:27 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 10:24:27 compute-1 sshd-session[247973]: Accepted publickey for zuul from 192.168.122.10 port 45582 ssh2: ECDSA SHA256:RIWOugHsRom13QN8+H2eekzMj6VNcm6gUxie+zDStiQ
Feb 02 10:24:27 compute-1 systemd-logind[805]: New session 58 of user zuul.
Feb 02 10:24:27 compute-1 systemd[1]: Started Session 58 of User zuul.
Feb 02 10:24:27 compute-1 sshd-session[247973]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Feb 02 10:24:27 compute-1 ceph-mon[80115]: pgmap v1356: 353 pgs: 353 active+clean; 41 MiB data, 303 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Feb 02 10:24:27 compute-1 sudo[247977]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/bash -c 'rm -rf /var/tmp/sos-osp && mkdir /var/tmp/sos-osp && sos report --batch --all-logs --tmp-dir=/var/tmp/sos-osp  -p container,openstack_edpm,system,storage,virt'
Feb 02 10:24:27 compute-1 sudo[247977]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 02 10:24:28 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:24:28 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:24:28 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:24:28.546 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:24:28 compute-1 nova_compute[226294]: 2026-02-02 10:24:28.671 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 02 10:24:28 compute-1 nova_compute[226294]: 2026-02-02 10:24:28.710 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 02 10:24:28 compute-1 nova_compute[226294]: 2026-02-02 10:24:28.712 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:24:28 compute-1 nova_compute[226294]: 2026-02-02 10:24:28.712 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 02 10:24:28 compute-1 nova_compute[226294]: 2026-02-02 10:24:28.712 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 02 10:24:28 compute-1 nova_compute[226294]: 2026-02-02 10:24:28.712 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 02 10:24:28 compute-1 nova_compute[226294]: 2026-02-02 10:24:28.713 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:24:29 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:24:29 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:24:29 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:24:29.511 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:24:29 compute-1 ceph-mon[80115]: pgmap v1357: 353 pgs: 353 active+clean; 41 MiB data, 303 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Feb 02 10:24:30 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:24:30 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:24:30 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:24:30.548 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:24:30 compute-1 ceph-mon[80115]: from='client.28321 -' entity='client.admin' cmd=[{"prefix": "orch status", "target": ["mon-mgr", ""]}]: dispatch
Feb 02 10:24:30 compute-1 ceph-mon[80115]: from='client.18723 -' entity='client.admin' cmd=[{"prefix": "orch status", "target": ["mon-mgr", ""]}]: dispatch
Feb 02 10:24:30 compute-1 ceph-mon[80115]: from='client.? 192.168.122.102:0/3747684622' entity='client.admin' cmd=[{"prefix": "status"}]: dispatch
Feb 02 10:24:31 compute-1 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "status"} v 0)
Feb 02 10:24:31 compute-1 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1495132194' entity='client.admin' cmd=[{"prefix": "status"}]: dispatch
Feb 02 10:24:31 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:24:31 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:24:31 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:24:31.513 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:24:31 compute-1 nova_compute[226294]: 2026-02-02 10:24:31.648 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 02 10:24:31 compute-1 nova_compute[226294]: 2026-02-02 10:24:31.649 226298 DEBUG nova.compute.manager [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 02 10:24:31 compute-1 ceph-mon[80115]: from='client.28409 -' entity='client.admin' cmd=[{"prefix": "orch status", "target": ["mon-mgr", ""]}]: dispatch
Feb 02 10:24:31 compute-1 ceph-mon[80115]: from='client.28327 -' entity='client.admin' cmd=[{"prefix": "crash ls", "target": ["mon-mgr", ""]}]: dispatch
Feb 02 10:24:31 compute-1 ceph-mon[80115]: from='client.18735 -' entity='client.admin' cmd=[{"prefix": "crash ls", "target": ["mon-mgr", ""]}]: dispatch
Feb 02 10:24:31 compute-1 ceph-mon[80115]: pgmap v1358: 353 pgs: 353 active+clean; 41 MiB data, 303 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Feb 02 10:24:31 compute-1 ceph-mon[80115]: from='client.28421 -' entity='client.admin' cmd=[{"prefix": "crash ls", "target": ["mon-mgr", ""]}]: dispatch
Feb 02 10:24:31 compute-1 ceph-mon[80115]: from='client.? 192.168.122.100:0/2120963382' entity='client.admin' cmd=[{"prefix": "status"}]: dispatch
Feb 02 10:24:31 compute-1 ceph-mon[80115]: from='client.? 192.168.122.101:0/1495132194' entity='client.admin' cmd=[{"prefix": "status"}]: dispatch
Feb 02 10:24:32 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:24:32 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 10:24:32 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:24:32.549 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 10:24:32 compute-1 nova_compute[226294]: 2026-02-02 10:24:32.649 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 02 10:24:32 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 10:24:32 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Feb 02 10:24:33 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:24:33 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 10:24:33 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:24:33.515 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 10:24:33 compute-1 nova_compute[226294]: 2026-02-02 10:24:33.649 226298 DEBUG oslo_service.periodic_task [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 02 10:24:33 compute-1 nova_compute[226294]: 2026-02-02 10:24:33.676 226298 DEBUG oslo_concurrency.lockutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 02 10:24:33 compute-1 nova_compute[226294]: 2026-02-02 10:24:33.678 226298 DEBUG oslo_concurrency.lockutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 02 10:24:33 compute-1 nova_compute[226294]: 2026-02-02 10:24:33.678 226298 DEBUG oslo_concurrency.lockutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 02 10:24:33 compute-1 nova_compute[226294]: 2026-02-02 10:24:33.678 226298 DEBUG nova.compute.resource_tracker [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 02 10:24:33 compute-1 nova_compute[226294]: 2026-02-02 10:24:33.679 226298 DEBUG oslo_concurrency.processutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 02 10:24:33 compute-1 nova_compute[226294]: 2026-02-02 10:24:33.713 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:24:33 compute-1 ovs-vsctl[248329]: ovs|00001|db_ctl_base|ERR|no key "dpdk-init" in Open_vSwitch record "." column other_config
Feb 02 10:24:34 compute-1 ceph-mon[80115]: pgmap v1359: 353 pgs: 353 active+clean; 41 MiB data, 303 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Feb 02 10:24:34 compute-1 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Feb 02 10:24:34 compute-1 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2759038895' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Feb 02 10:24:34 compute-1 nova_compute[226294]: 2026-02-02 10:24:34.100 226298 DEBUG oslo_concurrency.processutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.421s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 02 10:24:34 compute-1 nova_compute[226294]: 2026-02-02 10:24:34.236 226298 WARNING nova.virt.libvirt.driver [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 02 10:24:34 compute-1 nova_compute[226294]: 2026-02-02 10:24:34.237 226298 DEBUG nova.compute.resource_tracker [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4712MB free_disk=59.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 02 10:24:34 compute-1 nova_compute[226294]: 2026-02-02 10:24:34.237 226298 DEBUG oslo_concurrency.lockutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 02 10:24:34 compute-1 nova_compute[226294]: 2026-02-02 10:24:34.238 226298 DEBUG oslo_concurrency.lockutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 02 10:24:34 compute-1 nova_compute[226294]: 2026-02-02 10:24:34.412 226298 DEBUG nova.compute.resource_tracker [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 02 10:24:34 compute-1 nova_compute[226294]: 2026-02-02 10:24:34.412 226298 DEBUG nova.compute.resource_tracker [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 02 10:24:34 compute-1 nova_compute[226294]: 2026-02-02 10:24:34.522 226298 DEBUG nova.scheduler.client.report [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Refreshing inventories for resource provider 8e32c057-ad28-4c19-8374-763e0c1c8622 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Feb 02 10:24:34 compute-1 virtqemud[225988]: Failed to connect socket to '/var/run/libvirt/virtnetworkd-sock-ro': No such file or directory
Feb 02 10:24:34 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:24:34 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:24:34 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:24:34.551 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:24:34 compute-1 virtqemud[225988]: Failed to connect socket to '/var/run/libvirt/virtnwfilterd-sock-ro': No such file or directory
Feb 02 10:24:34 compute-1 virtqemud[225988]: Failed to connect socket to '/var/run/libvirt/virtstoraged-sock-ro': No such file or directory
Feb 02 10:24:34 compute-1 nova_compute[226294]: 2026-02-02 10:24:34.616 226298 DEBUG nova.scheduler.client.report [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Updating ProviderTree inventory for provider 8e32c057-ad28-4c19-8374-763e0c1c8622 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Feb 02 10:24:34 compute-1 nova_compute[226294]: 2026-02-02 10:24:34.617 226298 DEBUG nova.compute.provider_tree [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Updating inventory in ProviderTree for provider 8e32c057-ad28-4c19-8374-763e0c1c8622 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Feb 02 10:24:34 compute-1 nova_compute[226294]: 2026-02-02 10:24:34.644 226298 DEBUG nova.scheduler.client.report [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Refreshing aggregate associations for resource provider 8e32c057-ad28-4c19-8374-763e0c1c8622, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Feb 02 10:24:34 compute-1 nova_compute[226294]: 2026-02-02 10:24:34.669 226298 DEBUG nova.scheduler.client.report [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Refreshing trait associations for resource provider 8e32c057-ad28-4c19-8374-763e0c1c8622, traits: HW_CPU_X86_AESNI,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_SECURITY_TPM_1_2,HW_CPU_X86_SSE2,HW_CPU_X86_MMX,COMPUTE_STORAGE_BUS_USB,COMPUTE_STORAGE_BUS_IDE,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_SVM,HW_CPU_X86_F16C,HW_CPU_X86_AMD_SVM,HW_CPU_X86_SSE,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_GRAPHICS_MODEL_VIRTIO,HW_CPU_X86_BMI2,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_DEVICE_TAGGING,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_GRAPHICS_MODEL_VGA,HW_CPU_X86_AVX,COMPUTE_VOLUME_EXTEND,COMPUTE_SECURITY_UEFI_SECURE_BOOT,HW_CPU_X86_SHA,COMPUTE_SECURITY_TPM_2_0,HW_CPU_X86_ABM,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_CLMUL,HW_CPU_X86_FMA3,COMPUTE_ACCELERATORS,COMPUTE_TRUSTED_CERTS,HW_CPU_X86_BMI,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_SSE41,COMPUTE_VIOMMU_MODEL_AUTO,HW_CPU_X86_AVX2,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_RESCUE_BFV,HW_CPU_X86_SSE42,COMPUTE_NODE,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_SSSE3,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_NET_VIF_MODEL_PCNET,HW_CPU_X86_SSE4A,COMPUTE_IMAGE_TYPE_QCOW2 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Feb 02 10:24:34 compute-1 nova_compute[226294]: 2026-02-02 10:24:34.719 226298 DEBUG oslo_concurrency.processutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 02 10:24:35 compute-1 ceph-mon[80115]: from='client.? 192.168.122.101:0/2759038895' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Feb 02 10:24:35 compute-1 ceph-mon[80115]: pgmap v1360: 353 pgs: 353 active+clean; 41 MiB data, 303 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Feb 02 10:24:35 compute-1 ceph-mon[80115]: from='client.28351 -' entity='client.admin' cmd=[{"prefix": "balancer eval", "target": ["mon-mgr", ""]}]: dispatch
Feb 02 10:24:35 compute-1 ceph-mds[85402]: mds.cephfs.compute-1.khfsen asok_command: cache status {prefix=cache status} (starting...)
Feb 02 10:24:35 compute-1 ceph-mds[85402]: mds.cephfs.compute-1.khfsen Can't run that command on an inactive MDS!
Feb 02 10:24:35 compute-1 lvm[248680]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Feb 02 10:24:35 compute-1 lvm[248680]: VG ceph_vg0 finished
Feb 02 10:24:35 compute-1 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Feb 02 10:24:35 compute-1 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1419073997' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Feb 02 10:24:35 compute-1 nova_compute[226294]: 2026-02-02 10:24:35.217 226298 DEBUG oslo_concurrency.processutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.498s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 02 10:24:35 compute-1 nova_compute[226294]: 2026-02-02 10:24:35.225 226298 DEBUG nova.compute.provider_tree [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Inventory has not changed in ProviderTree for provider: 8e32c057-ad28-4c19-8374-763e0c1c8622 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 02 10:24:35 compute-1 ceph-mds[85402]: mds.cephfs.compute-1.khfsen asok_command: client ls {prefix=client ls} (starting...)
Feb 02 10:24:35 compute-1 ceph-mds[85402]: mds.cephfs.compute-1.khfsen Can't run that command on an inactive MDS!
Feb 02 10:24:35 compute-1 nova_compute[226294]: 2026-02-02 10:24:35.244 226298 DEBUG nova.scheduler.client.report [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Inventory has not changed for provider 8e32c057-ad28-4c19-8374-763e0c1c8622 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 02 10:24:35 compute-1 nova_compute[226294]: 2026-02-02 10:24:35.246 226298 DEBUG nova.compute.resource_tracker [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 02 10:24:35 compute-1 nova_compute[226294]: 2026-02-02 10:24:35.246 226298 DEBUG oslo_concurrency.lockutils [None req-934af0bf-49b3-4feb-a865-66e949909b4b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.008s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 02 10:24:35 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:24:35 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:24:35 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:24:35.517 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:24:35 compute-1 ceph-mds[85402]: mds.cephfs.compute-1.khfsen asok_command: damage ls {prefix=damage ls} (starting...)
Feb 02 10:24:35 compute-1 ceph-mds[85402]: mds.cephfs.compute-1.khfsen Can't run that command on an inactive MDS!
Feb 02 10:24:35 compute-1 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "report"} v 0)
Feb 02 10:24:35 compute-1 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/936668337' entity='client.admin' cmd=[{"prefix": "report"}]: dispatch
Feb 02 10:24:35 compute-1 ceph-mds[85402]: mds.cephfs.compute-1.khfsen asok_command: dump loads {prefix=dump loads} (starting...)
Feb 02 10:24:35 compute-1 ceph-mds[85402]: mds.cephfs.compute-1.khfsen Can't run that command on an inactive MDS!
Feb 02 10:24:36 compute-1 ceph-mon[80115]: from='client.? 192.168.122.101:0/1419073997' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Feb 02 10:24:36 compute-1 ceph-mon[80115]: from='client.28363 -' entity='client.admin' cmd=[{"prefix": "balancer status", "target": ["mon-mgr", ""]}]: dispatch
Feb 02 10:24:36 compute-1 ceph-mon[80115]: from='client.? 192.168.122.102:0/1312411424' entity='client.admin' cmd=[{"prefix": "report"}]: dispatch
Feb 02 10:24:36 compute-1 ceph-mon[80115]: from='client.? ' entity='client.admin' cmd=[{"prefix": "report"}]: dispatch
Feb 02 10:24:36 compute-1 ceph-mon[80115]: from='client.28375 -' entity='client.admin' cmd=[{"prefix": "balancer status detail", "target": ["mon-mgr", ""]}]: dispatch
Feb 02 10:24:36 compute-1 ceph-mon[80115]: from='client.28445 -' entity='client.admin' cmd=[{"prefix": "balancer eval", "target": ["mon-mgr", ""]}]: dispatch
Feb 02 10:24:36 compute-1 ceph-mon[80115]: from='client.? 192.168.122.102:0/483216115' entity='client.admin' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Feb 02 10:24:36 compute-1 ceph-mon[80115]: from='client.? 192.168.122.101:0/936668337' entity='client.admin' cmd=[{"prefix": "report"}]: dispatch
Feb 02 10:24:36 compute-1 ceph-mon[80115]: from='client.? ' entity='client.admin' cmd=[{"prefix": "report"}]: dispatch
Feb 02 10:24:36 compute-1 ceph-mds[85402]: mds.cephfs.compute-1.khfsen asok_command: dump tree {prefix=dump tree,root=/} (starting...)
Feb 02 10:24:36 compute-1 ceph-mds[85402]: mds.cephfs.compute-1.khfsen Can't run that command on an inactive MDS!
Feb 02 10:24:36 compute-1 ceph-mds[85402]: mds.cephfs.compute-1.khfsen asok_command: dump_blocked_ops {prefix=dump_blocked_ops} (starting...)
Feb 02 10:24:36 compute-1 ceph-mds[85402]: mds.cephfs.compute-1.khfsen Can't run that command on an inactive MDS!
Feb 02 10:24:36 compute-1 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Feb 02 10:24:36 compute-1 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/663724622' entity='client.admin' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Feb 02 10:24:36 compute-1 ceph-mds[85402]: mds.cephfs.compute-1.khfsen asok_command: dump_historic_ops {prefix=dump_historic_ops} (starting...)
Feb 02 10:24:36 compute-1 ceph-mds[85402]: mds.cephfs.compute-1.khfsen Can't run that command on an inactive MDS!
Feb 02 10:24:36 compute-1 ceph-mds[85402]: mds.cephfs.compute-1.khfsen asok_command: dump_historic_ops_by_duration {prefix=dump_historic_ops_by_duration} (starting...)
Feb 02 10:24:36 compute-1 ceph-mds[85402]: mds.cephfs.compute-1.khfsen Can't run that command on an inactive MDS!
Feb 02 10:24:36 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:24:36 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:24:36 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:24:36.553 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:24:36 compute-1 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "config log"} v 0)
Feb 02 10:24:36 compute-1 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1010756422' entity='client.admin' cmd=[{"prefix": "config log"}]: dispatch
Feb 02 10:24:36 compute-1 ceph-mds[85402]: mds.cephfs.compute-1.khfsen asok_command: dump_ops_in_flight {prefix=dump_ops_in_flight} (starting...)
Feb 02 10:24:36 compute-1 ceph-mds[85402]: mds.cephfs.compute-1.khfsen Can't run that command on an inactive MDS!
Feb 02 10:24:36 compute-1 ceph-mds[85402]: mds.cephfs.compute-1.khfsen asok_command: get subtrees {prefix=get subtrees} (starting...)
Feb 02 10:24:36 compute-1 ceph-mds[85402]: mds.cephfs.compute-1.khfsen Can't run that command on an inactive MDS!
Feb 02 10:24:37 compute-1 ceph-mon[80115]: from='client.28469 -' entity='client.admin' cmd=[{"prefix": "balancer status", "target": ["mon-mgr", ""]}]: dispatch
Feb 02 10:24:37 compute-1 ceph-mon[80115]: from='client.28390 -' entity='client.admin' cmd=[{"prefix": "healthcheck history ls", "target": ["mon-mgr", ""]}]: dispatch
Feb 02 10:24:37 compute-1 ceph-mon[80115]: from='client.? 192.168.122.102:0/3418756166' entity='client.admin' cmd=[{"prefix": "config log"}]: dispatch
Feb 02 10:24:37 compute-1 ceph-mon[80115]: from='client.18783 -' entity='client.admin' cmd=[{"prefix": "balancer eval", "target": ["mon-mgr", ""]}]: dispatch
Feb 02 10:24:37 compute-1 ceph-mon[80115]: from='client.? 192.168.122.101:0/663724622' entity='client.admin' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Feb 02 10:24:37 compute-1 ceph-mon[80115]: from='client.28481 -' entity='client.admin' cmd=[{"prefix": "balancer status detail", "target": ["mon-mgr", ""]}]: dispatch
Feb 02 10:24:37 compute-1 ceph-mon[80115]: from='client.? 192.168.122.100:0/2226803775' entity='client.admin' cmd=[{"prefix": "report"}]: dispatch
Feb 02 10:24:37 compute-1 ceph-mon[80115]: from='client.? 192.168.122.102:0/1013650008' entity='client.admin' cmd=[{"prefix": "log last", "channel": "cephadm"}]: dispatch
Feb 02 10:24:37 compute-1 ceph-mon[80115]: from='client.? 192.168.122.102:0/2266820028' entity='client.admin' cmd=[{"prefix": "config-key dump"}]: dispatch
Feb 02 10:24:37 compute-1 ceph-mon[80115]: pgmap v1361: 353 pgs: 353 active+clean; 41 MiB data, 303 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Feb 02 10:24:37 compute-1 ceph-mon[80115]: from='client.18804 -' entity='client.admin' cmd=[{"prefix": "balancer status", "target": ["mon-mgr", ""]}]: dispatch
Feb 02 10:24:37 compute-1 ceph-mon[80115]: from='client.? 192.168.122.101:0/1010756422' entity='client.admin' cmd=[{"prefix": "config log"}]: dispatch
Feb 02 10:24:37 compute-1 ceph-mon[80115]: from='client.? 192.168.122.100:0/1850324762' entity='client.admin' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Feb 02 10:24:37 compute-1 ceph-mon[80115]: from='client.28502 -' entity='client.admin' cmd=[{"prefix": "healthcheck history ls", "target": ["mon-mgr", ""]}]: dispatch
Feb 02 10:24:37 compute-1 ceph-mon[80115]: from='client.? 192.168.122.102:0/99299674' entity='client.admin' cmd=[{"prefix": "mgr dump"}]: dispatch
Feb 02 10:24:37 compute-1 ceph-mon[80115]: from='client.28438 -' entity='client.admin' cmd=[{"prefix": "crash ls", "target": ["mon-mgr", ""]}]: dispatch
Feb 02 10:24:37 compute-1 ceph-mds[85402]: mds.cephfs.compute-1.khfsen asok_command: ops {prefix=ops} (starting...)
Feb 02 10:24:37 compute-1 ceph-mds[85402]: mds.cephfs.compute-1.khfsen Can't run that command on an inactive MDS!
Feb 02 10:24:37 compute-1 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "config-key dump"} v 0)
Feb 02 10:24:37 compute-1 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2333568486' entity='client.admin' cmd=[{"prefix": "config-key dump"}]: dispatch
Feb 02 10:24:37 compute-1 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "log last", "channel": "cephadm"} v 0)
Feb 02 10:24:37 compute-1 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3989197214' entity='client.admin' cmd=[{"prefix": "log last", "channel": "cephadm"}]: dispatch
Feb 02 10:24:37 compute-1 podman[248991]: 2026-02-02 10:24:37.41778746 +0000 UTC m=+0.087508497 container health_status 9cab238676dda9a572dafc047b624a8b78a8fbec063bf997856559897160605d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20260127, config_id=ovn_metadata_agent, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'db4758ee7523fe447444c4bd2b867b543b1eee4e3bbcf6676cd1b27bf6147d86-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-4cf0791ff1fe87a1cb78145a20c95c187a5d671b8bac4c8603b14a21d3c8f8c0-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Feb 02 10:24:37 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:24:37 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:24:37 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:24:37.519 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:24:37 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 10:24:37 compute-1 ceph-mds[85402]: mds.cephfs.compute-1.khfsen asok_command: session ls {prefix=session ls} (starting...)
Feb 02 10:24:37 compute-1 ceph-mds[85402]: mds.cephfs.compute-1.khfsen Can't run that command on an inactive MDS!
Feb 02 10:24:37 compute-1 ceph-mds[85402]: mds.cephfs.compute-1.khfsen asok_command: status {prefix=status} (starting...)
Feb 02 10:24:37 compute-1 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr dump"} v 0)
Feb 02 10:24:37 compute-1 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3120345489' entity='client.admin' cmd=[{"prefix": "mgr dump"}]: dispatch
Feb 02 10:24:38 compute-1 ceph-mon[80115]: from='client.18825 -' entity='client.admin' cmd=[{"prefix": "balancer status detail", "target": ["mon-mgr", ""]}]: dispatch
Feb 02 10:24:38 compute-1 ceph-mon[80115]: from='client.? 192.168.122.101:0/2333568486' entity='client.admin' cmd=[{"prefix": "config-key dump"}]: dispatch
Feb 02 10:24:38 compute-1 ceph-mon[80115]: from='client.? 192.168.122.100:0/3622334637' entity='client.admin' cmd=[{"prefix": "config log"}]: dispatch
Feb 02 10:24:38 compute-1 ceph-mon[80115]: from='client.? 192.168.122.101:0/3989197214' entity='client.admin' cmd=[{"prefix": "log last", "channel": "cephadm"}]: dispatch
Feb 02 10:24:38 compute-1 ceph-mon[80115]: from='client.28444 -' entity='client.admin' cmd=[{"prefix": "crash stat", "target": ["mon-mgr", ""]}]: dispatch
Feb 02 10:24:38 compute-1 ceph-mon[80115]: from='client.? 192.168.122.102:0/3860226350' entity='client.admin' cmd=[{"prefix": "mgr metadata"}]: dispatch
Feb 02 10:24:38 compute-1 ceph-mon[80115]: from='client.18843 -' entity='client.admin' cmd=[{"prefix": "healthcheck history ls", "target": ["mon-mgr", ""]}]: dispatch
Feb 02 10:24:38 compute-1 ceph-mon[80115]: from='client.28541 -' entity='client.admin' cmd=[{"prefix": "crash ls", "target": ["mon-mgr", ""]}]: dispatch
Feb 02 10:24:38 compute-1 ceph-mon[80115]: from='client.? 192.168.122.100:0/3115935709' entity='client.admin' cmd=[{"prefix": "config-key dump"}]: dispatch
Feb 02 10:24:38 compute-1 ceph-mon[80115]: from='client.? 192.168.122.102:0/2264099985' entity='client.admin' cmd=[{"prefix": "mgr module ls"}]: dispatch
Feb 02 10:24:38 compute-1 ceph-mon[80115]: from='client.? 192.168.122.101:0/3120345489' entity='client.admin' cmd=[{"prefix": "mgr dump"}]: dispatch
Feb 02 10:24:38 compute-1 ceph-mon[80115]: from='client.? 192.168.122.102:0/4217101947' entity='client.admin' cmd=[{"prefix": "features"}]: dispatch
Feb 02 10:24:38 compute-1 ceph-mon[80115]: from='client.? ' entity='client.admin' cmd=[{"prefix": "features"}]: dispatch
Feb 02 10:24:38 compute-1 ceph-mon[80115]: from='client.? 192.168.122.100:0/1188552163' entity='client.admin' cmd=[{"prefix": "log last", "channel": "cephadm"}]: dispatch
Feb 02 10:24:38 compute-1 ceph-mon[80115]: from='client.28568 -' entity='client.admin' cmd=[{"prefix": "crash stat", "target": ["mon-mgr", ""]}]: dispatch
Feb 02 10:24:38 compute-1 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr metadata"} v 0)
Feb 02 10:24:38 compute-1 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/703134171' entity='client.admin' cmd=[{"prefix": "mgr metadata"}]: dispatch
Feb 02 10:24:38 compute-1 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "features"} v 0)
Feb 02 10:24:38 compute-1 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1165547802' entity='client.admin' cmd=[{"prefix": "features"}]: dispatch
Feb 02 10:24:38 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:24:38 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:24:38 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:24:38.555 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:24:38 compute-1 nova_compute[226294]: 2026-02-02 10:24:38.715 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:24:38 compute-1 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr module ls"} v 0)
Feb 02 10:24:38 compute-1 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/73952731' entity='client.admin' cmd=[{"prefix": "mgr module ls"}]: dispatch
Feb 02 10:24:38 compute-1 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "health", "detail": "detail"} v 0)
Feb 02 10:24:38 compute-1 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/512297101' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch
Feb 02 10:24:39 compute-1 ceph-mon[80115]: from='client.18870 -' entity='client.admin' cmd=[{"prefix": "crash ls", "target": ["mon-mgr", ""]}]: dispatch
Feb 02 10:24:39 compute-1 ceph-mon[80115]: from='client.? 192.168.122.102:0/477697622' entity='client.admin' cmd=[{"prefix": "mgr services"}]: dispatch
Feb 02 10:24:39 compute-1 ceph-mon[80115]: from='client.? 192.168.122.101:0/703134171' entity='client.admin' cmd=[{"prefix": "mgr metadata"}]: dispatch
Feb 02 10:24:39 compute-1 ceph-mon[80115]: from='client.? 192.168.122.102:0/3486921131' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch
Feb 02 10:24:39 compute-1 ceph-mon[80115]: from='client.? 192.168.122.101:0/1165547802' entity='client.admin' cmd=[{"prefix": "features"}]: dispatch
Feb 02 10:24:39 compute-1 ceph-mon[80115]: from='client.? ' entity='client.admin' cmd=[{"prefix": "features"}]: dispatch
Feb 02 10:24:39 compute-1 ceph-mon[80115]: from='client.? 192.168.122.100:0/3726403783' entity='client.admin' cmd=[{"prefix": "mgr dump"}]: dispatch
Feb 02 10:24:39 compute-1 ceph-mon[80115]: from='client.18888 -' entity='client.admin' cmd=[{"prefix": "crash stat", "target": ["mon-mgr", ""]}]: dispatch
Feb 02 10:24:39 compute-1 ceph-mon[80115]: pgmap v1362: 353 pgs: 353 active+clean; 41 MiB data, 303 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Feb 02 10:24:39 compute-1 ceph-mon[80115]: from='client.? 192.168.122.102:0/4133551776' entity='client.admin' cmd=[{"prefix": "mgr stat"}]: dispatch
Feb 02 10:24:39 compute-1 ceph-mon[80115]: from='client.28516 -' entity='client.admin' cmd=[{"prefix": "insights", "target": ["mon-mgr", ""]}]: dispatch
Feb 02 10:24:39 compute-1 ceph-mon[80115]: from='client.? 192.168.122.101:0/73952731' entity='client.admin' cmd=[{"prefix": "mgr module ls"}]: dispatch
Feb 02 10:24:39 compute-1 ceph-mon[80115]: from='client.? 192.168.122.101:0/512297101' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch
Feb 02 10:24:39 compute-1 ceph-mon[80115]: from='client.? 192.168.122.100:0/644854955' entity='client.admin' cmd=[{"prefix": "mgr metadata"}]: dispatch
Feb 02 10:24:39 compute-1 ceph-mon[80115]: from='client.? 192.168.122.100:0/3225960705' entity='client.admin' cmd=[{"prefix": "features"}]: dispatch
Feb 02 10:24:39 compute-1 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr services"} v 0)
Feb 02 10:24:39 compute-1 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2629942829' entity='client.admin' cmd=[{"prefix": "mgr services"}]: dispatch
Feb 02 10:24:39 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:24:39 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:24:39 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:24:39.521 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:24:39 compute-1 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "log last", "num": 10000, "level": "debug", "channel": "audit"} v 0)
Feb 02 10:24:39 compute-1 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/888108608' entity='client.admin' cmd=[{"prefix": "log last", "num": 10000, "level": "debug", "channel": "audit"}]: dispatch
Feb 02 10:24:39 compute-1 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr stat"} v 0)
Feb 02 10:24:39 compute-1 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3061187504' entity='client.admin' cmd=[{"prefix": "mgr stat"}]: dispatch
Feb 02 10:24:40 compute-1 ceph-mon[80115]: from='client.? 192.168.122.102:0/1593635413' entity='client.admin' cmd=[{"prefix": "log last", "num": 10000, "level": "debug", "channel": "audit"}]: dispatch
Feb 02 10:24:40 compute-1 ceph-mon[80115]: from='client.? 192.168.122.102:0/3311814230' entity='client.admin' cmd=[{"prefix": "mgr versions"}]: dispatch
Feb 02 10:24:40 compute-1 ceph-mon[80115]: from='client.28634 -' entity='client.admin' cmd=[{"prefix": "insights", "target": ["mon-mgr", ""]}]: dispatch
Feb 02 10:24:40 compute-1 ceph-mon[80115]: from='client.? 192.168.122.101:0/2629942829' entity='client.admin' cmd=[{"prefix": "mgr services"}]: dispatch
Feb 02 10:24:40 compute-1 ceph-mon[80115]: from='client.? 192.168.122.100:0/1959502888' entity='client.admin' cmd=[{"prefix": "mgr module ls"}]: dispatch
Feb 02 10:24:40 compute-1 ceph-mon[80115]: from='client.28549 -' entity='client.admin' cmd=[{"prefix": "orch host ls", "target": ["mon-mgr", ""]}]: dispatch
Feb 02 10:24:40 compute-1 ceph-mon[80115]: from='client.? 192.168.122.100:0/503603035' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch
Feb 02 10:24:40 compute-1 ceph-mon[80115]: from='client.? 192.168.122.102:0/380177895' entity='client.admin' cmd=[{"prefix": "log last", "num": 10000, "level": "debug", "channel": "cluster"}]: dispatch
Feb 02 10:24:40 compute-1 ceph-mon[80115]: from='client.? 192.168.122.101:0/888108608' entity='client.admin' cmd=[{"prefix": "log last", "num": 10000, "level": "debug", "channel": "audit"}]: dispatch
Feb 02 10:24:40 compute-1 ceph-mon[80115]: from='client.? 192.168.122.101:0/3061187504' entity='client.admin' cmd=[{"prefix": "mgr stat"}]: dispatch
Feb 02 10:24:40 compute-1 ceph-mon[80115]: from='client.? 192.168.122.100:0/2590607965' entity='client.admin' cmd=[{"prefix": "mgr services"}]: dispatch
Feb 02 10:24:40 compute-1 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "log last", "num": 10000, "level": "debug", "channel": "cluster"} v 0)
Feb 02 10:24:40 compute-1 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3424334972' entity='client.admin' cmd=[{"prefix": "log last", "num": 10000, "level": "debug", "channel": "cluster"}]: dispatch
Feb 02 10:24:40 compute-1 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr versions"} v 0)
Feb 02 10:24:40 compute-1 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1690255577' entity='client.admin' cmd=[{"prefix": "mgr versions"}]: dispatch
Feb 02 10:24:40 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:24:40 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:24:40 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:24:40.557 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:24:40 compute-1 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr dump"} v 0)
Feb 02 10:24:40 compute-1 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1744096391' entity='client.admin' cmd=[{"prefix": "mgr dump"}]: dispatch
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:51:48.043392+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85106688 unmapped: 3014656 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:51:49.043533+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85106688 unmapped: 3014656 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:51:50.043721+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85106688 unmapped: 3014656 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:51:51.043852+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85106688 unmapped: 3014656 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 962210 data_alloc: 218103808 data_used: 143360
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:51:52.044034+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85106688 unmapped: 3014656 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 139 ms_handle_reset con 0x5616e321e000 session 0x5616e34dbc20
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 139 ms_handle_reset con 0x5616e321f800 session 0x5616e0d17860
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:51:53.044226+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85106688 unmapped: 3014656 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:51:54.044380+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85106688 unmapped: 3014656 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:51:55.044510+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85106688 unmapped: 3014656 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:51:56.044695+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85114880 unmapped: 3006464 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 962210 data_alloc: 218103808 data_used: 143360
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:51:57.044833+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85114880 unmapped: 3006464 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:51:58.044940+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85114880 unmapped: 3006464 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:51:59.045121+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85114880 unmapped: 3006464 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:52:00.045400+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85114880 unmapped: 3006464 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:52:01.045576+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85114880 unmapped: 3006464 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 962210 data_alloc: 218103808 data_used: 143360
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:52:02.045736+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85114880 unmapped: 3006464 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:52:03.045959+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85114880 unmapped: 3006464 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e226ac00
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 16.825603485s of 16.828807831s, submitted: 1
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:52:04.046201+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85131264 unmapped: 2990080 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:52:05.046397+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85131264 unmapped: 2990080 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:52:06.046606+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85131264 unmapped: 2990080 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 962342 data_alloc: 218103808 data_used: 143360
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:52:07.046827+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85131264 unmapped: 2990080 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:52:08.047051+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85131264 unmapped: 2990080 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:52:09.047248+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85131264 unmapped: 2990080 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e3542000
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:52:10.047446+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85131264 unmapped: 2990080 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:52:11.047691+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85131264 unmapped: 2990080 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:52:12.047931+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 963854 data_alloc: 218103808 data_used: 143360
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85131264 unmapped: 2990080 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:52:13.048276+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85131264 unmapped: 2990080 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:52:14.048446+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85131264 unmapped: 2990080 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:52:15.048634+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85131264 unmapped: 2990080 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 12.089159966s of 12.096708298s, submitted: 2
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:52:16.048781+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85131264 unmapped: 2990080 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:52:17.048983+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 963263 data_alloc: 218103808 data_used: 143360
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85131264 unmapped: 2990080 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:52:18.049216+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85139456 unmapped: 2981888 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:52:19.049430+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85139456 unmapped: 2981888 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:52:20.049666+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85139456 unmapped: 2981888 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:52:21.049877+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85139456 unmapped: 2981888 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:52:22.050068+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 963131 data_alloc: 218103808 data_used: 143360
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85139456 unmapped: 2981888 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:52:23.050289+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85139456 unmapped: 2981888 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:52:24.050423+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85139456 unmapped: 2981888 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:52:25.050549+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85139456 unmapped: 2981888 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:52:26.050713+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85139456 unmapped: 2981888 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:52:27.050894+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 963131 data_alloc: 218103808 data_used: 143360
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85147648 unmapped: 2973696 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:52:28.051042+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85147648 unmapped: 2973696 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:52:29.051219+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85147648 unmapped: 2973696 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:52:30.051372+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85147648 unmapped: 2973696 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:52:31.051522+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85147648 unmapped: 2973696 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:52:32.051666+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 963131 data_alloc: 218103808 data_used: 143360
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85147648 unmapped: 2973696 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:52:33.051860+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85147648 unmapped: 2973696 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:52:34.052034+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85147648 unmapped: 2973696 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:52:35.052217+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85147648 unmapped: 2973696 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:52:36.052416+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85147648 unmapped: 2973696 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:52:37.052576+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 963131 data_alloc: 218103808 data_used: 143360
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85147648 unmapped: 2973696 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:52:38.052773+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85147648 unmapped: 2973696 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:52:39.052941+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85147648 unmapped: 2973696 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:52:40.053087+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85147648 unmapped: 2973696 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:52:41.053225+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85147648 unmapped: 2973696 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:52:42.053351+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 963131 data_alloc: 218103808 data_used: 143360
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85147648 unmapped: 2973696 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:52:43.053562+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85147648 unmapped: 2973696 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:52:44.053793+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85155840 unmapped: 2965504 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:52:45.053976+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85155840 unmapped: 2965504 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:52:46.054238+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85155840 unmapped: 2965504 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:52:47.054406+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 963131 data_alloc: 218103808 data_used: 143360
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85155840 unmapped: 2965504 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:52:48.054562+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85164032 unmapped: 2957312 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:52:49.054692+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85164032 unmapped: 2957312 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:52:50.054871+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85164032 unmapped: 2957312 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:52:51.055019+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85172224 unmapped: 2949120 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:52:52.055257+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 963131 data_alloc: 218103808 data_used: 143360
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85172224 unmapped: 2949120 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:52:53.074992+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85172224 unmapped: 2949120 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:52:54.075201+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85172224 unmapped: 2949120 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:52:55.075371+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85172224 unmapped: 2949120 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:52:56.075542+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85172224 unmapped: 2949120 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:52:57.075731+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 963131 data_alloc: 218103808 data_used: 143360
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85172224 unmapped: 2949120 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:52:58.075918+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85172224 unmapped: 2949120 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:52:59.076064+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85172224 unmapped: 2949120 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:53:00.076311+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85172224 unmapped: 2949120 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:53:01.076499+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85172224 unmapped: 2949120 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:53:02.076649+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 963131 data_alloc: 218103808 data_used: 143360
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 139 ms_handle_reset con 0x5616e3283c00 session 0x5616e3750f00
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 139 ms_handle_reset con 0x5616e2cafc00 session 0x5616e3750960
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85172224 unmapped: 2949120 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:53:03.076804+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 139 ms_handle_reset con 0x5616e3542000 session 0x5616e311a000
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 139 ms_handle_reset con 0x5616e226ac00 session 0x5616e2797860
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85172224 unmapped: 2949120 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:53:04.076947+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85172224 unmapped: 2949120 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:53:05.077108+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85172224 unmapped: 2949120 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:53:06.077206+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85172224 unmapped: 2949120 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:53:07.077343+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 963131 data_alloc: 218103808 data_used: 143360
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85172224 unmapped: 2949120 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:53:08.077487+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85172224 unmapped: 2949120 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:53:09.077632+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85172224 unmapped: 2949120 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:53:10.077766+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85172224 unmapped: 2949120 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:53:11.077924+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85172224 unmapped: 2949120 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:53:12.078062+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 963131 data_alloc: 218103808 data_used: 143360
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85172224 unmapped: 2949120 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:53:13.078223+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e3220400
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 57.016975403s of 57.231136322s, submitted: 2
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85172224 unmapped: 2949120 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:53:14.078381+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e11f9800
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85188608 unmapped: 2932736 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:53:15.078552+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85188608 unmapped: 2932736 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:53:16.078734+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e3281c00
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85196800 unmapped: 2924544 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:53:17.078874+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 964907 data_alloc: 218103808 data_used: 143360
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85196800 unmapped: 2924544 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:53:18.079037+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85196800 unmapped: 2924544 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:53:19.079250+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e321a800
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85196800 unmapped: 2924544 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:53:20.079388+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e226c400
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85204992 unmapped: 2916352 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:53:21.079537+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85204992 unmapped: 2916352 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:53:22.079758+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 966419 data_alloc: 218103808 data_used: 143360
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85204992 unmapped: 2916352 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:53:23.080194+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85204992 unmapped: 2916352 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:53:24.080318+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85204992 unmapped: 2916352 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:53:25.080453+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85204992 unmapped: 2916352 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:53:26.080634+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 13.142086029s of 13.158586502s, submitted: 4
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85204992 unmapped: 2916352 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr metadata"} v 0)
Feb 02 10:24:41 compute-1 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1580771436' entity='client.admin' cmd=[{"prefix": "mgr metadata"}]: dispatch
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:53:27.080814+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 965828 data_alloc: 218103808 data_used: 143360
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:53:28.080953+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85204992 unmapped: 2916352 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:53:29.083589+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85213184 unmapped: 2908160 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:53:30.084446+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85213184 unmapped: 2908160 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:53:31.084595+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85213184 unmapped: 2908160 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:53:32.084741+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85221376 unmapped: 2899968 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 965564 data_alloc: 218103808 data_used: 143360
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:53:33.086063+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85221376 unmapped: 2899968 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:53:34.086240+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85221376 unmapped: 2899968 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 139 ms_handle_reset con 0x5616e11f9000 session 0x5616e278a000
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 139 ms_handle_reset con 0x5616e3281c00 session 0x5616e312c000
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:53:35.086363+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85221376 unmapped: 2899968 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 139 ms_handle_reset con 0x5616e226c400 session 0x5616e311fe00
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 139 ms_handle_reset con 0x5616e11f9800 session 0x5616e311a3c0
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:53:36.086503+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85221376 unmapped: 2899968 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:53:37.086640+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85221376 unmapped: 2899968 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 965564 data_alloc: 218103808 data_used: 143360
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:53:38.086786+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85221376 unmapped: 2899968 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:53:39.087262+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85221376 unmapped: 2899968 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:53:40.087411+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85221376 unmapped: 2899968 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:53:41.087543+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85221376 unmapped: 2899968 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:53:42.087684+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85221376 unmapped: 2899968 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 965564 data_alloc: 218103808 data_used: 143360
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:53:43.087880+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85221376 unmapped: 2899968 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:53:44.088070+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85221376 unmapped: 2899968 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e3281000
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 18.639591217s of 18.651557922s, submitted: 3
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:53:45.088240+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85221376 unmapped: 2899968 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:53:46.088483+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85221376 unmapped: 2899968 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e1193000
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:53:47.088677+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85221376 unmapped: 2899968 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 965828 data_alloc: 218103808 data_used: 143360
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:53:48.088824+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85221376 unmapped: 2899968 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:53:49.088977+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85221376 unmapped: 2899968 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:53:50.089139+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85221376 unmapped: 2899968 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:53:51.089321+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e21e4000
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85237760 unmapped: 2883584 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:53:52.089461+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85237760 unmapped: 2883584 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 965828 data_alloc: 218103808 data_used: 143360
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:53:53.089753+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85237760 unmapped: 2883584 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:53:54.090388+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85237760 unmapped: 2883584 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:53:55.090552+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85237760 unmapped: 2883584 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 10.387768745s of 10.394624710s, submitted: 2
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:53:56.090719+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85237760 unmapped: 2883584 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:53:57.090887+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85237760 unmapped: 2883584 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 964646 data_alloc: 218103808 data_used: 143360
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:53:58.091041+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85237760 unmapped: 2883584 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:53:59.091280+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85237760 unmapped: 2883584 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:54:00.091486+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85237760 unmapped: 2883584 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:54:01.091685+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85237760 unmapped: 2883584 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:54:02.091819+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85237760 unmapped: 2883584 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 964514 data_alloc: 218103808 data_used: 143360
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:54:03.091981+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85237760 unmapped: 2883584 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:54:04.092203+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85237760 unmapped: 2883584 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:54:05.092335+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85237760 unmapped: 2883584 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:54:06.092471+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85237760 unmapped: 2883584 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:54:07.092582+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85237760 unmapped: 2883584 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 964382 data_alloc: 218103808 data_used: 143360
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:54:08.092781+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85237760 unmapped: 2883584 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:54:09.092911+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85237760 unmapped: 2883584 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:54:10.093048+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85237760 unmapped: 2883584 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:54:11.093208+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85237760 unmapped: 2883584 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:54:12.093344+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85237760 unmapped: 2883584 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 964382 data_alloc: 218103808 data_used: 143360
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:54:13.093568+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85237760 unmapped: 2883584 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:54:14.093718+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85237760 unmapped: 2883584 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:54:15.093865+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85237760 unmapped: 2883584 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:54:16.094004+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85237760 unmapped: 2883584 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:54:17.094210+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85237760 unmapped: 2883584 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 964382 data_alloc: 218103808 data_used: 143360
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:54:18.094401+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85237760 unmapped: 2883584 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 139 ms_handle_reset con 0x5616e21e4000 session 0x5616e311af00
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 139 ms_handle_reset con 0x5616e3281000 session 0x5616e312ef00
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:54:19.094531+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85237760 unmapped: 2883584 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:54:20.094678+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85245952 unmapped: 2875392 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:54:21.094846+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85245952 unmapped: 2875392 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:54:22.095023+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85245952 unmapped: 2875392 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 964382 data_alloc: 218103808 data_used: 143360
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:54:23.095219+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85245952 unmapped: 2875392 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:54:24.095396+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85245952 unmapped: 2875392 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:54:25.095605+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85245952 unmapped: 2875392 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:54:26.095736+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85245952 unmapped: 2875392 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:54:27.095900+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85245952 unmapped: 2875392 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 964382 data_alloc: 218103808 data_used: 143360
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:54:28.096060+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85245952 unmapped: 2875392 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e2caf800
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 33.559337616s of 33.630935669s, submitted: 4
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:54:29.096202+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85245952 unmapped: 2875392 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:54:30.096345+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85245952 unmapped: 2875392 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:54:31.096488+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85245952 unmapped: 2875392 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:54:32.096686+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e3222000
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85245952 unmapped: 2875392 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 966026 data_alloc: 218103808 data_used: 143360
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:54:33.097500+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85245952 unmapped: 2875392 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:54:34.097652+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85245952 unmapped: 2875392 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:54:35.097848+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85245952 unmapped: 2875392 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:54:36.098022+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85245952 unmapped: 2875392 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:54:37.098181+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85245952 unmapped: 2875392 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 966026 data_alloc: 218103808 data_used: 143360
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:54:38.098322+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85245952 unmapped: 2875392 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:54:39.098518+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85245952 unmapped: 2875392 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:54:40.098715+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85245952 unmapped: 2875392 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:54:41.098899+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85245952 unmapped: 2875392 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:54:42.099052+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85245952 unmapped: 2875392 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 966026 data_alloc: 218103808 data_used: 143360
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:54:43.099204+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85245952 unmapped: 2875392 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:54:44.099389+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85245952 unmapped: 2875392 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 15.959052086s of 15.965865135s, submitted: 2
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:54:45.099531+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85245952 unmapped: 2875392 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:54:46.099700+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85245952 unmapped: 2875392 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:54:47.099839+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85245952 unmapped: 2875392 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 965894 data_alloc: 218103808 data_used: 143360
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 139 ms_handle_reset con 0x5616e0541c00 session 0x5616e0d6e1e0
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e11f9800
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:54:48.099989+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85245952 unmapped: 2875392 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:54:49.100260+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85245952 unmapped: 2875392 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:54:50.100414+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85245952 unmapped: 2875392 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:54:51.100576+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85245952 unmapped: 2875392 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:54:52.100735+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85245952 unmapped: 2875392 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 965894 data_alloc: 218103808 data_used: 143360
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:54:53.100962+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85245952 unmapped: 2875392 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:54:54.101174+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85245952 unmapped: 2875392 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:54:55.101375+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85245952 unmapped: 2875392 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:54:56.101552+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85245952 unmapped: 2875392 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:54:57.101741+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85245952 unmapped: 2875392 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 965894 data_alloc: 218103808 data_used: 143360
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:54:58.101883+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85245952 unmapped: 2875392 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:54:59.102041+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85245952 unmapped: 2875392 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:55:00.102194+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85254144 unmapped: 2867200 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:55:01.102347+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85254144 unmapped: 2867200 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:55:02.102499+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85254144 unmapped: 2867200 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 965894 data_alloc: 218103808 data_used: 143360
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:55:03.102667+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85254144 unmapped: 2867200 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:55:04.102810+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85254144 unmapped: 2867200 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:55:05.102939+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85254144 unmapped: 2867200 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:55:06.103073+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85254144 unmapped: 2867200 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:55:07.103264+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85254144 unmapped: 2867200 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 965894 data_alloc: 218103808 data_used: 143360
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:55:08.103468+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85254144 unmapped: 2867200 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:55:09.103622+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85254144 unmapped: 2867200 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:55:10.103769+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85254144 unmapped: 2867200 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:55:11.103913+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85254144 unmapped: 2867200 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:55:12.104080+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85254144 unmapped: 2867200 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 965894 data_alloc: 218103808 data_used: 143360
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:55:13.104213+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85254144 unmapped: 2867200 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:55:14.104438+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85254144 unmapped: 2867200 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:55:15.104601+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85254144 unmapped: 2867200 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:55:16.104769+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85254144 unmapped: 2867200 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:55:17.104942+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85254144 unmapped: 2867200 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 965894 data_alloc: 218103808 data_used: 143360
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:55:18.105094+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85254144 unmapped: 2867200 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:55:19.105244+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85254144 unmapped: 2867200 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:55:20.105400+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85254144 unmapped: 2867200 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:55:21.105648+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85254144 unmapped: 2867200 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:55:22.105859+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85254144 unmapped: 2867200 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 965894 data_alloc: 218103808 data_used: 143360
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:55:23.106096+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85254144 unmapped: 2867200 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:55:24.106281+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85254144 unmapped: 2867200 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:55:25.106485+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85254144 unmapped: 2867200 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:55:26.106671+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85254144 unmapped: 2867200 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:55:27.106887+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85254144 unmapped: 2867200 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 139 ms_handle_reset con 0x5616e3222000 session 0x5616e311b0e0
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 139 ms_handle_reset con 0x5616e1193000 session 0x5616e312c960
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 965894 data_alloc: 218103808 data_used: 143360
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:55:28.107044+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85254144 unmapped: 2867200 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:55:29.107212+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85254144 unmapped: 2867200 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:55:30.107355+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85254144 unmapped: 2867200 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:55:31.107497+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85254144 unmapped: 2867200 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:55:32.107647+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85262336 unmapped: 2859008 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 965894 data_alloc: 218103808 data_used: 143360
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:55:33.107807+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85262336 unmapped: 2859008 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:55:34.107945+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85262336 unmapped: 2859008 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:55:35.108114+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85262336 unmapped: 2859008 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:55:36.108279+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85262336 unmapped: 2859008 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:55:37.108544+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85262336 unmapped: 2859008 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 965894 data_alloc: 218103808 data_used: 143360
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:55:38.108918+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85262336 unmapped: 2859008 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e3542c00
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 53.633445740s of 53.641864777s, submitted: 1
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:55:39.109057+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85262336 unmapped: 2859008 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:55:40.109431+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85262336 unmapped: 2859008 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:55:41.109919+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85262336 unmapped: 2859008 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e21adc00
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:55:42.110618+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85262336 unmapped: 2859008 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 967538 data_alloc: 218103808 data_used: 143360
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:55:43.111269+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85262336 unmapped: 2859008 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:55:44.111403+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85262336 unmapped: 2859008 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e2a99800
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:55:45.112589+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85270528 unmapped: 2850816 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:55:46.113055+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85270528 unmapped: 2850816 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:55:47.113438+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85270528 unmapped: 2850816 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 967538 data_alloc: 218103808 data_used: 143360
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:55:48.113620+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85270528 unmapped: 2850816 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:55:49.113781+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85270528 unmapped: 2850816 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:55:50.113958+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85270528 unmapped: 2850816 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 12.074842453s of 12.082664490s, submitted: 2
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:55:51.114134+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85270528 unmapped: 2850816 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:55:52.114533+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85270528 unmapped: 2850816 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 966947 data_alloc: 218103808 data_used: 143360
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:55:53.114949+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85270528 unmapped: 2850816 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:55:54.115130+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85270528 unmapped: 2850816 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:55:55.115336+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85270528 unmapped: 2850816 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:55:56.115536+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85278720 unmapped: 2842624 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:55:57.115680+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85278720 unmapped: 2842624 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 966815 data_alloc: 218103808 data_used: 143360
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:55:58.115823+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85278720 unmapped: 2842624 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:55:59.116051+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85278720 unmapped: 2842624 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:56:00.116212+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85278720 unmapped: 2842624 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:56:01.116364+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85278720 unmapped: 2842624 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:56:02.116497+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 139 ms_handle_reset con 0x5616e321a800 session 0x5616e311a780
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 139 ms_handle_reset con 0x5616e3220400 session 0x5616e34ea5a0
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85278720 unmapped: 2842624 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 966815 data_alloc: 218103808 data_used: 143360
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:56:03.116671+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85278720 unmapped: 2842624 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:56:04.116811+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85278720 unmapped: 2842624 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:56:05.116962+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85278720 unmapped: 2842624 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:56:06.117203+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85278720 unmapped: 2842624 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:56:07.117344+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85278720 unmapped: 2842624 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 966815 data_alloc: 218103808 data_used: 143360
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:56:08.117468+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85278720 unmapped: 2842624 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:56:09.117580+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85278720 unmapped: 2842624 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:56:10.117706+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85278720 unmapped: 2842624 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:56:11.117839+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85286912 unmapped: 2834432 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:56:12.118002+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85286912 unmapped: 2834432 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 966815 data_alloc: 218103808 data_used: 143360
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:56:13.118154+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e327a000
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 22.421203613s of 22.434373856s, submitted: 2
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85295104 unmapped: 2826240 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:56:14.118299+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85295104 unmapped: 2826240 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:56:15.118496+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85295104 unmapped: 2826240 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:56:16.118643+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85295104 unmapped: 2826240 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:56:17.118845+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85295104 unmapped: 2826240 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 968459 data_alloc: 218103808 data_used: 143360
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:56:18.119074+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85295104 unmapped: 2826240 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:56:19.119291+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85295104 unmapped: 2826240 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:56:20.119424+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85295104 unmapped: 2826240 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:56:21.119627+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85295104 unmapped: 2826240 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:56:22.119774+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85303296 unmapped: 2818048 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 967868 data_alloc: 218103808 data_used: 143360
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:56:23.119972+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85303296 unmapped: 2818048 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:56:24.120106+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85303296 unmapped: 2818048 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:56:25.120289+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85303296 unmapped: 2818048 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:56:26.120487+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85303296 unmapped: 2818048 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:56:27.120677+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85303296 unmapped: 2818048 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 967868 data_alloc: 218103808 data_used: 143360
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:56:28.120804+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 15.030465126s of 15.042689323s, submitted: 3
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 139 ms_handle_reset con 0x5616e327a000 session 0x5616e3750960
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85311488 unmapped: 2809856 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:56:29.121068+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85311488 unmapped: 2809856 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:56:30.121247+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85311488 unmapped: 2809856 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:56:31.121418+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85311488 unmapped: 2809856 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:56:32.121593+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85311488 unmapped: 2809856 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 967736 data_alloc: 218103808 data_used: 143360
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:56:33.121805+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85311488 unmapped: 2809856 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:56:34.121961+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85311488 unmapped: 2809856 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:56:35.122121+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85311488 unmapped: 2809856 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:56:36.122331+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85311488 unmapped: 2809856 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:56:37.122511+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85311488 unmapped: 2809856 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 967736 data_alloc: 218103808 data_used: 143360
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:56:38.122711+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85311488 unmapped: 2809856 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:56:39.122878+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e3282400
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 10.958620071s of 10.963719368s, submitted: 1
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85311488 unmapped: 2809856 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:56:40.123102+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85311488 unmapped: 2809856 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:56:41.123282+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85311488 unmapped: 2809856 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:56:42.123506+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85311488 unmapped: 2809856 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 970892 data_alloc: 218103808 data_used: 143360
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:56:43.123993+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85311488 unmapped: 2809856 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:56:44.125060+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85311488 unmapped: 2809856 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:56:45.125513+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85311488 unmapped: 2809856 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:56:46.125648+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85311488 unmapped: 2809856 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:56:47.125784+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85311488 unmapped: 2809856 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:56:48.125932+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 970892 data_alloc: 218103808 data_used: 143360
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85311488 unmapped: 2809856 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:56:49.126063+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85311488 unmapped: 2809856 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:56:50.126198+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85311488 unmapped: 2809856 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:56:51.126373+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85311488 unmapped: 2809856 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:56:52.126800+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85319680 unmapped: 2801664 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:56:53.127011+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 970301 data_alloc: 218103808 data_used: 143360
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 13.377700806s of 13.986701965s, submitted: 4
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85319680 unmapped: 2801664 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:56:54.127269+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85319680 unmapped: 2801664 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:56:55.127455+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85319680 unmapped: 2801664 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:56:56.127643+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85319680 unmapped: 2801664 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:56:57.127866+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85319680 unmapped: 2801664 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:56:58.128060+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 970169 data_alloc: 218103808 data_used: 143360
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85319680 unmapped: 2801664 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:56:59.128228+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85319680 unmapped: 2801664 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:57:00.128393+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85319680 unmapped: 2801664 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:57:01.128554+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85319680 unmapped: 2801664 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:57:02.128699+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85319680 unmapped: 2801664 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:57:03.128933+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 970169 data_alloc: 218103808 data_used: 143360
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85319680 unmapped: 2801664 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:57:04.129073+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85319680 unmapped: 2801664 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:57:05.129190+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85319680 unmapped: 2801664 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:57:06.129310+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85319680 unmapped: 2801664 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:57:07.129473+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85319680 unmapped: 2801664 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:57:08.129596+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 970169 data_alloc: 218103808 data_used: 143360
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85319680 unmapped: 2801664 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:57:09.129706+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85319680 unmapped: 2801664 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:57:10.129817+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85319680 unmapped: 2801664 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:57:11.130050+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85319680 unmapped: 2801664 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:57:12.130252+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85319680 unmapped: 2801664 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:57:13.130427+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 970169 data_alloc: 218103808 data_used: 143360
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85319680 unmapped: 2801664 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:57:14.130548+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85319680 unmapped: 2801664 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:57:15.130717+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85319680 unmapped: 2801664 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:57:16.130855+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85319680 unmapped: 2801664 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:57:17.131015+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85319680 unmapped: 2801664 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:57:18.131184+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 970169 data_alloc: 218103808 data_used: 143360
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85327872 unmapped: 2793472 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:57:19.131325+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85327872 unmapped: 2793472 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-mon[80115]: from='client.28667 -' entity='client.admin' cmd=[{"prefix": "orch device ls", "target": ["mon-mgr", ""]}]: dispatch
Feb 02 10:24:41 compute-1 ceph-mon[80115]: from='client.18969 -' entity='client.admin' cmd=[{"prefix": "insights", "target": ["mon-mgr", ""]}]: dispatch
Feb 02 10:24:41 compute-1 ceph-mon[80115]: from='client.? 192.168.122.102:0/2696969086' entity='client.admin' cmd=[{"prefix": "mgr dump"}]: dispatch
Feb 02 10:24:41 compute-1 ceph-mon[80115]: from='client.? 192.168.122.101:0/3424334972' entity='client.admin' cmd=[{"prefix": "log last", "num": 10000, "level": "debug", "channel": "cluster"}]: dispatch
Feb 02 10:24:41 compute-1 ceph-mon[80115]: from='client.? 192.168.122.101:0/1690255577' entity='client.admin' cmd=[{"prefix": "mgr versions"}]: dispatch
Feb 02 10:24:41 compute-1 ceph-mon[80115]: from='client.28570 -' entity='client.admin' cmd=[{"prefix": "orch ls", "target": ["mon-mgr", ""]}]: dispatch
Feb 02 10:24:41 compute-1 ceph-mon[80115]: from='client.? 192.168.122.100:0/350123171' entity='client.admin' cmd=[{"prefix": "mgr stat"}]: dispatch
Feb 02 10:24:41 compute-1 ceph-mon[80115]: from='client.? 192.168.122.102:0/2751691675' entity='client.admin' cmd=[{"prefix": "mgr metadata"}]: dispatch
Feb 02 10:24:41 compute-1 ceph-mon[80115]: from='client.? 192.168.122.100:0/4059925757' entity='client.admin' cmd=[{"prefix": "log last", "num": 10000, "level": "debug", "channel": "audit"}]: dispatch
Feb 02 10:24:41 compute-1 ceph-mon[80115]: pgmap v1363: 353 pgs: 353 active+clean; 41 MiB data, 303 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Feb 02 10:24:41 compute-1 ceph-mon[80115]: from='client.28706 -' entity='client.admin' cmd=[{"prefix": "orch host ls", "target": ["mon-mgr", ""]}]: dispatch
Feb 02 10:24:41 compute-1 ceph-mon[80115]: from='client.? 192.168.122.101:0/1744096391' entity='client.admin' cmd=[{"prefix": "mgr dump"}]: dispatch
Feb 02 10:24:41 compute-1 ceph-mon[80115]: from='client.28588 -' entity='client.admin' cmd=[{"prefix": "orch ls", "export": true, "target": ["mon-mgr", ""]}]: dispatch
Feb 02 10:24:41 compute-1 ceph-mon[80115]: from='client.? 192.168.122.100:0/3157816002' entity='client.admin' cmd=[{"prefix": "mgr versions"}]: dispatch
Feb 02 10:24:41 compute-1 ceph-mon[80115]: from='client.? 192.168.122.102:0/2778894105' entity='client.admin' cmd=[{"prefix": "mgr module ls"}]: dispatch
Feb 02 10:24:41 compute-1 ceph-mon[80115]: from='client.? 192.168.122.100:0/2105064198' entity='client.admin' cmd=[{"prefix": "log last", "num": 10000, "level": "debug", "channel": "cluster"}]: dispatch
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:57:20.131485+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85327872 unmapped: 2793472 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:57:21.131657+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85327872 unmapped: 2793472 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:57:22.131800+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85327872 unmapped: 2793472 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:57:23.132014+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 970169 data_alloc: 218103808 data_used: 143360
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85327872 unmapped: 2793472 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:57:24.132163+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85327872 unmapped: 2793472 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:57:25.132313+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85327872 unmapped: 2793472 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:57:26.132452+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85327872 unmapped: 2793472 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:57:27.132579+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85327872 unmapped: 2793472 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:57:28.132730+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 970169 data_alloc: 218103808 data_used: 143360
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85327872 unmapped: 2793472 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:57:29.132931+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85327872 unmapped: 2793472 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:57:30.133114+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 139 ms_handle_reset con 0x5616e3282400 session 0x5616e311a3c0
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85327872 unmapped: 2793472 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:57:31.133210+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85327872 unmapped: 2793472 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:57:32.133359+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85327872 unmapped: 2793472 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:57:33.133543+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 970169 data_alloc: 218103808 data_used: 143360
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85327872 unmapped: 2793472 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:57:34.133742+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85327872 unmapped: 2793472 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:57:35.133913+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85327872 unmapped: 2793472 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:57:36.134179+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85327872 unmapped: 2793472 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:57:37.134395+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85327872 unmapped: 2793472 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:57:38.134642+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 970169 data_alloc: 218103808 data_used: 143360
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85327872 unmapped: 2793472 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:57:39.134926+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85327872 unmapped: 2793472 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:57:40.135132+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85327872 unmapped: 2793472 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:57:41.135382+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e3285400
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 48.270744324s of 48.274307251s, submitted: 1
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85327872 unmapped: 2793472 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:57:42.135573+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85327872 unmapped: 2793472 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:57:43.135803+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 970301 data_alloc: 218103808 data_used: 143360
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85327872 unmapped: 2793472 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:57:44.135939+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85327872 unmapped: 2793472 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:57:45.136096+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85327872 unmapped: 2793472 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:57:46.136263+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85327872 unmapped: 2793472 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:57:47.136387+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e1193c00
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85344256 unmapped: 2777088 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:57:48.136506+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 971813 data_alloc: 218103808 data_used: 143360
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85344256 unmapped: 2777088 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:57:49.136639+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85344256 unmapped: 2777088 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:57:50.136726+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:57:51.136852+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85344256 unmapped: 2777088 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:57:52.137046+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85344256 unmapped: 2777088 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:57:53.137213+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85344256 unmapped: 2777088 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 970631 data_alloc: 218103808 data_used: 143360
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:57:54.137377+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85344256 unmapped: 2777088 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:57:55.137522+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85344256 unmapped: 2777088 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:57:56.137697+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85344256 unmapped: 2777088 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:57:57.137841+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85344256 unmapped: 2777088 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:57:58.137973+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85344256 unmapped: 2777088 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 970631 data_alloc: 218103808 data_used: 143360
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:57:59.138289+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85344256 unmapped: 2777088 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 17.719709396s of 17.732963562s, submitted: 4
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:58:00.138465+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85344256 unmapped: 2777088 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:58:01.138621+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85344256 unmapped: 2777088 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:58:02.138783+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85344256 unmapped: 2777088 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:58:03.139015+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85344256 unmapped: 2777088 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 970499 data_alloc: 218103808 data_used: 143360
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:58:04.139398+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85344256 unmapped: 2777088 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:58:05.139560+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85344256 unmapped: 2777088 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:58:06.139745+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85344256 unmapped: 2777088 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:58:07.139966+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85344256 unmapped: 2777088 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:58:08.140222+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85344256 unmapped: 2777088 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 970499 data_alloc: 218103808 data_used: 143360
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:58:09.140413+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85344256 unmapped: 2777088 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:58:10.140571+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85344256 unmapped: 2777088 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:58:11.140768+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85344256 unmapped: 2777088 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:58:12.141044+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85344256 unmapped: 2777088 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:58:13.141347+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85344256 unmapped: 2777088 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 970499 data_alloc: 218103808 data_used: 143360
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:58:14.141500+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85344256 unmapped: 2777088 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:58:15.141665+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85344256 unmapped: 2777088 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:58:16.141812+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85344256 unmapped: 2777088 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:58:17.141980+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85344256 unmapped: 2777088 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:58:18.142097+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85344256 unmapped: 2777088 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 970499 data_alloc: 218103808 data_used: 143360
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:58:19.142313+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85344256 unmapped: 2777088 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:58:20.142469+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85352448 unmapped: 2768896 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:58:21.142615+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85352448 unmapped: 2768896 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:58:22.142793+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85352448 unmapped: 2768896 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 139 ms_handle_reset con 0x5616e21adc00 session 0x5616e311b860
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 139 ms_handle_reset con 0x5616e2caf800 session 0x5616e337f2c0
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:58:23.142977+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85352448 unmapped: 2768896 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 970499 data_alloc: 218103808 data_used: 143360
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:58:24.143059+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85352448 unmapped: 2768896 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:58:25.143202+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85352448 unmapped: 2768896 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:58:26.143378+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85352448 unmapped: 2768896 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:58:27.143525+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85352448 unmapped: 2768896 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:58:28.143726+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85352448 unmapped: 2768896 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 970499 data_alloc: 218103808 data_used: 143360
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:58:29.143908+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85352448 unmapped: 2768896 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:58:30.144060+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85352448 unmapped: 2768896 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:58:31.144588+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85352448 unmapped: 2768896 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:58:32.144912+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85352448 unmapped: 2768896 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:58:33.145256+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85352448 unmapped: 2768896 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e21ac400
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 34.156009674s of 34.160236359s, submitted: 1
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 970631 data_alloc: 218103808 data_used: 143360
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:58:34.145394+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85352448 unmapped: 2768896 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:58:35.145554+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85352448 unmapped: 2768896 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:58:36.145696+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85360640 unmapped: 2760704 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:58:37.145850+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85360640 unmapped: 2760704 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:58:38.146048+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85360640 unmapped: 2760704 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 972143 data_alloc: 218103808 data_used: 143360
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                           ** DB Stats **
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Cumulative writes: 8815 writes, 34K keys, 8815 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.02 MB/s
                                           Cumulative WAL: 8815 writes, 1876 syncs, 4.70 writes per sync, written: 0.02 GB, 0.02 MB/s
                                           Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                           Interval writes: 746 writes, 1209 keys, 746 commit groups, 1.0 writes per commit group, ingest: 0.41 MB, 0.00 MB/s
                                           Interval WAL: 746 writes, 348 syncs, 2.14 writes per sync, written: 0.00 GB, 0.00 MB/s
                                           Interval stall: 00:00:0.000 H:M:S, 0.0 percent
                                           
                                           ** Compaction Stats [default] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      2/0    2.63 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.2      0.01              0.00         1    0.005       0      0       0.0       0.0
                                            Sum      2/0    2.63 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.2      0.01              0.00         1    0.005       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [default] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.2      0.01              0.00         1    0.005       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x5616dea3d350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 8.3e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [default] **
                                           
                                           ** Compaction Stats [m-0] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [m-0] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x5616dea3d350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 8.3e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [m-0] **
                                           
                                           ** Compaction Stats [m-1] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [m-1] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x5616dea3d350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 8.3e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [m-1] **
                                           
                                           ** Compaction Stats [m-2] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [m-2] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x5616dea3d350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 8.3e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [m-2] **
                                           
                                           ** Compaction Stats [p-0] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      1/0    1.56 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.2      0.01              0.00         1    0.007       0      0       0.0       0.0
                                            Sum      1/0    1.56 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.2      0.01              0.00         1    0.007       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [p-0] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.2      0.01              0.00         1    0.007       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x5616dea3d350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 8.3e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [p-0] **
                                           
                                           ** Compaction Stats [p-1] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [p-1] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x5616dea3d350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 8.3e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [p-1] **
                                           
                                           ** Compaction Stats [p-2] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [p-2] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x5616dea3d350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 8.3e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [p-2] **
                                           
                                           ** Compaction Stats [O-0] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [O-0] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x5616dea3c9b0#2 capacity: 224.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 2 last_secs: 1.3e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,8.85555e-05%) FilterBlock(1,0.11 KB,4.76837e-05%) IndexBlock(1,0.14 KB,6.13076e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [O-0] **
                                           
                                           ** Compaction Stats [O-1] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [O-1] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x5616dea3c9b0#2 capacity: 224.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 2 last_secs: 1.3e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,8.85555e-05%) FilterBlock(1,0.11 KB,4.76837e-05%) IndexBlock(1,0.14 KB,6.13076e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [O-1] **
                                           
                                           ** Compaction Stats [O-2] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      1/0    1.25 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.4      0.00              0.00         1    0.003       0      0       0.0       0.0
                                            Sum      1/0    1.25 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.4      0.00              0.00         1    0.003       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [O-2] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.4      0.00              0.00         1    0.003       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x5616dea3c9b0#2 capacity: 224.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 2 last_secs: 1.3e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,8.85555e-05%) FilterBlock(1,0.11 KB,4.76837e-05%) IndexBlock(1,0.14 KB,6.13076e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [O-2] **
                                           
                                           ** Compaction Stats [L] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.001       0      0       0.0       0.0
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.001       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [L] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.001       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x5616dea3d350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 8.3e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [L] **
                                           
                                           ** Compaction Stats [P] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [P] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x5616dea3d350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 8.3e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [P] **
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:58:39.146257+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85360640 unmapped: 2760704 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:58:40.146419+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85360640 unmapped: 2760704 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:58:41.146618+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85360640 unmapped: 2760704 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:58:42.146762+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85360640 unmapped: 2760704 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 139 ms_handle_reset con 0x5616e1193c00 session 0x5616e311bc20
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 139 ms_handle_reset con 0x5616e3285400 session 0x5616e311a780
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:58:43.146949+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85360640 unmapped: 2760704 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 972143 data_alloc: 218103808 data_used: 143360
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:58:44.147038+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85360640 unmapped: 2760704 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:58:45.147241+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85360640 unmapped: 2760704 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:58:46.147385+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85360640 unmapped: 2760704 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:58:47.147548+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85360640 unmapped: 2760704 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:58:48.147670+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85360640 unmapped: 2760704 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 972143 data_alloc: 218103808 data_used: 143360
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 15.711874962s of 15.720390320s, submitted: 2
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:58:49.147795+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85360640 unmapped: 2760704 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:58:50.147968+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85360640 unmapped: 2760704 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:58:51.148111+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85360640 unmapped: 2760704 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:58:52.148321+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85360640 unmapped: 2760704 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:58:53.148748+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85360640 unmapped: 2760704 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e10efc00
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 972143 data_alloc: 218103808 data_used: 143360
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:58:54.148964+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85368832 unmapped: 2752512 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:58:55.149127+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85368832 unmapped: 2752512 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:58:56.149319+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85368832 unmapped: 2752512 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:58:57.149489+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85368832 unmapped: 2752512 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:58:58.149659+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85368832 unmapped: 2752512 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 972143 data_alloc: 218103808 data_used: 143360
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:58:59.149917+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85368832 unmapped: 2752512 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e34d4800
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:59:00.150102+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85368832 unmapped: 2752512 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:59:01.150301+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85368832 unmapped: 2752512 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:59:02.150479+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85368832 unmapped: 2752512 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:59:03.150673+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85368832 unmapped: 2752512 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 972143 data_alloc: 218103808 data_used: 143360
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:59:04.150809+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85368832 unmapped: 2752512 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:59:05.150910+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85368832 unmapped: 2752512 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 16.441757202s of 16.448879242s, submitted: 2
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:59:06.151079+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85368832 unmapped: 2752512 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:59:07.151243+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85368832 unmapped: 2752512 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:59:08.151415+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85377024 unmapped: 2744320 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 971552 data_alloc: 218103808 data_used: 143360
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:59:09.151554+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85377024 unmapped: 2744320 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:59:10.151758+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85377024 unmapped: 2744320 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:59:11.151909+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85377024 unmapped: 2744320 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:59:12.152099+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85377024 unmapped: 2744320 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:59:13.152363+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85377024 unmapped: 2744320 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 971420 data_alloc: 218103808 data_used: 143360
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:59:14.152522+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85377024 unmapped: 2744320 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:59:15.152609+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85377024 unmapped: 2744320 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:59:16.152756+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85377024 unmapped: 2744320 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:59:17.152905+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85377024 unmapped: 2744320 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:59:18.153071+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85377024 unmapped: 2744320 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 971420 data_alloc: 218103808 data_used: 143360
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:59:19.153238+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85377024 unmapped: 2744320 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:59:20.153431+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85377024 unmapped: 2744320 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:59:21.153566+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85377024 unmapped: 2744320 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:59:22.153741+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85377024 unmapped: 2744320 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:59:23.153959+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85377024 unmapped: 2744320 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 971420 data_alloc: 218103808 data_used: 143360
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:59:24.154119+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85377024 unmapped: 2744320 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:59:25.154260+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85377024 unmapped: 2744320 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:59:26.154449+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85377024 unmapped: 2744320 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:59:27.154673+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85377024 unmapped: 2744320 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:59:28.154834+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85377024 unmapped: 2744320 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 971420 data_alloc: 218103808 data_used: 143360
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:59:29.154979+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85377024 unmapped: 2744320 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:59:30.155114+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85377024 unmapped: 2744320 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:59:31.155323+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85377024 unmapped: 2744320 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:59:32.155496+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85377024 unmapped: 2744320 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:59:33.155716+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85377024 unmapped: 2744320 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 971420 data_alloc: 218103808 data_used: 143360
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:59:34.155881+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85377024 unmapped: 2744320 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:59:35.156056+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85385216 unmapped: 2736128 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:59:36.156204+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85385216 unmapped: 2736128 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:59:37.156397+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85385216 unmapped: 2736128 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:59:38.156544+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85385216 unmapped: 2736128 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 971420 data_alloc: 218103808 data_used: 143360
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:59:39.156756+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85385216 unmapped: 2736128 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:59:40.156946+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85385216 unmapped: 2736128 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:59:41.484530+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85385216 unmapped: 2736128 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:59:42.484651+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85385216 unmapped: 2736128 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:59:43.484832+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85393408 unmapped: 2727936 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 971420 data_alloc: 218103808 data_used: 143360
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:59:44.484990+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 139 ms_handle_reset con 0x5616e34d4800 session 0x5616e3688780
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 139 ms_handle_reset con 0x5616e10efc00 session 0x5616e312ef00
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85393408 unmapped: 2727936 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:59:45.485179+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85393408 unmapped: 2727936 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:59:46.485358+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85393408 unmapped: 2727936 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:59:47.485547+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85393408 unmapped: 2727936 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:59:48.485732+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85401600 unmapped: 2719744 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 971420 data_alloc: 218103808 data_used: 143360
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:59:49.485872+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85401600 unmapped: 2719744 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:59:50.485994+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85401600 unmapped: 2719744 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:59:51.486115+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85401600 unmapped: 2719744 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:59:52.486276+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85401600 unmapped: 2719744 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:59:53.486464+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85401600 unmapped: 2719744 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 971420 data_alloc: 218103808 data_used: 143360
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:59:54.486654+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85401600 unmapped: 2719744 heap: 88121344 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 49.622116089s of 49.629035950s, submitted: 2
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e3280c00
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:59:55.486795+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85401600 unmapped: 3768320 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:59:56.486935+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85557248 unmapped: 3612672 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:59:57.487076+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85671936 unmapped: 3497984 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:59:58.487261+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85671936 unmapped: 3497984 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 971552 data_alloc: 218103808 data_used: 143360
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T09:59:59.487398+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85671936 unmapped: 3497984 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:00:00.487583+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85671936 unmapped: 3497984 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e108a800
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:00:01.487724+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85671936 unmapped: 3497984 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:00:02.487870+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85671936 unmapped: 3497984 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:00:03.488029+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85671936 unmapped: 3497984 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 973064 data_alloc: 218103808 data_used: 143360
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:00:04.488218+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85671936 unmapped: 3497984 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:00:05.488371+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85671936 unmapped: 3497984 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:00:06.488538+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85671936 unmapped: 3497984 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:00:07.488691+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85671936 unmapped: 3497984 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread fragmentation_score=0.000029 took=0.000037s
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:00:08.488906+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85671936 unmapped: 3497984 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 973064 data_alloc: 218103808 data_used: 143360
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:00:09.489026+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85671936 unmapped: 3497984 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:00:10.489229+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85671936 unmapped: 3497984 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:00:11.489385+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85671936 unmapped: 3497984 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:00:12.489532+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85671936 unmapped: 3497984 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 16.980213165s of 18.280017853s, submitted: 343
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:00:13.489704+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85688320 unmapped: 3481600 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 972932 data_alloc: 218103808 data_used: 143360
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:00:14.489814+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85688320 unmapped: 3481600 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:00:15.489960+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85688320 unmapped: 3481600 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:00:16.490117+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85688320 unmapped: 3481600 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:00:17.490325+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85688320 unmapped: 3481600 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:00:18.490486+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85688320 unmapped: 3481600 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 972932 data_alloc: 218103808 data_used: 143360
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:00:19.490618+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85688320 unmapped: 3481600 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:00:20.490749+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85688320 unmapped: 3481600 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:00:21.490870+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85688320 unmapped: 3481600 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:00:22.491020+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85688320 unmapped: 3481600 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:00:23.491228+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85688320 unmapped: 3481600 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 972932 data_alloc: 218103808 data_used: 143360
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:00:24.491394+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85688320 unmapped: 3481600 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:00:25.491575+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85688320 unmapped: 3481600 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:00:26.491716+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85688320 unmapped: 3481600 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:00:27.491851+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85688320 unmapped: 3481600 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:00:28.492035+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85688320 unmapped: 3481600 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 972932 data_alloc: 218103808 data_used: 143360
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:00:29.492207+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85688320 unmapped: 3481600 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:00:30.492399+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85688320 unmapped: 3481600 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:00:31.492535+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85688320 unmapped: 3481600 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:00:32.492664+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85688320 unmapped: 3481600 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:00:33.492828+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85688320 unmapped: 3481600 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 972932 data_alloc: 218103808 data_used: 143360
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:00:34.492979+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85688320 unmapped: 3481600 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:00:35.493114+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85688320 unmapped: 3481600 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:00:36.493262+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85688320 unmapped: 3481600 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:00:37.493409+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85688320 unmapped: 3481600 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:00:38.493537+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85688320 unmapped: 3481600 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 972932 data_alloc: 218103808 data_used: 143360
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:00:39.493705+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85688320 unmapped: 3481600 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:00:40.493844+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85688320 unmapped: 3481600 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:00:41.493966+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85688320 unmapped: 3481600 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:00:42.494070+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85688320 unmapped: 3481600 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:00:43.494205+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85688320 unmapped: 3481600 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 972932 data_alloc: 218103808 data_used: 143360
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:00:44.494359+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 139 ms_handle_reset con 0x5616e21ac400 session 0x5616e3688000
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85688320 unmapped: 3481600 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:00:45.494503+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85688320 unmapped: 3481600 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:00:46.494631+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85688320 unmapped: 3481600 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:00:47.494774+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85688320 unmapped: 3481600 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:00:48.494870+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85688320 unmapped: 3481600 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 972932 data_alloc: 218103808 data_used: 143360
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:00:49.495013+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85688320 unmapped: 3481600 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:00:50.495186+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85688320 unmapped: 3481600 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:00:51.495328+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85688320 unmapped: 3481600 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:00:52.495497+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85688320 unmapped: 3481600 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:00:53.495702+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85688320 unmapped: 3481600 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 972932 data_alloc: 218103808 data_used: 143360
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:00:54.495915+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85688320 unmapped: 3481600 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e2273400
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 41.939346313s of 41.943122864s, submitted: 1
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:00:55.496060+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85696512 unmapped: 3473408 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:00:56.496213+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85696512 unmapped: 3473408 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:00:57.496400+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85696512 unmapped: 3473408 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:00:58.496542+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85696512 unmapped: 3473408 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 974576 data_alloc: 218103808 data_used: 143360
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:00:59.496698+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85696512 unmapped: 3473408 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:01:00.496860+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85696512 unmapped: 3473408 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:01:01.497070+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85696512 unmapped: 3473408 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:01:02.497269+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85696512 unmapped: 3473408 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:01:03.497443+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85696512 unmapped: 3473408 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 974576 data_alloc: 218103808 data_used: 143360
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:01:04.497615+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85696512 unmapped: 3473408 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:01:05.497793+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85696512 unmapped: 3473408 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:01:06.497964+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85696512 unmapped: 3473408 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:01:07.498105+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85696512 unmapped: 3473408 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:01:08.498294+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85696512 unmapped: 3473408 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 973985 data_alloc: 218103808 data_used: 143360
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:01:09.498455+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85696512 unmapped: 3473408 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:01:10.498629+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85696512 unmapped: 3473408 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:01:11.498773+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85696512 unmapped: 3473408 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:01:12.498962+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 139 heartbeat osd_stat(store_statfs(0x4fc667000/0x0/0x4ffc00000, data 0xf24b6/0x1a5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85696512 unmapped: 3473408 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 17.912214279s of 17.978521347s, submitted: 3
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:01:13.499241+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _renew_subs
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 139 handle_osd_map epochs [140,140], i have 139, src has [1,140]
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85696512 unmapped: 3473408 heap: 89169920 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 977619 data_alloc: 218103808 data_used: 143360
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:01:14.499443+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 140 handle_osd_map epochs [140,141], i have 140, src has [1,141]
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e327c800
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 94093312 unmapped: 3465216 heap: 97558528 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:01:15.499621+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 141 ms_handle_reset con 0x5616e327c800 session 0x5616e1f7b0e0
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85827584 unmapped: 20127744 heap: 105955328 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 141 heartbeat osd_stat(store_statfs(0x4fb65f000/0x0/0x4ffc00000, data 0x10f66e2/0x11ab000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 141 handle_osd_map epochs [142,142], i have 141, src has [1,142]
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 141 handle_osd_map epochs [142,142], i have 142, src has [1,142]
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e21e4c00
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:01:16.499853+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85868544 unmapped: 20086784 heap: 105955328 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:01:17.500017+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 142 ms_handle_reset con 0x5616e21e4c00 session 0x5616e37ce780
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85876736 unmapped: 20078592 heap: 105955328 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:01:18.500239+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85876736 unmapped: 20078592 heap: 105955328 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1141871 data_alloc: 218103808 data_used: 143360
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:01:19.500400+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85876736 unmapped: 20078592 heap: 105955328 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 142 handle_osd_map epochs [143,143], i have 142, src has [1,143]
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:01:20.500615+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85876736 unmapped: 20078592 heap: 105955328 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:01:21.500814+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85876736 unmapped: 20078592 heap: 105955328 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 143 heartbeat osd_stat(store_statfs(0x4fae59000/0x0/0x4ffc00000, data 0x18fa8f2/0x19b1000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 143 handle_osd_map epochs [143,144], i have 143, src has [1,144]
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:01:22.501009+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85909504 unmapped: 20045824 heap: 105955328 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fae57000/0x0/0x4ffc00000, data 0x18fc8c4/0x19b4000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:01:23.501230+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85909504 unmapped: 20045824 heap: 105955328 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1147723 data_alloc: 218103808 data_used: 143360
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:01:24.501381+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85909504 unmapped: 20045824 heap: 105955328 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:01:25.501581+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85909504 unmapped: 20045824 heap: 105955328 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:01:26.501729+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85909504 unmapped: 20045824 heap: 105955328 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:01:27.501895+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85909504 unmapped: 20045824 heap: 105955328 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fae57000/0x0/0x4ffc00000, data 0x18fc8c4/0x19b4000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:01:28.502058+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85909504 unmapped: 20045824 heap: 105955328 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1147723 data_alloc: 218103808 data_used: 143360
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:01:29.502231+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fae57000/0x0/0x4ffc00000, data 0x18fc8c4/0x19b4000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85909504 unmapped: 20045824 heap: 105955328 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fae57000/0x0/0x4ffc00000, data 0x18fc8c4/0x19b4000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:01:30.502345+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85909504 unmapped: 20045824 heap: 105955328 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fae57000/0x0/0x4ffc00000, data 0x18fc8c4/0x19b4000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:01:31.502532+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85909504 unmapped: 20045824 heap: 105955328 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:01:32.502756+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85909504 unmapped: 20045824 heap: 105955328 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:01:33.502972+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85909504 unmapped: 20045824 heap: 105955328 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1147723 data_alloc: 218103808 data_used: 143360
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:01:34.503138+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85909504 unmapped: 20045824 heap: 105955328 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fae57000/0x0/0x4ffc00000, data 0x18fc8c4/0x19b4000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:01:35.503376+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85909504 unmapped: 20045824 heap: 105955328 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:01:36.503580+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85909504 unmapped: 20045824 heap: 105955328 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fae57000/0x0/0x4ffc00000, data 0x18fc8c4/0x19b4000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:01:37.503712+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85909504 unmapped: 20045824 heap: 105955328 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:01:38.503892+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85909504 unmapped: 20045824 heap: 105955328 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1147723 data_alloc: 218103808 data_used: 143360
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:01:39.504096+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85909504 unmapped: 20045824 heap: 105955328 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:01:40.504244+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85909504 unmapped: 20045824 heap: 105955328 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:01:41.504408+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fae57000/0x0/0x4ffc00000, data 0x18fc8c4/0x19b4000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85909504 unmapped: 20045824 heap: 105955328 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fae57000/0x0/0x4ffc00000, data 0x18fc8c4/0x19b4000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:01:42.504598+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 144 ms_handle_reset con 0x5616e108a800 session 0x5616e3688d20
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 144 ms_handle_reset con 0x5616e3280c00 session 0x5616e36965a0
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fae57000/0x0/0x4ffc00000, data 0x18fc8c4/0x19b4000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85917696 unmapped: 20037632 heap: 105955328 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:01:43.504825+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85917696 unmapped: 20037632 heap: 105955328 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1147723 data_alloc: 218103808 data_used: 143360
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:01:44.504928+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85917696 unmapped: 20037632 heap: 105955328 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:01:45.505099+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85917696 unmapped: 20037632 heap: 105955328 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:01:46.505294+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fae57000/0x0/0x4ffc00000, data 0x18fc8c4/0x19b4000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85917696 unmapped: 20037632 heap: 105955328 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:01:47.505437+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85917696 unmapped: 20037632 heap: 105955328 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:01:48.505609+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85917696 unmapped: 20037632 heap: 105955328 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1147723 data_alloc: 218103808 data_used: 143360
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:01:49.505748+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fae57000/0x0/0x4ffc00000, data 0x18fc8c4/0x19b4000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85917696 unmapped: 20037632 heap: 105955328 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:01:50.505896+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85917696 unmapped: 20037632 heap: 105955328 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:01:51.506052+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85917696 unmapped: 20037632 heap: 105955328 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:01:52.506222+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85917696 unmapped: 20037632 heap: 105955328 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e3222400
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 39.927474976s of 40.098861694s, submitted: 30
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:01:53.506430+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fae57000/0x0/0x4ffc00000, data 0x18fc8c4/0x19b4000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85917696 unmapped: 20037632 heap: 105955328 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1147855 data_alloc: 218103808 data_used: 143360
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fae57000/0x0/0x4ffc00000, data 0x18fc8c4/0x19b4000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:01:54.506554+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85917696 unmapped: 20037632 heap: 105955328 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:01:55.506701+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85917696 unmapped: 20037632 heap: 105955328 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:01:56.506846+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e321c800
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85819392 unmapped: 20135936 heap: 105955328 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:01:57.506982+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 144 heartbeat osd_stat(store_statfs(0x4fae58000/0x0/0x4ffc00000, data 0x18fc8c4/0x19b4000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85819392 unmapped: 20135936 heap: 105955328 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:01:58.507121+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 85819392 unmapped: 20135936 heap: 105955328 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1148527 data_alloc: 218103808 data_used: 143360
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:01:59.507266+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e327c400
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e226a400
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 144 ms_handle_reset con 0x5616e226a400 session 0x5616e1eb4780
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 100524032 unmapped: 5431296 heap: 105955328 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:02:00.507406+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 100524032 unmapped: 5431296 heap: 105955328 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:02:01.507558+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 144 handle_osd_map epochs [145,145], i have 144, src has [1,145]
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 145 heartbeat osd_stat(store_statfs(0x4fae58000/0x0/0x4ffc00000, data 0x18fc8c4/0x19b4000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 100524032 unmapped: 5431296 heap: 105955328 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:02:02.507715+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 145 handle_osd_map epochs [145,146], i have 145, src has [1,146]
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e2cac000
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 146 ms_handle_reset con 0x5616e2cac000 session 0x5616e34770e0
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 102490112 unmapped: 3465216 heap: 105955328 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:02:03.507916+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 102514688 unmapped: 3440640 heap: 105955328 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1212945 data_alloc: 234881024 data_used: 13774848
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:02:04.508530+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 102555648 unmapped: 3399680 heap: 105955328 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:02:05.508874+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 11.954066277s of 12.084420204s, submitted: 24
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 102555648 unmapped: 3399680 heap: 105955328 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:02:06.509036+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 102555648 unmapped: 3399680 heap: 105955328 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fad62000/0x0/0x4ffc00000, data 0x19eeaf0/0x1aa8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:02:07.509421+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 146 handle_osd_map epochs [147,147], i have 146, src has [1,147]
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 102416384 unmapped: 3538944 heap: 105955328 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:02:08.509602+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 102416384 unmapped: 3538944 heap: 105955328 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1213508 data_alloc: 234881024 data_used: 13774848
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:02:09.509948+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e321dc00
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 103014400 unmapped: 2940928 heap: 105955328 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:02:10.510165+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 103120896 unmapped: 2834432 heap: 105955328 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4fad60000/0x0/0x4ffc00000, data 0x19f0ac2/0x1aab000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:02:11.510461+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 103120896 unmapped: 2834432 heap: 105955328 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:02:12.510919+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 103120896 unmapped: 2834432 heap: 105955328 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:02:13.511237+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 103120896 unmapped: 2834432 heap: 105955328 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4fad60000/0x0/0x4ffc00000, data 0x19f0ac2/0x1aab000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1218848 data_alloc: 234881024 data_used: 14458880
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e327c400 session 0x5616e34eb2c0
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e3222400 session 0x5616e3378780
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:02:14.511372+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 103120896 unmapped: 2834432 heap: 105955328 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:02:15.511630+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 103120896 unmapped: 2834432 heap: 105955328 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:02:16.511998+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 103120896 unmapped: 2834432 heap: 105955328 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:02:17.512193+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 103120896 unmapped: 2834432 heap: 105955328 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:02:18.512463+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 103120896 unmapped: 2834432 heap: 105955328 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1218848 data_alloc: 234881024 data_used: 14458880
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4fad60000/0x0/0x4ffc00000, data 0x19f0ac2/0x1aab000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:02:19.512678+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 103120896 unmapped: 2834432 heap: 105955328 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:02:20.512883+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 15.370294571s of 15.401467323s, submitted: 11
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 106135552 unmapped: 2965504 heap: 109101056 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4fad60000/0x0/0x4ffc00000, data 0x19f0ac2/0x1aab000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x33ef9c6), peers [1,2] op hist [0,0,1])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:02:21.513082+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 106307584 unmapped: 2793472 heap: 109101056 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #43. Immutable memtables: 0.
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:02:22.513254+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 107315200 unmapped: 2834432 heap: 110149632 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:02:23.513439+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 107315200 unmapped: 2834432 heap: 110149632 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1281178 data_alloc: 234881024 data_used: 14733312
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:02:24.513566+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e21ac400
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 107315200 unmapped: 2834432 heap: 110149632 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:02:25.513741+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 107315200 unmapped: 2834432 heap: 110149632 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:02:26.513969+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f944e000/0x0/0x4ffc00000, data 0x215bac2/0x2216000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 107315200 unmapped: 2834432 heap: 110149632 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:02:27.514121+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 106733568 unmapped: 3416064 heap: 110149632 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:02:28.514420+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 106733568 unmapped: 3416064 heap: 110149632 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1276638 data_alloc: 234881024 data_used: 14733312
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:02:29.514626+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 106733568 unmapped: 3416064 heap: 110149632 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:02:30.514806+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e2619000
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 9.930480957s of 10.239095688s, submitted: 95
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 106741760 unmapped: 3407872 heap: 110149632 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:02:31.514964+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9435000/0x0/0x4ffc00000, data 0x217cac2/0x2237000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 106741760 unmapped: 3407872 heap: 110149632 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:02:32.515206+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 106741760 unmapped: 3407872 heap: 110149632 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:02:33.515393+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 106741760 unmapped: 3407872 heap: 110149632 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1277223 data_alloc: 234881024 data_used: 14733312
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:02:34.515544+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9435000/0x0/0x4ffc00000, data 0x217cac2/0x2237000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 106831872 unmapped: 3317760 heap: 110149632 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:02:35.515728+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 106831872 unmapped: 3317760 heap: 110149632 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:02:36.515917+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 106831872 unmapped: 3317760 heap: 110149632 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:02:37.516059+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 106831872 unmapped: 3317760 heap: 110149632 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:02:38.516196+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 106831872 unmapped: 3317760 heap: 110149632 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1277072 data_alloc: 234881024 data_used: 14733312
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:02:39.516347+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 106831872 unmapped: 3317760 heap: 110149632 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:02:40.516503+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f942c000/0x0/0x4ffc00000, data 0x2185ac2/0x2240000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e321e000
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e321e000 session 0x5616e1f04960
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e327d800
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e327d800 session 0x5616e1f054a0
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e0ff1c00
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e0ff1c00 session 0x5616e1f05e00
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 106577920 unmapped: 3571712 heap: 110149632 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:02:41.516655+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 106577920 unmapped: 3571712 heap: 110149632 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:02:42.516809+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e3283800
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e3283800 session 0x5616e1f04780
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 108347392 unmapped: 1802240 heap: 110149632 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 12.226655960s of 12.266713142s, submitted: 7
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:02:43.516986+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e321f000
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e321f000 session 0x5616e337e000
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e0ff1c00
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 108953600 unmapped: 3293184 heap: 112246784 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e0ff1c00 session 0x5616e337f860
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1293665 data_alloc: 234881024 data_used: 15781888
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:02:44.517122+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e321e000
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e321e000 session 0x5616e0d161e0
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f92cc000/0x0/0x4ffc00000, data 0x22e4b24/0x23a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 108994560 unmapped: 3252224 heap: 112246784 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:02:45.517300+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 108994560 unmapped: 3252224 heap: 112246784 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:02:46.517456+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 108994560 unmapped: 3252224 heap: 112246784 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:02:47.517608+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 108994560 unmapped: 3252224 heap: 112246784 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:02:48.517785+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 108994560 unmapped: 3252224 heap: 112246784 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:02:49.517917+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1293681 data_alloc: 234881024 data_used: 15781888
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 108994560 unmapped: 3252224 heap: 112246784 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:02:50.518101+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f92cc000/0x0/0x4ffc00000, data 0x22e4b24/0x23a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 109027328 unmapped: 3219456 heap: 112246784 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:02:51.518275+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 109027328 unmapped: 3219456 heap: 112246784 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:02:52.518459+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e2a99000
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e2a99000 session 0x5616e1f7b2c0
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 109043712 unmapped: 3203072 heap: 112246784 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:02:53.518746+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e10eec00
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e3542000
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 10.034008980s of 10.189837456s, submitted: 37
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 109182976 unmapped: 3063808 heap: 112246784 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:02:54.518890+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1294918 data_alloc: 234881024 data_used: 15740928
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 109232128 unmapped: 3014656 heap: 112246784 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:02:55.519002+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f92cc000/0x0/0x4ffc00000, data 0x22e4b24/0x23a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 109232128 unmapped: 3014656 heap: 112246784 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:02:56.519133+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 109232128 unmapped: 3014656 heap: 112246784 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:02:57.519260+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 109232128 unmapped: 3014656 heap: 112246784 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:02:58.519368+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 109232128 unmapped: 3014656 heap: 112246784 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:02:59.546943+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1295070 data_alloc: 234881024 data_used: 15749120
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 109232128 unmapped: 3014656 heap: 112246784 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:03:00.547096+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f92cc000/0x0/0x4ffc00000, data 0x22e4b24/0x23a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 109232128 unmapped: 3014656 heap: 112246784 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:03:01.547243+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 109232128 unmapped: 3014656 heap: 112246784 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:03:02.547390+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e321c800 session 0x5616e3750d20
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e2273400 session 0x5616e36892c0
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 108748800 unmapped: 3497984 heap: 112246784 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:03:03.547558+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f92c9000/0x0/0x4ffc00000, data 0x22e7b24/0x23a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 10.504937172s of 10.520147324s, submitted: 3
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 108732416 unmapped: 5611520 heap: 114343936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:03:04.547728+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1341628 data_alloc: 234881024 data_used: 15806464
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 109731840 unmapped: 4612096 heap: 114343936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:03:05.547860+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 110092288 unmapped: 4251648 heap: 114343936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:03:06.548047+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f8c03000/0x0/0x4ffc00000, data 0x29abb24/0x2a67000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 110092288 unmapped: 4251648 heap: 114343936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:03:07.548193+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 110092288 unmapped: 4251648 heap: 114343936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:03:08.548375+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e2a99800 session 0x5616e27985a0
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e3542c00 session 0x5616e2797860
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 110092288 unmapped: 4251648 heap: 114343936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:03:09.548580+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1352434 data_alloc: 234881024 data_used: 15826944
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 110092288 unmapped: 4251648 heap: 114343936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:03:10.549122+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f8c02000/0x0/0x4ffc00000, data 0x29aeb24/0x2a6a000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 110092288 unmapped: 4251648 heap: 114343936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:03:11.549257+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f8c02000/0x0/0x4ffc00000, data 0x29aeb24/0x2a6a000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 110092288 unmapped: 4251648 heap: 114343936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:03:12.549376+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 110092288 unmapped: 4251648 heap: 114343936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e226b000
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:03:13.549506+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e10eec00 session 0x5616e311e1e0
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e3542000 session 0x5616e2620000
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e2617400
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 9.650221825s of 10.002939224s, submitted: 115
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e2617400 session 0x5616e337e3c0
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 109658112 unmapped: 4685824 heap: 114343936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:03:14.549655+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1286162 data_alloc: 234881024 data_used: 15585280
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 109658112 unmapped: 4685824 heap: 114343936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:03:15.549794+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9422000/0x0/0x4ffc00000, data 0x218eac2/0x2249000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 109658112 unmapped: 4685824 heap: 114343936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:03:16.549957+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 109658112 unmapped: 4685824 heap: 114343936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f941f000/0x0/0x4ffc00000, data 0x2191ac2/0x224c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:03:17.550092+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 109658112 unmapped: 4685824 heap: 114343936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:03:18.550239+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f941f000/0x0/0x4ffc00000, data 0x2191ac2/0x224c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 109658112 unmapped: 4685824 heap: 114343936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:03:19.550397+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1286522 data_alloc: 234881024 data_used: 15585280
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e3220800
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 109674496 unmapped: 4669440 heap: 114343936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:03:20.550563+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 109674496 unmapped: 4669440 heap: 114343936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:03:21.550739+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e321dc00 session 0x5616e3475a40
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e108bc00
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 108904448 unmapped: 5439488 heap: 114343936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:03:22.550882+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e108bc00 session 0x5616e3751860
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 108904448 unmapped: 5439488 heap: 114343936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:03:23.551077+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 108904448 unmapped: 5439488 heap: 114343936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:03:24.551203+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1212351 data_alloc: 234881024 data_used: 14626816
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9caf000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 108904448 unmapped: 5439488 heap: 114343936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:03:25.551294+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e226c000
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 11.967247963s of 12.068682671s, submitted: 26
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e34d4000
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 108904448 unmapped: 5439488 heap: 114343936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:03:26.551466+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 108904448 unmapped: 5439488 heap: 114343936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:03:27.551627+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 108904448 unmapped: 5439488 heap: 114343936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:03:28.551737+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 108904448 unmapped: 5439488 heap: 114343936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:03:29.551863+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1215375 data_alloc: 234881024 data_used: 14626816
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9caf000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9caf000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 108904448 unmapped: 5439488 heap: 114343936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:03:30.551987+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9caf000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 108904448 unmapped: 5439488 heap: 114343936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:03:31.552136+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 108912640 unmapped: 5431296 heap: 114343936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:03:32.552292+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 108912640 unmapped: 5431296 heap: 114343936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:03:33.552514+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9caf000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 108920832 unmapped: 5423104 heap: 114343936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:03:34.552669+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1215111 data_alloc: 234881024 data_used: 14626816
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 108920832 unmapped: 5423104 heap: 114343936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:03:35.552847+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 108920832 unmapped: 5423104 heap: 114343936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:03:36.553021+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 108920832 unmapped: 5423104 heap: 114343936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:03:37.553168+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 108920832 unmapped: 5423104 heap: 114343936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:03:38.553308+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9caf000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 108920832 unmapped: 5423104 heap: 114343936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:03:39.554253+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1215111 data_alloc: 234881024 data_used: 14626816
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 108920832 unmapped: 5423104 heap: 114343936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:03:40.554450+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9caf000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 108920832 unmapped: 5423104 heap: 114343936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:03:41.554806+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e327e000
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 15.472117424s of 15.489373207s, submitted: 5
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e327e000 session 0x5616e2797c20
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 108961792 unmapped: 22183936 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:03:42.554977+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 108961792 unmapped: 22183936 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:03:43.555219+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f922c000/0x0/0x4ffc00000, data 0x2385ac2/0x2440000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 108961792 unmapped: 22183936 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:03:44.555370+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1290065 data_alloc: 234881024 data_used: 14626816
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 108961792 unmapped: 22183936 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:03:45.555556+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 108961792 unmapped: 22183936 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:03:46.555720+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 108961792 unmapped: 22183936 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:03:47.555932+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f922c000/0x0/0x4ffc00000, data 0x2385ac2/0x2440000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 108961792 unmapped: 22183936 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:03:48.556104+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 108961792 unmapped: 22183936 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:03:49.556233+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1290065 data_alloc: 234881024 data_used: 14626816
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e226c000 session 0x5616e2d70d20
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e3220800 session 0x5616e312f4a0
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 108961792 unmapped: 22183936 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:03:50.556425+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e108bc00
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e108bc00 session 0x5616e337ef00
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e226c000
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 109068288 unmapped: 22077440 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:03:51.556619+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e321dc00
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9208000/0x0/0x4ffc00000, data 0x23a9ac2/0x2464000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [1])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 114139136 unmapped: 17006592 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:03:52.556829+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 118480896 unmapped: 12664832 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:03:53.557067+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9208000/0x0/0x4ffc00000, data 0x23a9ac2/0x2464000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 118497280 unmapped: 12648448 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:03:54.557229+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1366317 data_alloc: 234881024 data_used: 25640960
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9208000/0x0/0x4ffc00000, data 0x23a9ac2/0x2464000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 118497280 unmapped: 12648448 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:03:55.557428+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 118497280 unmapped: 12648448 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:03:56.557569+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 118497280 unmapped: 12648448 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:03:57.557739+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9208000/0x0/0x4ffc00000, data 0x23a9ac2/0x2464000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 118497280 unmapped: 12648448 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:03:58.557941+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 118497280 unmapped: 12648448 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:03:59.558137+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1366317 data_alloc: 234881024 data_used: 25640960
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 118497280 unmapped: 12648448 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:04:00.558381+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e321a000
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 18.837284088s of 18.897920609s, submitted: 9
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 118497280 unmapped: 12648448 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:04:01.558558+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9208000/0x0/0x4ffc00000, data 0x23a9ac2/0x2464000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:04:02.558753+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 119037952 unmapped: 12107776 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:04:03.558944+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 119062528 unmapped: 12083200 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f901d000/0x0/0x4ffc00000, data 0x2594ac2/0x264f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:04:04.559235+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 119308288 unmapped: 11837440 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1397161 data_alloc: 234881024 data_used: 26198016
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:04:05.559421+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 119308288 unmapped: 11837440 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:04:06.559629+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 119308288 unmapped: 11837440 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e34d6000
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:04:07.559805+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 119308288 unmapped: 11837440 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9015000/0x0/0x4ffc00000, data 0x259cac2/0x2657000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:04:08.559964+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 119308288 unmapped: 11837440 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:04:09.560137+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 119308288 unmapped: 11837440 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1396570 data_alloc: 234881024 data_used: 26198016
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:04:10.560351+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 120356864 unmapped: 10788864 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:04:11.560581+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 120356864 unmapped: 10788864 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9015000/0x0/0x4ffc00000, data 0x259cac2/0x2657000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:04:12.560794+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 120356864 unmapped: 10788864 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 11.991930008s of 12.083539009s, submitted: 22
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:04:13.561050+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 120356864 unmapped: 10788864 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9015000/0x0/0x4ffc00000, data 0x259cac2/0x2657000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:04:14.561289+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 120356864 unmapped: 10788864 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1394635 data_alloc: 234881024 data_used: 26198016
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9015000/0x0/0x4ffc00000, data 0x259cac2/0x2657000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:04:15.561556+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 120356864 unmapped: 10788864 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:04:16.561697+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 120356864 unmapped: 10788864 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:04:17.561905+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 120356864 unmapped: 10788864 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:04:18.562069+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 120356864 unmapped: 10788864 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9015000/0x0/0x4ffc00000, data 0x259cac2/0x2657000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:04:19.562262+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 120356864 unmapped: 10788864 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1394503 data_alloc: 234881024 data_used: 26198016
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:04:20.562469+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 120356864 unmapped: 10788864 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:04:21.562613+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 120356864 unmapped: 10788864 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:04:22.562744+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 120356864 unmapped: 10788864 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:04:23.562948+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 120356864 unmapped: 10788864 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9015000/0x0/0x4ffc00000, data 0x259cac2/0x2657000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:04:24.563057+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 120356864 unmapped: 10788864 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1394503 data_alloc: 234881024 data_used: 26198016
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9015000/0x0/0x4ffc00000, data 0x259cac2/0x2657000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:04:25.563241+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 120356864 unmapped: 10788864 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:04:26.563364+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 120356864 unmapped: 10788864 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:04:27.563561+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 120356864 unmapped: 10788864 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9015000/0x0/0x4ffc00000, data 0x259cac2/0x2657000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:04:28.563715+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 120356864 unmapped: 10788864 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:04:29.563866+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 120356864 unmapped: 10788864 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1394503 data_alloc: 234881024 data_used: 26198016
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e321dc00 session 0x5616e3475e00
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e226c000 session 0x5616e0ffeb40
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e3222c00
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 17.675941467s of 17.711282730s, submitted: 2
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:04:30.563991+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 114180096 unmapped: 16965632 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e3222c00 session 0x5616e3689a40
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:04:31.564137+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 114180096 unmapped: 16965632 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9caf000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:04:32.564374+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 114180096 unmapped: 16965632 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:04:33.564602+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 114180096 unmapped: 16965632 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:04:34.564768+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 114180096 unmapped: 16965632 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1224201 data_alloc: 234881024 data_used: 14626816
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:04:35.564977+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 114180096 unmapped: 16965632 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:04:36.565201+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9caf000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 114180096 unmapped: 16965632 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9caf000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:04:37.565545+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 114180096 unmapped: 16965632 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:04:38.565698+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 114180096 unmapped: 16965632 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:04:39.565838+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 114180096 unmapped: 16965632 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1224201 data_alloc: 234881024 data_used: 14626816
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:04:40.566022+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9caf000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 114180096 unmapped: 16965632 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:04:41.566223+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 114180096 unmapped: 16965632 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:04:42.566346+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 114180096 unmapped: 16965632 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:04:43.566508+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 114180096 unmapped: 16965632 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:04:44.566678+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 114180096 unmapped: 16965632 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1224201 data_alloc: 234881024 data_used: 14626816
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:04:45.566956+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 114180096 unmapped: 16965632 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9caf000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:04:46.567106+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 114180096 unmapped: 16965632 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:04:47.567317+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 114180096 unmapped: 16965632 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:04:48.567496+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 114180096 unmapped: 16965632 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:04:49.567734+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 114180096 unmapped: 16965632 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1224201 data_alloc: 234881024 data_used: 14626816
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9caf000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9caf000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:04:50.567863+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 114180096 unmapped: 16965632 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9caf000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:04:51.568008+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 114180096 unmapped: 16965632 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:04:52.568219+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 114180096 unmapped: 16965632 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:04:53.568441+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 114180096 unmapped: 16965632 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:04:54.568578+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 114180096 unmapped: 16965632 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1224201 data_alloc: 234881024 data_used: 14626816
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:04:55.568718+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9caf000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 114180096 unmapped: 16965632 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:04:56.568877+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 114180096 unmapped: 16965632 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:04:57.568998+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 114180096 unmapped: 16965632 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e2273400
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e2273400 session 0x5616e34eaf00
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e108bc00
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e108bc00 session 0x5616e34ead20
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e226c000
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e226c000 session 0x5616e278ad20
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:04:58.569123+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e2271c00
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 114892800 unmapped: 16252928 heap: 131145728 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e2271c00 session 0x5616e3408000
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e3284800
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e3284800 session 0x5616e36914a0
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e2a99c00
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 29.119201660s of 29.155227661s, submitted: 11
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:04:59.569234+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e2a99c00 session 0x5616e311b0e0
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 113942528 unmapped: 20881408 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e108bc00
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e108bc00 session 0x5616e10e7c20
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e226c000
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e226c000 session 0x5616e10e63c0
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e2271c00
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e2271c00 session 0x5616e0572780
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e3284800
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e3284800 session 0x5616e1f7a1e0
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1273675 data_alloc: 234881024 data_used: 14823424
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9cae000/0x0/0x4ffc00000, data 0x1902ad2/0x19be000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:05:00.569408+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 113958912 unmapped: 20865024 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9651000/0x0/0x4ffc00000, data 0x1f5fad2/0x201b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9651000/0x0/0x4ffc00000, data 0x1f5fad2/0x201b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:05:01.569516+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 113958912 unmapped: 20865024 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:05:02.569615+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 113958912 unmapped: 20865024 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:05:03.569771+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9651000/0x0/0x4ffc00000, data 0x1f5fad2/0x201b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 113844224 unmapped: 20979712 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:05:04.569926+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 113844224 unmapped: 20979712 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e34d4000 session 0x5616e0d17860
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e226b000 session 0x5616e37cef00
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1273675 data_alloc: 234881024 data_used: 14823424
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e108bc00
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e108bc00 session 0x5616e34ea780
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:05:05.570100+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 113713152 unmapped: 21110784 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e226c000
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e2271c00
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:05:06.570238+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 113713152 unmapped: 21110784 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:05:07.570407+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 113713152 unmapped: 21110784 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:05:08.570551+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 111927296 unmapped: 22896640 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:05:09.570731+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9650000/0x0/0x4ffc00000, data 0x1f5faf5/0x201c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 111927296 unmapped: 22896640 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1321277 data_alloc: 234881024 data_used: 16965632
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:05:10.570905+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 111927296 unmapped: 22896640 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:05:11.571059+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 111927296 unmapped: 22896640 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9650000/0x0/0x4ffc00000, data 0x1f5faf5/0x201c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9650000/0x0/0x4ffc00000, data 0x1f5faf5/0x201c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:05:12.571205+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 111927296 unmapped: 22896640 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:05:13.571420+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 111927296 unmapped: 22896640 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:05:14.571589+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 111927296 unmapped: 22896640 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1321277 data_alloc: 234881024 data_used: 16965632
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e3282800
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 15.900746346s of 15.997513771s, submitted: 18
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:05:15.571756+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 111575040 unmapped: 23248896 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:05:16.572182+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 111575040 unmapped: 23248896 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9650000/0x0/0x4ffc00000, data 0x1f5faf5/0x201c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:05:17.572428+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9650000/0x0/0x4ffc00000, data 0x1f5faf5/0x201c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 111575040 unmapped: 23248896 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:05:18.572782+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f90fc000/0x0/0x4ffc00000, data 0x24b3af5/0x2570000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 113655808 unmapped: 21168128 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:05:19.573202+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f90fc000/0x0/0x4ffc00000, data 0x24b3af5/0x2570000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 113672192 unmapped: 21151744 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1367081 data_alloc: 234881024 data_used: 17113088
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:05:20.573555+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 113672192 unmapped: 21151744 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:05:21.573935+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e3283400
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 113672192 unmapped: 21151744 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:05:22.574222+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 113672192 unmapped: 21151744 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:05:23.574605+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 113672192 unmapped: 21151744 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f90f6000/0x0/0x4ffc00000, data 0x24b9af5/0x2576000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:05:24.574802+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 112992256 unmapped: 21831680 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1366154 data_alloc: 234881024 data_used: 17113088
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:05:25.575022+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 112992256 unmapped: 21831680 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:05:26.575306+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 112992256 unmapped: 21831680 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:05:27.575506+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 112992256 unmapped: 21831680 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:05:28.575733+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 112992256 unmapped: 21831680 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:05:29.575999+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f90f6000/0x0/0x4ffc00000, data 0x24b9af5/0x2576000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 112992256 unmapped: 21831680 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1366154 data_alloc: 234881024 data_used: 17113088
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:05:30.576204+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 112992256 unmapped: 21831680 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:05:31.576439+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 112992256 unmapped: 21831680 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:05:32.576669+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f90f6000/0x0/0x4ffc00000, data 0x24b9af5/0x2576000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 112992256 unmapped: 21831680 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 17.825149536s of 17.985601425s, submitted: 37
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:05:33.576984+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 112992256 unmapped: 21831680 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:05:34.577132+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 112992256 unmapped: 21831680 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f90f6000/0x0/0x4ffc00000, data 0x24b9af5/0x2576000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1366022 data_alloc: 234881024 data_used: 17113088
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:05:35.577353+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 112992256 unmapped: 21831680 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:05:36.577550+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 112992256 unmapped: 21831680 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e321bc00
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e321bc00 session 0x5616e3696d20
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e327f800
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e327f800 session 0x5616df7d8780
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e0e07000
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e0e07000 session 0x5616e311f860
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e0e07000
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e0e07000 session 0x5616dfb71c20
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e108bc00
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:05:37.577726+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e108bc00 session 0x5616e0d161e0
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e226b000
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e226b000 session 0x5616e1f054a0
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e321bc00
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e321bc00 session 0x5616e311a960
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e327f800
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e327f800 session 0x5616e37ced20
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e0e07000
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e0e07000 session 0x5616e0d6ed20
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 114393088 unmapped: 20430848 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f8c38000/0x0/0x4ffc00000, data 0x2976b57/0x2a34000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:05:38.577911+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 114425856 unmapped: 20398080 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:05:39.578103+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 114425856 unmapped: 20398080 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1413608 data_alloc: 234881024 data_used: 17113088
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:05:40.578277+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 114425856 unmapped: 20398080 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:05:41.578467+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 114434048 unmapped: 20389888 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:05:42.578657+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 114434048 unmapped: 20389888 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:05:43.578939+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e327d400
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 10.185551643s of 10.350721359s, submitted: 46
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e327d400 session 0x5616e34db4a0
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 114171904 unmapped: 20652032 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f8c13000/0x0/0x4ffc00000, data 0x299ab7a/0x2a59000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:05:44.579120+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e3280000
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e2a99000
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 114171904 unmapped: 20652032 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1417109 data_alloc: 234881024 data_used: 17113088
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:05:45.579278+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 114868224 unmapped: 19955712 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:05:46.579488+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116793344 unmapped: 18030592 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f8c13000/0x0/0x4ffc00000, data 0x299ab7a/0x2a59000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:05:47.579660+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116793344 unmapped: 18030592 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:05:48.579850+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116793344 unmapped: 18030592 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:05:49.580015+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f8c13000/0x0/0x4ffc00000, data 0x299ab7a/0x2a59000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116793344 unmapped: 18030592 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1449333 data_alloc: 234881024 data_used: 20922368
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:05:50.580218+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116801536 unmapped: 18022400 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:05:51.580369+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116801536 unmapped: 18022400 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:05:52.580554+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116801536 unmapped: 18022400 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:05:53.580819+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116801536 unmapped: 18022400 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:05:54.580975+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116801536 unmapped: 18022400 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1449333 data_alloc: 234881024 data_used: 20922368
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f8c13000/0x0/0x4ffc00000, data 0x299ab7a/0x2a59000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:05:55.581178+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116809728 unmapped: 18014208 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 12.333662033s of 12.362901688s, submitted: 6
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:05:56.581363+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121585664 unmapped: 13238272 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:05:57.581515+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122150912 unmapped: 12673024 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:05:58.581673+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 123871232 unmapped: 10952704 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:05:59.581853+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f833b000/0x0/0x4ffc00000, data 0x3272b7a/0x3331000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 123904000 unmapped: 10919936 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1524459 data_alloc: 234881024 data_used: 21643264
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:06:00.582050+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 123904000 unmapped: 10919936 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:06:01.582200+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 123912192 unmapped: 10911744 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e3280000 session 0x5616e2cd8b40
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e2a99000 session 0x5616e311e960
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:06:02.582338+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e2caf800
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 123445248 unmapped: 11378688 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e2caf800 session 0x5616e2cd8780
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:06:03.582536+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 119570432 unmapped: 15253504 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:06:04.582648+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 119570432 unmapped: 15253504 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1370129 data_alloc: 234881024 data_used: 15147008
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f8f62000/0x0/0x4ffc00000, data 0x24b9af5/0x2576000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:06:05.582846+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 119570432 unmapped: 15253504 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:06:06.583055+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 119578624 unmapped: 15245312 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:06:07.583236+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 119578624 unmapped: 15245312 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:06:08.583343+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 119578624 unmapped: 15245312 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:06:09.583525+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 119578624 unmapped: 15245312 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1370129 data_alloc: 234881024 data_used: 15147008
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:06:10.583674+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f8f62000/0x0/0x4ffc00000, data 0x24b9af5/0x2576000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 119578624 unmapped: 15245312 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:06:11.583851+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 119586816 unmapped: 15237120 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e226c000 session 0x5616e34061e0
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 14.954436302s of 15.854330063s, submitted: 169
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e2271c00 session 0x5616e311a5a0
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e226d000
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f8f62000/0x0/0x4ffc00000, data 0x24b9af5/0x2576000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:06:12.583972+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e226d000 session 0x5616e1f052c0
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f8f62000/0x0/0x4ffc00000, data 0x24b9af5/0x2576000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117055488 unmapped: 17768448 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:06:13.584194+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117055488 unmapped: 17768448 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:06:14.584365+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117055488 unmapped: 17768448 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1239177 data_alloc: 234881024 data_used: 10301440
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:06:15.584559+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117055488 unmapped: 17768448 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:06:16.584721+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117055488 unmapped: 17768448 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9cae000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:06:17.584889+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9cae000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117055488 unmapped: 17768448 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:06:18.585049+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117055488 unmapped: 17768448 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:06:19.585235+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117055488 unmapped: 17768448 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1239177 data_alloc: 234881024 data_used: 10301440
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:06:20.585393+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117055488 unmapped: 17768448 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:06:21.585582+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117055488 unmapped: 17768448 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:06:22.585741+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9cae000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117055488 unmapped: 17768448 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:06:23.585935+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117055488 unmapped: 17768448 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:06:24.586084+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117055488 unmapped: 17768448 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1239177 data_alloc: 234881024 data_used: 10301440
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:06:25.586252+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117055488 unmapped: 17768448 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:06:26.586406+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9cae000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117055488 unmapped: 17768448 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9cae000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:06:27.586598+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117055488 unmapped: 17768448 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:06:28.586777+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117055488 unmapped: 17768448 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:06:29.586940+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117055488 unmapped: 17768448 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1239177 data_alloc: 234881024 data_used: 10301440
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:06:30.587104+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117055488 unmapped: 17768448 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:06:31.587303+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117055488 unmapped: 17768448 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e321c400
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 20.149909973s of 20.260728836s, submitted: 33
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e321c400 session 0x5616e04dd680
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:06:32.587479+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e2caec00
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e2caec00 session 0x5616e2797860
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e226c000
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e226c000 session 0x5616e263c960
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e226d000
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e226d000 session 0x5616e359af00
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e2271c00
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e2271c00 session 0x5616e04dc1e0
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9caf000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117145600 unmapped: 17678336 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:06:33.587684+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117145600 unmapped: 17678336 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:06:34.587908+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117145600 unmapped: 17678336 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1250106 data_alloc: 234881024 data_used: 10301440
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:06:35.588098+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117145600 unmapped: 17678336 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9b9b000/0x0/0x4ffc00000, data 0x1a16ac2/0x1ad1000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:06:36.588221+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117145600 unmapped: 17678336 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:06:37.588378+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9b9b000/0x0/0x4ffc00000, data 0x1a16ac2/0x1ad1000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117145600 unmapped: 17678336 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:06:38.588513+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e321e800
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117145600 unmapped: 17678336 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:06:39.588637+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116973568 unmapped: 17850368 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1250238 data_alloc: 234881024 data_used: 10301440
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:06:40.588771+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116973568 unmapped: 17850368 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:06:41.588888+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116973568 unmapped: 17850368 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:06:42.589038+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116973568 unmapped: 17850368 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9b9b000/0x0/0x4ffc00000, data 0x1a16ac2/0x1ad1000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:06:43.589208+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116973568 unmapped: 17850368 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:06:44.589374+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116973568 unmapped: 17850368 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1250238 data_alloc: 234881024 data_used: 10301440
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:06:45.589558+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116973568 unmapped: 17850368 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:06:46.589737+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116973568 unmapped: 17850368 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:06:47.589872+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116973568 unmapped: 17850368 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:06:48.590004+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9b9b000/0x0/0x4ffc00000, data 0x1a16ac2/0x1ad1000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116973568 unmapped: 17850368 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:06:49.590136+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 17.556289673s of 17.624111176s, submitted: 16
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116973568 unmapped: 17850368 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1255618 data_alloc: 234881024 data_used: 10326016
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:06:50.590342+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116989952 unmapped: 17833984 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9b34000/0x0/0x4ffc00000, data 0x1a7cac2/0x1b37000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:06:51.590478+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116998144 unmapped: 17825792 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:06:52.590616+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116998144 unmapped: 17825792 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:06:53.590784+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9b31000/0x0/0x4ffc00000, data 0x1a80ac2/0x1b3b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116998144 unmapped: 17825792 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:06:54.590900+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116998144 unmapped: 17825792 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1258448 data_alloc: 234881024 data_used: 10338304
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:06:55.591024+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116998144 unmapped: 17825792 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:06:56.591177+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116998144 unmapped: 17825792 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:06:57.591326+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116998144 unmapped: 17825792 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:06:58.591483+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116998144 unmapped: 17825792 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:06:59.591615+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9b31000/0x0/0x4ffc00000, data 0x1a80ac2/0x1b3b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116998144 unmapped: 17825792 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1258448 data_alloc: 234881024 data_used: 10338304
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:07:00.591831+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9b31000/0x0/0x4ffc00000, data 0x1a80ac2/0x1b3b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116998144 unmapped: 17825792 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:07:01.591966+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116998144 unmapped: 17825792 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:07:02.592123+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116998144 unmapped: 17825792 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:07:03.592394+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116998144 unmapped: 17825792 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:07:04.592582+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9b31000/0x0/0x4ffc00000, data 0x1a80ac2/0x1b3b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116998144 unmapped: 17825792 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9b31000/0x0/0x4ffc00000, data 0x1a80ac2/0x1b3b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1258448 data_alloc: 234881024 data_used: 10338304
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:07:05.592737+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116998144 unmapped: 17825792 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:07:06.592896+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9b31000/0x0/0x4ffc00000, data 0x1a80ac2/0x1b3b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116998144 unmapped: 17825792 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:07:07.593029+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e34d6000 session 0x5616e0f6e000
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e321a000 session 0x5616e337e5a0
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116998144 unmapped: 17825792 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:07:08.593231+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9b31000/0x0/0x4ffc00000, data 0x1a80ac2/0x1b3b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116998144 unmapped: 17825792 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:07:09.593363+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117006336 unmapped: 17817600 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1258448 data_alloc: 234881024 data_used: 10338304
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:07:10.593494+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e3283400 session 0x5616e2d70000
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e3282800 session 0x5616e34741e0
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117006336 unmapped: 17817600 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:07:11.593636+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117006336 unmapped: 17817600 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:07:12.593769+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117006336 unmapped: 17817600 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:07:13.593973+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117006336 unmapped: 17817600 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:07:14.594138+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9b31000/0x0/0x4ffc00000, data 0x1a80ac2/0x1b3b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117006336 unmapped: 17817600 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1258448 data_alloc: 234881024 data_used: 10338304
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:07:15.594329+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117006336 unmapped: 17817600 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:07:16.594458+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117006336 unmapped: 17817600 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:07:17.594582+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117006336 unmapped: 17817600 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:07:18.594813+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e3542800
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 29.186046600s of 29.217700958s, submitted: 7
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117006336 unmapped: 17817600 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:07:19.594940+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117006336 unmapped: 17817600 heap: 134823936 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:07:20.595075+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1258580 data_alloc: 234881024 data_used: 10338304
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9b31000/0x0/0x4ffc00000, data 0x1a80ac2/0x1b3b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e3283000
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 115892224 unmapped: 22609920 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e3283000 session 0x5616e313e780
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:07:21.595228+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e327cc00
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 115892224 unmapped: 22609920 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:07:22.595342+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 115892224 unmapped: 22609920 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:07:23.595481+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f94f5000/0x0/0x4ffc00000, data 0x20bcac2/0x2177000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 115908608 unmapped: 22593536 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:07:24.595693+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 115908608 unmapped: 22593536 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e2cac000
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:07:25.597376+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1310128 data_alloc: 234881024 data_used: 10338304
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 115908608 unmapped: 22593536 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:07:26.599742+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 115908608 unmapped: 22593536 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:07:27.601653+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e0540c00
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 115335168 unmapped: 23166976 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:07:28.605939+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e321cc00
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e321cc00 session 0x5616e3476780
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f94f5000/0x0/0x4ffc00000, data 0x20bcac2/0x2177000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e3543400
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e3543400 session 0x5616e33d3e00
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 115335168 unmapped: 23166976 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:07:29.606093+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e321cc00
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e321cc00 session 0x5616e33d3a40
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e3282800
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 10.488653183s of 10.665133476s, submitted: 14
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e3282800 session 0x5616e33d23c0
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 115343360 unmapped: 23158784 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:07:30.609037+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1315916 data_alloc: 234881024 data_used: 10338304
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e3283000
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 115343360 unmapped: 23158784 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:07:31.609220+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e3283400
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 115433472 unmapped: 23068672 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:07:32.609415+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117096448 unmapped: 21405696 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:07:33.609849+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f94f3000/0x0/0x4ffc00000, data 0x20bcaf5/0x2179000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117096448 unmapped: 21405696 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:07:34.610072+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f94f3000/0x0/0x4ffc00000, data 0x20bcaf5/0x2179000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117096448 unmapped: 21405696 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:07:35.610196+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1357250 data_alloc: 234881024 data_used: 16588800
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f94f3000/0x0/0x4ffc00000, data 0x20bcaf5/0x2179000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117096448 unmapped: 21405696 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:07:36.610366+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117096448 unmapped: 21405696 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:07:37.610529+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117096448 unmapped: 21405696 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:07:38.610776+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117096448 unmapped: 21405696 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:07:39.611098+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117096448 unmapped: 21405696 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:07:40.611334+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1357250 data_alloc: 234881024 data_used: 16588800
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117096448 unmapped: 21405696 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:07:41.611467+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f94f3000/0x0/0x4ffc00000, data 0x20bcaf5/0x2179000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 12.383611679s of 12.435736656s, submitted: 16
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121544704 unmapped: 16957440 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:07:42.611633+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122388480 unmapped: 16113664 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:07:43.611868+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122961920 unmapped: 15540224 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:07:44.612088+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122961920 unmapped: 15540224 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:07:45.612322+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1437520 data_alloc: 234881024 data_used: 17522688
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f8bae000/0x0/0x4ffc00000, data 0x2a00af5/0x2abd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122961920 unmapped: 15540224 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:07:46.612479+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f8bae000/0x0/0x4ffc00000, data 0x2a00af5/0x2abd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122961920 unmapped: 15540224 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:07:47.612645+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122970112 unmapped: 15532032 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:07:48.612843+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122224640 unmapped: 16277504 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:07:49.613008+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122224640 unmapped: 16277504 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:07:50.613229+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1435392 data_alloc: 234881024 data_used: 17522688
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122224640 unmapped: 16277504 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:07:51.613400+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f8b90000/0x0/0x4ffc00000, data 0x2a1faf5/0x2adc000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122224640 unmapped: 16277504 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:07:52.613576+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122224640 unmapped: 16277504 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:07:53.613916+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122224640 unmapped: 16277504 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:07:54.614297+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 12.516505241s of 13.009800911s, submitted: 89
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122429440 unmapped: 16072704 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:07:55.614581+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1435504 data_alloc: 234881024 data_used: 17522688
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122437632 unmapped: 16064512 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:07:56.614743+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122437632 unmapped: 16064512 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:07:57.615096+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f8b86000/0x0/0x4ffc00000, data 0x2a29af5/0x2ae6000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122445824 unmapped: 16056320 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:07:58.615227+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122445824 unmapped: 16056320 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:07:59.615365+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f8b86000/0x0/0x4ffc00000, data 0x2a29af5/0x2ae6000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122445824 unmapped: 16056320 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:08:00.615522+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1435504 data_alloc: 234881024 data_used: 17522688
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122445824 unmapped: 16056320 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:08:01.615663+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122445824 unmapped: 16056320 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:08:02.615828+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f8b86000/0x0/0x4ffc00000, data 0x2a29af5/0x2ae6000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122445824 unmapped: 16056320 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:08:03.616036+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f8b83000/0x0/0x4ffc00000, data 0x2a2caf5/0x2ae9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122445824 unmapped: 16056320 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:08:04.616263+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f8b83000/0x0/0x4ffc00000, data 0x2a2caf5/0x2ae9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:08:05.616403+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122445824 unmapped: 16056320 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1435592 data_alloc: 234881024 data_used: 17522688
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f8b83000/0x0/0x4ffc00000, data 0x2a2caf5/0x2ae9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:08:06.616556+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122445824 unmapped: 16056320 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:08:07.616675+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122445824 unmapped: 16056320 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f8b83000/0x0/0x4ffc00000, data 0x2a2caf5/0x2ae9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:08:08.616852+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122445824 unmapped: 16056320 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f8b83000/0x0/0x4ffc00000, data 0x2a2caf5/0x2ae9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:08:09.617055+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122454016 unmapped: 16048128 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:08:10.617192+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122454016 unmapped: 16048128 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1435592 data_alloc: 234881024 data_used: 17522688
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:08:11.617317+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122454016 unmapped: 16048128 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f8b83000/0x0/0x4ffc00000, data 0x2a2caf5/0x2ae9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:08:12.617447+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122454016 unmapped: 16048128 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:08:13.617650+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122454016 unmapped: 16048128 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:08:14.617787+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122454016 unmapped: 16048128 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 19.440578461s of 19.455564499s, submitted: 4
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:08:15.617925+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122454016 unmapped: 16048128 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1435592 data_alloc: 234881024 data_used: 17522688
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f8b83000/0x0/0x4ffc00000, data 0x2a2caf5/0x2ae9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:08:16.618076+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122454016 unmapped: 16048128 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f8b83000/0x0/0x4ffc00000, data 0x2a2caf5/0x2ae9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:08:17.618227+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122454016 unmapped: 16048128 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:08:18.618411+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122454016 unmapped: 16048128 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:08:19.618555+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122454016 unmapped: 16048128 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:08:20.618657+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122454016 unmapped: 16048128 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1435592 data_alloc: 234881024 data_used: 17522688
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f8b83000/0x0/0x4ffc00000, data 0x2a2caf5/0x2ae9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e3283400 session 0x5616e312e960
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e3283000 session 0x5616e37ceb40
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e0e07000
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:08:21.618818+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117563392 unmapped: 20938752 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e0e07000 session 0x5616e34065a0
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:08:22.619008+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9721000/0x0/0x4ffc00000, data 0x1a80ac2/0x1b3b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117563392 unmapped: 20938752 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:08:23.619220+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117563392 unmapped: 20938752 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:08:24.619314+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117563392 unmapped: 20938752 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9721000/0x0/0x4ffc00000, data 0x1a80ac2/0x1b3b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:08:25.619387+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117563392 unmapped: 20938752 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1268225 data_alloc: 234881024 data_used: 10338304
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:08:26.619495+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117563392 unmapped: 20938752 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:08:27.619632+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117563392 unmapped: 20938752 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9721000/0x0/0x4ffc00000, data 0x1a80ac2/0x1b3b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:08:28.619813+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117563392 unmapped: 20938752 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:08:29.620984+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117563392 unmapped: 20938752 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:08:30.622012+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117563392 unmapped: 20938752 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9721000/0x0/0x4ffc00000, data 0x1a80ac2/0x1b3b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1268225 data_alloc: 234881024 data_used: 10338304
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9721000/0x0/0x4ffc00000, data 0x1a80ac2/0x1b3b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:08:31.624209+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117563392 unmapped: 20938752 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 16.921215057s of 17.030107498s, submitted: 36
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e321e800 session 0x5616e3406d20
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e3542400
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e3542400 session 0x5616e34772c0
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:08:32.624393+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117579776 unmapped: 20922368 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:08:33.624616+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117579776 unmapped: 20922368 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:08:34.624785+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117579776 unmapped: 20922368 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:08:35.625711+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117579776 unmapped: 20922368 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1252711 data_alloc: 234881024 data_used: 10301440
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:08:36.625973+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117579776 unmapped: 20922368 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:08:37.626197+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117579776 unmapped: 20922368 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:08:38.626585+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117579776 unmapped: 20922368 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                           ** DB Stats **
                                           Uptime(secs): 1800.1 total, 600.0 interval
                                           Cumulative writes: 11K writes, 41K keys, 11K commit groups, 1.0 writes per commit group, ingest: 0.03 GB, 0.02 MB/s
                                           Cumulative WAL: 11K writes, 2812 syncs, 3.92 writes per sync, written: 0.03 GB, 0.02 MB/s
                                           Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                           Interval writes: 2196 writes, 6935 keys, 2196 commit groups, 1.0 writes per commit group, ingest: 6.82 MB, 0.01 MB/s
                                           Interval WAL: 2196 writes, 936 syncs, 2.35 writes per sync, written: 0.01 GB, 0.01 MB/s
                                           Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:08:39.626953+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117579776 unmapped: 20922368 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:08:40.627290+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117579776 unmapped: 20922368 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1252711 data_alloc: 234881024 data_used: 10301440
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:08:41.627580+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117579776 unmapped: 20922368 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:08:42.627791+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117579776 unmapped: 20922368 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:08:43.627944+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117579776 unmapped: 20922368 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:08:44.628233+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117579776 unmapped: 20922368 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:08:45.628472+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117579776 unmapped: 20922368 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1252711 data_alloc: 234881024 data_used: 10301440
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:08:46.628692+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117579776 unmapped: 20922368 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:08:47.628877+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117579776 unmapped: 20922368 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:08:48.629384+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117579776 unmapped: 20922368 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:08:49.629922+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117579776 unmapped: 20922368 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:08:50.630235+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117579776 unmapped: 20922368 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1252711 data_alloc: 234881024 data_used: 10301440
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e2617c00
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 19.137660980s of 19.212322235s, submitted: 21
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e2617c00 session 0x5616e312fe00
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:08:51.630430+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117604352 unmapped: 20897792 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:08:52.630668+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117604352 unmapped: 20897792 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:08:53.630929+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117604352 unmapped: 20897792 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:08:54.631109+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117604352 unmapped: 20897792 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:08:55.631275+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9516000/0x0/0x4ffc00000, data 0x1c8bac2/0x1d46000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117604352 unmapped: 20897792 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1282483 data_alloc: 234881024 data_used: 10301440
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e327d400
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e327d400 session 0x5616e34734a0
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:08:56.631455+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e10efc00
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e10efc00 session 0x5616e34ea3c0
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117604352 unmapped: 20897792 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9516000/0x0/0x4ffc00000, data 0x1c8bac2/0x1d46000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e2617c00
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e2617c00 session 0x5616e2d70780
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e321e800
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:08:57.631614+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e321e800 session 0x5616e1f05c20
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116129792 unmapped: 22372352 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e327d400
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:08:58.631853+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e2255c00
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116170752 unmapped: 22331392 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:08:59.632023+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116776960 unmapped: 21725184 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:09:00.632218+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e2255c00 session 0x5616e2d71680
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e327d400 session 0x5616e3476f00
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116776960 unmapped: 21725184 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1312446 data_alloc: 234881024 data_used: 14000128
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e3285000
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 10.036809921s of 10.113059044s, submitted: 15
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e3285000 session 0x5616e36881e0
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f94f1000/0x0/0x4ffc00000, data 0x1cafad2/0x1d6b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:09:01.632367+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116310016 unmapped: 22192128 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:09:02.632519+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116310016 unmapped: 22192128 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:09:03.632693+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116310016 unmapped: 22192128 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:09:04.632840+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116310016 unmapped: 22192128 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:09:05.633011+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116310016 unmapped: 22192128 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1258361 data_alloc: 234881024 data_used: 10301440
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:09:06.633193+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116310016 unmapped: 22192128 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:09:07.633366+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116310016 unmapped: 22192128 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:09:08.634205+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116310016 unmapped: 22192128 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e2619000 session 0x5616e33d2780
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e21ac400 session 0x5616e3494000
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:09:09.634340+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116310016 unmapped: 22192128 heap: 138502144 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e3283c00
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e3283c00 session 0x5616e3691c20
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e2270c00
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e2270c00 session 0x5616e34ea1e0
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e2a99800
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e2a99800 session 0x5616e337f0e0
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e21ac400
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e21ac400 session 0x5616e0fffc20
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e2270c00
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e2270c00 session 0x5616e05730e0
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e2619000
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e2619000 session 0x5616e2cd83c0
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e3283c00
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:09:10.634582+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e3283c00 session 0x5616e05a2780
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e21e5400
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e21e5400 session 0x5616dfb74b40
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e21ac400
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e21ac400 session 0x5616e311b680
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116105216 unmapped: 29745152 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1346877 data_alloc: 234881024 data_used: 10301440
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f8d84000/0x0/0x4ffc00000, data 0x241cad2/0x24d8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:09:11.634767+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116105216 unmapped: 29745152 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:09:12.634919+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116105216 unmapped: 29745152 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e2270c00
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e2270c00 session 0x5616e3476000
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:09:13.635061+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e3282c00
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e3282c00 session 0x5616e34761e0
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116105216 unmapped: 29745152 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e321ec00
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e321ec00 session 0x5616e359ab40
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e34d5800
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 13.106111526s of 13.270271301s, submitted: 38
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e34d5800 session 0x5616e33d2f00
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:09:14.635246+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116105216 unmapped: 29745152 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e21ac400
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e2270c00
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:09:15.635436+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116113408 unmapped: 29736960 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1348691 data_alloc: 234881024 data_used: 10301440
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:09:16.635624+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f8d83000/0x0/0x4ffc00000, data 0x241cae2/0x24d9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121683968 unmapped: 24166400 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:09:17.635777+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121708544 unmapped: 24141824 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:09:18.635977+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121708544 unmapped: 24141824 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e21ac400 session 0x5616e3404780
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e2270c00 session 0x5616e04dc960
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e2cae400
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:09:19.636118+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e2cae400 session 0x5616e311a000
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116850688 unmapped: 28999680 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:09:20.636262+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116850688 unmapped: 28999680 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1266122 data_alloc: 234881024 data_used: 10301440
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:09:21.636402+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116850688 unmapped: 28999680 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:09:22.636573+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116850688 unmapped: 28999680 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:09:23.636728+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116850688 unmapped: 28999680 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:09:24.636868+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116850688 unmapped: 28999680 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:09:25.637018+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116850688 unmapped: 28999680 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1266122 data_alloc: 234881024 data_used: 10301440
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:09:26.637238+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116850688 unmapped: 28999680 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:09:27.637409+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116850688 unmapped: 28999680 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:09:28.637610+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116850688 unmapped: 28999680 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:09:29.637802+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116850688 unmapped: 28999680 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:09:30.637965+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116850688 unmapped: 28999680 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1266122 data_alloc: 234881024 data_used: 10301440
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:09:31.638169+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116850688 unmapped: 28999680 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:09:32.638401+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116850688 unmapped: 28999680 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:09:33.638647+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116850688 unmapped: 28999680 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:09:34.638853+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116850688 unmapped: 28999680 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 20.711370468s of 20.742525101s, submitted: 12
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:09:35.639029+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117899264 unmapped: 27951104 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1265990 data_alloc: 234881024 data_used: 10301440
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:09:36.639220+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117899264 unmapped: 27951104 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:09:37.639414+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117899264 unmapped: 27951104 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:09:38.639610+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117899264 unmapped: 27951104 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:09:39.640068+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117899264 unmapped: 27951104 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:09:40.640267+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117899264 unmapped: 27951104 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1265990 data_alloc: 234881024 data_used: 10301440
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:09:41.640451+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117899264 unmapped: 27951104 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:09:42.640641+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117899264 unmapped: 27951104 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:09:43.640837+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 117899264 unmapped: 27951104 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e327fc00
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e327fc00 session 0x5616e312d680
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:09:44.641022+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116711424 unmapped: 29138944 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:09:45.641204+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116711424 unmapped: 29138944 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1298342 data_alloc: 234881024 data_used: 10301440
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e0541400 session 0x5616dfb705a0
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e0541400
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:09:46.641385+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e0541000 session 0x5616dfb703c0
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e21ac400
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116711424 unmapped: 29138944 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f961c000/0x0/0x4ffc00000, data 0x1b85ac2/0x1c40000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:09:47.641547+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e11f9800 session 0x5616e311b4a0
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e0541000
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116719616 unmapped: 29130752 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:09:48.641739+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116719616 unmapped: 29130752 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:09:49.641917+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e34d4800
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 14.859712601s of 14.898717880s, submitted: 17
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e34d4800 session 0x5616e04dde00
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116957184 unmapped: 28893184 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e3220400
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e3284c00
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:09:50.642077+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116957184 unmapped: 28893184 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1301202 data_alloc: 234881024 data_used: 10452992
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:09:51.642202+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f95f8000/0x0/0x4ffc00000, data 0x1ba9ac2/0x1c64000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116957184 unmapped: 28893184 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:09:52.642332+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116957184 unmapped: 28893184 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:09:53.642510+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116957184 unmapped: 28893184 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:09:54.642638+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116957184 unmapped: 28893184 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:09:55.642785+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116490240 unmapped: 29360128 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1308194 data_alloc: 234881024 data_used: 11509760
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:09:56.643002+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f95f8000/0x0/0x4ffc00000, data 0x1ba9ac2/0x1c64000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [0,0,1])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116695040 unmapped: 29155328 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:09:57.643209+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116858880 unmapped: 28991488 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:09:58.643364+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116858880 unmapped: 28991488 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:09:59.643553+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f95f8000/0x0/0x4ffc00000, data 0x1ba9ac2/0x1c64000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116867072 unmapped: 28983296 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:10:00.643710+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 116817920 unmapped: 29032448 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 9.749152184s of 11.151672363s, submitted: 349
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1311098 data_alloc: 234881024 data_used: 11579392
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:10:01.643884+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 119087104 unmapped: 26763264 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:10:02.644067+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 120037376 unmapped: 25812992 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:10:03.644266+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 120037376 unmapped: 25812992 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:10:04.644427+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 120037376 unmapped: 25812992 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f8920000/0x0/0x4ffc00000, data 0x286aac2/0x2925000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:10:05.644585+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 120037376 unmapped: 25812992 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1422902 data_alloc: 234881024 data_used: 11780096
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:10:06.644756+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 120037376 unmapped: 25812992 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:10:07.644952+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f8920000/0x0/0x4ffc00000, data 0x286aac2/0x2925000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 119365632 unmapped: 26484736 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:10:08.645189+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 119365632 unmapped: 26484736 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:10:09.645374+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 119365632 unmapped: 26484736 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:10:10.645544+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 119365632 unmapped: 26484736 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1414566 data_alloc: 234881024 data_used: 11788288
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:10:11.645695+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 119365632 unmapped: 26484736 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:10:12.645843+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 119365632 unmapped: 26484736 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 12.005815506s of 12.341868401s, submitted: 120
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:10:13.646092+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f8903000/0x0/0x4ffc00000, data 0x289eac2/0x2959000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 119480320 unmapped: 26370048 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:10:14.646275+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e3220400 session 0x5616e04dd680
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e3284c00 session 0x5616e3477680
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 119480320 unmapped: 26370048 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e108b400
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:10:15.646481+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e108b400 session 0x5616dfb634a0
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 119390208 unmapped: 26460160 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f93e4000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1279966 data_alloc: 234881024 data_used: 10301440
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:10:16.646673+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 119390208 unmapped: 26460160 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:10:17.646804+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 119390208 unmapped: 26460160 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:10:18.646970+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 119390208 unmapped: 26460160 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:10:19.647185+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 119390208 unmapped: 26460160 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:10:20.647398+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f93e4000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 119390208 unmapped: 26460160 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1279966 data_alloc: 234881024 data_used: 10301440
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:10:21.647580+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 119390208 unmapped: 26460160 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:10:22.647733+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 119390208 unmapped: 26460160 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:10:23.647970+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 119390208 unmapped: 26460160 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:10:24.648156+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f93e4000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 119390208 unmapped: 26460160 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f93e4000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:10:25.648308+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 119390208 unmapped: 26460160 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1279966 data_alloc: 234881024 data_used: 10301440
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:10:26.648493+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 119390208 unmapped: 26460160 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f93e4000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:10:27.648685+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 119398400 unmapped: 26451968 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:10:28.648853+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 119398400 unmapped: 26451968 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:10:29.648995+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f93e4000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 119406592 unmapped: 26443776 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:10:30.649216+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 119406592 unmapped: 26443776 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1279966 data_alloc: 234881024 data_used: 10301440
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:10:31.649387+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 119406592 unmapped: 26443776 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:10:32.649548+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 119406592 unmapped: 26443776 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:10:33.649904+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e2618800
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e2618800 session 0x5616e312d0e0
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e108b400
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e108b400 session 0x5616e312fc20
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e3220400
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e3220400 session 0x5616e34725a0
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e3284c00
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e3284c00 session 0x5616e312c780
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 119422976 unmapped: 26427392 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e34d4800
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 20.763683319s of 20.829357147s, submitted: 19
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e34d4800 session 0x5616e3477c20
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:10:34.650059+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9338000/0x0/0x4ffc00000, data 0x1e69ac2/0x1f24000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e11f9c00 session 0x5616e05732c0
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e108b400
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 119709696 unmapped: 26140672 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:10:35.650385+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 119717888 unmapped: 26132480 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1324070 data_alloc: 234881024 data_used: 10301440
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9338000/0x0/0x4ffc00000, data 0x1e69ac2/0x1f24000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:10:36.650518+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9338000/0x0/0x4ffc00000, data 0x1e69ac2/0x1f24000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 119717888 unmapped: 26132480 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:10:37.679962+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 119717888 unmapped: 26132480 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:10:38.680196+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 119717888 unmapped: 26132480 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9338000/0x0/0x4ffc00000, data 0x1e69ac2/0x1f24000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e34d7000
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e34d7000 session 0x5616e04dc960
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:10:39.680352+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e34d5000
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e327a400
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 120020992 unmapped: 25829376 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:10:40.680569+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 119808000 unmapped: 26042368 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1330608 data_alloc: 234881024 data_used: 11010048
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:10:41.680715+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121479168 unmapped: 24371200 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:10:42.680886+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121479168 unmapped: 24371200 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9314000/0x0/0x4ffc00000, data 0x1e8dac2/0x1f48000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:10:43.681072+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121479168 unmapped: 24371200 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:10:44.681219+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121479168 unmapped: 24371200 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:10:45.681379+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121479168 unmapped: 24371200 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1356296 data_alloc: 234881024 data_used: 14831616
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:10:46.681577+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121479168 unmapped: 24371200 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9314000/0x0/0x4ffc00000, data 0x1e8dac2/0x1f48000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:10:47.681732+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121479168 unmapped: 24371200 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9314000/0x0/0x4ffc00000, data 0x1e8dac2/0x1f48000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:10:48.681859+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121479168 unmapped: 24371200 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:10:49.682006+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121495552 unmapped: 24354816 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:10:50.682170+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 16.424659729s of 16.518987656s, submitted: 14
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e3281400
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e3281400 session 0x5616e0d6ed20
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 123994112 unmapped: 21856256 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1453094 data_alloc: 234881024 data_used: 15106048
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:10:51.682319+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 124026880 unmapped: 21823488 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:10:52.682479+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f87f4000/0x0/0x4ffc00000, data 0x29abac2/0x2a66000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 123879424 unmapped: 21970944 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:10:53.682705+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 123879424 unmapped: 21970944 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:10:54.682884+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 123879424 unmapped: 21970944 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:10:55.683075+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 123879424 unmapped: 21970944 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1459086 data_alloc: 234881024 data_used: 15007744
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:10:56.683249+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f87e6000/0x0/0x4ffc00000, data 0x29b3ac2/0x2a6e000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e0540c00 session 0x5616e0cdb0e0
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e327cc00 session 0x5616e278a780
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 123879424 unmapped: 21970944 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:10:57.683446+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 123879424 unmapped: 21970944 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:10:58.683617+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 123666432 unmapped: 22183936 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e2616400
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e2616400 session 0x5616e36910e0
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:10:59.683855+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e0540c00
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e327cc00
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 123813888 unmapped: 22036480 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:11:00.684020+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 124370944 unmapped: 21479424 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1466042 data_alloc: 234881024 data_used: 16191488
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f87c7000/0x0/0x4ffc00000, data 0x29daac2/0x2a95000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:11:01.684270+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 124370944 unmapped: 21479424 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:11:02.684474+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 124370944 unmapped: 21479424 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:11:03.684682+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 124370944 unmapped: 21479424 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:11:04.684919+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 124370944 unmapped: 21479424 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:11:05.685100+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 124370944 unmapped: 21479424 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1466042 data_alloc: 234881024 data_used: 16191488
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:11:06.685276+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 124370944 unmapped: 21479424 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f87c7000/0x0/0x4ffc00000, data 0x29daac2/0x2a95000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:11:07.685459+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 124403712 unmapped: 21446656 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:11:08.685635+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 124403712 unmapped: 21446656 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:11:09.685769+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 124403712 unmapped: 21446656 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:11:10.685918+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 19.643545151s of 19.996114731s, submitted: 90
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 124805120 unmapped: 21045248 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1548442 data_alloc: 234881024 data_used: 16363520
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:11:11.686083+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 124837888 unmapped: 21012480 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:11:12.686241+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f7d82000/0x0/0x4ffc00000, data 0x3417ac2/0x34d2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [0,0,1])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 124469248 unmapped: 21381120 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:11:13.686448+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e2cac000 session 0x5616e1f04b40
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e3542800 session 0x5616e1ec81e0
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 123109376 unmapped: 22740992 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:11:14.686712+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 123109376 unmapped: 22740992 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:11:15.686923+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 123109376 unmapped: 22740992 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1555186 data_alloc: 234881024 data_used: 16371712
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:11:16.687059+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 123109376 unmapped: 22740992 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:11:17.687249+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 123117568 unmapped: 22732800 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f7d01000/0x0/0x4ffc00000, data 0x34a0ac2/0x355b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:11:18.687379+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 123248640 unmapped: 22601728 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:11:19.687537+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 123248640 unmapped: 22601728 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:11:20.687676+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f7ce0000/0x0/0x4ffc00000, data 0x34c1ac2/0x357c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 123248640 unmapped: 22601728 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1555554 data_alloc: 234881024 data_used: 16371712
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 9.561103821s of 10.734168053s, submitted: 73
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:11:21.687830+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 123248640 unmapped: 22601728 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f7ce0000/0x0/0x4ffc00000, data 0x34c1ac2/0x357c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:11:22.687977+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 123248640 unmapped: 22601728 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:11:23.688200+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 123256832 unmapped: 22593536 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:11:24.688321+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f7ce0000/0x0/0x4ffc00000, data 0x34c1ac2/0x357c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 123256832 unmapped: 22593536 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e3222c00
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:11:25.688441+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 123256832 unmapped: 22593536 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1555386 data_alloc: 234881024 data_used: 16371712
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:11:26.688583+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f7ce0000/0x0/0x4ffc00000, data 0x34c1ac2/0x357c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 123256832 unmapped: 22593536 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:11:27.688701+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 123256832 unmapped: 22593536 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:11:28.688846+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 123256832 unmapped: 22593536 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:11:29.688965+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 123256832 unmapped: 22593536 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:11:30.689084+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 123256832 unmapped: 22593536 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1555642 data_alloc: 234881024 data_used: 16371712
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:11:31.689267+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e34d6800
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 9.766269684s of 10.206396103s, submitted: 5
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e0540c00 session 0x5616e312c3c0
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e327cc00 session 0x5616e34734a0
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e321b000
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 123273216 unmapped: 22577152 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:11:32.689391+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e321b000 session 0x5616e0cdb680
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f8a68000/0x0/0x4ffc00000, data 0x2739ac2/0x27f4000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 123273216 unmapped: 22577152 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:11:33.689594+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 123273216 unmapped: 22577152 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:11:34.689746+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f8a68000/0x0/0x4ffc00000, data 0x2739ac2/0x27f4000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 123273216 unmapped: 22577152 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:11:35.689898+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f8a68000/0x0/0x4ffc00000, data 0x2739ac2/0x27f4000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e34d5000 session 0x5616e04dde00
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e327a400 session 0x5616e1f04780
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 123273216 unmapped: 22577152 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1439208 data_alloc: 234881024 data_used: 15007744
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:11:36.690011+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e2618c00
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e2618c00 session 0x5616e311b860
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121823232 unmapped: 24027136 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:11:37.690245+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121823232 unmapped: 24027136 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:11:38.690414+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121831424 unmapped: 24018944 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:11:39.690551+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121831424 unmapped: 24018944 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:11:40.690661+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121831424 unmapped: 24018944 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1299793 data_alloc: 234881024 data_used: 10301440
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:11:41.690853+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121831424 unmapped: 24018944 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:11:42.691134+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121831424 unmapped: 24018944 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:11:43.691404+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121831424 unmapped: 24018944 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:11:44.691542+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121831424 unmapped: 24018944 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:11:45.691689+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121831424 unmapped: 24018944 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1299793 data_alloc: 234881024 data_used: 10301440
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:11:46.691844+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121831424 unmapped: 24018944 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:11:47.691988+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: mgrc ms_handle_reset ms_handle_reset con 0x5616dff2cc00
Feb 02 10:24:41 compute-1 ceph-osd[77691]: mgrc reconnect Terminating session with v2:192.168.122.100:6800/1282799344
Feb 02 10:24:41 compute-1 ceph-osd[77691]: mgrc reconnect Starting new session with [v2:192.168.122.100:6800/1282799344,v1:192.168.122.100:6801/1282799344]
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: get_auth_request con 0x5616e2616400 auth_method 0
Feb 02 10:24:41 compute-1 ceph-osd[77691]: mgrc handle_mgr_configure stats_period=5
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121831424 unmapped: 24018944 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:11:48.692200+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121831424 unmapped: 24018944 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:11:49.692326+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121839616 unmapped: 24010752 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:11:50.692461+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121839616 unmapped: 24010752 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1299793 data_alloc: 234881024 data_used: 10301440
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:11:51.692661+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121839616 unmapped: 24010752 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:11:52.692814+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121839616 unmapped: 24010752 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:11:53.693083+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121839616 unmapped: 24010752 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:11:54.693248+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121839616 unmapped: 24010752 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:11:55.693381+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121839616 unmapped: 24010752 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:11:56.693564+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1299793 data_alloc: 234881024 data_used: 10301440
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121839616 unmapped: 24010752 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:11:57.693726+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121839616 unmapped: 24010752 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:11:58.693939+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121839616 unmapped: 24010752 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:11:59.694186+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121839616 unmapped: 24010752 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:12:00.694370+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121839616 unmapped: 24010752 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:12:01.694527+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1299793 data_alloc: 234881024 data_used: 10301440
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121839616 unmapped: 24010752 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:12:02.694668+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121847808 unmapped: 24002560 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:12:03.694855+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121847808 unmapped: 24002560 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:12:04.695023+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121847808 unmapped: 24002560 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:12:05.695228+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e34d7c00
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e34d7c00 session 0x5616e312cd20
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e226a400
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e226a400 session 0x5616e1f04d20
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e2618c00
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e2618c00 session 0x5616dfb70b40
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e327a400
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e327a400 session 0x5616e1f7a780
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e34d5000
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 34.408138275s of 34.488780975s, submitted: 26
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e34d5000 session 0x5616e34eba40
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121438208 unmapped: 24412160 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:12:06.695400+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1325281 data_alloc: 234881024 data_used: 10301440
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9634000/0x0/0x4ffc00000, data 0x1b6dac2/0x1c28000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121438208 unmapped: 24412160 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:12:07.695545+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121438208 unmapped: 24412160 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:12:08.695687+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121438208 unmapped: 24412160 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:12:09.695905+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9634000/0x0/0x4ffc00000, data 0x1b6dac2/0x1c28000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121438208 unmapped: 24412160 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:12:10.696080+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121438208 unmapped: 24412160 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:12:11.696249+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1325281 data_alloc: 234881024 data_used: 10301440
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9634000/0x0/0x4ffc00000, data 0x1b6dac2/0x1c28000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121446400 unmapped: 24403968 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9634000/0x0/0x4ffc00000, data 0x1b6dac2/0x1c28000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:12:12.696387+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e327f400
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e327f400 session 0x5616e312c780
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e21e5800
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e226c400
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121454592 unmapped: 24395776 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:12:13.696640+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9610000/0x0/0x4ffc00000, data 0x1b91ac2/0x1c4c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121421824 unmapped: 24428544 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:12:14.696778+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121421824 unmapped: 24428544 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:12:15.696983+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121421824 unmapped: 24428544 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:12:16.697180+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1337509 data_alloc: 234881024 data_used: 11628544
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121421824 unmapped: 24428544 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:12:17.697303+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121421824 unmapped: 24428544 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:12:18.697574+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9610000/0x0/0x4ffc00000, data 0x1b91ac2/0x1c4c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9610000/0x0/0x4ffc00000, data 0x1b91ac2/0x1c4c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121421824 unmapped: 24428544 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:12:19.697725+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121421824 unmapped: 24428544 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:12:20.697882+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121421824 unmapped: 24428544 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:12:21.698034+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1337509 data_alloc: 234881024 data_used: 11628544
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121421824 unmapped: 24428544 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:12:22.698207+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121421824 unmapped: 24428544 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:12:23.698368+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 17.755592346s of 17.786962509s, submitted: 13
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f9610000/0x0/0x4ffc00000, data 0x1b91ac2/0x1c4c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [0,0,0,0,1])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:12:24.698505+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 124960768 unmapped: 20889600 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:12:25.698668+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 125386752 unmapped: 20463616 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:12:26.698850+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 125386752 unmapped: 20463616 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1441767 data_alloc: 234881024 data_used: 13156352
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f8a1a000/0x0/0x4ffc00000, data 0x2786ac2/0x2841000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:12:27.698986+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 125386752 unmapped: 20463616 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:12:28.699139+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 125476864 unmapped: 20373504 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:12:29.699305+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 125476864 unmapped: 20373504 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:12:30.699499+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 125476864 unmapped: 20373504 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:12:31.699717+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 123789312 unmapped: 22061056 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1434471 data_alloc: 234881024 data_used: 13156352
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f8a18000/0x0/0x4ffc00000, data 0x2789ac2/0x2844000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:12:32.699889+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 123789312 unmapped: 22061056 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:12:33.700055+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 123789312 unmapped: 22061056 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:12:34.700221+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 123789312 unmapped: 22061056 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e21e5800 session 0x5616e312c3c0
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e226c400 session 0x5616e312cd20
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: handle_auth_request added challenge on 0x5616e21e5800
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 11.164107323s of 11.421627998s, submitted: 102
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:12:35.700356+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122593280 unmapped: 23257088 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 ms_handle_reset con 0x5616e21e5800 session 0x5616e359a5a0
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:12:36.700524+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122601472 unmapped: 23248896 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 234881024 data_used: 10301440
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:12:37.700724+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122601472 unmapped: 23248896 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:12:38.700880+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122601472 unmapped: 23248896 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:12:39.701074+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122601472 unmapped: 23248896 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:12:40.701262+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122601472 unmapped: 23248896 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:12:41.701385+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122601472 unmapped: 23248896 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 234881024 data_used: 10301440
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:12:42.701514+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122601472 unmapped: 23248896 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:12:43.701685+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122601472 unmapped: 23248896 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:12:44.701896+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122601472 unmapped: 23248896 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:12:45.702081+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122601472 unmapped: 23248896 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:12:46.702238+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122601472 unmapped: 23248896 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 234881024 data_used: 10301440
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:12:47.702354+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122601472 unmapped: 23248896 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:12:48.702481+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122601472 unmapped: 23248896 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:12:49.702666+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122601472 unmapped: 23248896 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:12:50.702809+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122601472 unmapped: 23248896 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:12:51.703007+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122601472 unmapped: 23248896 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 234881024 data_used: 10301440
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:12:52.703187+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122601472 unmapped: 23248896 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:12:53.703392+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122601472 unmapped: 23248896 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:12:54.703576+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122601472 unmapped: 23248896 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:12:55.703725+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122601472 unmapped: 23248896 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:12:56.703910+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122601472 unmapped: 23248896 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 234881024 data_used: 10301440
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:12:57.704083+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121839616 unmapped: 24010752 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:12:58.704268+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121839616 unmapped: 24010752 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:12:59.704420+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121839616 unmapped: 24010752 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:13:00.704537+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121839616 unmapped: 24010752 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:13:01.704705+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121839616 unmapped: 24010752 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 234881024 data_used: 10301440
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:13:02.704825+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121839616 unmapped: 24010752 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:13:03.704998+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121839616 unmapped: 24010752 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:13:04.705213+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121839616 unmapped: 24010752 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:13:05.705411+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121839616 unmapped: 24010752 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:13:06.705586+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121839616 unmapped: 24010752 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 234881024 data_used: 10301440
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:13:07.705767+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121839616 unmapped: 24010752 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:13:08.705937+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121847808 unmapped: 24002560 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:13:09.706065+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121847808 unmapped: 24002560 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:13:10.706191+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121847808 unmapped: 24002560 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:13:11.706376+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121847808 unmapped: 24002560 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 234881024 data_used: 10301440
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:13:12.706541+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121847808 unmapped: 24002560 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:13:13.706763+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121847808 unmapped: 24002560 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:13:14.706957+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121847808 unmapped: 24002560 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:13:15.707119+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121847808 unmapped: 24002560 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:13:16.707271+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121847808 unmapped: 24002560 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 234881024 data_used: 10301440
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:13:17.707437+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121847808 unmapped: 24002560 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:13:18.707640+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121856000 unmapped: 23994368 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:13:19.707802+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121856000 unmapped: 23994368 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:13:20.707947+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121856000 unmapped: 23994368 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:13:21.708058+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121856000 unmapped: 23994368 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 234881024 data_used: 10301440
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:13:22.708215+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121856000 unmapped: 23994368 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:13:23.708431+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121856000 unmapped: 23994368 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:13:24.708635+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121856000 unmapped: 23994368 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:13:25.708788+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121856000 unmapped: 23994368 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:13:26.708906+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121856000 unmapped: 23994368 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 234881024 data_used: 10301440
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:13:27.709064+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121856000 unmapped: 23994368 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:13:28.709213+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121856000 unmapped: 23994368 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:13:29.709372+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121856000 unmapped: 23994368 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:13:30.709470+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121856000 unmapped: 23994368 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:13:31.709689+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121864192 unmapped: 23986176 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 234881024 data_used: 10301440
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:13:32.709899+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121864192 unmapped: 23986176 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:13:33.710080+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121864192 unmapped: 23986176 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:13:34.710243+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121864192 unmapped: 23986176 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:13:35.710400+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121864192 unmapped: 23986176 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:13:36.710568+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121864192 unmapped: 23986176 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 234881024 data_used: 10301440
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:13:37.710700+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121864192 unmapped: 23986176 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:13:38.710846+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121864192 unmapped: 23986176 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:13:39.710979+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121864192 unmapped: 23986176 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:13:40.711102+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121864192 unmapped: 23986176 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:13:41.711256+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121864192 unmapped: 23986176 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 234881024 data_used: 10301440
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:13:42.711421+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121864192 unmapped: 23986176 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:13:43.711605+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121864192 unmapped: 23986176 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:13:44.711792+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121864192 unmapped: 23986176 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:13:45.711899+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121864192 unmapped: 23986176 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:13:46.712111+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121864192 unmapped: 23986176 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 234881024 data_used: 10301440
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:13:47.712250+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121872384 unmapped: 23977984 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:13:48.712363+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121872384 unmapped: 23977984 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:13:49.712592+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121872384 unmapped: 23977984 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:13:50.712716+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121872384 unmapped: 23977984 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:13:51.712858+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: do_command 'config diff' '{prefix=config diff}'
Feb 02 10:24:41 compute-1 ceph-osd[77691]: do_command 'config diff' '{prefix=config diff}' result is 0 bytes
Feb 02 10:24:41 compute-1 ceph-osd[77691]: do_command 'config show' '{prefix=config show}'
Feb 02 10:24:41 compute-1 ceph-osd[77691]: do_command 'config show' '{prefix=config show}' result is 0 bytes
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122142720 unmapped: 23707648 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 234881024 data_used: 10301440
Feb 02 10:24:41 compute-1 ceph-osd[77691]: do_command 'counter dump' '{prefix=counter dump}'
Feb 02 10:24:41 compute-1 ceph-osd[77691]: do_command 'counter dump' '{prefix=counter dump}' result is 0 bytes
Feb 02 10:24:41 compute-1 ceph-osd[77691]: do_command 'counter schema' '{prefix=counter schema}'
Feb 02 10:24:41 compute-1 ceph-osd[77691]: do_command 'counter schema' '{prefix=counter schema}' result is 0 bytes
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:13:52.713120+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121544704 unmapped: 24305664 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:13:53.713311+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121421824 unmapped: 24428544 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: do_command 'log dump' '{prefix=log dump}'
Feb 02 10:24:41 compute-1 ceph-osd[77691]: do_command 'log dump' '{prefix=log dump}' result is 0 bytes
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:13:54.713631+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: do_command 'perf dump' '{prefix=perf dump}'
Feb 02 10:24:41 compute-1 ceph-osd[77691]: do_command 'perf dump' '{prefix=perf dump}' result is 0 bytes
Feb 02 10:24:41 compute-1 ceph-osd[77691]: do_command 'perf histogram dump' '{prefix=perf histogram dump}'
Feb 02 10:24:41 compute-1 ceph-osd[77691]: do_command 'perf histogram dump' '{prefix=perf histogram dump}' result is 0 bytes
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: do_command 'perf schema' '{prefix=perf schema}'
Feb 02 10:24:41 compute-1 ceph-osd[77691]: do_command 'perf schema' '{prefix=perf schema}' result is 0 bytes
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121667584 unmapped: 24182784 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:13:55.713768+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121716736 unmapped: 24133632 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:13:56.713923+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121716736 unmapped: 24133632 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 234881024 data_used: 10301440
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:13:57.714042+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121716736 unmapped: 24133632 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:13:58.714203+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121716736 unmapped: 24133632 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:13:59.714421+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121716736 unmapped: 24133632 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:14:00.714591+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121716736 unmapped: 24133632 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:14:01.714721+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121716736 unmapped: 24133632 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 234881024 data_used: 10301440
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:14:02.714958+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121716736 unmapped: 24133632 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:14:03.715190+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121716736 unmapped: 24133632 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:14:04.715294+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121716736 unmapped: 24133632 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:14:05.715401+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121716736 unmapped: 24133632 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:14:06.715538+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121716736 unmapped: 24133632 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 234881024 data_used: 10301440
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:14:07.715654+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121716736 unmapped: 24133632 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:14:08.715767+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121716736 unmapped: 24133632 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:14:09.715901+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121716736 unmapped: 24133632 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:14:10.716023+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121716736 unmapped: 24133632 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:14:11.716179+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121724928 unmapped: 24125440 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 234881024 data_used: 10301440
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:14:12.716340+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121724928 unmapped: 24125440 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:14:13.716639+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121724928 unmapped: 24125440 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:14:14.716786+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121724928 unmapped: 24125440 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:14:15.716924+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121724928 unmapped: 24125440 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:14:16.717040+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121724928 unmapped: 24125440 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 234881024 data_used: 10301440
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:14:17.717180+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121724928 unmapped: 24125440 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:14:18.717303+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121733120 unmapped: 24117248 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:14:19.717439+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121741312 unmapped: 24109056 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:14:20.717596+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121749504 unmapped: 24100864 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:14:21.717768+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121749504 unmapped: 24100864 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 234881024 data_used: 10301440
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:14:22.717957+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121749504 unmapped: 24100864 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:14:23.718356+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121749504 unmapped: 24100864 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:14:24.718888+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121749504 unmapped: 24100864 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:14:25.719099+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121749504 unmapped: 24100864 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:14:26.719391+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121749504 unmapped: 24100864 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 234881024 data_used: 10301440
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:14:27.719595+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121749504 unmapped: 24100864 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:14:28.720631+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121749504 unmapped: 24100864 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:14:29.720763+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121749504 unmapped: 24100864 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:14:30.721556+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121749504 unmapped: 24100864 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:14:31.721706+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121749504 unmapped: 24100864 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 234881024 data_used: 10301440
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:14:32.722019+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121749504 unmapped: 24100864 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:14:33.722255+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121749504 unmapped: 24100864 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:14:34.722745+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121749504 unmapped: 24100864 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:14:35.722906+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121757696 unmapped: 24092672 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:14:36.723185+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121757696 unmapped: 24092672 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 234881024 data_used: 10301440
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:14:37.723338+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121757696 unmapped: 24092672 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:14:38.723706+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121757696 unmapped: 24092672 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:14:39.723853+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121757696 unmapped: 24092672 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:14:40.724180+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121757696 unmapped: 24092672 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:14:41.724298+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121757696 unmapped: 24092672 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 234881024 data_used: 10301440
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:14:42.724595+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121757696 unmapped: 24092672 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:14:43.724878+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121765888 unmapped: 24084480 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:14:44.725212+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121765888 unmapped: 24084480 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:14:45.725466+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121765888 unmapped: 24084480 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:14:46.725613+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121765888 unmapped: 24084480 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 234881024 data_used: 10301440
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:14:47.725857+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121765888 unmapped: 24084480 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:14:48.726203+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121765888 unmapped: 24084480 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:14:49.726486+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121765888 unmapped: 24084480 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:14:50.726797+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121765888 unmapped: 24084480 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:14:51.727008+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121765888 unmapped: 24084480 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 234881024 data_used: 10301440
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:14:52.727215+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121765888 unmapped: 24084480 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:14:53.727482+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121774080 unmapped: 24076288 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:14:54.727774+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121774080 unmapped: 24076288 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:14:55.728365+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121774080 unmapped: 24076288 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:14:56.729700+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121774080 unmapped: 24076288 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 234881024 data_used: 10301440
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:14:57.730604+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121774080 unmapped: 24076288 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:14:58.731615+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121774080 unmapped: 24076288 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:14:59.732107+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121774080 unmapped: 24076288 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:15:00.732848+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121774080 unmapped: 24076288 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:15:01.733513+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121774080 unmapped: 24076288 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 234881024 data_used: 10301440
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:15:02.734040+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121774080 unmapped: 24076288 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:15:03.734486+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121774080 unmapped: 24076288 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:15:04.734980+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121774080 unmapped: 24076288 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:15:05.735420+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121774080 unmapped: 24076288 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:15:06.735826+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121774080 unmapped: 24076288 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 234881024 data_used: 10301440
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:15:07.736135+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121782272 unmapped: 24068096 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:15:08.736507+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121782272 unmapped: 24068096 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:15:09.736790+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121782272 unmapped: 24068096 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:15:10.737053+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121782272 unmapped: 24068096 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:15:11.737209+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121782272 unmapped: 24068096 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 234881024 data_used: 10301440
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:15:12.737563+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121782272 unmapped: 24068096 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:15:13.737779+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121782272 unmapped: 24068096 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:15:14.737989+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121782272 unmapped: 24068096 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:15:15.738212+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121782272 unmapped: 24068096 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:15:16.738476+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121782272 unmapped: 24068096 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 234881024 data_used: 10301440
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:15:17.738726+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121782272 unmapped: 24068096 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:15:18.739055+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121782272 unmapped: 24068096 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:15:19.739262+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121782272 unmapped: 24068096 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:15:20.739452+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121790464 unmapped: 24059904 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:15:21.739623+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121790464 unmapped: 24059904 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 234881024 data_used: 10301440
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:15:22.739870+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121790464 unmapped: 24059904 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:15:23.740099+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121790464 unmapped: 24059904 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:15:24.740358+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121790464 unmapped: 24059904 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:15:25.740510+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121790464 unmapped: 24059904 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:15:26.740803+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121790464 unmapped: 24059904 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 234881024 data_used: 10301440
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:15:27.740971+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121790464 unmapped: 24059904 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:15:28.741094+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121790464 unmapped: 24059904 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:15:29.741228+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121790464 unmapped: 24059904 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:15:30.741340+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121790464 unmapped: 24059904 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:15:31.741503+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121790464 unmapped: 24059904 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 234881024 data_used: 10301440
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:15:32.741641+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121798656 unmapped: 24051712 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:15:33.741811+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121798656 unmapped: 24051712 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:15:34.741968+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121798656 unmapped: 24051712 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:15:35.742092+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121798656 unmapped: 24051712 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:15:36.742242+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121798656 unmapped: 24051712 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 234881024 data_used: 10301440
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:15:37.742406+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121798656 unmapped: 24051712 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:15:38.742547+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121798656 unmapped: 24051712 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:15:39.742687+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121798656 unmapped: 24051712 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:15:40.742887+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121798656 unmapped: 24051712 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:15:41.743093+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121798656 unmapped: 24051712 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 234881024 data_used: 10301440
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:15:42.743300+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121798656 unmapped: 24051712 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:15:43.743455+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121798656 unmapped: 24051712 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:15:44.743630+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121798656 unmapped: 24051712 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:15:45.743859+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121798656 unmapped: 24051712 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:15:46.744000+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121798656 unmapped: 24051712 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 234881024 data_used: 10301440
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:15:47.744233+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121806848 unmapped: 24043520 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:15:48.744507+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121806848 unmapped: 24043520 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:15:49.744647+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121806848 unmapped: 24043520 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:15:50.744791+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121806848 unmapped: 24043520 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:15:51.744910+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121806848 unmapped: 24043520 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 234881024 data_used: 10301440
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:15:52.745085+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121806848 unmapped: 24043520 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:15:53.745357+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121806848 unmapped: 24043520 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:15:54.745491+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121806848 unmapped: 24043520 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:15:55.745677+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121815040 unmapped: 24035328 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:15:56.745857+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121675776 unmapped: 24174592 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 234881024 data_used: 10301440
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:15:57.746042+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121675776 unmapped: 24174592 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:15:58.746236+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121675776 unmapped: 24174592 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:15:59.746454+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121675776 unmapped: 24174592 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:16:00.746614+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121675776 unmapped: 24174592 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:16:01.748862+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121675776 unmapped: 24174592 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 234881024 data_used: 10301440
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:16:02.749648+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121675776 unmapped: 24174592 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:16:03.749834+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121675776 unmapped: 24174592 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:16:04.750639+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121675776 unmapped: 24174592 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:16:05.750941+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121683968 unmapped: 24166400 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:16:06.751852+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121683968 unmapped: 24166400 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 234881024 data_used: 10301440
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:16:07.752544+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121683968 unmapped: 24166400 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:16:08.752757+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121683968 unmapped: 24166400 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:16:09.753481+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121683968 unmapped: 24166400 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:16:10.753898+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121683968 unmapped: 24166400 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:16:11.754077+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121683968 unmapped: 24166400 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 234881024 data_used: 10301440
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:16:12.754366+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121683968 unmapped: 24166400 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:16:13.754537+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121683968 unmapped: 24166400 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:16:14.754758+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121683968 unmapped: 24166400 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:16:15.754931+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121683968 unmapped: 24166400 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:16:16.755116+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:16:17.755503+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121683968 unmapped: 24166400 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 234881024 data_used: 10301440
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:16:18.755753+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121683968 unmapped: 24166400 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:16:19.755920+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121692160 unmapped: 24158208 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:16:20.756213+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121692160 unmapped: 24158208 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:16:21.756516+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121692160 unmapped: 24158208 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:16:22.756777+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121692160 unmapped: 24158208 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 234881024 data_used: 10301440
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:16:23.757085+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121692160 unmapped: 24158208 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:16:24.757465+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121692160 unmapped: 24158208 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:16:25.757692+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121692160 unmapped: 24158208 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:16:26.758071+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121692160 unmapped: 24158208 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:16:27.758226+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121692160 unmapped: 24158208 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 234881024 data_used: 10301440
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:16:28.758577+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121692160 unmapped: 24158208 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:16:29.758837+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121692160 unmapped: 24158208 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:16:30.758987+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121692160 unmapped: 24158208 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:16:31.759226+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121692160 unmapped: 24158208 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:16:32.759429+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121692160 unmapped: 24158208 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 234881024 data_used: 10301440
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:16:33.759607+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121700352 unmapped: 24150016 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:16:34.759774+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121700352 unmapped: 24150016 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:16:35.759945+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121700352 unmapped: 24150016 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:16:36.760094+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121700352 unmapped: 24150016 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:16:37.760256+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121700352 unmapped: 24150016 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 234881024 data_used: 10301440
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:16:38.760418+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121700352 unmapped: 24150016 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:16:39.760617+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121700352 unmapped: 24150016 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:16:40.760772+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121700352 unmapped: 24150016 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:16:41.761236+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121700352 unmapped: 24150016 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:16:42.761482+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121700352 unmapped: 24150016 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 234881024 data_used: 10301440
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:16:43.761661+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121708544 unmapped: 24141824 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:16:44.761843+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121708544 unmapped: 24141824 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:16:45.762026+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121708544 unmapped: 24141824 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:16:46.762208+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121708544 unmapped: 24141824 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:16:47.762335+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121708544 unmapped: 24141824 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 234881024 data_used: 10301440
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:16:48.762482+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121708544 unmapped: 24141824 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:16:49.762633+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121708544 unmapped: 24141824 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:16:50.762747+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121708544 unmapped: 24141824 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:16:51.762902+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121708544 unmapped: 24141824 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:16:52.763038+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121708544 unmapped: 24141824 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 234881024 data_used: 10301440
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:16:53.763220+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121708544 unmapped: 24141824 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:16:54.763362+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121708544 unmapped: 24141824 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:16:55.763553+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121708544 unmapped: 24141824 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:16:56.763714+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121708544 unmapped: 24141824 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:16:57.763940+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121708544 unmapped: 24141824 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 234881024 data_used: 10301440
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:16:58.764192+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121708544 unmapped: 24141824 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:16:59.764362+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121708544 unmapped: 24141824 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:17:00.764557+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121708544 unmapped: 24141824 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:17:01.764749+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121708544 unmapped: 24141824 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:17:02.764939+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121708544 unmapped: 24141824 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 218103808 data_used: 10301440
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:17:03.765179+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121708544 unmapped: 24141824 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:17:04.765343+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121708544 unmapped: 24141824 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:17:05.765587+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121716736 unmapped: 24133632 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:17:06.766090+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121716736 unmapped: 24133632 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:17:07.766447+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121716736 unmapped: 24133632 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 218103808 data_used: 10301440
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:17:08.766624+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121716736 unmapped: 24133632 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:17:09.766962+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121716736 unmapped: 24133632 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:17:10.767510+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121716736 unmapped: 24133632 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:17:11.767973+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121716736 unmapped: 24133632 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:17:12.768185+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121716736 unmapped: 24133632 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 218103808 data_used: 10301440
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:17:13.771401+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121716736 unmapped: 24133632 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:17:14.771613+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121716736 unmapped: 24133632 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:17:15.771802+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121724928 unmapped: 24125440 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:17:16.772026+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121724928 unmapped: 24125440 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:17:17.772276+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121724928 unmapped: 24125440 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 218103808 data_used: 10301440
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:17:18.772790+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121724928 unmapped: 24125440 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:17:19.773127+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121724928 unmapped: 24125440 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:17:20.773335+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121724928 unmapped: 24125440 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:17:21.773534+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121724928 unmapped: 24125440 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:17:22.773798+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121724928 unmapped: 24125440 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 218103808 data_used: 10301440
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:17:23.774076+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121733120 unmapped: 24117248 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:17:24.774896+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121733120 unmapped: 24117248 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:17:25.775269+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121733120 unmapped: 24117248 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:17:26.776011+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121733120 unmapped: 24117248 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:17:27.776347+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121733120 unmapped: 24117248 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 218103808 data_used: 10301440
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:17:28.776974+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121733120 unmapped: 24117248 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:17:29.777566+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121733120 unmapped: 24117248 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:17:30.777872+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121733120 unmapped: 24117248 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:17:31.778087+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121733120 unmapped: 24117248 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:17:32.778255+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121733120 unmapped: 24117248 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 218103808 data_used: 10301440
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:17:33.778440+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121733120 unmapped: 24117248 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:17:34.778764+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121733120 unmapped: 24117248 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:17:35.779048+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121733120 unmapped: 24117248 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:17:36.779361+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121733120 unmapped: 24117248 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:17:37.779499+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121733120 unmapped: 24117248 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 218103808 data_used: 10301440
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:17:38.779647+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121733120 unmapped: 24117248 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:17:39.779799+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121741312 unmapped: 24109056 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:17:40.779937+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121741312 unmapped: 24109056 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:17:41.780090+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121741312 unmapped: 24109056 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:17:42.780232+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121741312 unmapped: 24109056 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 218103808 data_used: 10301440
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:17:43.780464+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121741312 unmapped: 24109056 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:17:44.780654+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121741312 unmapped: 24109056 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:17:45.780816+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121741312 unmapped: 24109056 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:17:46.780950+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121741312 unmapped: 24109056 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:17:47.781119+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121741312 unmapped: 24109056 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 218103808 data_used: 10301440
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:17:48.781271+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121741312 unmapped: 24109056 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:17:49.781447+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121749504 unmapped: 24100864 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:17:50.781641+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121749504 unmapped: 24100864 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:17:51.781803+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121749504 unmapped: 24100864 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:17:52.781995+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121749504 unmapped: 24100864 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 218103808 data_used: 10301440
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:17:53.782214+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121749504 unmapped: 24100864 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:17:54.782354+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121749504 unmapped: 24100864 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:17:55.782557+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121749504 unmapped: 24100864 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:17:56.782740+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121749504 unmapped: 24100864 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:17:57.782907+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121749504 unmapped: 24100864 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 218103808 data_used: 10301440
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:17:58.783057+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121749504 unmapped: 24100864 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:17:59.783242+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121749504 unmapped: 24100864 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:18:00.783451+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121749504 unmapped: 24100864 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:18:01.783639+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121749504 unmapped: 24100864 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:18:02.783827+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121749504 unmapped: 24100864 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 218103808 data_used: 10301440
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:18:03.784013+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121757696 unmapped: 24092672 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:18:04.784226+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121757696 unmapped: 24092672 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:18:05.784438+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121757696 unmapped: 24092672 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:18:06.784596+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121757696 unmapped: 24092672 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:18:07.784806+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121757696 unmapped: 24092672 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 218103808 data_used: 10301440
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:18:08.784984+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121757696 unmapped: 24092672 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:18:09.785134+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121757696 unmapped: 24092672 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:18:10.785320+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121757696 unmapped: 24092672 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:18:11.785492+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121765888 unmapped: 24084480 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:18:12.785667+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121765888 unmapped: 24084480 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 218103808 data_used: 10301440
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:18:13.785866+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121765888 unmapped: 24084480 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:18:14.786019+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121765888 unmapped: 24084480 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:18:15.786136+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121765888 unmapped: 24084480 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:18:16.786351+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121765888 unmapped: 24084480 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:18:17.786535+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121765888 unmapped: 24084480 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 218103808 data_used: 10301440
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:18:18.786694+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121765888 unmapped: 24084480 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:18:19.786860+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121765888 unmapped: 24084480 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:18:20.787060+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121765888 unmapped: 24084480 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:18:21.787249+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121774080 unmapped: 24076288 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:18:22.787426+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121774080 unmapped: 24076288 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 218103808 data_used: 10301440
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:18:23.787632+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121774080 unmapped: 24076288 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:18:24.787799+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121774080 unmapped: 24076288 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:18:25.787982+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121774080 unmapped: 24076288 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:18:26.788138+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121774080 unmapped: 24076288 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:18:27.788338+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121774080 unmapped: 24076288 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 218103808 data_used: 10301440
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:18:28.788467+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121774080 unmapped: 24076288 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:18:29.788650+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121774080 unmapped: 24076288 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:18:30.788802+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121774080 unmapped: 24076288 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:18:31.788994+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121774080 unmapped: 24076288 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:18:32.789224+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121774080 unmapped: 24076288 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 218103808 data_used: 10301440
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:18:33.789447+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121774080 unmapped: 24076288 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:18:34.789643+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121774080 unmapped: 24076288 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:18:35.789785+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121782272 unmapped: 24068096 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:18:36.789957+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121782272 unmapped: 24068096 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:18:37.790118+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121782272 unmapped: 24068096 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 218103808 data_used: 10301440
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:18:38.790283+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                           ** DB Stats **
                                           Uptime(secs): 2400.1 total, 600.0 interval
                                           Cumulative writes: 12K writes, 47K keys, 12K commit groups, 1.0 writes per commit group, ingest: 0.03 GB, 0.01 MB/s
                                           Cumulative WAL: 12K writes, 3443 syncs, 3.64 writes per sync, written: 0.03 GB, 0.01 MB/s
                                           Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                           Interval writes: 1534 writes, 5165 keys, 1534 commit groups, 1.0 writes per commit group, ingest: 4.83 MB, 0.01 MB/s
                                           Interval WAL: 1534 writes, 631 syncs, 2.43 writes per sync, written: 0.00 GB, 0.01 MB/s
                                           Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121782272 unmapped: 24068096 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:18:39.790442+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121782272 unmapped: 24068096 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:18:40.790630+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121782272 unmapped: 24068096 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:18:41.790942+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121782272 unmapped: 24068096 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:18:42.791228+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 218103808 data_used: 10301440
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121782272 unmapped: 24068096 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:18:43.791542+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121790464 unmapped: 24059904 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:18:44.791751+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121790464 unmapped: 24059904 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:18:45.792020+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121790464 unmapped: 24059904 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:18:46.792232+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121790464 unmapped: 24059904 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:18:47.792481+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 218103808 data_used: 10301440
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121790464 unmapped: 24059904 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:18:48.792724+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121790464 unmapped: 24059904 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:18:49.793012+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121790464 unmapped: 24059904 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:18:50.793199+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121790464 unmapped: 24059904 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:18:51.793427+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121790464 unmapped: 24059904 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:18:52.793640+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 218103808 data_used: 10301440
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121790464 unmapped: 24059904 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:18:53.793882+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121790464 unmapped: 24059904 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:18:54.794114+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121790464 unmapped: 24059904 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:18:55.794268+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121790464 unmapped: 24059904 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:18:56.794380+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121790464 unmapped: 24059904 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:18:57.794621+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 218103808 data_used: 10301440
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121790464 unmapped: 24059904 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:18:58.794841+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121790464 unmapped: 24059904 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:18:59.795017+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121798656 unmapped: 24051712 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:19:00.795206+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121798656 unmapped: 24051712 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:19:01.795390+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121798656 unmapped: 24051712 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:19:02.795595+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 218103808 data_used: 10301440
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121798656 unmapped: 24051712 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:19:03.795839+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121798656 unmapped: 24051712 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:19:04.796037+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121798656 unmapped: 24051712 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:19:05.796273+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121798656 unmapped: 24051712 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:19:06.796493+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121798656 unmapped: 24051712 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:19:07.796691+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 218103808 data_used: 10301440
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121806848 unmapped: 24043520 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:19:08.796854+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121806848 unmapped: 24043520 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:19:09.797088+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121806848 unmapped: 24043520 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:19:10.797376+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121806848 unmapped: 24043520 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:19:11.797599+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121806848 unmapped: 24043520 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:19:12.797758+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 218103808 data_used: 10301440
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121806848 unmapped: 24043520 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:19:13.797959+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121806848 unmapped: 24043520 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:19:14.798193+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121806848 unmapped: 24043520 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:19:15.798377+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121806848 unmapped: 24043520 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:19:16.798586+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121806848 unmapped: 24043520 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:19:17.798735+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 218103808 data_used: 10301440
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121806848 unmapped: 24043520 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:19:18.798881+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121806848 unmapped: 24043520 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:19:19.799067+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121806848 unmapped: 24043520 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:19:20.799242+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121806848 unmapped: 24043520 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:19:21.799394+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121806848 unmapped: 24043520 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:19:22.799522+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 218103808 data_used: 10301440
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121806848 unmapped: 24043520 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:19:23.799799+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121815040 unmapped: 24035328 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:19:24.799955+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121815040 unmapped: 24035328 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:19:25.800095+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121815040 unmapped: 24035328 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:19:26.800301+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121815040 unmapped: 24035328 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:19:27.800475+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 218103808 data_used: 10301440
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121815040 unmapped: 24035328 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:19:28.800658+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121815040 unmapped: 24035328 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:19:29.800841+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121815040 unmapped: 24035328 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:19:30.801005+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121815040 unmapped: 24035328 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:19:31.801208+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121823232 unmapped: 24027136 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:19:32.801399+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 218103808 data_used: 10301440
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121823232 unmapped: 24027136 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:19:33.801583+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121823232 unmapped: 24027136 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:19:34.801747+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121823232 unmapped: 24027136 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:19:35.801878+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121823232 unmapped: 24027136 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:19:36.802050+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121823232 unmapped: 24027136 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:19:37.802247+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 218103808 data_used: 10301440
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121823232 unmapped: 24027136 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:19:38.802448+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121823232 unmapped: 24027136 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:19:39.802599+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121823232 unmapped: 24027136 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:19:40.802747+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121823232 unmapped: 24027136 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:19:41.802863+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121823232 unmapped: 24027136 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:19:42.803023+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 218103808 data_used: 10301440
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121823232 unmapped: 24027136 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:19:43.803188+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121823232 unmapped: 24027136 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:19:44.803341+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121831424 unmapped: 24018944 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:19:45.803563+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121831424 unmapped: 24018944 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:19:46.803765+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121831424 unmapped: 24018944 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:19:47.803925+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 218103808 data_used: 10301440
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121831424 unmapped: 24018944 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:19:48.804054+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121831424 unmapped: 24018944 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:19:49.804231+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121831424 unmapped: 24018944 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:19:50.804583+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121831424 unmapped: 24018944 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:19:51.804863+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121831424 unmapped: 24018944 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:19:52.805043+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 218103808 data_used: 10301440
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121831424 unmapped: 24018944 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:19:53.805264+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121831424 unmapped: 24018944 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:19:54.805466+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 439.931579590s of 439.972412109s, submitted: 10
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [0,0,0,0,0,2])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121856000 unmapped: 23994368 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:19:55.805603+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121856000 unmapped: 23994368 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:19:56.805751+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:19:57.805919+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121946112 unmapped: 23904256 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [0,0,1,1])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 218103808 data_used: 10301440
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:19:58.806100+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122077184 unmapped: 23773184 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:19:59.806306+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122077184 unmapped: 23773184 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:20:00.806489+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122077184 unmapped: 23773184 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:20:01.806658+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122077184 unmapped: 23773184 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:20:02.806855+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122077184 unmapped: 23773184 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 218103808 data_used: 10301440
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:20:03.807053+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122077184 unmapped: 23773184 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:20:04.807219+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122077184 unmapped: 23773184 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:20:05.807439+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122077184 unmapped: 23773184 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:20:06.807610+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122077184 unmapped: 23773184 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:20:07.807756+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122077184 unmapped: 23773184 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 218103808 data_used: 10301440
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:20:08.807870+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122077184 unmapped: 23773184 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:20:09.808038+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122077184 unmapped: 23773184 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:20:10.808204+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122077184 unmapped: 23773184 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:20:11.808352+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122077184 unmapped: 23773184 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:20:12.808547+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122077184 unmapped: 23773184 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:20:13.808772+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 218103808 data_used: 10301440
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122077184 unmapped: 23773184 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:20:14.808969+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122077184 unmapped: 23773184 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:20:15.809130+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122077184 unmapped: 23773184 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:20:16.809499+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122077184 unmapped: 23773184 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:20:17.809734+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122077184 unmapped: 23773184 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:20:18.809923+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 218103808 data_used: 10301440
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122077184 unmapped: 23773184 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:20:19.810075+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122085376 unmapped: 23764992 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:20:20.810286+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122085376 unmapped: 23764992 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:20:21.810457+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122085376 unmapped: 23764992 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:20:22.810702+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122085376 unmapped: 23764992 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:20:23.810925+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 218103808 data_used: 10301440
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122085376 unmapped: 23764992 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:20:24.811199+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122085376 unmapped: 23764992 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:20:25.811331+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122085376 unmapped: 23764992 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:20:26.811554+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122085376 unmapped: 23764992 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:20:27.811776+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122093568 unmapped: 23756800 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:20:28.811931+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 218103808 data_used: 10301440
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122093568 unmapped: 23756800 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:20:29.812079+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122093568 unmapped: 23756800 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:20:30.812230+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122093568 unmapped: 23756800 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:20:31.812370+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122093568 unmapped: 23756800 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:20:32.812637+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122093568 unmapped: 23756800 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:20:33.813001+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 218103808 data_used: 10301440
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122093568 unmapped: 23756800 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:20:34.813210+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122093568 unmapped: 23756800 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:20:35.813384+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122101760 unmapped: 23748608 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:20:36.813597+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122101760 unmapped: 23748608 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:20:37.813774+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122101760 unmapped: 23748608 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:20:38.814004+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 218103808 data_used: 10301440
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122101760 unmapped: 23748608 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:20:39.814232+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122101760 unmapped: 23748608 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:20:40.814407+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122101760 unmapped: 23748608 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:20:41.814595+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122101760 unmapped: 23748608 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:20:42.814773+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122101760 unmapped: 23748608 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:20:43.814976+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 218103808 data_used: 10301440
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122109952 unmapped: 23740416 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:20:44.815250+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122109952 unmapped: 23740416 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:20:45.815413+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122109952 unmapped: 23740416 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:20:46.815608+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122109952 unmapped: 23740416 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:20:47.815886+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122109952 unmapped: 23740416 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:20:48.816115+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 218103808 data_used: 10301440
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122109952 unmapped: 23740416 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:20:49.816293+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122109952 unmapped: 23740416 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:20:50.816461+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122109952 unmapped: 23740416 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:20:51.816655+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122109952 unmapped: 23740416 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:20:52.816851+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122109952 unmapped: 23740416 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:20:53.817088+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 218103808 data_used: 10301440
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122109952 unmapped: 23740416 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:20:54.817263+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122109952 unmapped: 23740416 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:20:55.817450+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122109952 unmapped: 23740416 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:20:56.817595+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122109952 unmapped: 23740416 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:20:57.817715+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122109952 unmapped: 23740416 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:20:58.817833+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 218103808 data_used: 10301440
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122109952 unmapped: 23740416 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:20:59.817968+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122118144 unmapped: 23732224 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:21:00.818103+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122118144 unmapped: 23732224 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:21:01.818220+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122118144 unmapped: 23732224 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:21:02.818345+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122118144 unmapped: 23732224 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:21:03.818538+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 218103808 data_used: 10301440
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122118144 unmapped: 23732224 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:21:04.818672+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122118144 unmapped: 23732224 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:21:05.818815+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122118144 unmapped: 23732224 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:21:06.818972+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122118144 unmapped: 23732224 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:21:07.819106+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122118144 unmapped: 23732224 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:21:08.819272+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 218103808 data_used: 10301440
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122118144 unmapped: 23732224 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:21:09.819405+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122118144 unmapped: 23732224 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:21:10.819611+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122118144 unmapped: 23732224 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:21:11.819743+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122118144 unmapped: 23732224 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:21:12.819879+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122118144 unmapped: 23732224 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:21:13.820087+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 218103808 data_used: 10301440
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122118144 unmapped: 23732224 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:21:14.820301+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122118144 unmapped: 23732224 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:21:15.820417+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122118144 unmapped: 23732224 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:21:16.820587+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122118144 unmapped: 23732224 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:21:17.820726+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122118144 unmapped: 23732224 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:21:18.820879+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 218103808 data_used: 10301440
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122118144 unmapped: 23732224 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:21:19.821007+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122118144 unmapped: 23732224 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:21:20.821223+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122118144 unmapped: 23732224 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:21:21.821350+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122118144 unmapped: 23732224 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:21:22.821516+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122118144 unmapped: 23732224 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:21:23.821727+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 218103808 data_used: 10301440
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122118144 unmapped: 23732224 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:21:24.871496+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122118144 unmapped: 23732224 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:21:25.871778+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122118144 unmapped: 23732224 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:21:26.872474+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122118144 unmapped: 23732224 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:21:27.872727+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122118144 unmapped: 23732224 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:21:28.873621+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 218103808 data_used: 10301440
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122118144 unmapped: 23732224 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:21:29.873852+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122118144 unmapped: 23732224 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:21:30.874052+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122118144 unmapped: 23732224 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:21:31.874442+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122118144 unmapped: 23732224 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:21:32.874942+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122118144 unmapped: 23732224 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:21:33.875221+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 218103808 data_used: 10301440
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122118144 unmapped: 23732224 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:21:34.875431+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122118144 unmapped: 23732224 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:21:35.875604+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122118144 unmapped: 23732224 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:21:36.875769+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122118144 unmapped: 23732224 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:21:37.876013+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122118144 unmapped: 23732224 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:21:38.876270+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 218103808 data_used: 10301440
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122118144 unmapped: 23732224 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:21:39.876508+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122118144 unmapped: 23732224 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:21:40.876671+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122118144 unmapped: 23732224 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:21:41.876899+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122118144 unmapped: 23732224 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:21:42.877216+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122118144 unmapped: 23732224 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:21:43.877398+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 218103808 data_used: 10301440
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122118144 unmapped: 23732224 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:21:44.878299+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122118144 unmapped: 23732224 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:21:45.878519+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122118144 unmapped: 23732224 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:21:46.878766+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122118144 unmapped: 23732224 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:21:47.879021+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122118144 unmapped: 23732224 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:21:48.879295+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 218103808 data_used: 10301440
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122118144 unmapped: 23732224 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:21:49.879507+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122118144 unmapped: 23732224 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:21:50.879692+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122118144 unmapped: 23732224 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:21:51.879876+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122118144 unmapped: 23732224 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:21:52.880425+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122118144 unmapped: 23732224 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:21:53.880631+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 218103808 data_used: 10301440
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122118144 unmapped: 23732224 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:21:54.880777+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122118144 unmapped: 23732224 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:21:55.880913+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122118144 unmapped: 23732224 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:21:56.881087+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122118144 unmapped: 23732224 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:21:57.881246+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122118144 unmapped: 23732224 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:21:58.881398+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 218103808 data_used: 10301440
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122118144 unmapped: 23732224 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:21:59.881604+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122118144 unmapped: 23732224 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:22:00.881767+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122118144 unmapped: 23732224 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:22:01.881913+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122118144 unmapped: 23732224 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:22:02.883557+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122118144 unmapped: 23732224 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:22:03.883717+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 218103808 data_used: 10301440
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122126336 unmapped: 23724032 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:22:04.883844+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122126336 unmapped: 23724032 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:22:05.884022+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122126336 unmapped: 23724032 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:22:06.884179+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122126336 unmapped: 23724032 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:22:07.884354+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122126336 unmapped: 23724032 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:22:08.884509+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 218103808 data_used: 10301440
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122126336 unmapped: 23724032 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:22:09.884657+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122126336 unmapped: 23724032 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:22:10.884800+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122126336 unmapped: 23724032 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:22:11.885010+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122134528 unmapped: 23715840 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:22:12.885202+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122134528 unmapped: 23715840 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:22:13.885423+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 218103808 data_used: 10301440
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122134528 unmapped: 23715840 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:22:14.885555+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122134528 unmapped: 23715840 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:22:15.885696+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122134528 unmapped: 23715840 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:22:16.885828+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122134528 unmapped: 23715840 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:22:17.885992+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122134528 unmapped: 23715840 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:22:18.886167+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 218103808 data_used: 10301440
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122134528 unmapped: 23715840 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:22:19.886274+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122142720 unmapped: 23707648 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:22:20.886436+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122142720 unmapped: 23707648 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:22:21.886674+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122142720 unmapped: 23707648 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:22:22.886832+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122142720 unmapped: 23707648 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:22:23.887041+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 218103808 data_used: 10301440
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122142720 unmapped: 23707648 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:22:24.887199+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122142720 unmapped: 23707648 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:22:25.887426+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122142720 unmapped: 23707648 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:22:26.888467+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122142720 unmapped: 23707648 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:22:27.889417+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122142720 unmapped: 23707648 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:22:28.889557+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 218103808 data_used: 10301440
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122142720 unmapped: 23707648 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:22:29.890311+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122142720 unmapped: 23707648 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:22:30.890442+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122142720 unmapped: 23707648 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:22:31.890601+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122142720 unmapped: 23707648 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:22:32.890790+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122150912 unmapped: 23699456 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:22:33.890980+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 218103808 data_used: 10301440
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122150912 unmapped: 23699456 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:22:34.891124+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122150912 unmapped: 23699456 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:22:35.891358+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122150912 unmapped: 23699456 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:22:36.891514+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122150912 unmapped: 23699456 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:22:37.891702+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122150912 unmapped: 23699456 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:22:38.891833+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 218103808 data_used: 10301440
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122150912 unmapped: 23699456 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:22:39.892239+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122150912 unmapped: 23699456 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:22:40.892419+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122159104 unmapped: 23691264 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:22:42.198489+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122159104 unmapped: 23691264 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:22:43.198723+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122159104 unmapped: 23691264 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:22:44.199012+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 218103808 data_used: 10301440
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122159104 unmapped: 23691264 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:22:45.199210+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122159104 unmapped: 23691264 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:22:46.199430+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122159104 unmapped: 23691264 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:22:47.199627+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122159104 unmapped: 23691264 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:22:48.199822+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122159104 unmapped: 23691264 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:22:49.199969+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 218103808 data_used: 10301440
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122159104 unmapped: 23691264 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:22:50.200238+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122159104 unmapped: 23691264 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:22:51.200474+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122159104 unmapped: 23691264 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:22:52.200698+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122159104 unmapped: 23691264 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:22:53.200880+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122159104 unmapped: 23691264 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:22:54.201133+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 218103808 data_used: 10301440
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122159104 unmapped: 23691264 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:22:55.201410+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122159104 unmapped: 23691264 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:22:56.201706+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122159104 unmapped: 23691264 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:22:57.201998+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122159104 unmapped: 23691264 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:22:58.202233+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122159104 unmapped: 23691264 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:22:59.202590+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 218103808 data_used: 10301440
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122159104 unmapped: 23691264 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:23:00.203294+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122167296 unmapped: 23683072 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:23:01.204234+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122167296 unmapped: 23683072 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:23:02.204506+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122167296 unmapped: 23683072 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:23:03.204723+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122167296 unmapped: 23683072 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:23:04.205134+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 218103808 data_used: 10301440
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122167296 unmapped: 23683072 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:23:05.205618+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122167296 unmapped: 23683072 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:23:06.205783+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122167296 unmapped: 23683072 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:23:07.206241+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122167296 unmapped: 23683072 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:23:08.206526+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122175488 unmapped: 23674880 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:23:09.206753+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 218103808 data_used: 10301440
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122175488 unmapped: 23674880 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:23:10.206948+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122175488 unmapped: 23674880 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:23:11.207114+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122175488 unmapped: 23674880 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:23:12.207383+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122175488 unmapped: 23674880 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:23:13.207679+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122175488 unmapped: 23674880 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:23:14.207913+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 218103808 data_used: 10301440
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122175488 unmapped: 23674880 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:23:15.208187+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122175488 unmapped: 23674880 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:23:16.208419+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122175488 unmapped: 23674880 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:23:17.208736+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122175488 unmapped: 23674880 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:23:18.208992+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122175488 unmapped: 23674880 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:23:19.209255+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 218103808 data_used: 10301440
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122175488 unmapped: 23674880 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:23:20.209434+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122175488 unmapped: 23674880 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:23:21.209592+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122175488 unmapped: 23674880 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:23:22.209734+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122175488 unmapped: 23674880 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:23:23.209980+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122175488 unmapped: 23674880 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:23:24.210238+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 218103808 data_used: 10301440
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122175488 unmapped: 23674880 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:23:25.210411+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122175488 unmapped: 23674880 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:23:26.210568+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122175488 unmapped: 23674880 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:23:27.210766+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122175488 unmapped: 23674880 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:23:28.210953+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122175488 unmapped: 23674880 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:23:29.211120+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 218103808 data_used: 10301440
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122175488 unmapped: 23674880 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:23:30.211590+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122175488 unmapped: 23674880 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:23:31.212007+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 122183680 unmapped: 23666688 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:23:32.212286+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121708544 unmapped: 24141824 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:23:33.212466+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121708544 unmapped: 24141824 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:23:34.212649+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 218103808 data_used: 10301440
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121708544 unmapped: 24141824 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:23:35.212840+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121708544 unmapped: 24141824 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:23:36.213010+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121708544 unmapped: 24141824 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:23:37.213245+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121708544 unmapped: 24141824 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:23:38.213419+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121708544 unmapped: 24141824 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets getting new tickets!
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:23:39.213662+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _finish_auth 0
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:23:39.215016+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 218103808 data_used: 10301440
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121716736 unmapped: 24133632 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:23:40.213832+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121716736 unmapped: 24133632 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:23:41.214055+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121716736 unmapped: 24133632 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:23:42.214274+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121716736 unmapped: 24133632 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:23:43.214499+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121716736 unmapped: 24133632 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:23:44.214760+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 218103808 data_used: 10301440
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121716736 unmapped: 24133632 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:23:45.214924+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121716736 unmapped: 24133632 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:23:46.215100+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121716736 unmapped: 24133632 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:23:47.215310+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121724928 unmapped: 24125440 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:23:48.215499+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121724928 unmapped: 24125440 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:23:49.215688+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 218103808 data_used: 10301440
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121724928 unmapped: 24125440 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:23:50.215823+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121724928 unmapped: 24125440 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:23:51.215983+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121724928 unmapped: 24125440 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:23:52.216202+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121724928 unmapped: 24125440 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:23:53.216353+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121724928 unmapped: 24125440 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:23:54.216620+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 218103808 data_used: 10301440
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121724928 unmapped: 24125440 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:23:55.216830+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121724928 unmapped: 24125440 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:23:56.216977+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121724928 unmapped: 24125440 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:23:57.217192+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121724928 unmapped: 24125440 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:23:58.217389+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121733120 unmapped: 24117248 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:23:59.217614+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 218103808 data_used: 10301440
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121733120 unmapped: 24117248 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:24:00.217791+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121733120 unmapped: 24117248 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:24:01.217937+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121733120 unmapped: 24117248 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:24:02.218095+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121733120 unmapped: 24117248 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:24:03.218274+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121733120 unmapped: 24117248 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:24:04.218444+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 218103808 data_used: 10301440
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121733120 unmapped: 24117248 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:24:05.218563+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121733120 unmapped: 24117248 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:24:06.218670+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121733120 unmapped: 24117248 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:24:07.218808+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121733120 unmapped: 24117248 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:24:08.218945+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: do_command 'config diff' '{prefix=config diff}'
Feb 02 10:24:41 compute-1 ceph-osd[77691]: do_command 'config diff' '{prefix=config diff}' result is 0 bytes
Feb 02 10:24:41 compute-1 ceph-osd[77691]: osd.0 147 heartbeat osd_stat(store_statfs(0x4f989f000/0x0/0x4ffc00000, data 0x1902ac2/0x19bd000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [1,2] op hist [])
Feb 02 10:24:41 compute-1 ceph-osd[77691]: do_command 'config show' '{prefix=config show}'
Feb 02 10:24:41 compute-1 ceph-osd[77691]: do_command 'config show' '{prefix=config show}' result is 0 bytes
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121856000 unmapped: 23994368 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: do_command 'counter dump' '{prefix=counter dump}'
Feb 02 10:24:41 compute-1 ceph-osd[77691]: do_command 'counter dump' '{prefix=counter dump}' result is 0 bytes
Feb 02 10:24:41 compute-1 ceph-osd[77691]: do_command 'counter schema' '{prefix=counter schema}'
Feb 02 10:24:41 compute-1 ceph-osd[77691]: do_command 'counter schema' '{prefix=counter schema}' result is 0 bytes
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:24:09.219081+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Feb 02 10:24:41 compute-1 ceph-osd[77691]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Feb 02 10:24:41 compute-1 ceph-osd[77691]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307786 data_alloc: 218103808 data_used: 10301440
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121643008 unmapped: 24207360 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: tick
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_tickets
Feb 02 10:24:41 compute-1 ceph-osd[77691]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-02T10:24:10.219216+0000)
Feb 02 10:24:41 compute-1 ceph-osd[77691]: prioritycache tune_memory target: 4294967296 mapped: 121683968 unmapped: 24166400 heap: 145850368 old mem: 2845415833 new mem: 2845415833
Feb 02 10:24:41 compute-1 ceph-osd[77691]: do_command 'log dump' '{prefix=log dump}'
Feb 02 10:24:41 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:24:41 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:24:41 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:24:41.523 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:24:41 compute-1 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr module ls"} v 0)
Feb 02 10:24:41 compute-1 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2115974441' entity='client.admin' cmd=[{"prefix": "mgr module ls"}]: dispatch
Feb 02 10:24:41 compute-1 rsyslogd[1009]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Feb 02 10:24:42 compute-1 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr services"} v 0)
Feb 02 10:24:42 compute-1 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2759989341' entity='client.admin' cmd=[{"prefix": "mgr services"}]: dispatch
Feb 02 10:24:42 compute-1 ceph-mon[80115]: from='client.28712 -' entity='client.admin' cmd=[{"prefix": "orch device ls", "target": ["mon-mgr", ""]}]: dispatch
Feb 02 10:24:42 compute-1 ceph-mon[80115]: from='client.? 192.168.122.101:0/1580771436' entity='client.admin' cmd=[{"prefix": "mgr metadata"}]: dispatch
Feb 02 10:24:42 compute-1 ceph-mon[80115]: from='client.28615 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch
Feb 02 10:24:42 compute-1 ceph-mon[80115]: from='client.19008 -' entity='client.admin' cmd=[{"prefix": "orch host ls", "target": ["mon-mgr", ""]}]: dispatch
Feb 02 10:24:42 compute-1 ceph-mon[80115]: from='client.28733 -' entity='client.admin' cmd=[{"prefix": "orch ls", "target": ["mon-mgr", ""]}]: dispatch
Feb 02 10:24:42 compute-1 ceph-mon[80115]: from='client.? 192.168.122.102:0/640030923' entity='client.admin' cmd=[{"prefix": "mgr services"}]: dispatch
Feb 02 10:24:42 compute-1 ceph-mon[80115]: from='client.? 192.168.122.100:0/2271174169' entity='client.admin' cmd=[{"prefix": "mgr dump"}]: dispatch
Feb 02 10:24:42 compute-1 ceph-mon[80115]: from='client.28630 -' entity='client.admin' cmd=[{"prefix": "orch status", "detail": true, "target": ["mon-mgr", ""]}]: dispatch
Feb 02 10:24:42 compute-1 ceph-mon[80115]: from='client.19023 -' entity='client.admin' cmd=[{"prefix": "orch device ls", "target": ["mon-mgr", ""]}]: dispatch
Feb 02 10:24:42 compute-1 ceph-mon[80115]: from='client.? 192.168.122.101:0/2115974441' entity='client.admin' cmd=[{"prefix": "mgr module ls"}]: dispatch
Feb 02 10:24:42 compute-1 ceph-mon[80115]: from='client.28757 -' entity='client.admin' cmd=[{"prefix": "orch ls", "export": true, "target": ["mon-mgr", ""]}]: dispatch
Feb 02 10:24:42 compute-1 ceph-mon[80115]: from='client.? 192.168.122.102:0/860680990' entity='client.admin' cmd=[{"prefix": "mgr versions"}]: dispatch
Feb 02 10:24:42 compute-1 ceph-mon[80115]: from='client.? 192.168.122.100:0/2951791200' entity='client.admin' cmd=[{"prefix": "mgr metadata"}]: dispatch
Feb 02 10:24:42 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:24:42 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:24:42 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:24:42.559 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:24:42 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 10:24:42 compute-1 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr versions"} v 0)
Feb 02 10:24:42 compute-1 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1968831604' entity='client.admin' cmd=[{"prefix": "mgr versions"}]: dispatch
Feb 02 10:24:42 compute-1 crontab[249925]: (root) LIST (root)
Feb 02 10:24:43 compute-1 ceph-mon[80115]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #76. Immutable memtables: 0.
Feb 02 10:24:43 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:24:43.044799) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Feb 02 10:24:43 compute-1 ceph-mon[80115]: rocksdb: [db/flush_job.cc:856] [default] [JOB 45] Flushing memtable with next log file: 76
Feb 02 10:24:43 compute-1 ceph-mon[80115]: rocksdb: EVENT_LOG_v1 {"time_micros": 1770027883044858, "job": 45, "event": "flush_started", "num_memtables": 1, "num_entries": 1393, "num_deletes": 258, "total_data_size": 3180907, "memory_usage": 3227664, "flush_reason": "Manual Compaction"}
Feb 02 10:24:43 compute-1 ceph-mon[80115]: rocksdb: [db/flush_job.cc:885] [default] [JOB 45] Level-0 flush table #77: started
Feb 02 10:24:43 compute-1 ceph-mon[80115]: rocksdb: EVENT_LOG_v1 {"time_micros": 1770027883064766, "cf_name": "default", "job": 45, "event": "table_file_creation", "file_number": 77, "file_size": 2070865, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 39411, "largest_seqno": 40799, "table_properties": {"data_size": 2064676, "index_size": 3328, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1797, "raw_key_size": 14724, "raw_average_key_size": 20, "raw_value_size": 2051611, "raw_average_value_size": 2869, "num_data_blocks": 145, "num_entries": 715, "num_filter_entries": 715, "num_deletions": 258, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1770027779, "oldest_key_time": 1770027779, "file_creation_time": 1770027883, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "fac1d709-8a2a-487d-b05b-57255ec289c7", "db_session_id": "DE871D21TSCUFP8UED8E", "orig_file_number": 77, "seqno_to_time_mapping": "N/A"}}
Feb 02 10:24:43 compute-1 ceph-mon[80115]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 45] Flush lasted 20017 microseconds, and 3350 cpu microseconds.
Feb 02 10:24:43 compute-1 ceph-mon[80115]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Feb 02 10:24:43 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:24:43.064819) [db/flush_job.cc:967] [default] [JOB 45] Level-0 flush table #77: 2070865 bytes OK
Feb 02 10:24:43 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:24:43.064842) [db/memtable_list.cc:519] [default] Level-0 commit table #77 started
Feb 02 10:24:43 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:24:43.067221) [db/memtable_list.cc:722] [default] Level-0 commit table #77: memtable #1 done
Feb 02 10:24:43 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:24:43.067241) EVENT_LOG_v1 {"time_micros": 1770027883067235, "job": 45, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Feb 02 10:24:43 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:24:43.067260) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Feb 02 10:24:43 compute-1 ceph-mon[80115]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 45] Try to delete WAL files size 3173994, prev total WAL file size 3173994, number of live WAL files 2.
Feb 02 10:24:43 compute-1 ceph-mon[80115]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000073.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb 02 10:24:43 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:24:43.067940) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6C6F676D0031303031' seq:72057594037927935, type:22 .. '6C6F676D0031323535' seq:0, type:0; will stop at (end)
Feb 02 10:24:43 compute-1 ceph-mon[80115]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 46] Compacting 1@0 + 1@6 files to L6, score -1.00
Feb 02 10:24:43 compute-1 ceph-mon[80115]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 45 Base level 0, inputs: [77(2022KB)], [75(11MB)]
Feb 02 10:24:43 compute-1 ceph-mon[80115]: rocksdb: EVENT_LOG_v1 {"time_micros": 1770027883067972, "job": 46, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [77], "files_L6": [75], "score": -1, "input_data_size": 14272057, "oldest_snapshot_seqno": -1}
Feb 02 10:24:43 compute-1 ceph-mon[80115]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 46] Generated table #78: 6908 keys, 14141304 bytes, temperature: kUnknown
Feb 02 10:24:43 compute-1 ceph-mon[80115]: rocksdb: EVENT_LOG_v1 {"time_micros": 1770027883186699, "cf_name": "default", "job": 46, "event": "table_file_creation", "file_number": 78, "file_size": 14141304, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 14096790, "index_size": 26162, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 17285, "raw_key_size": 182070, "raw_average_key_size": 26, "raw_value_size": 13973765, "raw_average_value_size": 2022, "num_data_blocks": 1027, "num_entries": 6908, "num_filter_entries": 6908, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1770025175, "oldest_key_time": 0, "file_creation_time": 1770027883, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "fac1d709-8a2a-487d-b05b-57255ec289c7", "db_session_id": "DE871D21TSCUFP8UED8E", "orig_file_number": 78, "seqno_to_time_mapping": "N/A"}}
Feb 02 10:24:43 compute-1 ceph-mon[80115]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Feb 02 10:24:43 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:24:43.186898) [db/compaction/compaction_job.cc:1663] [default] [JOB 46] Compacted 1@0 + 1@6 files to L6 => 14141304 bytes
Feb 02 10:24:43 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:24:43.226016) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 120.1 rd, 119.0 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(2.0, 11.6 +0.0 blob) out(13.5 +0.0 blob), read-write-amplify(13.7) write-amplify(6.8) OK, records in: 7438, records dropped: 530 output_compression: NoCompression
Feb 02 10:24:43 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:24:43.226060) EVENT_LOG_v1 {"time_micros": 1770027883226044, "job": 46, "event": "compaction_finished", "compaction_time_micros": 118788, "compaction_time_cpu_micros": 20554, "output_level": 6, "num_output_files": 1, "total_output_size": 14141304, "num_input_records": 7438, "num_output_records": 6908, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Feb 02 10:24:43 compute-1 ceph-mon[80115]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000077.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb 02 10:24:43 compute-1 ceph-mon[80115]: rocksdb: EVENT_LOG_v1 {"time_micros": 1770027883226401, "job": 46, "event": "table_file_deletion", "file_number": 77}
Feb 02 10:24:43 compute-1 ceph-mon[80115]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000075.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb 02 10:24:43 compute-1 ceph-mon[80115]: rocksdb: EVENT_LOG_v1 {"time_micros": 1770027883227566, "job": 46, "event": "table_file_deletion", "file_number": 75}
Feb 02 10:24:43 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:24:43.067871) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 02 10:24:43 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:24:43.227624) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 02 10:24:43 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:24:43.227630) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 02 10:24:43 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:24:43.227632) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 02 10:24:43 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:24:43.227634) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 02 10:24:43 compute-1 ceph-mon[80115]: rocksdb: (Original Log Time 2026/02/02-10:24:43.227636) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 02 10:24:43 compute-1 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon stat"} v 0)
Feb 02 10:24:43 compute-1 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/4167825202' entity='client.admin' cmd=[{"prefix": "mon stat"}]: dispatch
Feb 02 10:24:43 compute-1 ceph-mon[80115]: from='client.28654 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch
Feb 02 10:24:43 compute-1 ceph-mon[80115]: from='client.19038 -' entity='client.admin' cmd=[{"prefix": "orch ls", "target": ["mon-mgr", ""]}]: dispatch
Feb 02 10:24:43 compute-1 ceph-mon[80115]: from='client.? 192.168.122.101:0/2759989341' entity='client.admin' cmd=[{"prefix": "mgr services"}]: dispatch
Feb 02 10:24:43 compute-1 ceph-mon[80115]: from='client.28784 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch
Feb 02 10:24:43 compute-1 ceph-mon[80115]: from='client.? 192.168.122.102:0/3128593866' entity='client.admin' cmd=[{"prefix": "mon stat"}]: dispatch
Feb 02 10:24:43 compute-1 ceph-mon[80115]: from='client.? 192.168.122.100:0/2856937313' entity='client.admin' cmd=[{"prefix": "mgr module ls"}]: dispatch
Feb 02 10:24:43 compute-1 ceph-mon[80115]: from='client.28675 -' entity='client.admin' cmd=[{"prefix": "balancer eval", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Feb 02 10:24:43 compute-1 ceph-mon[80115]: from='client.19065 -' entity='client.admin' cmd=[{"prefix": "orch ls", "export": true, "target": ["mon-mgr", ""]}]: dispatch
Feb 02 10:24:43 compute-1 ceph-mon[80115]: pgmap v1364: 353 pgs: 353 active+clean; 41 MiB data, 303 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Feb 02 10:24:43 compute-1 ceph-mon[80115]: from='client.28805 -' entity='client.admin' cmd=[{"prefix": "orch status", "detail": true, "target": ["mon-mgr", ""]}]: dispatch
Feb 02 10:24:43 compute-1 ceph-mon[80115]: from='client.? 192.168.122.101:0/1968831604' entity='client.admin' cmd=[{"prefix": "mgr versions"}]: dispatch
Feb 02 10:24:43 compute-1 ceph-mon[80115]: from='client.? 192.168.122.100:0/3763930355' entity='client.admin' cmd=[{"prefix": "mgr services"}]: dispatch
Feb 02 10:24:43 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:24:43 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:24:43 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:24:43.525 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:24:43 compute-1 sudo[250059]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Feb 02 10:24:43 compute-1 sudo[250059]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Feb 02 10:24:43 compute-1 sudo[250059]: pam_unix(sudo:session): session closed for user root
Feb 02 10:24:43 compute-1 nova_compute[226294]: 2026-02-02 10:24:43.716 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:24:44 compute-1 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "node ls"} v 0)
Feb 02 10:24:44 compute-1 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3391420026' entity='client.admin' cmd=[{"prefix": "node ls"}]: dispatch
Feb 02 10:24:44 compute-1 ceph-mon[80115]: from='client.28696 -' entity='client.admin' cmd=[{"prefix": "balancer status", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Feb 02 10:24:44 compute-1 ceph-mon[80115]: from='client.19083 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch
Feb 02 10:24:44 compute-1 ceph-mon[80115]: from='client.28817 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch
Feb 02 10:24:44 compute-1 ceph-mon[80115]: from='client.? 192.168.122.101:0/4167825202' entity='client.admin' cmd=[{"prefix": "mon stat"}]: dispatch
Feb 02 10:24:44 compute-1 ceph-mon[80115]: from='client.? 192.168.122.100:0/2540107102' entity='client.admin' cmd=[{"prefix": "mgr versions"}]: dispatch
Feb 02 10:24:44 compute-1 ceph-mon[80115]: from='client.28717 -' entity='client.admin' cmd=[{"prefix": "balancer status detail", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Feb 02 10:24:44 compute-1 ceph-mon[80115]: from='client.19098 -' entity='client.admin' cmd=[{"prefix": "orch status", "detail": true, "target": ["mon-mgr", ""]}]: dispatch
Feb 02 10:24:44 compute-1 ceph-mon[80115]: from='client.28829 -' entity='client.admin' cmd=[{"prefix": "balancer eval", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Feb 02 10:24:44 compute-1 ceph-mon[80115]: from='client.? 192.168.122.102:0/3471163310' entity='client.admin' cmd=[{"prefix": "node ls"}]: dispatch
Feb 02 10:24:44 compute-1 ceph-mon[80115]: from='client.19122 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch
Feb 02 10:24:44 compute-1 ceph-mon[80115]: from='client.28726 -' entity='client.admin' cmd=[{"prefix": "healthcheck history ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Feb 02 10:24:44 compute-1 ceph-mon[80115]: from='client.? 192.168.122.102:0/3078264917' entity='client.admin' cmd=[{"prefix": "osd crush class ls"}]: dispatch
Feb 02 10:24:44 compute-1 ceph-mon[80115]: from='client.28850 -' entity='client.admin' cmd=[{"prefix": "balancer status", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Feb 02 10:24:44 compute-1 ceph-mon[80115]: from='client.? 192.168.122.100:0/1120965683' entity='client.admin' cmd=[{"prefix": "mon stat"}]: dispatch
Feb 02 10:24:44 compute-1 ceph-mon[80115]: from='client.? 192.168.122.101:0/3391420026' entity='client.admin' cmd=[{"prefix": "node ls"}]: dispatch
Feb 02 10:24:44 compute-1 ceph-mon[80115]: from='client.? 192.168.122.102:0/774094201' entity='client.admin' cmd=[{"prefix": "osd crush dump"}]: dispatch
Feb 02 10:24:44 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:24:44 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:24:44 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:24:44.561 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:24:44 compute-1 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd crush class ls"} v 0)
Feb 02 10:24:44 compute-1 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3346937992' entity='client.admin' cmd=[{"prefix": "osd crush class ls"}]: dispatch
Feb 02 10:24:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 10:24:44.926 143542 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 02 10:24:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 10:24:44.927 143542 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 02 10:24:44 compute-1 ovn_metadata_agent[143537]: 2026-02-02 10:24:44.927 143542 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 02 10:24:45 compute-1 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd crush dump"} v 0)
Feb 02 10:24:45 compute-1 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/117477717' entity='client.admin' cmd=[{"prefix": "osd crush dump"}]: dispatch
Feb 02 10:24:45 compute-1 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "log last", "channel": "cephadm", "format": "json-pretty"} v 0)
Feb 02 10:24:45 compute-1 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/236570362' entity='client.admin' cmd=[{"prefix": "log last", "channel": "cephadm", "format": "json-pretty"}]: dispatch
Feb 02 10:24:45 compute-1 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd crush rule ls"} v 0)
Feb 02 10:24:45 compute-1 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3903962935' entity='client.admin' cmd=[{"prefix": "osd crush rule ls"}]: dispatch
Feb 02 10:24:45 compute-1 systemd[1]: Starting Hostname Service...
Feb 02 10:24:45 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:24:45 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:24:45 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:24:45.527 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:24:45 compute-1 ceph-mon[80115]: from='client.19131 -' entity='client.admin' cmd=[{"prefix": "balancer eval", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Feb 02 10:24:45 compute-1 ceph-mon[80115]: from='client.28868 -' entity='client.admin' cmd=[{"prefix": "balancer status detail", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Feb 02 10:24:45 compute-1 ceph-mon[80115]: from='client.? 192.168.122.102:0/3892154292' entity='client.admin' cmd=[{"prefix": "log last", "channel": "cephadm", "format": "json-pretty"}]: dispatch
Feb 02 10:24:45 compute-1 ceph-mon[80115]: pgmap v1365: 353 pgs: 353 active+clean; 41 MiB data, 303 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Feb 02 10:24:45 compute-1 ceph-mon[80115]: from='client.? 192.168.122.101:0/3346937992' entity='client.admin' cmd=[{"prefix": "osd crush class ls"}]: dispatch
Feb 02 10:24:45 compute-1 ceph-mon[80115]: from='client.? 192.168.122.102:0/3909106104' entity='client.admin' cmd=[{"prefix": "osd crush rule ls"}]: dispatch
Feb 02 10:24:45 compute-1 ceph-mon[80115]: from='client.28883 -' entity='client.admin' cmd=[{"prefix": "healthcheck history ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Feb 02 10:24:45 compute-1 ceph-mon[80115]: from='client.19149 -' entity='client.admin' cmd=[{"prefix": "balancer status", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Feb 02 10:24:45 compute-1 ceph-mon[80115]: from='client.? 192.168.122.100:0/2995767945' entity='client.admin' cmd=[{"prefix": "node ls"}]: dispatch
Feb 02 10:24:45 compute-1 ceph-mon[80115]: from='client.? 192.168.122.102:0/695109794' entity='client.admin' cmd=[{"prefix": "mgr dump", "format": "json-pretty"}]: dispatch
Feb 02 10:24:45 compute-1 ceph-mon[80115]: from='client.? 192.168.122.101:0/117477717' entity='client.admin' cmd=[{"prefix": "osd crush dump"}]: dispatch
Feb 02 10:24:45 compute-1 ceph-mon[80115]: from='client.? 192.168.122.102:0/1184006819' entity='client.admin' cmd=[{"prefix": "osd crush show-tunables"}]: dispatch
Feb 02 10:24:45 compute-1 ceph-mon[80115]: from='client.? 192.168.122.101:0/236570362' entity='client.admin' cmd=[{"prefix": "log last", "channel": "cephadm", "format": "json-pretty"}]: dispatch
Feb 02 10:24:45 compute-1 ceph-mon[80115]: from='client.? 192.168.122.100:0/3979996278' entity='client.admin' cmd=[{"prefix": "osd crush class ls"}]: dispatch
Feb 02 10:24:45 compute-1 ceph-mon[80115]: from='client.? 192.168.122.101:0/3903962935' entity='client.admin' cmd=[{"prefix": "osd crush rule ls"}]: dispatch
Feb 02 10:24:45 compute-1 ceph-mon[80115]: from='client.? 192.168.122.102:0/700439238' entity='client.admin' cmd=[{"prefix": "mgr metadata", "format": "json-pretty"}]: dispatch
Feb 02 10:24:45 compute-1 systemd[1]: Started Hostname Service.
Feb 02 10:24:45 compute-1 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr dump", "format": "json-pretty"} v 0)
Feb 02 10:24:45 compute-1 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3038498722' entity='client.admin' cmd=[{"prefix": "mgr dump", "format": "json-pretty"}]: dispatch
Feb 02 10:24:45 compute-1 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd crush show-tunables"} v 0)
Feb 02 10:24:45 compute-1 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3318002558' entity='client.admin' cmd=[{"prefix": "osd crush show-tunables"}]: dispatch
Feb 02 10:24:46 compute-1 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd crush tree", "show_shadow": true} v 0)
Feb 02 10:24:46 compute-1 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/951116633' entity='client.admin' cmd=[{"prefix": "osd crush tree", "show_shadow": true}]: dispatch
Feb 02 10:24:46 compute-1 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr metadata", "format": "json-pretty"} v 0)
Feb 02 10:24:46 compute-1 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1216490102' entity='client.admin' cmd=[{"prefix": "mgr metadata", "format": "json-pretty"}]: dispatch
Feb 02 10:24:46 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:24:46 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:24:46 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:24:46.563 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:24:46 compute-1 ceph-mon[80115]: from='client.19173 -' entity='client.admin' cmd=[{"prefix": "balancer status detail", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Feb 02 10:24:46 compute-1 ceph-mon[80115]: from='client.? 192.168.122.102:0/141693057' entity='client.admin' cmd=[{"prefix": "osd crush tree", "show_shadow": true}]: dispatch
Feb 02 10:24:46 compute-1 ceph-mon[80115]: from='client.19191 -' entity='client.admin' cmd=[{"prefix": "healthcheck history ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Feb 02 10:24:46 compute-1 ceph-mon[80115]: from='client.? 192.168.122.101:0/3038498722' entity='client.admin' cmd=[{"prefix": "mgr dump", "format": "json-pretty"}]: dispatch
Feb 02 10:24:46 compute-1 ceph-mon[80115]: from='client.? 192.168.122.102:0/860452697' entity='client.admin' cmd=[{"prefix": "osd erasure-code-profile ls"}]: dispatch
Feb 02 10:24:46 compute-1 ceph-mon[80115]: from='client.? 192.168.122.101:0/3318002558' entity='client.admin' cmd=[{"prefix": "osd crush show-tunables"}]: dispatch
Feb 02 10:24:46 compute-1 ceph-mon[80115]: from='client.? 192.168.122.102:0/1693270302' entity='client.admin' cmd=[{"prefix": "mgr module ls", "format": "json-pretty"}]: dispatch
Feb 02 10:24:46 compute-1 ceph-mon[80115]: from='client.? 192.168.122.100:0/2213294818' entity='client.admin' cmd=[{"prefix": "osd crush dump"}]: dispatch
Feb 02 10:24:46 compute-1 ceph-mon[80115]: from='client.? 192.168.122.100:0/422127175' entity='client.admin' cmd=[{"prefix": "log last", "channel": "cephadm", "format": "json-pretty"}]: dispatch
Feb 02 10:24:46 compute-1 ceph-mon[80115]: from='client.? 192.168.122.102:0/1823691420' entity='client.admin' cmd=[{"prefix": "osd metadata"}]: dispatch
Feb 02 10:24:46 compute-1 ceph-mon[80115]: from='client.? 192.168.122.101:0/951116633' entity='client.admin' cmd=[{"prefix": "osd crush tree", "show_shadow": true}]: dispatch
Feb 02 10:24:46 compute-1 ceph-mon[80115]: from='client.? 192.168.122.100:0/4278616183' entity='client.admin' cmd=[{"prefix": "osd crush rule ls"}]: dispatch
Feb 02 10:24:46 compute-1 ceph-mon[80115]: from='client.? 192.168.122.101:0/1216490102' entity='client.admin' cmd=[{"prefix": "mgr metadata", "format": "json-pretty"}]: dispatch
Feb 02 10:24:46 compute-1 ceph-mon[80115]: from='client.? 192.168.122.102:0/3857455932' entity='client.admin' cmd=[{"prefix": "mgr services", "format": "json-pretty"}]: dispatch
Feb 02 10:24:46 compute-1 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd erasure-code-profile ls"} v 0)
Feb 02 10:24:46 compute-1 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1937781476' entity='client.admin' cmd=[{"prefix": "osd erasure-code-profile ls"}]: dispatch
Feb 02 10:24:46 compute-1 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr module ls", "format": "json-pretty"} v 0)
Feb 02 10:24:46 compute-1 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/812017344' entity='client.admin' cmd=[{"prefix": "mgr module ls", "format": "json-pretty"}]: dispatch
Feb 02 10:24:47 compute-1 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd metadata"} v 0)
Feb 02 10:24:47 compute-1 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1828341276' entity='client.admin' cmd=[{"prefix": "osd metadata"}]: dispatch
Feb 02 10:24:47 compute-1 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr services", "format": "json-pretty"} v 0)
Feb 02 10:24:47 compute-1 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1322837347' entity='client.admin' cmd=[{"prefix": "mgr services", "format": "json-pretty"}]: dispatch
Feb 02 10:24:47 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:24:47 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 10:24:47 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:24:47.529 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 10:24:47 compute-1 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd utilization"} v 0)
Feb 02 10:24:47 compute-1 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2100496108' entity='client.admin' cmd=[{"prefix": "osd utilization"}]: dispatch
Feb 02 10:24:47 compute-1 ceph-mon[80115]: from='client.? 192.168.122.102:0/3677594601' entity='client.admin' cmd=[{"prefix": "osd utilization"}]: dispatch
Feb 02 10:24:47 compute-1 ceph-mon[80115]: pgmap v1366: 353 pgs: 353 active+clean; 41 MiB data, 303 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Feb 02 10:24:47 compute-1 ceph-mon[80115]: from='client.? 192.168.122.101:0/1937781476' entity='client.admin' cmd=[{"prefix": "osd erasure-code-profile ls"}]: dispatch
Feb 02 10:24:47 compute-1 ceph-mon[80115]: from='client.? 192.168.122.100:0/4056548866' entity='client.admin' cmd=[{"prefix": "osd crush show-tunables"}]: dispatch
Feb 02 10:24:47 compute-1 ceph-mon[80115]: from='client.? 192.168.122.102:0/2363084501' entity='client.admin' cmd=[{"prefix": "mgr stat", "format": "json-pretty"}]: dispatch
Feb 02 10:24:47 compute-1 ceph-mon[80115]: from='client.? 192.168.122.101:0/812017344' entity='client.admin' cmd=[{"prefix": "mgr module ls", "format": "json-pretty"}]: dispatch
Feb 02 10:24:47 compute-1 ceph-mon[80115]: from='client.? 192.168.122.100:0/2340112825' entity='client.admin' cmd=[{"prefix": "mgr dump", "format": "json-pretty"}]: dispatch
Feb 02 10:24:47 compute-1 ceph-mon[80115]: from='client.? 192.168.122.100:0/3949875084' entity='client.admin' cmd=[{"prefix": "osd crush tree", "show_shadow": true}]: dispatch
Feb 02 10:24:47 compute-1 ceph-mon[80115]: from='client.? 192.168.122.101:0/1828341276' entity='client.admin' cmd=[{"prefix": "osd metadata"}]: dispatch
Feb 02 10:24:47 compute-1 ceph-mon[80115]: from='client.? 192.168.122.102:0/2357756114' entity='client.admin' cmd=[{"prefix": "mgr versions", "format": "json-pretty"}]: dispatch
Feb 02 10:24:47 compute-1 ceph-mon[80115]: from='mgr.14760 192.168.122.100:0/1432667282' entity='mgr.compute-0.djvyfo' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Feb 02 10:24:47 compute-1 ceph-mon[80115]: from='client.? 192.168.122.101:0/1322837347' entity='client.admin' cmd=[{"prefix": "mgr services", "format": "json-pretty"}]: dispatch
Feb 02 10:24:47 compute-1 ceph-mon[80115]: from='client.? 192.168.122.100:0/4250121613' entity='client.admin' cmd=[{"prefix": "mgr metadata", "format": "json-pretty"}]: dispatch
Feb 02 10:24:47 compute-1 ceph-mon[80115]: from='client.? 192.168.122.101:0/2100496108' entity='client.admin' cmd=[{"prefix": "osd utilization"}]: dispatch
Feb 02 10:24:47 compute-1 ceph-mon[80115]: from='client.? 192.168.122.100:0/3324932441' entity='client.admin' cmd=[{"prefix": "osd erasure-code-profile ls"}]: dispatch
Feb 02 10:24:47 compute-1 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr stat", "format": "json-pretty"} v 0)
Feb 02 10:24:47 compute-1 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3084816001' entity='client.admin' cmd=[{"prefix": "mgr stat", "format": "json-pretty"}]: dispatch
Feb 02 10:24:47 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 10:24:48 compute-1 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr versions", "format": "json-pretty"} v 0)
Feb 02 10:24:48 compute-1 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/657262728' entity='client.admin' cmd=[{"prefix": "mgr versions", "format": "json-pretty"}]: dispatch
Feb 02 10:24:48 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:24:48 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 10:24:48 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:24:48.565 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 10:24:48 compute-1 nova_compute[226294]: 2026-02-02 10:24:48.718 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 02 10:24:48 compute-1 ceph-mon[80115]: from='client.28882 -' entity='client.admin' cmd=[{"prefix": "telemetry channel ls", "target": ["mon-mgr", ""]}]: dispatch
Feb 02 10:24:48 compute-1 ceph-mon[80115]: from='client.28924 -' entity='client.admin' cmd=[{"prefix": "orch host ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Feb 02 10:24:48 compute-1 ceph-mon[80115]: from='client.28912 -' entity='client.admin' cmd=[{"prefix": "telemetry collection ls", "target": ["mon-mgr", ""]}]: dispatch
Feb 02 10:24:48 compute-1 ceph-mon[80115]: from='client.? 192.168.122.101:0/3084816001' entity='client.admin' cmd=[{"prefix": "mgr stat", "format": "json-pretty"}]: dispatch
Feb 02 10:24:48 compute-1 ceph-mon[80115]: from='client.? 192.168.122.100:0/2242730504' entity='client.admin' cmd=[{"prefix": "mgr module ls", "format": "json-pretty"}]: dispatch
Feb 02 10:24:48 compute-1 ceph-mon[80115]: from='client.28939 -' entity='client.admin' cmd=[{"prefix": "orch device ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Feb 02 10:24:48 compute-1 ceph-mon[80115]: from='client.? 192.168.122.100:0/185207194' entity='client.admin' cmd=[{"prefix": "osd metadata"}]: dispatch
Feb 02 10:24:48 compute-1 ceph-mon[80115]: from='client.? 192.168.122.101:0/657262728' entity='client.admin' cmd=[{"prefix": "mgr versions", "format": "json-pretty"}]: dispatch
Feb 02 10:24:48 compute-1 ceph-mon[80115]: from='client.? 192.168.122.100:0/1265805961' entity='client.admin' cmd=[{"prefix": "mgr services", "format": "json-pretty"}]: dispatch
Feb 02 10:24:48 compute-1 ceph-mon[80115]: from='client.? 192.168.122.102:0/1617644932' entity='client.admin' cmd=[{"prefix": "quorum_status"}]: dispatch
Feb 02 10:24:48 compute-1 ceph-mon[80115]: from='client.? 192.168.122.100:0/1841370334' entity='client.admin' cmd=[{"prefix": "osd utilization"}]: dispatch
Feb 02 10:24:49 compute-1 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "quorum_status"} v 0)
Feb 02 10:24:49 compute-1 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2380801877' entity='client.admin' cmd=[{"prefix": "quorum_status"}]: dispatch
Feb 02 10:24:49 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:24:49 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:24:49 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:24:49.531 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:24:49 compute-1 ceph-mon[80115]: from='client.19287 -' entity='client.admin' cmd=[{"prefix": "telemetry channel ls", "target": ["mon-mgr", ""]}]: dispatch
Feb 02 10:24:49 compute-1 ceph-mon[80115]: from='client.28963 -' entity='client.admin' cmd=[{"prefix": "orch ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Feb 02 10:24:49 compute-1 ceph-mon[80115]: from='client.29060 -' entity='client.admin' cmd=[{"prefix": "telemetry collection ls", "target": ["mon-mgr", ""]}]: dispatch
Feb 02 10:24:49 compute-1 ceph-mon[80115]: from='client.29069 -' entity='client.admin' cmd=[{"prefix": "orch host ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Feb 02 10:24:49 compute-1 ceph-mon[80115]: pgmap v1367: 353 pgs: 353 active+clean; 41 MiB data, 303 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Feb 02 10:24:49 compute-1 ceph-mon[80115]: from='client.28987 -' entity='client.admin' cmd=[{"prefix": "orch ls", "export": true, "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Feb 02 10:24:49 compute-1 ceph-mon[80115]: from='client.? 192.168.122.100:0/3039444262' entity='client.admin' cmd=[{"prefix": "mgr stat", "format": "json-pretty"}]: dispatch
Feb 02 10:24:49 compute-1 ceph-mon[80115]: from='client.? 192.168.122.102:0/2205478616' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch
Feb 02 10:24:49 compute-1 ceph-mon[80115]: from='client.? 192.168.122.100:0/2383295343' entity='client.admin' cmd=[{"prefix": "mgr versions", "format": "json-pretty"}]: dispatch
Feb 02 10:24:49 compute-1 ceph-mon[80115]: from='client.? 192.168.122.102:0/1021502049' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail", "format": "json-pretty"}]: dispatch
Feb 02 10:24:49 compute-1 ceph-mon[80115]: from='client.? 192.168.122.101:0/2380801877' entity='client.admin' cmd=[{"prefix": "quorum_status"}]: dispatch
Feb 02 10:24:50 compute-1 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "versions"} v 0)
Feb 02 10:24:50 compute-1 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1466464595' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch
Feb 02 10:24:50 compute-1 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "health", "detail": "detail", "format": "json-pretty"} v 0)
Feb 02 10:24:50 compute-1 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2157195115' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail", "format": "json-pretty"}]: dispatch
Feb 02 10:24:50 compute-1 ceph-mon[80115]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Feb 02 10:24:50 compute-1 ceph-mon[80115]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Feb 02 10:24:50 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:24:50 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000027s ======
Feb 02 10:24:50 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:24:50.567 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Feb 02 10:24:50 compute-1 ceph-mon[80115]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd='sessions' args=[]: dispatch
Feb 02 10:24:50 compute-1 ceph-mon[80115]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd=sessions args=[]: finished
Feb 02 10:24:50 compute-1 ceph-mon[80115]: from='client.29090 -' entity='client.admin' cmd=[{"prefix": "orch device ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Feb 02 10:24:50 compute-1 ceph-mon[80115]: from='client.19335 -' entity='client.admin' cmd=[{"prefix": "telemetry channel ls", "target": ["mon-mgr", ""]}]: dispatch
Feb 02 10:24:50 compute-1 ceph-mon[80115]: from='client.29008 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Feb 02 10:24:50 compute-1 ceph-mon[80115]: from='client.29108 -' entity='client.admin' cmd=[{"prefix": "orch ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Feb 02 10:24:50 compute-1 ceph-mon[80115]: from='client.19359 -' entity='client.admin' cmd=[{"prefix": "telemetry collection ls", "target": ["mon-mgr", ""]}]: dispatch
Feb 02 10:24:50 compute-1 ceph-mon[80115]: from='client.29026 -' entity='client.admin' cmd=[{"prefix": "orch status", "detail": true, "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Feb 02 10:24:50 compute-1 ceph-mon[80115]: from='client.19368 -' entity='client.admin' cmd=[{"prefix": "orch host ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Feb 02 10:24:50 compute-1 ceph-mon[80115]: from='client.? 192.168.122.102:0/2473725231' entity='client.admin' cmd=[{"prefix": "osd tree", "format": "json-pretty"}]: dispatch
Feb 02 10:24:50 compute-1 ceph-mon[80115]: from='client.29126 -' entity='client.admin' cmd=[{"prefix": "orch ls", "export": true, "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Feb 02 10:24:50 compute-1 ceph-mon[80115]: from='client.29041 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Feb 02 10:24:50 compute-1 ceph-mon[80115]: from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Feb 02 10:24:50 compute-1 ceph-mon[80115]: from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Feb 02 10:24:50 compute-1 ceph-mon[80115]: from='client.? 192.168.122.101:0/1466464595' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch
Feb 02 10:24:50 compute-1 ceph-mon[80115]: from='admin socket' entity='admin socket' cmd='sessions' args=[]: dispatch
Feb 02 10:24:50 compute-1 ceph-mon[80115]: from='admin socket' entity='admin socket' cmd=sessions args=[]: finished
Feb 02 10:24:50 compute-1 ceph-mon[80115]: from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Feb 02 10:24:50 compute-1 ceph-mon[80115]: from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Feb 02 10:24:50 compute-1 ceph-mon[80115]: from='admin socket' entity='admin socket' cmd='sessions' args=[]: dispatch
Feb 02 10:24:50 compute-1 ceph-mon[80115]: from='admin socket' entity='admin socket' cmd=sessions args=[]: finished
Feb 02 10:24:50 compute-1 ceph-mon[80115]: from='client.? 192.168.122.101:0/2157195115' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail", "format": "json-pretty"}]: dispatch
Feb 02 10:24:50 compute-1 ceph-mon[80115]: from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Feb 02 10:24:50 compute-1 ceph-mon[80115]: from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Feb 02 10:24:50 compute-1 ceph-mon[80115]: from='client.? 192.168.122.100:0/3883505779' entity='client.admin' cmd=[{"prefix": "quorum_status"}]: dispatch
Feb 02 10:24:50 compute-1 ceph-mon[80115]: from='admin socket' entity='admin socket' cmd='sessions' args=[]: dispatch
Feb 02 10:24:50 compute-1 ceph-mon[80115]: from='admin socket' entity='admin socket' cmd=sessions args=[]: finished
Feb 02 10:24:50 compute-1 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd tree", "format": "json-pretty"} v 0)
Feb 02 10:24:50 compute-1 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3165818654' entity='client.admin' cmd=[{"prefix": "osd tree", "format": "json-pretty"}]: dispatch
Feb 02 10:24:51 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:24:51 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.001000026s ======
Feb 02 10:24:51 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:24:51.533 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Feb 02 10:24:51 compute-1 ceph-mon[80115]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Feb 02 10:24:51 compute-1 ceph-mon[80115]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Feb 02 10:24:51 compute-1 ceph-mon[80115]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd='sessions' args=[]: dispatch
Feb 02 10:24:51 compute-1 ceph-mon[80115]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd=sessions args=[]: finished
Feb 02 10:24:51 compute-1 ceph-mon[80115]: from='client.19380 -' entity='client.admin' cmd=[{"prefix": "orch device ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Feb 02 10:24:51 compute-1 ceph-mon[80115]: from='client.29132 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Feb 02 10:24:51 compute-1 ceph-mon[80115]: from='client.19407 -' entity='client.admin' cmd=[{"prefix": "orch ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Feb 02 10:24:51 compute-1 ceph-mon[80115]: pgmap v1368: 353 pgs: 353 active+clean; 41 MiB data, 303 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Feb 02 10:24:51 compute-1 ceph-mon[80115]: from='client.29159 -' entity='client.admin' cmd=[{"prefix": "orch status", "detail": true, "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Feb 02 10:24:51 compute-1 ceph-mon[80115]: from='client.? 192.168.122.101:0/3165818654' entity='client.admin' cmd=[{"prefix": "osd tree", "format": "json-pretty"}]: dispatch
Feb 02 10:24:51 compute-1 ceph-mon[80115]: from='client.19431 -' entity='client.admin' cmd=[{"prefix": "orch ls", "export": true, "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Feb 02 10:24:51 compute-1 ceph-mon[80115]: from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Feb 02 10:24:51 compute-1 ceph-mon[80115]: from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Feb 02 10:24:51 compute-1 ceph-mon[80115]: from='client.? 192.168.122.100:0/1635200105' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch
Feb 02 10:24:51 compute-1 ceph-mon[80115]: from='client.? 192.168.122.102:0/1799157703' entity='client.admin' cmd=[{"prefix": "config dump"}]: dispatch
Feb 02 10:24:51 compute-1 ceph-mon[80115]: from='admin socket' entity='admin socket' cmd='sessions' args=[]: dispatch
Feb 02 10:24:51 compute-1 ceph-mon[80115]: from='admin socket' entity='admin socket' cmd=sessions args=[]: finished
Feb 02 10:24:51 compute-1 ceph-mon[80115]: from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Feb 02 10:24:51 compute-1 ceph-mon[80115]: from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Feb 02 10:24:51 compute-1 ceph-mon[80115]: from='admin socket' entity='admin socket' cmd='sessions' args=[]: dispatch
Feb 02 10:24:51 compute-1 ceph-mon[80115]: from='admin socket' entity='admin socket' cmd=sessions args=[]: finished
Feb 02 10:24:51 compute-1 ceph-mon[80115]: from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Feb 02 10:24:51 compute-1 ceph-mon[80115]: from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Feb 02 10:24:51 compute-1 ceph-mon[80115]: from='client.? 192.168.122.100:0/2294472089' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail", "format": "json-pretty"}]: dispatch
Feb 02 10:24:51 compute-1 ceph-mon[80115]: from='admin socket' entity='admin socket' cmd='sessions' args=[]: dispatch
Feb 02 10:24:51 compute-1 ceph-mon[80115]: from='admin socket' entity='admin socket' cmd=sessions args=[]: finished
Feb 02 10:24:52 compute-1 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "config dump"} v 0)
Feb 02 10:24:52 compute-1 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1211669878' entity='client.admin' cmd=[{"prefix": "config dump"}]: dispatch
Feb 02 10:24:52 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:24:52 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:24:52 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:24:52.570 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:24:52 compute-1 ceph-mon[80115]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 02 10:24:52 compute-1 ceph-mon[80115]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Feb 02 10:24:52 compute-1 ceph-mon[80115]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Feb 02 10:24:52 compute-1 ceph-mon[80115]: from='client.29192 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Feb 02 10:24:52 compute-1 ceph-mon[80115]: from='client.19458 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Feb 02 10:24:52 compute-1 ceph-mon[80115]: from='client.29131 -' entity='client.admin' cmd=[{"prefix": "device ls", "target": ["mon-mgr", ""]}]: dispatch
Feb 02 10:24:52 compute-1 ceph-mon[80115]: from='client.19476 -' entity='client.admin' cmd=[{"prefix": "orch status", "detail": true, "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Feb 02 10:24:52 compute-1 ceph-mon[80115]: from='client.? 192.168.122.102:0/1543839551' entity='client.admin' cmd=[{"prefix": "df", "detail": "detail"}]: dispatch
Feb 02 10:24:52 compute-1 ceph-mon[80115]: from='client.? 192.168.122.100:0/3757286924' entity='client.admin' cmd=[{"prefix": "osd tree", "format": "json-pretty"}]: dispatch
Feb 02 10:24:52 compute-1 ceph-mon[80115]: from='client.? 192.168.122.101:0/1211669878' entity='client.admin' cmd=[{"prefix": "config dump"}]: dispatch
Feb 02 10:24:52 compute-1 ceph-mon[80115]: from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Feb 02 10:24:52 compute-1 ceph-mon[80115]: from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Feb 02 10:24:52 compute-1 ceph-mon[80115]: from='client.? 192.168.122.102:0/473355055' entity='client.admin' cmd=[{"prefix": "df"}]: dispatch
Feb 02 10:24:52 compute-1 ceph-mon[80115]: from='admin socket' entity='admin socket' cmd='sessions' args=[]: dispatch
Feb 02 10:24:52 compute-1 ceph-mon[80115]: from='admin socket' entity='admin socket' cmd=sessions args=[]: finished
Feb 02 10:24:52 compute-1 ceph-mon[80115]: from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Feb 02 10:24:52 compute-1 ceph-mon[80115]: from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Feb 02 10:24:52 compute-1 ceph-mon[80115]: from='admin socket' entity='admin socket' cmd='sessions' args=[]: dispatch
Feb 02 10:24:52 compute-1 ceph-mon[80115]: from='admin socket' entity='admin socket' cmd=sessions args=[]: finished
Feb 02 10:24:52 compute-1 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "detail": "detail"} v 0)
Feb 02 10:24:52 compute-1 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1628641530' entity='client.admin' cmd=[{"prefix": "df", "detail": "detail"}]: dispatch
Feb 02 10:24:53 compute-1 ceph-mon[80115]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd='sessions' args=[]: dispatch
Feb 02 10:24:53 compute-1 ceph-mon[80115]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd=sessions args=[]: finished
Feb 02 10:24:53 compute-1 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df"} v 0)
Feb 02 10:24:53 compute-1 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1751019335' entity='client.admin' cmd=[{"prefix": "df"}]: dispatch
Feb 02 10:24:53 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:24:53 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:24:53 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.100 - anonymous [02/Feb/2026:10:24:53.535 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:24:53 compute-1 nova_compute[226294]: 2026-02-02 10:24:53.719 226298 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 02 10:24:53 compute-1 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "fs dump"} v 0)
Feb 02 10:24:53 compute-1 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2735528338' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch
Feb 02 10:24:53 compute-1 ceph-mon[80115]: from='client.19497 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Feb 02 10:24:53 compute-1 ceph-mon[80115]: from='client.29267 -' entity='client.admin' cmd=[{"prefix": "device ls", "target": ["mon-mgr", ""]}]: dispatch
Feb 02 10:24:53 compute-1 ceph-mon[80115]: pgmap v1369: 353 pgs: 353 active+clean; 41 MiB data, 303 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Feb 02 10:24:53 compute-1 ceph-mon[80115]: from='client.? 192.168.122.102:0/2972204838' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch
Feb 02 10:24:53 compute-1 ceph-mon[80115]: from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Feb 02 10:24:53 compute-1 ceph-mon[80115]: from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Feb 02 10:24:53 compute-1 ceph-mon[80115]: from='client.? 192.168.122.101:0/1628641530' entity='client.admin' cmd=[{"prefix": "df", "detail": "detail"}]: dispatch
Feb 02 10:24:53 compute-1 ceph-mon[80115]: from='admin socket' entity='admin socket' cmd='sessions' args=[]: dispatch
Feb 02 10:24:53 compute-1 ceph-mon[80115]: from='admin socket' entity='admin socket' cmd=sessions args=[]: finished
Feb 02 10:24:53 compute-1 ceph-mon[80115]: from='client.? 192.168.122.102:0/510477597' entity='client.admin' cmd=[{"prefix": "fs ls"}]: dispatch
Feb 02 10:24:53 compute-1 ceph-mon[80115]: from='client.? 192.168.122.100:0/2725893371' entity='client.admin' cmd=[{"prefix": "config dump"}]: dispatch
Feb 02 10:24:53 compute-1 ceph-mon[80115]: from='client.? 192.168.122.101:0/1751019335' entity='client.admin' cmd=[{"prefix": "df"}]: dispatch
Feb 02 10:24:54 compute-1 ceph-mon[80115]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "fs ls"} v 0)
Feb 02 10:24:54 compute-1 ceph-mon[80115]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3685415627' entity='client.admin' cmd=[{"prefix": "fs ls"}]: dispatch
Feb 02 10:24:54 compute-1 radosgw[81528]: ====== starting new request req=0x7fc4a354c5d0 =====
Feb 02 10:24:54 compute-1 radosgw[81528]: ====== req done req=0x7fc4a354c5d0 op status=0 http_status=200 latency=0.000000000s ======
Feb 02 10:24:54 compute-1 radosgw[81528]: beast: 0x7fc4a354c5d0: 192.168.122.102 - anonymous [02/Feb/2026:10:24:54.571 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Feb 02 10:24:55 compute-1 ceph-mon[80115]: from='client.29221 -' entity='client.admin' cmd=[{"prefix": "fs status", "target": ["mon-mgr", ""]}]: dispatch
Feb 02 10:24:55 compute-1 ceph-mon[80115]: from='client.19566 -' entity='client.admin' cmd=[{"prefix": "device ls", "target": ["mon-mgr", ""]}]: dispatch
Feb 02 10:24:55 compute-1 ceph-mon[80115]: from='client.? 192.168.122.101:0/2735528338' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch
Feb 02 10:24:55 compute-1 ceph-mon[80115]: from='client.? 192.168.122.102:0/833716387' entity='client.admin' cmd=[{"prefix": "mds stat"}]: dispatch
Feb 02 10:24:55 compute-1 ceph-mon[80115]: from='client.? 192.168.122.100:0/2757971077' entity='client.admin' cmd=[{"prefix": "df", "detail": "detail"}]: dispatch
Feb 02 10:24:55 compute-1 ceph-mon[80115]: from='client.? 192.168.122.101:0/3685415627' entity='client.admin' cmd=[{"prefix": "fs ls"}]: dispatch
Feb 02 10:24:55 compute-1 ceph-mon[80115]: from='client.? 192.168.122.102:0/4181213782' entity='client.admin' cmd=[{"prefix": "mon dump"}]: dispatch
Feb 02 10:24:55 compute-1 ceph-mon[80115]: from='client.? 192.168.122.100:0/4094173751' entity='client.admin' cmd=[{"prefix": "df"}]: dispatch
